Nov 28 06:42:47 localhost kernel: Linux version 5.14.0-284.11.1.el9_2.x86_64 (mockbuild@x86-vm-09.build.eng.bos.redhat.com) (gcc (GCC) 11.3.1 20221121 (Red Hat 11.3.1-4), GNU ld version 2.35.2-37.el9) #1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023
Nov 28 06:42:47 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 28 06:42:47 localhost kernel: Command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Nov 28 06:42:47 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 28 06:42:47 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 28 06:42:47 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 28 06:42:47 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 28 06:42:47 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format.
Nov 28 06:42:47 localhost kernel: signal: max sigframe size: 1776
Nov 28 06:42:47 localhost kernel: BIOS-provided physical RAM map:
Nov 28 06:42:47 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 28 06:42:47 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 28 06:42:47 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 28 06:42:47 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Nov 28 06:42:47 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Nov 28 06:42:47 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 28 06:42:47 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 28 06:42:47 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000043fffffff] usable
Nov 28 06:42:47 localhost kernel: NX (Execute Disable) protection: active
Nov 28 06:42:47 localhost kernel: SMBIOS 2.8 present.
Nov 28 06:42:47 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Nov 28 06:42:47 localhost kernel: Hypervisor detected: KVM
Nov 28 06:42:47 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 28 06:42:47 localhost kernel: kvm-clock: using sched offset of 1782068177 cycles
Nov 28 06:42:47 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 28 06:42:47 localhost kernel: tsc: Detected 2799.998 MHz processor
Nov 28 06:42:47 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Nov 28 06:42:47 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Nov 28 06:42:47 localhost kernel: last_pfn = 0x440000 max_arch_pfn = 0x400000000
Nov 28 06:42:47 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 28 06:42:47 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Nov 28 06:42:47 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Nov 28 06:42:47 localhost kernel: Using GB pages for direct mapping
Nov 28 06:42:47 localhost kernel: RAMDISK: [mem 0x2eef4000-0x33771fff]
Nov 28 06:42:47 localhost kernel: ACPI: Early table checksum verification disabled
Nov 28 06:42:47 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Nov 28 06:42:47 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 06:42:47 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 06:42:47 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 06:42:47 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Nov 28 06:42:47 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 06:42:47 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 06:42:47 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Nov 28 06:42:47 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Nov 28 06:42:47 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Nov 28 06:42:47 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Nov 28 06:42:47 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Nov 28 06:42:47 localhost kernel: No NUMA configuration found
Nov 28 06:42:47 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000043fffffff]
Nov 28 06:42:47 localhost kernel: NODE_DATA(0) allocated [mem 0x43ffd5000-0x43fffffff]
Nov 28 06:42:47 localhost kernel: Reserving 256MB of memory at 2800MB for crashkernel (System RAM: 16383MB)
Nov 28 06:42:47 localhost kernel: Zone ranges:
Nov 28 06:42:47 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 28 06:42:47 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 28 06:42:47 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000043fffffff]
Nov 28 06:42:47 localhost kernel:   Device   empty
Nov 28 06:42:47 localhost kernel: Movable zone start for each node
Nov 28 06:42:47 localhost kernel: Early memory node ranges
Nov 28 06:42:47 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 28 06:42:47 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Nov 28 06:42:47 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000043fffffff]
Nov 28 06:42:47 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000043fffffff]
Nov 28 06:42:47 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 28 06:42:47 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 28 06:42:47 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 28 06:42:47 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Nov 28 06:42:47 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 28 06:42:47 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 28 06:42:47 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 28 06:42:47 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 28 06:42:47 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 28 06:42:47 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 28 06:42:47 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 28 06:42:47 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 28 06:42:47 localhost kernel: TSC deadline timer available
Nov 28 06:42:47 localhost kernel: smpboot: Allowing 8 CPUs, 0 hotplug CPUs
Nov 28 06:42:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 28 06:42:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 28 06:42:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 28 06:42:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 28 06:42:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Nov 28 06:42:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Nov 28 06:42:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 28 06:42:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 28 06:42:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 28 06:42:47 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Nov 28 06:42:47 localhost kernel: Booting paravirtualized kernel on KVM
Nov 28 06:42:47 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 28 06:42:47 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Nov 28 06:42:47 localhost kernel: percpu: Embedded 55 pages/cpu s188416 r8192 d28672 u262144
Nov 28 06:42:47 localhost kernel: pcpu-alloc: s188416 r8192 d28672 u262144 alloc=1*2097152
Nov 28 06:42:47 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Nov 28 06:42:47 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Nov 28 06:42:47 localhost kernel: Fallback order for Node 0: 0 
Nov 28 06:42:47 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 4128475
Nov 28 06:42:47 localhost kernel: Policy zone: Normal
Nov 28 06:42:47 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Nov 28 06:42:47 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64", will be passed to user space.
Nov 28 06:42:47 localhost kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear)
Nov 28 06:42:47 localhost kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 28 06:42:47 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 28 06:42:47 localhost kernel: software IO TLB: area num 8.
Nov 28 06:42:47 localhost kernel: Memory: 2873456K/16776676K available (14342K kernel code, 5536K rwdata, 10180K rodata, 2792K init, 7524K bss, 741260K reserved, 0K cma-reserved)
Nov 28 06:42:47 localhost kernel: random: get_random_u64 called from kmem_cache_open+0x1e/0x210 with crng_init=0
Nov 28 06:42:47 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Nov 28 06:42:47 localhost kernel: ftrace: allocating 44803 entries in 176 pages
Nov 28 06:42:47 localhost kernel: ftrace: allocated 176 pages with 3 groups
Nov 28 06:42:47 localhost kernel: Dynamic Preempt: voluntary
Nov 28 06:42:47 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 28 06:42:47 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Nov 28 06:42:47 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Nov 28 06:42:47 localhost kernel:         Rude variant of Tasks RCU enabled.
Nov 28 06:42:47 localhost kernel:         Tracing variant of Tasks RCU enabled.
Nov 28 06:42:47 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 28 06:42:47 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Nov 28 06:42:47 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Nov 28 06:42:47 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 28 06:42:47 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 28 06:42:47 localhost kernel: random: crng init done (trusting CPU's manufacturer)
Nov 28 06:42:47 localhost kernel: Console: colour VGA+ 80x25
Nov 28 06:42:47 localhost kernel: printk: console [tty0] enabled
Nov 28 06:42:47 localhost kernel: printk: console [ttyS0] enabled
Nov 28 06:42:47 localhost kernel: ACPI: Core revision 20211217
Nov 28 06:42:47 localhost kernel: APIC: Switch to symmetric I/O mode setup
Nov 28 06:42:47 localhost kernel: x2apic enabled
Nov 28 06:42:47 localhost kernel: Switched APIC routing to physical x2apic.
Nov 28 06:42:47 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 28 06:42:47 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Nov 28 06:42:47 localhost kernel: pid_max: default: 32768 minimum: 301
Nov 28 06:42:47 localhost kernel: LSM: Security Framework initializing
Nov 28 06:42:47 localhost kernel: Yama: becoming mindful.
Nov 28 06:42:47 localhost kernel: SELinux:  Initializing.
Nov 28 06:42:47 localhost kernel: LSM support for eBPF active
Nov 28 06:42:47 localhost kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Nov 28 06:42:47 localhost kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Nov 28 06:42:47 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 28 06:42:47 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 28 06:42:47 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 28 06:42:47 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 28 06:42:47 localhost kernel: Spectre V2 : Mitigation: Retpolines
Nov 28 06:42:47 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch
Nov 28 06:42:47 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT
Nov 28 06:42:47 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Nov 28 06:42:47 localhost kernel: RETBleed: Mitigation: untrained return thunk
Nov 28 06:42:47 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 28 06:42:47 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 28 06:42:47 localhost kernel: Freeing SMP alternatives memory: 36K
Nov 28 06:42:47 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Nov 28 06:42:47 localhost kernel: cblist_init_generic: Setting adjustable number of callback queues.
Nov 28 06:42:47 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Nov 28 06:42:47 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Nov 28 06:42:47 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Nov 28 06:42:47 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 28 06:42:47 localhost kernel: ... version:                0
Nov 28 06:42:47 localhost kernel: ... bit width:              48
Nov 28 06:42:47 localhost kernel: ... generic registers:      6
Nov 28 06:42:47 localhost kernel: ... value mask:             0000ffffffffffff
Nov 28 06:42:47 localhost kernel: ... max period:             00007fffffffffff
Nov 28 06:42:47 localhost kernel: ... fixed-purpose events:   0
Nov 28 06:42:47 localhost kernel: ... event mask:             000000000000003f
Nov 28 06:42:47 localhost kernel: rcu: Hierarchical SRCU implementation.
Nov 28 06:42:47 localhost kernel: rcu:         Max phase no-delay instances is 400.
Nov 28 06:42:47 localhost kernel: smp: Bringing up secondary CPUs ...
Nov 28 06:42:47 localhost kernel: x86: Booting SMP configuration:
Nov 28 06:42:47 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Nov 28 06:42:47 localhost kernel: smp: Brought up 1 node, 8 CPUs
Nov 28 06:42:47 localhost kernel: smpboot: Max logical packages: 8
Nov 28 06:42:47 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Nov 28 06:42:47 localhost kernel: node 0 deferred pages initialised in 22ms
Nov 28 06:42:47 localhost kernel: devtmpfs: initialized
Nov 28 06:42:47 localhost kernel: x86/mm: Memory block size: 128MB
Nov 28 06:42:47 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 28 06:42:47 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Nov 28 06:42:47 localhost kernel: pinctrl core: initialized pinctrl subsystem
Nov 28 06:42:47 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 28 06:42:47 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations
Nov 28 06:42:47 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 28 06:42:47 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 28 06:42:47 localhost kernel: audit: initializing netlink subsys (disabled)
Nov 28 06:42:47 localhost kernel: audit: type=2000 audit(1764312166.498:1): state=initialized audit_enabled=0 res=1
Nov 28 06:42:47 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 28 06:42:47 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 28 06:42:47 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 28 06:42:47 localhost kernel: cpuidle: using governor menu
Nov 28 06:42:47 localhost kernel: HugeTLB: can optimize 4095 vmemmap pages for hugepages-1048576kB
Nov 28 06:42:47 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 28 06:42:47 localhost kernel: PCI: Using configuration type 1 for base access
Nov 28 06:42:47 localhost kernel: PCI: Using configuration type 1 for extended access
Nov 28 06:42:47 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 28 06:42:47 localhost kernel: HugeTLB: can optimize 7 vmemmap pages for hugepages-2048kB
Nov 28 06:42:47 localhost kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages
Nov 28 06:42:47 localhost kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages
Nov 28 06:42:47 localhost kernel: cryptd: max_cpu_qlen set to 1000
Nov 28 06:42:47 localhost kernel: ACPI: Added _OSI(Module Device)
Nov 28 06:42:47 localhost kernel: ACPI: Added _OSI(Processor Device)
Nov 28 06:42:47 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 28 06:42:47 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 28 06:42:47 localhost kernel: ACPI: Added _OSI(Linux-Dell-Video)
Nov 28 06:42:47 localhost kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio)
Nov 28 06:42:47 localhost kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics)
Nov 28 06:42:47 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 28 06:42:47 localhost kernel: ACPI: Interpreter enabled
Nov 28 06:42:47 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Nov 28 06:42:47 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Nov 28 06:42:47 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 28 06:42:47 localhost kernel: PCI: Using E820 reservations for host bridge windows
Nov 28 06:42:47 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Nov 28 06:42:47 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 28 06:42:47 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [3] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [4] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [5] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [6] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [7] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [8] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [9] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [10] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [11] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [12] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [13] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [14] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [15] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [16] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [17] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [18] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [19] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [20] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [21] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [22] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [23] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [24] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [25] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [26] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [27] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [28] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [29] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [30] registered
Nov 28 06:42:47 localhost kernel: acpiphp: Slot [31] registered
Nov 28 06:42:47 localhost kernel: PCI host bridge to bus 0000:00
Nov 28 06:42:47 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 28 06:42:47 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 28 06:42:47 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 28 06:42:47 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 28 06:42:47 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x440000000-0x4bfffffff window]
Nov 28 06:42:47 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 28 06:42:47 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000
Nov 28 06:42:47 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100
Nov 28 06:42:47 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180
Nov 28 06:42:47 localhost kernel: pci 0000:00:01.1: reg 0x20: [io  0xc140-0xc14f]
Nov 28 06:42:47 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io  0x01f0-0x01f7]
Nov 28 06:42:47 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io  0x03f6]
Nov 28 06:42:47 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io  0x0170-0x0177]
Nov 28 06:42:47 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io  0x0376]
Nov 28 06:42:47 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300
Nov 28 06:42:47 localhost kernel: pci 0000:00:01.2: reg 0x20: [io  0xc100-0xc11f]
Nov 28 06:42:47 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000
Nov 28 06:42:47 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Nov 28 06:42:47 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Nov 28 06:42:47 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000
Nov 28 06:42:47 localhost kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref]
Nov 28 06:42:47 localhost kernel: pci 0000:00:02.0: reg 0x18: [mem 0xfe800000-0xfe803fff 64bit pref]
Nov 28 06:42:47 localhost kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff]
Nov 28 06:42:47 localhost kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref]
Nov 28 06:42:47 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 28 06:42:47 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000
Nov 28 06:42:47 localhost kernel: pci 0000:00:03.0: reg 0x10: [io  0xc080-0xc0bf]
Nov 28 06:42:47 localhost kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff]
Nov 28 06:42:47 localhost kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe804000-0xfe807fff 64bit pref]
Nov 28 06:42:47 localhost kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref]
Nov 28 06:42:47 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000
Nov 28 06:42:47 localhost kernel: pci 0000:00:04.0: reg 0x10: [io  0xc000-0xc07f]
Nov 28 06:42:47 localhost kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff]
Nov 28 06:42:47 localhost kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe808000-0xfe80bfff 64bit pref]
Nov 28 06:42:47 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00
Nov 28 06:42:47 localhost kernel: pci 0000:00:05.0: reg 0x10: [io  0xc0c0-0xc0ff]
Nov 28 06:42:47 localhost kernel: pci 0000:00:05.0: reg 0x20: [mem 0xfe80c000-0xfe80ffff 64bit pref]
Nov 28 06:42:47 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00
Nov 28 06:42:47 localhost kernel: pci 0000:00:06.0: reg 0x10: [io  0xc120-0xc13f]
Nov 28 06:42:47 localhost kernel: pci 0000:00:06.0: reg 0x20: [mem 0xfe810000-0xfe813fff 64bit pref]
Nov 28 06:42:47 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 28 06:42:47 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 28 06:42:47 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 28 06:42:47 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 28 06:42:47 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Nov 28 06:42:47 localhost kernel: iommu: Default domain type: Translated 
Nov 28 06:42:47 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode 
Nov 28 06:42:47 localhost kernel: SCSI subsystem initialized
Nov 28 06:42:47 localhost kernel: ACPI: bus type USB registered
Nov 28 06:42:47 localhost kernel: usbcore: registered new interface driver usbfs
Nov 28 06:42:47 localhost kernel: usbcore: registered new interface driver hub
Nov 28 06:42:47 localhost kernel: usbcore: registered new device driver usb
Nov 28 06:42:47 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 28 06:42:47 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 28 06:42:47 localhost kernel: PTP clock support registered
Nov 28 06:42:47 localhost kernel: EDAC MC: Ver: 3.0.0
Nov 28 06:42:47 localhost kernel: NetLabel: Initializing
Nov 28 06:42:47 localhost kernel: NetLabel:  domain hash size = 128
Nov 28 06:42:47 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 28 06:42:47 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Nov 28 06:42:47 localhost kernel: PCI: Using ACPI for IRQ routing
Nov 28 06:42:47 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Nov 28 06:42:47 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Nov 28 06:42:47 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Nov 28 06:42:47 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Nov 28 06:42:47 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Nov 28 06:42:47 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 28 06:42:47 localhost kernel: vgaarb: loaded
Nov 28 06:42:47 localhost kernel: clocksource: Switched to clocksource kvm-clock
Nov 28 06:42:47 localhost kernel: VFS: Disk quotas dquot_6.6.0
Nov 28 06:42:47 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 28 06:42:47 localhost kernel: pnp: PnP ACPI init
Nov 28 06:42:47 localhost kernel: pnp 00:03: [dma 2]
Nov 28 06:42:47 localhost kernel: pnp: PnP ACPI: found 5 devices
Nov 28 06:42:47 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 28 06:42:47 localhost kernel: NET: Registered PF_INET protocol family
Nov 28 06:42:47 localhost kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear)
Nov 28 06:42:47 localhost kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear)
Nov 28 06:42:47 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 28 06:42:47 localhost kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 28 06:42:47 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 28 06:42:47 localhost kernel: TCP: Hash tables configured (established 131072 bind 65536)
Nov 28 06:42:47 localhost kernel: MPTCP token hash table entries: 16384 (order: 6, 393216 bytes, linear)
Nov 28 06:42:47 localhost kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear)
Nov 28 06:42:47 localhost kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear)
Nov 28 06:42:47 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 28 06:42:47 localhost kernel: NET: Registered PF_XDP protocol family
Nov 28 06:42:47 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 28 06:42:47 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 28 06:42:47 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 28 06:42:47 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Nov 28 06:42:47 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x440000000-0x4bfffffff window]
Nov 28 06:42:47 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Nov 28 06:42:47 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Nov 28 06:42:47 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Nov 28 06:42:47 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 26823 usecs
Nov 28 06:42:47 localhost kernel: PCI: CLS 0 bytes, default 64
Nov 28 06:42:47 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 28 06:42:47 localhost kernel: Trying to unpack rootfs image as initramfs...
Nov 28 06:42:47 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Nov 28 06:42:47 localhost kernel: ACPI: bus type thunderbolt registered
Nov 28 06:42:47 localhost kernel: Initialise system trusted keyrings
Nov 28 06:42:47 localhost kernel: Key type blacklist registered
Nov 28 06:42:47 localhost kernel: workingset: timestamp_bits=36 max_order=22 bucket_order=0
Nov 28 06:42:47 localhost kernel: zbud: loaded
Nov 28 06:42:47 localhost kernel: integrity: Platform Keyring initialized
Nov 28 06:42:47 localhost kernel: NET: Registered PF_ALG protocol family
Nov 28 06:42:47 localhost kernel: xor: automatically using best checksumming function   avx       
Nov 28 06:42:47 localhost kernel: Key type asymmetric registered
Nov 28 06:42:47 localhost kernel: Asymmetric key parser 'x509' registered
Nov 28 06:42:47 localhost kernel: Running certificate verification selftests
Nov 28 06:42:47 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 28 06:42:47 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 28 06:42:47 localhost kernel: io scheduler mq-deadline registered
Nov 28 06:42:47 localhost kernel: io scheduler kyber registered
Nov 28 06:42:47 localhost kernel: io scheduler bfq registered
Nov 28 06:42:47 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 28 06:42:47 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 28 06:42:47 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 28 06:42:47 localhost kernel: ACPI: button: Power Button [PWRF]
Nov 28 06:42:47 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Nov 28 06:42:47 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Nov 28 06:42:47 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Nov 28 06:42:47 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 28 06:42:47 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 28 06:42:47 localhost kernel: Non-volatile memory driver v1.3
Nov 28 06:42:47 localhost kernel: rdac: device handler registered
Nov 28 06:42:47 localhost kernel: hp_sw: device handler registered
Nov 28 06:42:47 localhost kernel: emc: device handler registered
Nov 28 06:42:47 localhost kernel: alua: device handler registered
Nov 28 06:42:47 localhost kernel: libphy: Fixed MDIO Bus: probed
Nov 28 06:42:47 localhost kernel: ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver
Nov 28 06:42:47 localhost kernel: ehci-pci: EHCI PCI platform driver
Nov 28 06:42:47 localhost kernel: ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver
Nov 28 06:42:47 localhost kernel: ohci-pci: OHCI PCI platform driver
Nov 28 06:42:47 localhost kernel: uhci_hcd: USB Universal Host Controller Interface driver
Nov 28 06:42:47 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Nov 28 06:42:47 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Nov 28 06:42:47 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Nov 28 06:42:47 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Nov 28 06:42:47 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 28 06:42:47 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 28 06:42:47 localhost kernel: usb usb1: Product: UHCI Host Controller
Nov 28 06:42:47 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-284.11.1.el9_2.x86_64 uhci_hcd
Nov 28 06:42:47 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Nov 28 06:42:47 localhost kernel: hub 1-0:1.0: USB hub found
Nov 28 06:42:47 localhost kernel: hub 1-0:1.0: 2 ports detected
Nov 28 06:42:47 localhost kernel: usbcore: registered new interface driver usbserial_generic
Nov 28 06:42:47 localhost kernel: usbserial: USB Serial support registered for generic
Nov 28 06:42:47 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 28 06:42:47 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 28 06:42:47 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 28 06:42:47 localhost kernel: mousedev: PS/2 mouse device common for all mice
Nov 28 06:42:47 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Nov 28 06:42:47 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 28 06:42:47 localhost kernel: rtc_cmos 00:04: registered as rtc0
Nov 28 06:42:47 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 28 06:42:47 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-11-28T06:42:46 UTC (1764312166)
Nov 28 06:42:47 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 28 06:42:47 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Nov 28 06:42:47 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 28 06:42:47 localhost kernel: usbcore: registered new interface driver usbhid
Nov 28 06:42:47 localhost kernel: usbhid: USB HID core driver
Nov 28 06:42:47 localhost kernel: drop_monitor: Initializing network drop monitor service
Nov 28 06:42:47 localhost kernel: Initializing XFRM netlink socket
Nov 28 06:42:47 localhost kernel: NET: Registered PF_INET6 protocol family
Nov 28 06:42:47 localhost kernel: Segment Routing with IPv6
Nov 28 06:42:47 localhost kernel: NET: Registered PF_PACKET protocol family
Nov 28 06:42:47 localhost kernel: mpls_gso: MPLS GSO support
Nov 28 06:42:47 localhost kernel: IPI shorthand broadcast: enabled
Nov 28 06:42:47 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Nov 28 06:42:47 localhost kernel: AES CTR mode by8 optimization enabled
Nov 28 06:42:47 localhost kernel: sched_clock: Marking stable (713034875, 173887832)->(1009444400, -122521693)
Nov 28 06:42:47 localhost kernel: registered taskstats version 1
Nov 28 06:42:47 localhost kernel: Loading compiled-in X.509 certificates
Nov 28 06:42:47 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72'
Nov 28 06:42:47 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 28 06:42:47 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 28 06:42:47 localhost kernel: zswap: loaded using pool lzo/zbud
Nov 28 06:42:47 localhost kernel: page_owner is disabled
Nov 28 06:42:47 localhost kernel: Key type big_key registered
Nov 28 06:42:47 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 28 06:42:47 localhost kernel: Freeing initrd memory: 74232K
Nov 28 06:42:47 localhost kernel: Key type encrypted registered
Nov 28 06:42:47 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 28 06:42:47 localhost kernel: Loading compiled-in module X.509 certificates
Nov 28 06:42:47 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72'
Nov 28 06:42:47 localhost kernel: ima: Allocated hash algorithm: sha256
Nov 28 06:42:47 localhost kernel: ima: No architecture policies found
Nov 28 06:42:47 localhost kernel: evm: Initialising EVM extended attributes:
Nov 28 06:42:47 localhost kernel: evm: security.selinux
Nov 28 06:42:47 localhost kernel: evm: security.SMACK64 (disabled)
Nov 28 06:42:47 localhost kernel: evm: security.SMACK64EXEC (disabled)
Nov 28 06:42:47 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 28 06:42:47 localhost kernel: evm: security.SMACK64MMAP (disabled)
Nov 28 06:42:47 localhost kernel: evm: security.apparmor (disabled)
Nov 28 06:42:47 localhost kernel: evm: security.ima
Nov 28 06:42:47 localhost kernel: evm: security.capability
Nov 28 06:42:47 localhost kernel: evm: HMAC attrs: 0x1
Nov 28 06:42:47 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 28 06:42:47 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 28 06:42:47 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Nov 28 06:42:47 localhost kernel: usb 1-1: Manufacturer: QEMU
Nov 28 06:42:47 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Nov 28 06:42:47 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 28 06:42:47 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Nov 28 06:42:47 localhost kernel: Freeing unused decrypted memory: 2036K
Nov 28 06:42:47 localhost kernel: Freeing unused kernel image (initmem) memory: 2792K
Nov 28 06:42:47 localhost kernel: Write protecting the kernel read-only data: 26624k
Nov 28 06:42:47 localhost kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K
Nov 28 06:42:47 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 60K
Nov 28 06:42:47 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 28 06:42:47 localhost kernel: Run /init as init process
Nov 28 06:42:47 localhost kernel:   with arguments:
Nov 28 06:42:47 localhost kernel:     /init
Nov 28 06:42:47 localhost kernel:   with environment:
Nov 28 06:42:47 localhost kernel:     HOME=/
Nov 28 06:42:47 localhost kernel:     TERM=linux
Nov 28 06:42:47 localhost kernel:     BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64
Nov 28 06:42:47 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 28 06:42:47 localhost systemd[1]: Detected virtualization kvm.
Nov 28 06:42:47 localhost systemd[1]: Detected architecture x86-64.
Nov 28 06:42:47 localhost systemd[1]: Running in initrd.
Nov 28 06:42:47 localhost systemd[1]: No hostname configured, using default hostname.
Nov 28 06:42:47 localhost systemd[1]: Hostname set to <localhost>.
Nov 28 06:42:47 localhost systemd[1]: Initializing machine ID from VM UUID.
Nov 28 06:42:47 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Nov 28 06:42:47 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 28 06:42:47 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 28 06:42:47 localhost systemd[1]: Reached target Initrd /usr File System.
Nov 28 06:42:47 localhost systemd[1]: Reached target Local File Systems.
Nov 28 06:42:47 localhost systemd[1]: Reached target Path Units.
Nov 28 06:42:47 localhost systemd[1]: Reached target Slice Units.
Nov 28 06:42:47 localhost systemd[1]: Reached target Swaps.
Nov 28 06:42:47 localhost systemd[1]: Reached target Timer Units.
Nov 28 06:42:47 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 28 06:42:47 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Nov 28 06:42:47 localhost systemd[1]: Listening on Journal Socket.
Nov 28 06:42:47 localhost systemd[1]: Listening on udev Control Socket.
Nov 28 06:42:47 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 28 06:42:47 localhost systemd[1]: Reached target Socket Units.
Nov 28 06:42:47 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 28 06:42:47 localhost systemd[1]: Starting Journal Service...
Nov 28 06:42:47 localhost systemd[1]: Starting Load Kernel Modules...
Nov 28 06:42:47 localhost systemd[1]: Starting Create System Users...
Nov 28 06:42:47 localhost systemd[1]: Starting Setup Virtual Console...
Nov 28 06:42:47 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 28 06:42:47 localhost systemd[1]: Finished Load Kernel Modules.
Nov 28 06:42:47 localhost systemd-journald[284]: Journal started
Nov 28 06:42:47 localhost systemd-journald[284]: Runtime Journal (/run/log/journal/eb468aede0e94528988f9267a3530b7a) is 8.0M, max 314.7M, 306.7M free.
Nov 28 06:42:47 localhost systemd-modules-load[285]: Module 'msr' is built in
Nov 28 06:42:47 localhost systemd[1]: Started Journal Service.
Nov 28 06:42:47 localhost systemd[1]: Finished Setup Virtual Console.
Nov 28 06:42:47 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 28 06:42:47 localhost systemd[1]: Starting dracut cmdline hook...
Nov 28 06:42:47 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 28 06:42:47 localhost systemd-sysusers[286]: Creating group 'sgx' with GID 997.
Nov 28 06:42:47 localhost systemd-sysusers[286]: Creating group 'users' with GID 100.
Nov 28 06:42:47 localhost systemd-sysusers[286]: Creating group 'dbus' with GID 81.
Nov 28 06:42:47 localhost systemd-sysusers[286]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 28 06:42:47 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 28 06:42:47 localhost systemd[1]: Finished Create System Users.
Nov 28 06:42:47 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 28 06:42:47 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 28 06:42:47 localhost dracut-cmdline[289]: dracut-9.2 (Plow) dracut-057-21.git20230214.el9
Nov 28 06:42:47 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 28 06:42:47 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 28 06:42:47 localhost dracut-cmdline[289]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Nov 28 06:42:47 localhost systemd[1]: Finished dracut cmdline hook.
Nov 28 06:42:47 localhost systemd[1]: Starting dracut pre-udev hook...
Nov 28 06:42:47 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 28 06:42:47 localhost kernel: device-mapper: uevent: version 1.0.3
Nov 28 06:42:47 localhost kernel: device-mapper: ioctl: 4.47.0-ioctl (2022-07-28) initialised: dm-devel@redhat.com
Nov 28 06:42:47 localhost kernel: RPC: Registered named UNIX socket transport module.
Nov 28 06:42:47 localhost kernel: RPC: Registered udp transport module.
Nov 28 06:42:47 localhost kernel: RPC: Registered tcp transport module.
Nov 28 06:42:47 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 28 06:42:47 localhost rpc.statd[408]: Version 2.5.4 starting
Nov 28 06:42:47 localhost rpc.statd[408]: Initializing NSM state
Nov 28 06:42:47 localhost rpc.idmapd[413]: Setting log level to 0
Nov 28 06:42:47 localhost systemd[1]: Finished dracut pre-udev hook.
Nov 28 06:42:47 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 28 06:42:47 localhost systemd-udevd[426]: Using default interface naming scheme 'rhel-9.0'.
Nov 28 06:42:47 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 28 06:42:47 localhost systemd[1]: Starting dracut pre-trigger hook...
Nov 28 06:42:47 localhost systemd[1]: Finished dracut pre-trigger hook.
Nov 28 06:42:47 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 28 06:42:47 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 28 06:42:47 localhost systemd[1]: Reached target System Initialization.
Nov 28 06:42:47 localhost systemd[1]: Reached target Basic System.
Nov 28 06:42:47 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 28 06:42:47 localhost systemd[1]: Reached target Network.
Nov 28 06:42:47 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 28 06:42:47 localhost systemd[1]: Starting dracut initqueue hook...
Nov 28 06:42:47 localhost kernel: virtio_blk virtio2: [vda] 838860800 512-byte logical blocks (429 GB/400 GiB)
Nov 28 06:42:48 localhost kernel: GPT:Primary header thinks Alt. header is not at the end of the disk.
Nov 28 06:42:48 localhost kernel: GPT:20971519 != 838860799
Nov 28 06:42:48 localhost kernel: GPT:Alternate GPT header not at the end of the disk.
Nov 28 06:42:48 localhost kernel: GPT:20971519 != 838860799
Nov 28 06:42:48 localhost kernel: GPT: Use GNU Parted to correct GPT errors.
Nov 28 06:42:48 localhost kernel:  vda: vda1 vda2 vda3 vda4
Nov 28 06:42:48 localhost kernel: libata version 3.00 loaded.
Nov 28 06:42:48 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Nov 28 06:42:48 localhost kernel: scsi host0: ata_piix
Nov 28 06:42:48 localhost kernel: scsi host1: ata_piix
Nov 28 06:42:48 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14
Nov 28 06:42:48 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15
Nov 28 06:42:48 localhost systemd-udevd[471]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 06:42:48 localhost systemd[1]: Found device /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a.
Nov 28 06:42:48 localhost systemd[1]: Reached target Initrd Root Device.
Nov 28 06:42:48 localhost kernel: ata1: found unknown device (class 0)
Nov 28 06:42:48 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 28 06:42:48 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 28 06:42:48 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 28 06:42:48 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 28 06:42:48 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 28 06:42:48 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Nov 28 06:42:48 localhost systemd[1]: Finished dracut initqueue hook.
Nov 28 06:42:48 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Nov 28 06:42:48 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Nov 28 06:42:48 localhost systemd[1]: Reached target Remote File Systems.
Nov 28 06:42:48 localhost systemd[1]: Starting dracut pre-mount hook...
Nov 28 06:42:48 localhost systemd[1]: Finished dracut pre-mount hook.
Nov 28 06:42:48 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a...
Nov 28 06:42:48 localhost systemd-fsck[512]: /usr/sbin/fsck.xfs: XFS file system.
Nov 28 06:42:48 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a.
Nov 28 06:42:48 localhost systemd[1]: Mounting /sysroot...
Nov 28 06:42:48 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 28 06:42:48 localhost kernel: XFS (vda4): Mounting V5 Filesystem
Nov 28 06:42:48 localhost kernel: XFS (vda4): Ending clean mount
Nov 28 06:42:48 localhost systemd[1]: Mounted /sysroot.
Nov 28 06:42:48 localhost systemd[1]: Reached target Initrd Root File System.
Nov 28 06:42:48 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 28 06:42:48 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 28 06:42:48 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 28 06:42:48 localhost systemd[1]: Reached target Initrd File Systems.
Nov 28 06:42:48 localhost systemd[1]: Reached target Initrd Default Target.
Nov 28 06:42:48 localhost systemd[1]: Starting dracut mount hook...
Nov 28 06:42:48 localhost systemd[1]: Finished dracut mount hook.
Nov 28 06:42:48 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 28 06:42:48 localhost rpc.idmapd[413]: exiting on signal 15
Nov 28 06:42:48 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 28 06:42:48 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 28 06:42:48 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 28 06:42:48 localhost systemd[1]: Stopped target Network.
Nov 28 06:42:48 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 28 06:42:48 localhost systemd[1]: Stopped target Timer Units.
Nov 28 06:42:48 localhost systemd[1]: dbus.socket: Deactivated successfully.
Nov 28 06:42:48 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 28 06:42:48 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 28 06:42:48 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 28 06:42:48 localhost systemd[1]: Stopped target Initrd Default Target.
Nov 28 06:42:48 localhost systemd[1]: Stopped target Basic System.
Nov 28 06:42:48 localhost systemd[1]: Stopped target Initrd Root Device.
Nov 28 06:42:48 localhost systemd[1]: Stopped target Initrd /usr File System.
Nov 28 06:42:48 localhost systemd[1]: Stopped target Path Units.
Nov 28 06:42:48 localhost systemd[1]: Stopped target Remote File Systems.
Nov 28 06:42:48 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 28 06:42:48 localhost systemd[1]: Stopped target Slice Units.
Nov 28 06:42:48 localhost systemd[1]: Stopped target Socket Units.
Nov 28 06:42:48 localhost systemd[1]: Stopped target System Initialization.
Nov 28 06:42:48 localhost systemd[1]: Stopped target Local File Systems.
Nov 28 06:42:48 localhost systemd[1]: Stopped target Swaps.
Nov 28 06:42:48 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 28 06:42:48 localhost systemd[1]: Stopped dracut mount hook.
Nov 28 06:42:48 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 28 06:42:48 localhost systemd[1]: Stopped dracut pre-mount hook.
Nov 28 06:42:48 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Nov 28 06:42:48 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 28 06:42:48 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 28 06:42:48 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 28 06:42:48 localhost systemd[1]: Stopped dracut initqueue hook.
Nov 28 06:42:48 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 28 06:42:48 localhost systemd[1]: Stopped Apply Kernel Variables.
Nov 28 06:42:48 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 28 06:42:48 localhost systemd[1]: Stopped Load Kernel Modules.
Nov 28 06:42:48 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 28 06:42:48 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Nov 28 06:42:48 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 28 06:42:48 localhost systemd[1]: Stopped Coldplug All udev Devices.
Nov 28 06:42:48 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 28 06:42:48 localhost systemd[1]: Stopped dracut pre-trigger hook.
Nov 28 06:42:48 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 28 06:42:48 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 28 06:42:48 localhost systemd[1]: Stopped Setup Virtual Console.
Nov 28 06:42:48 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Nov 28 06:42:48 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 28 06:42:48 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 28 06:42:48 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 28 06:42:48 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 28 06:42:48 localhost systemd[1]: Closed udev Control Socket.
Nov 28 06:42:48 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 28 06:42:48 localhost systemd[1]: Closed udev Kernel Socket.
Nov 28 06:42:48 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 28 06:42:48 localhost systemd[1]: Stopped dracut pre-udev hook.
Nov 28 06:42:48 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 28 06:42:48 localhost systemd[1]: Stopped dracut cmdline hook.
Nov 28 06:42:48 localhost systemd[1]: Starting Cleanup udev Database...
Nov 28 06:42:48 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 28 06:42:48 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 28 06:42:49 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 28 06:42:49 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Nov 28 06:42:49 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 28 06:42:49 localhost systemd[1]: Stopped Create System Users.
Nov 28 06:42:49 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 28 06:42:49 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 28 06:42:49 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 28 06:42:49 localhost systemd[1]: Finished Cleanup udev Database.
Nov 28 06:42:49 localhost systemd[1]: Reached target Switch Root.
Nov 28 06:42:49 localhost systemd[1]: Starting Switch Root...
Nov 28 06:42:49 localhost systemd[1]: Switching root.
Nov 28 06:42:49 localhost systemd-journald[284]: Journal stopped
Nov 28 06:42:49 localhost systemd-journald[284]: Received SIGTERM from PID 1 (systemd).
Nov 28 06:42:49 localhost kernel: audit: type=1404 audit(1764312169.104:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 28 06:42:49 localhost kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 06:42:49 localhost kernel: SELinux:  policy capability open_perms=1
Nov 28 06:42:49 localhost kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 06:42:49 localhost kernel: SELinux:  policy capability always_check_network=0
Nov 28 06:42:49 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 06:42:49 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 06:42:49 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 06:42:49 localhost kernel: audit: type=1403 audit(1764312169.185:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 28 06:42:49 localhost systemd[1]: Successfully loaded SELinux policy in 83.239ms.
Nov 28 06:42:49 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 22.636ms.
Nov 28 06:42:49 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 28 06:42:49 localhost systemd[1]: Detected virtualization kvm.
Nov 28 06:42:49 localhost systemd[1]: Detected architecture x86-64.
Nov 28 06:42:49 localhost systemd-rc-local-generator[582]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 06:42:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 06:42:49 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Nov 28 06:42:49 localhost systemd[1]: Stopped Switch Root.
Nov 28 06:42:49 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 28 06:42:49 localhost systemd[1]: Created slice Slice /system/getty.
Nov 28 06:42:49 localhost systemd[1]: Created slice Slice /system/modprobe.
Nov 28 06:42:49 localhost systemd[1]: Created slice Slice /system/serial-getty.
Nov 28 06:42:49 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Nov 28 06:42:49 localhost systemd[1]: Created slice Slice /system/systemd-fsck.
Nov 28 06:42:49 localhost systemd[1]: Created slice User and Session Slice.
Nov 28 06:42:49 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 28 06:42:49 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Nov 28 06:42:49 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 28 06:42:49 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 28 06:42:49 localhost systemd[1]: Stopped target Switch Root.
Nov 28 06:42:49 localhost systemd[1]: Stopped target Initrd File Systems.
Nov 28 06:42:49 localhost systemd[1]: Stopped target Initrd Root File System.
Nov 28 06:42:49 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Nov 28 06:42:49 localhost systemd[1]: Reached target Path Units.
Nov 28 06:42:49 localhost systemd[1]: Reached target rpc_pipefs.target.
Nov 28 06:42:49 localhost systemd[1]: Reached target Slice Units.
Nov 28 06:42:49 localhost systemd[1]: Reached target Swaps.
Nov 28 06:42:49 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Nov 28 06:42:49 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Nov 28 06:42:49 localhost systemd[1]: Reached target RPC Port Mapper.
Nov 28 06:42:49 localhost systemd[1]: Listening on Process Core Dump Socket.
Nov 28 06:42:49 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Nov 28 06:42:49 localhost systemd[1]: Listening on udev Control Socket.
Nov 28 06:42:49 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 28 06:42:49 localhost systemd[1]: Mounting Huge Pages File System...
Nov 28 06:42:49 localhost systemd[1]: Mounting POSIX Message Queue File System...
Nov 28 06:42:49 localhost systemd[1]: Mounting Kernel Debug File System...
Nov 28 06:42:49 localhost systemd[1]: Mounting Kernel Trace File System...
Nov 28 06:42:49 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 28 06:42:49 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 28 06:42:49 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 28 06:42:49 localhost systemd[1]: Starting Load Kernel Module drm...
Nov 28 06:42:49 localhost systemd[1]: Starting Load Kernel Module fuse...
Nov 28 06:42:49 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 28 06:42:49 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Nov 28 06:42:49 localhost systemd[1]: Stopped File System Check on Root Device.
Nov 28 06:42:49 localhost systemd[1]: Stopped Journal Service.
Nov 28 06:42:49 localhost kernel: fuse: init (API version 7.36)
Nov 28 06:42:49 localhost systemd[1]: Starting Journal Service...
Nov 28 06:42:49 localhost systemd[1]: Starting Load Kernel Modules...
Nov 28 06:42:49 localhost systemd[1]: Starting Generate network units from Kernel command line...
Nov 28 06:42:49 localhost kernel: ACPI: bus type drm_connector registered
Nov 28 06:42:49 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Nov 28 06:42:49 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 28 06:42:49 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 28 06:42:49 localhost systemd-journald[618]: Journal started
Nov 28 06:42:49 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/5cd59ba25ae47acac865224fa46a5f9e) is 8.0M, max 314.7M, 306.7M free.
Nov 28 06:42:49 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 28 06:42:49 localhost systemd[1]: Queued start job for default target Multi-User System.
Nov 28 06:42:49 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 28 06:42:49 localhost systemd-modules-load[619]: Module 'msr' is built in
Nov 28 06:42:49 localhost systemd[1]: Started Journal Service.
Nov 28 06:42:49 localhost systemd[1]: Mounted Huge Pages File System.
Nov 28 06:42:49 localhost systemd[1]: Mounted POSIX Message Queue File System.
Nov 28 06:42:49 localhost systemd[1]: Mounted Kernel Debug File System.
Nov 28 06:42:49 localhost systemd[1]: Mounted Kernel Trace File System.
Nov 28 06:42:49 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 28 06:42:49 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 28 06:42:49 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 28 06:42:49 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 28 06:42:49 localhost systemd[1]: Finished Load Kernel Module drm.
Nov 28 06:42:49 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 28 06:42:49 localhost systemd[1]: Finished Load Kernel Module fuse.
Nov 28 06:42:49 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 28 06:42:49 localhost systemd[1]: Finished Load Kernel Modules.
Nov 28 06:42:49 localhost systemd[1]: Finished Generate network units from Kernel command line.
Nov 28 06:42:49 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 28 06:42:49 localhost systemd[1]: Mounting FUSE Control File System...
Nov 28 06:42:49 localhost systemd[1]: Mounting Kernel Configuration File System...
Nov 28 06:42:49 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 28 06:42:49 localhost systemd[1]: Starting Rebuild Hardware Database...
Nov 28 06:42:49 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 28 06:42:49 localhost systemd[1]: Starting Load/Save Random Seed...
Nov 28 06:42:49 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 28 06:42:49 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/5cd59ba25ae47acac865224fa46a5f9e) is 8.0M, max 314.7M, 306.7M free.
Nov 28 06:42:49 localhost systemd-journald[618]: Received client request to flush runtime journal.
Nov 28 06:42:49 localhost systemd[1]: Starting Create System Users...
Nov 28 06:42:49 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 28 06:42:49 localhost systemd[1]: Mounted FUSE Control File System.
Nov 28 06:42:49 localhost systemd[1]: Mounted Kernel Configuration File System.
Nov 28 06:42:49 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 28 06:42:49 localhost systemd[1]: Finished Load/Save Random Seed.
Nov 28 06:42:49 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 28 06:42:49 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 28 06:42:49 localhost systemd-sysusers[631]: Creating group 'sgx' with GID 989.
Nov 28 06:42:49 localhost systemd-sysusers[631]: Creating group 'systemd-oom' with GID 988.
Nov 28 06:42:49 localhost systemd-sysusers[631]: Creating user 'systemd-oom' (systemd Userspace OOM Killer) with UID 988 and GID 988.
Nov 28 06:42:49 localhost systemd[1]: Finished Create System Users.
Nov 28 06:42:49 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 28 06:42:49 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 28 06:42:49 localhost systemd[1]: Reached target Preparation for Local File Systems.
Nov 28 06:42:49 localhost systemd[1]: Set up automount EFI System Partition Automount.
Nov 28 06:42:50 localhost systemd[1]: Finished Rebuild Hardware Database.
Nov 28 06:42:50 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 28 06:42:50 localhost systemd-udevd[635]: Using default interface naming scheme 'rhel-9.0'.
Nov 28 06:42:50 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 28 06:42:50 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 28 06:42:50 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 28 06:42:50 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 28 06:42:50 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 28 06:42:50 localhost systemd-udevd[651]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 06:42:50 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/7B77-95E7 being skipped.
Nov 28 06:42:50 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/7B77-95E7...
Nov 28 06:42:50 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/b141154b-6a70-437a-a97f-d160c9ba37eb being skipped.
Nov 28 06:42:50 localhost systemd[1]: Mounting /boot...
Nov 28 06:42:50 localhost kernel: XFS (vda3): Mounting V5 Filesystem
Nov 28 06:42:50 localhost systemd-fsck[686]: fsck.fat 4.2 (2021-01-31)
Nov 28 06:42:50 localhost systemd-fsck[686]: /dev/vda2: 12 files, 1782/51145 clusters
Nov 28 06:42:50 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/7B77-95E7.
Nov 28 06:42:50 localhost kernel: XFS (vda3): Ending clean mount
Nov 28 06:42:50 localhost kernel: xfs filesystem being mounted at /boot supports timestamps until 2038 (0x7fffffff)
Nov 28 06:42:50 localhost systemd[1]: Mounted /boot.
Nov 28 06:42:50 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Nov 28 06:42:50 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 28 06:42:50 localhost kernel: SVM: TSC scaling supported
Nov 28 06:42:50 localhost kernel: kvm: Nested Virtualization enabled
Nov 28 06:42:50 localhost kernel: SVM: kvm: Nested Paging enabled
Nov 28 06:42:50 localhost kernel: SVM: LBR virtualization supported
Nov 28 06:42:50 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Nov 28 06:42:50 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Nov 28 06:42:50 localhost kernel: Console: switching to colour dummy device 80x25
Nov 28 06:42:50 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 28 06:42:50 localhost kernel: [drm] features: -context_init
Nov 28 06:42:50 localhost kernel: [drm] number of scanouts: 1
Nov 28 06:42:50 localhost kernel: [drm] number of cap sets: 0
Nov 28 06:42:50 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 0 for virtio0 on minor 0
Nov 28 06:42:50 localhost kernel: virtio_gpu virtio0: [drm] drm_plane_enable_fb_damage_clips() not called
Nov 28 06:42:50 localhost kernel: Console: switching to colour frame buffer device 128x48
Nov 28 06:42:50 localhost kernel: virtio_gpu virtio0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 28 06:42:50 localhost systemd[1]: Mounting /boot/efi...
Nov 28 06:42:50 localhost systemd[1]: Mounted /boot/efi.
Nov 28 06:42:50 localhost systemd[1]: Reached target Local File Systems.
Nov 28 06:42:50 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 28 06:42:50 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 28 06:42:50 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 28 06:42:50 localhost systemd[1]: Store a System Token in an EFI Variable was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 28 06:42:50 localhost systemd[1]: Starting Automatic Boot Loader Update...
Nov 28 06:42:50 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 28 06:42:50 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 28 06:42:50 localhost systemd[1]: efi.automount: Got automount request for /efi, triggered by 712 (bootctl)
Nov 28 06:42:50 localhost systemd[1]: Starting File System Check on /dev/vda2...
Nov 28 06:42:50 localhost systemd[1]: Finished File System Check on /dev/vda2.
Nov 28 06:42:50 localhost systemd[1]: Mounting EFI System Partition Automount...
Nov 28 06:42:50 localhost systemd[1]: Mounted EFI System Partition Automount.
Nov 28 06:42:50 localhost systemd[1]: Finished Automatic Boot Loader Update.
Nov 28 06:42:50 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 28 06:42:50 localhost systemd[1]: Starting Security Auditing Service...
Nov 28 06:42:50 localhost systemd[1]: Starting RPC Bind...
Nov 28 06:42:50 localhost systemd[1]: Starting Rebuild Journal Catalog...
Nov 28 06:42:50 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 28 06:42:50 localhost auditd[725]: audit dispatcher initialized with q_depth=1200 and 1 active plugins
Nov 28 06:42:50 localhost auditd[725]: Init complete, auditd 3.0.7 listening for events (startup state enable)
Nov 28 06:42:50 localhost systemd[1]: Finished Rebuild Journal Catalog.
Nov 28 06:42:50 localhost systemd[1]: Starting Update is Completed...
Nov 28 06:42:50 localhost systemd[1]: Finished Update is Completed.
Nov 28 06:42:50 localhost systemd[1]: Started RPC Bind.
Nov 28 06:42:50 localhost augenrules[730]: /sbin/augenrules: No change
Nov 28 06:42:50 localhost augenrules[741]: No rules
Nov 28 06:42:50 localhost augenrules[741]: enabled 1
Nov 28 06:42:50 localhost augenrules[741]: failure 1
Nov 28 06:42:50 localhost augenrules[741]: pid 725
Nov 28 06:42:50 localhost augenrules[741]: rate_limit 0
Nov 28 06:42:50 localhost augenrules[741]: backlog_limit 8192
Nov 28 06:42:50 localhost augenrules[741]: lost 0
Nov 28 06:42:50 localhost augenrules[741]: backlog 2
Nov 28 06:42:50 localhost augenrules[741]: backlog_wait_time 60000
Nov 28 06:42:50 localhost augenrules[741]: backlog_wait_time_actual 0
Nov 28 06:42:50 localhost augenrules[741]: enabled 1
Nov 28 06:42:50 localhost augenrules[741]: failure 1
Nov 28 06:42:50 localhost augenrules[741]: pid 725
Nov 28 06:42:50 localhost augenrules[741]: rate_limit 0
Nov 28 06:42:50 localhost augenrules[741]: backlog_limit 8192
Nov 28 06:42:50 localhost augenrules[741]: lost 0
Nov 28 06:42:50 localhost augenrules[741]: backlog 0
Nov 28 06:42:50 localhost augenrules[741]: backlog_wait_time 60000
Nov 28 06:42:50 localhost augenrules[741]: backlog_wait_time_actual 0
Nov 28 06:42:50 localhost augenrules[741]: enabled 1
Nov 28 06:42:50 localhost augenrules[741]: failure 1
Nov 28 06:42:50 localhost augenrules[741]: pid 725
Nov 28 06:42:50 localhost augenrules[741]: rate_limit 0
Nov 28 06:42:50 localhost augenrules[741]: backlog_limit 8192
Nov 28 06:42:50 localhost augenrules[741]: lost 0
Nov 28 06:42:50 localhost augenrules[741]: backlog 0
Nov 28 06:42:50 localhost augenrules[741]: backlog_wait_time 60000
Nov 28 06:42:50 localhost augenrules[741]: backlog_wait_time_actual 0
Nov 28 06:42:50 localhost systemd[1]: Started Security Auditing Service.
Nov 28 06:42:50 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 28 06:42:50 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 28 06:42:50 localhost systemd[1]: Reached target System Initialization.
Nov 28 06:42:50 localhost systemd[1]: Started dnf makecache --timer.
Nov 28 06:42:50 localhost systemd[1]: Started Daily rotation of log files.
Nov 28 06:42:50 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 28 06:42:50 localhost systemd[1]: Reached target Timer Units.
Nov 28 06:42:50 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 28 06:42:50 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 28 06:42:50 localhost systemd[1]: Reached target Socket Units.
Nov 28 06:42:50 localhost systemd[1]: Starting Initial cloud-init job (pre-networking)...
Nov 28 06:42:51 localhost systemd[1]: Starting D-Bus System Message Bus...
Nov 28 06:42:51 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 28 06:42:51 localhost systemd[1]: Started D-Bus System Message Bus.
Nov 28 06:42:51 localhost systemd[1]: Reached target Basic System.
Nov 28 06:42:51 localhost dbus-broker-lau[750]: Ready
Nov 28 06:42:51 localhost systemd[1]: Starting NTP client/server...
Nov 28 06:42:51 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 28 06:42:51 localhost systemd[1]: Started irqbalance daemon.
Nov 28 06:42:51 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 28 06:42:51 localhost systemd[1]: Starting System Logging Service...
Nov 28 06:42:51 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 06:42:51 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 06:42:51 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 06:42:51 localhost systemd[1]: Reached target sshd-keygen.target.
Nov 28 06:42:51 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 28 06:42:51 localhost systemd[1]: Reached target User and Group Name Lookups.
Nov 28 06:42:51 localhost rsyslogd[759]: [origin software="rsyslogd" swVersion="8.2102.0-111.el9" x-pid="759" x-info="https://www.rsyslog.com"] start
Nov 28 06:42:51 localhost rsyslogd[759]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2040 ]
Nov 28 06:42:51 localhost systemd[1]: Starting User Login Management...
Nov 28 06:42:51 localhost systemd[1]: Started System Logging Service.
Nov 28 06:42:51 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 28 06:42:51 localhost chronyd[766]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Nov 28 06:42:51 localhost chronyd[766]: Using right/UTC timezone to obtain leap second data
Nov 28 06:42:51 localhost chronyd[766]: Loaded seccomp filter (level 2)
Nov 28 06:42:51 localhost systemd[1]: Started NTP client/server.
Nov 28 06:42:51 localhost systemd-logind[764]: New seat seat0.
Nov 28 06:42:51 localhost systemd-logind[764]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 28 06:42:51 localhost systemd-logind[764]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 28 06:42:51 localhost systemd[1]: Started User Login Management.
Nov 28 06:42:51 localhost rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 06:42:51 localhost cloud-init[770]: Cloud-init v. 22.1-9.el9 running 'init-local' at Fri, 28 Nov 2025 06:42:51 +0000. Up 5.50 seconds.
Nov 28 06:42:51 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Nov 28 06:42:51 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Nov 28 06:42:51 localhost systemd[1]: Starting Hostname Service...
Nov 28 06:42:51 localhost systemd[1]: Started Hostname Service.
Nov 28 06:42:51 np0005538513.novalocal systemd-hostnamed[784]: Hostname set to <np0005538513.novalocal> (static)
Nov 28 06:42:51 np0005538513.novalocal systemd[1]: run-cloud\x2dinit-tmp-tmpi7f25n2f.mount: Deactivated successfully.
Nov 28 06:42:51 np0005538513.novalocal systemd[1]: Finished Initial cloud-init job (pre-networking).
Nov 28 06:42:51 np0005538513.novalocal systemd[1]: Reached target Preparation for Network.
Nov 28 06:42:51 np0005538513.novalocal systemd[1]: Starting Network Manager...
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8124] NetworkManager (version 1.42.2-1.el9) is starting... (boot:590d17e7-bf7a-4d44-b812-a5de06abfb1f)
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8130] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf)
Nov 28 06:42:51 np0005538513.novalocal systemd[1]: Started Network Manager.
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8163] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 28 06:42:51 np0005538513.novalocal systemd[1]: Reached target Network.
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8222] manager[0x55dedc308020]: monitoring kernel firmware directory '/lib/firmware'.
Nov 28 06:42:51 np0005538513.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 28 06:42:51 np0005538513.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8272] hostname: hostname: using hostnamed
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8273] hostname: static hostname changed from (none) to "np0005538513.novalocal"
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8281] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 28 06:42:51 np0005538513.novalocal systemd[1]: Starting Enable periodic update of entitlement certificates....
Nov 28 06:42:51 np0005538513.novalocal systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 28 06:42:51 np0005538513.novalocal systemd[1]: Started Enable periodic update of entitlement certificates..
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8438] manager[0x55dedc308020]: rfkill: Wi-Fi hardware radio set enabled
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8439] manager[0x55dedc308020]: rfkill: WWAN hardware radio set enabled
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8475] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so)
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8476] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8478] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8478] manager: Networking is enabled by state file
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8489] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8490] settings: Loaded settings plugin: keyfile (internal)
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8508] dhcp: init: Using DHCP client 'internal'
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8510] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 28 06:42:51 np0005538513.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8520] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8526] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external')
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8532] device (lo): Activation: starting connection 'lo' (dc22fba5-a55e-4101-8dc2-18071340ca35)
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8538] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8541] device (eth0): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external')
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8567] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external')
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8569] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external')
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8570] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external')
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8572] device (eth0): carrier: link connected
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8574] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external')
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8578] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed')
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8582] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8586] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8586] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed')
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8589] manager: NetworkManager state is now CONNECTING
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8590] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'managed')
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8595] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed')
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8597] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 06:42:51 np0005538513.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8657] dhcp4 (eth0): state changed new lease, address=38.102.83.64
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8659] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8673] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'managed')
Nov 28 06:42:51 np0005538513.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 06:42:51 np0005538513.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 28 06:42:51 np0005538513.novalocal systemd[1]: Reached target NFS client services.
Nov 28 06:42:51 np0005538513.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Nov 28 06:42:51 np0005538513.novalocal systemd[1]: Reached target Remote File Systems.
Nov 28 06:42:51 np0005538513.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 28 06:42:51 np0005538513.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8911] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external')
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8914] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'managed')
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8916] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external')
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8925] device (lo): Activation: successful, device activated.
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8933] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'managed')
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8938] manager: NetworkManager state is now CONNECTED_SITE
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8942] device (eth0): Activation: successful, device activated.
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8950] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 28 06:42:51 np0005538513.novalocal NetworkManager[789]: <info>  [1764312171.8956] manager: startup complete
Nov 28 06:42:51 np0005538513.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 28 06:42:51 np0005538513.novalocal systemd[1]: Starting Initial cloud-init job (metadata service crawler)...
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: Cloud-init v. 22.1-9.el9 running 'init' at Fri, 28 Nov 2025 06:42:52 +0000. Up 6.27 seconds.
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: |  eth0  | True |         38.102.83.64         | 255.255.255.0 | global | fa:16:3e:b0:25:93 |
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: |  eth0  | True | fe80::f816:3eff:feb0:2593/64 |       .       |  link  | fa:16:3e:b0:25:93 |
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Nov 28 06:42:52 np0005538513.novalocal cloud-init[973]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 28 06:42:52 np0005538513.novalocal systemd[1]: Starting Authorization Manager...
Nov 28 06:42:52 np0005538513.novalocal polkitd[1036]: Started polkitd version 0.117
Nov 28 06:42:52 np0005538513.novalocal systemd[1]: Started Dynamic System Tuning Daemon.
Nov 28 06:42:52 np0005538513.novalocal polkitd[1036]: Loading rules from directory /etc/polkit-1/rules.d
Nov 28 06:42:52 np0005538513.novalocal polkitd[1036]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 28 06:42:52 np0005538513.novalocal polkitd[1036]: Finished loading, compiling and executing 4 rules
Nov 28 06:42:52 np0005538513.novalocal systemd[1]: Started Authorization Manager.
Nov 28 06:42:52 np0005538513.novalocal polkitd[1036]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Nov 28 06:42:53 np0005538513.novalocal useradd[1115]: new group: name=cloud-user, GID=1001
Nov 28 06:42:53 np0005538513.novalocal useradd[1115]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Nov 28 06:42:53 np0005538513.novalocal useradd[1115]: add 'cloud-user' to group 'adm'
Nov 28 06:42:53 np0005538513.novalocal useradd[1115]: add 'cloud-user' to group 'systemd-journal'
Nov 28 06:42:53 np0005538513.novalocal useradd[1115]: add 'cloud-user' to shadow group 'adm'
Nov 28 06:42:53 np0005538513.novalocal useradd[1115]: add 'cloud-user' to shadow group 'systemd-journal'
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: Generating public/private rsa key pair.
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: The key fingerprint is:
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: SHA256:DeX7Gx9a/pmi8Yqbt882QwrPSlorkSrd97q5V7PdpFs root@np0005538513.novalocal
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: The key's randomart image is:
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: +---[RSA 3072]----+
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |          .      |
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |         o       |
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |        . .      |
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |         o .     |
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |        S o      |
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |       o . . +  .|
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |    . o .o+.* *oE|
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |   . o o+.=*+&.++|
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |    .  .o@X=O+B=.|
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: +----[SHA256]-----+
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: Generating public/private ecdsa key pair.
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: The key fingerprint is:
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: SHA256:fkEbeM8urwSNUDsbm5W9dv+B4EWJfWI0nQZ1cTyF0wk root@np0005538513.novalocal
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: The key's randomart image is:
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: +---[ECDSA 256]---+
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |        .    E=*O|
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |       . o o+ =*=|
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |      . = *..*.o.|
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |       . % =o.o  |
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |        S +.=..  |
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |       . ..+o... |
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |        . +... ..|
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |         o o    o|
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |          ...   .|
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: +----[SHA256]-----+
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: Generating public/private ed25519 key pair.
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: The key fingerprint is:
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: SHA256:0SVodquVirdK6O2R8rTErbiu2+OzC15/7xgxqM68IjA root@np0005538513.novalocal
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: The key's randomart image is:
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: +--[ED25519 256]--+
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |         .. .    |
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |        +..o     |
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |       o...o     |
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |       . .+      |
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |      ..S+       |
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |E    +.o+o       |
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |... = B.o.       |
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: |...OoX +oo       |
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: | .+*#OOo.oo      |
Nov 28 06:42:54 np0005538513.novalocal cloud-init[973]: +----[SHA256]-----+
Nov 28 06:42:54 np0005538513.novalocal sm-notify[1128]: Version 2.5.4 starting
Nov 28 06:42:54 np0005538513.novalocal systemd[1]: Finished Initial cloud-init job (metadata service crawler).
Nov 28 06:42:54 np0005538513.novalocal systemd[1]: Reached target Cloud-config availability.
Nov 28 06:42:54 np0005538513.novalocal systemd[1]: Reached target Network is Online.
Nov 28 06:42:54 np0005538513.novalocal systemd[1]: Starting Apply the settings specified in cloud-config...
Nov 28 06:42:54 np0005538513.novalocal systemd[1]: Run Insights Client at boot was skipped because of an unmet condition check (ConditionPathExists=/etc/insights-client/.run_insights_client_next_boot).
Nov 28 06:42:54 np0005538513.novalocal systemd[1]: Starting Crash recovery kernel arming...
Nov 28 06:42:54 np0005538513.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Nov 28 06:42:54 np0005538513.novalocal systemd[1]: Starting OpenSSH server daemon...
Nov 28 06:42:54 np0005538513.novalocal systemd[1]: Starting Permit User Sessions...
Nov 28 06:42:54 np0005538513.novalocal systemd[1]: Started Notify NFS peers of a restart.
Nov 28 06:42:54 np0005538513.novalocal systemd[1]: Finished Permit User Sessions.
Nov 28 06:42:54 np0005538513.novalocal systemd[1]: Started Command Scheduler.
Nov 28 06:42:54 np0005538513.novalocal sshd[1129]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:42:54 np0005538513.novalocal systemd[1]: Started Getty on tty1.
Nov 28 06:42:54 np0005538513.novalocal crond[1132]: (CRON) STARTUP (1.5.7)
Nov 28 06:42:54 np0005538513.novalocal crond[1132]: (CRON) INFO (Syslog will be used instead of sendmail.)
Nov 28 06:42:54 np0005538513.novalocal crond[1132]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 66% if used.)
Nov 28 06:42:54 np0005538513.novalocal crond[1132]: (CRON) INFO (running with inotify support)
Nov 28 06:42:54 np0005538513.novalocal sshd[1129]: Server listening on 0.0.0.0 port 22.
Nov 28 06:42:54 np0005538513.novalocal sshd[1129]: Server listening on :: port 22.
Nov 28 06:42:54 np0005538513.novalocal systemd[1]: Started Serial Getty on ttyS0.
Nov 28 06:42:54 np0005538513.novalocal systemd[1]: Reached target Login Prompts.
Nov 28 06:42:54 np0005538513.novalocal systemd[1]: Started OpenSSH server daemon.
Nov 28 06:42:54 np0005538513.novalocal systemd[1]: Reached target Multi-User System.
Nov 28 06:42:54 np0005538513.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 28 06:42:54 np0005538513.novalocal sshd[1146]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:42:54 np0005538513.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 28 06:42:54 np0005538513.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 28 06:42:54 np0005538513.novalocal sshd[1165]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:42:54 np0005538513.novalocal sshd[1165]: Unable to negotiate with 38.102.83.114 port 33834: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Nov 28 06:42:54 np0005538513.novalocal sshd[1173]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:42:54 np0005538513.novalocal sshd[1185]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:42:54 np0005538513.novalocal sshd[1185]: Unable to negotiate with 38.102.83.114 port 33846: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Nov 28 06:42:54 np0005538513.novalocal kdumpctl[1133]: kdump: No kdump initial ramdisk found.
Nov 28 06:42:54 np0005538513.novalocal kdumpctl[1133]: kdump: Rebuilding /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img
Nov 28 06:42:54 np0005538513.novalocal sshd[1190]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:42:54 np0005538513.novalocal sshd[1190]: Unable to negotiate with 38.102.83.114 port 33850: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Nov 28 06:42:54 np0005538513.novalocal sshd[1195]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:42:54 np0005538513.novalocal sshd[1208]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:42:54 np0005538513.novalocal sshd[1146]: Connection closed by 38.102.83.114 port 33818 [preauth]
Nov 28 06:42:54 np0005538513.novalocal sshd[1227]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:42:54 np0005538513.novalocal sshd[1227]: fatal: mm_answer_sign: sign: error in libcrypto
Nov 28 06:42:54 np0005538513.novalocal sshd[1173]: Connection closed by 38.102.83.114 port 33838 [preauth]
Nov 28 06:42:54 np0005538513.novalocal sshd[1237]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:42:54 np0005538513.novalocal sshd[1237]: Unable to negotiate with 38.102.83.114 port 33898: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Nov 28 06:42:54 np0005538513.novalocal cloud-init[1266]: Cloud-init v. 22.1-9.el9 running 'modules:config' at Fri, 28 Nov 2025 06:42:54 +0000. Up 8.49 seconds.
Nov 28 06:42:54 np0005538513.novalocal sshd[1195]: Connection closed by 38.102.83.114 port 33862 [preauth]
Nov 28 06:42:54 np0005538513.novalocal sshd[1208]: Connection closed by 38.102.83.114 port 33872 [preauth]
Nov 28 06:42:54 np0005538513.novalocal systemd[1]: Finished Apply the settings specified in cloud-config.
Nov 28 06:42:54 np0005538513.novalocal systemd[1]: Starting Execute cloud user/final scripts...
Nov 28 06:42:54 np0005538513.novalocal dracut[1433]: dracut-057-21.git20230214.el9
Nov 28 06:42:54 np0005538513.novalocal cloud-init[1437]: Cloud-init v. 22.1-9.el9 running 'modules:final' at Fri, 28 Nov 2025 06:42:54 +0000. Up 8.85 seconds.
Nov 28 06:42:54 np0005538513.novalocal cloud-init[1451]: #############################################################
Nov 28 06:42:54 np0005538513.novalocal cloud-init[1453]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 28 06:42:54 np0005538513.novalocal cloud-init[1457]: 256 SHA256:fkEbeM8urwSNUDsbm5W9dv+B4EWJfWI0nQZ1cTyF0wk root@np0005538513.novalocal (ECDSA)
Nov 28 06:42:54 np0005538513.novalocal cloud-init[1461]: 256 SHA256:0SVodquVirdK6O2R8rTErbiu2+OzC15/7xgxqM68IjA root@np0005538513.novalocal (ED25519)
Nov 28 06:42:54 np0005538513.novalocal dracut[1435]: Executing: /usr/bin/dracut --add kdumpbase --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  -o "plymouth resume ifcfg earlykdump" --mount "/dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device -f /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img 5.14.0-284.11.1.el9_2.x86_64
Nov 28 06:42:54 np0005538513.novalocal cloud-init[1468]: 3072 SHA256:DeX7Gx9a/pmi8Yqbt882QwrPSlorkSrd97q5V7PdpFs root@np0005538513.novalocal (RSA)
Nov 28 06:42:54 np0005538513.novalocal cloud-init[1472]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 28 06:42:54 np0005538513.novalocal cloud-init[1475]: #############################################################
Nov 28 06:42:54 np0005538513.novalocal cloud-init[1437]: Cloud-init v. 22.1-9.el9 finished at Fri, 28 Nov 2025 06:42:54 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 9.08 seconds
Nov 28 06:42:54 np0005538513.novalocal dracut[1435]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 28 06:42:54 np0005538513.novalocal dracut[1435]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 28 06:42:54 np0005538513.novalocal dracut[1435]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 28 06:42:54 np0005538513.novalocal dracut[1435]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 28 06:42:54 np0005538513.novalocal dracut[1435]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 28 06:42:54 np0005538513.novalocal systemd[1]: Reloading Network Manager...
Nov 28 06:42:54 np0005538513.novalocal dracut[1435]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 28 06:42:54 np0005538513.novalocal dracut[1435]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 28 06:42:55 np0005538513.novalocal NetworkManager[789]: <info>  [1764312175.0024] audit: op="reload" arg="0" pid=1589 uid=0 result="success"
Nov 28 06:42:55 np0005538513.novalocal NetworkManager[789]: <info>  [1764312175.0031] config: signal: SIGHUP (no changes from disk)
Nov 28 06:42:55 np0005538513.novalocal systemd[1]: Reloaded Network Manager.
Nov 28 06:42:55 np0005538513.novalocal systemd[1]: Finished Execute cloud user/final scripts.
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 28 06:42:55 np0005538513.novalocal systemd[1]: Reached target Cloud-init target.
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'ifcfg' will not be installed, because it's in the list to be omitted!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'plymouth' will not be installed, because it's in the list to be omitted!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'resume' will not be installed, because it's in the list to be omitted!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'earlykdump' will not be installed, because it's in the list to be omitted!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: memstrack is not available
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: memstrack is not available
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 28 06:42:55 np0005538513.novalocal dracut[1435]: *** Including module: systemd ***
Nov 28 06:42:56 np0005538513.novalocal dracut[1435]: *** Including module: systemd-initrd ***
Nov 28 06:42:56 np0005538513.novalocal dracut[1435]: *** Including module: i18n ***
Nov 28 06:42:56 np0005538513.novalocal dracut[1435]: No KEYMAP configured.
Nov 28 06:42:56 np0005538513.novalocal dracut[1435]: *** Including module: drm ***
Nov 28 06:42:56 np0005538513.novalocal dracut[1435]: *** Including module: prefixdevname ***
Nov 28 06:42:56 np0005538513.novalocal dracut[1435]: *** Including module: kernel-modules ***
Nov 28 06:42:56 np0005538513.novalocal chronyd[766]: Selected source 149.56.19.163 (2.rhel.pool.ntp.org)
Nov 28 06:42:56 np0005538513.novalocal chronyd[766]: System clock TAI offset set to 37 seconds
Nov 28 06:42:57 np0005538513.novalocal dracut[1435]: *** Including module: kernel-modules-extra ***
Nov 28 06:42:57 np0005538513.novalocal dracut[1435]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Nov 28 06:42:57 np0005538513.novalocal dracut[1435]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Nov 28 06:42:57 np0005538513.novalocal dracut[1435]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Nov 28 06:42:57 np0005538513.novalocal dracut[1435]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Nov 28 06:42:57 np0005538513.novalocal dracut[1435]: *** Including module: qemu ***
Nov 28 06:42:57 np0005538513.novalocal dracut[1435]: *** Including module: fstab-sys ***
Nov 28 06:42:57 np0005538513.novalocal dracut[1435]: *** Including module: rootfs-block ***
Nov 28 06:42:57 np0005538513.novalocal dracut[1435]: *** Including module: terminfo ***
Nov 28 06:42:57 np0005538513.novalocal dracut[1435]: *** Including module: udev-rules ***
Nov 28 06:42:57 np0005538513.novalocal dracut[1435]: Skipping udev rule: 91-permissions.rules
Nov 28 06:42:57 np0005538513.novalocal dracut[1435]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 28 06:42:57 np0005538513.novalocal dracut[1435]: *** Including module: virtiofs ***
Nov 28 06:42:57 np0005538513.novalocal dracut[1435]: *** Including module: dracut-systemd ***
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]: *** Including module: usrmount ***
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]: *** Including module: base ***
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]: *** Including module: fs-lib ***
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]: *** Including module: kdumpbase ***
Nov 28 06:42:58 np0005538513.novalocal chronyd[766]: Selected source 206.108.0.131 (2.rhel.pool.ntp.org)
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]:   microcode_ctl module: mangling fw_dir
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]:     microcode_ctl: configuration "intel" is ignored
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]:     microcode_ctl: final fw_dir: "/lib/firmware/updates/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware/updates /lib/firmware/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware"
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]: *** Including module: shutdown ***
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]: *** Including module: squash ***
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]: *** Including modules done ***
Nov 28 06:42:58 np0005538513.novalocal dracut[1435]: *** Installing kernel module dependencies ***
Nov 28 06:42:59 np0005538513.novalocal dracut[1435]: *** Installing kernel module dependencies done ***
Nov 28 06:42:59 np0005538513.novalocal dracut[1435]: *** Resolving executable dependencies ***
Nov 28 06:43:00 np0005538513.novalocal dracut[1435]: *** Resolving executable dependencies done ***
Nov 28 06:43:00 np0005538513.novalocal dracut[1435]: *** Hardlinking files ***
Nov 28 06:43:00 np0005538513.novalocal dracut[1435]: Mode:           real
Nov 28 06:43:00 np0005538513.novalocal dracut[1435]: Files:          1099
Nov 28 06:43:00 np0005538513.novalocal dracut[1435]: Linked:         3 files
Nov 28 06:43:00 np0005538513.novalocal dracut[1435]: Compared:       0 xattrs
Nov 28 06:43:00 np0005538513.novalocal dracut[1435]: Compared:       373 files
Nov 28 06:43:00 np0005538513.novalocal dracut[1435]: Saved:          61.04 KiB
Nov 28 06:43:00 np0005538513.novalocal dracut[1435]: Duration:       0.024039 seconds
Nov 28 06:43:00 np0005538513.novalocal dracut[1435]: *** Hardlinking files done ***
Nov 28 06:43:00 np0005538513.novalocal dracut[1435]: Could not find 'strip'. Not stripping the initramfs.
Nov 28 06:43:00 np0005538513.novalocal dracut[1435]: *** Generating early-microcode cpio image ***
Nov 28 06:43:00 np0005538513.novalocal dracut[1435]: *** Constructing AuthenticAMD.bin ***
Nov 28 06:43:00 np0005538513.novalocal dracut[1435]: *** Store current command line parameters ***
Nov 28 06:43:00 np0005538513.novalocal dracut[1435]: Stored kernel commandline:
Nov 28 06:43:00 np0005538513.novalocal dracut[1435]: No dracut internal kernel commandline stored in the initramfs
Nov 28 06:43:01 np0005538513.novalocal dracut[1435]: *** Install squash loader ***
Nov 28 06:43:01 np0005538513.novalocal dracut[1435]: *** Squashing the files inside the initramfs ***
Nov 28 06:43:02 np0005538513.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 06:43:02 np0005538513.novalocal dracut[1435]: *** Squashing the files inside the initramfs done ***
Nov 28 06:43:02 np0005538513.novalocal dracut[1435]: *** Creating image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' ***
Nov 28 06:43:02 np0005538513.novalocal dracut[1435]: *** Creating initramfs image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' done ***
Nov 28 06:43:03 np0005538513.novalocal kdumpctl[1133]: kdump: kexec: loaded kdump kernel
Nov 28 06:43:03 np0005538513.novalocal kdumpctl[1133]: kdump: Starting kdump: [OK]
Nov 28 06:43:03 np0005538513.novalocal systemd[1]: Finished Crash recovery kernel arming.
Nov 28 06:43:03 np0005538513.novalocal systemd[1]: Startup finished in 1.239s (kernel) + 2.034s (initrd) + 14.145s (userspace) = 17.419s.
Nov 28 06:43:21 np0005538513.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 28 06:43:40 np0005538513.novalocal sshd[4173]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:43:40 np0005538513.novalocal sshd[4173]: Accepted publickey for zuul from 38.102.83.114 port 39962 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Nov 28 06:43:40 np0005538513.novalocal systemd[1]: Created slice User Slice of UID 1000.
Nov 28 06:43:40 np0005538513.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 28 06:43:40 np0005538513.novalocal systemd-logind[764]: New session 1 of user zuul.
Nov 28 06:43:40 np0005538513.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 28 06:43:40 np0005538513.novalocal systemd[1]: Starting User Manager for UID 1000...
Nov 28 06:43:41 np0005538513.novalocal systemd[4177]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 06:43:41 np0005538513.novalocal systemd[4177]: Queued start job for default target Main User Target.
Nov 28 06:43:41 np0005538513.novalocal systemd[4177]: Created slice User Application Slice.
Nov 28 06:43:41 np0005538513.novalocal systemd[4177]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 06:43:41 np0005538513.novalocal systemd[4177]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 06:43:41 np0005538513.novalocal systemd[4177]: Reached target Paths.
Nov 28 06:43:41 np0005538513.novalocal systemd[4177]: Reached target Timers.
Nov 28 06:43:41 np0005538513.novalocal systemd[4177]: Starting D-Bus User Message Bus Socket...
Nov 28 06:43:41 np0005538513.novalocal systemd[4177]: Starting Create User's Volatile Files and Directories...
Nov 28 06:43:41 np0005538513.novalocal systemd[4177]: Listening on D-Bus User Message Bus Socket.
Nov 28 06:43:41 np0005538513.novalocal systemd[4177]: Reached target Sockets.
Nov 28 06:43:41 np0005538513.novalocal systemd[4177]: Finished Create User's Volatile Files and Directories.
Nov 28 06:43:41 np0005538513.novalocal systemd[4177]: Reached target Basic System.
Nov 28 06:43:41 np0005538513.novalocal systemd[4177]: Reached target Main User Target.
Nov 28 06:43:41 np0005538513.novalocal systemd[4177]: Startup finished in 115ms.
Nov 28 06:43:41 np0005538513.novalocal systemd[1]: Started User Manager for UID 1000.
Nov 28 06:43:41 np0005538513.novalocal systemd[1]: Started Session 1 of User zuul.
Nov 28 06:43:41 np0005538513.novalocal sshd[4173]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 06:43:41 np0005538513.novalocal python3[4229]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 06:43:49 np0005538513.novalocal python3[4247]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 06:43:56 np0005538513.novalocal python3[4300]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 06:43:58 np0005538513.novalocal python3[4330]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 28 06:44:01 np0005538513.novalocal python3[4346]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsnBivukZgTjr1SoC29hE3ofwUMxTaKeXh9gXvDwMJASbvK4q9943cbJ2j47GUf8sEgY38kkU/dxSMQWULl4d2oquIgZpJbJuXMU1WNxwGNSrS74OecQ3Or4VxTiDmu/HV83nIWHqfpDCra4DlrIBPPNwhBK4u0QYy87AJaML6NGEDaubbHgVCg1UpW1ho/sDoXptAehoCEaaeRz5tPHiXRnHpIXu44Sp8fRcyU9rBqdv+/lgachTcMYadsD2WBHIL+pptEDHB5TvQTDpnU58YdGFarn8uuGPP4t8H6xcqXbaJS9/oZa5Fb5Mh3vORBbR65jvlGg4PYGzCuI/xllY5+lGK7eyOleFyRqWKa2uAIaGoRBT4ZLKAssOFwCIaGfOAFFOBMkuylg4+MtbYiMJYRARPSRAufAROqhUDOo73y5lBrXh07aiWuSn8fU4mclWu+Xw382ryxW+XeHPc12d7S46TvGJaRvzsLtlyerRxGI77xOHRexq1Z/SFjOWLOwc= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:01 np0005538513.novalocal python3[4360]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:02 np0005538513.novalocal python3[4419]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 06:44:03 np0005538513.novalocal python3[4460]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764312242.7142704-389-233679186952224/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=4237e6bc560e46d9aa55417d911b2c55_id_rsa follow=False checksum=47f6a2f8fa426c1f34aad346f88073a22928af4e backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:04 np0005538513.novalocal python3[4533]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 06:44:04 np0005538513.novalocal python3[4574]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764312244.3381405-485-201455792218243/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=4237e6bc560e46d9aa55417d911b2c55_id_rsa.pub follow=False checksum=d1f12d852c72cfefab089d88337552962cfbc93d backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:06 np0005538513.novalocal python3[4602]: ansible-ping Invoked with data=pong
Nov 28 06:44:09 np0005538513.novalocal python3[4616]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 06:44:13 np0005538513.novalocal python3[4670]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 28 06:44:15 np0005538513.novalocal python3[4692]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:16 np0005538513.novalocal python3[4706]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:16 np0005538513.novalocal python3[4720]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:17 np0005538513.novalocal python3[4734]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:17 np0005538513.novalocal python3[4748]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:17 np0005538513.novalocal python3[4762]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:20 np0005538513.novalocal sudo[4776]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jkctrszocerznaifnwewrckydvckarsv ; /usr/bin/python3
Nov 28 06:44:20 np0005538513.novalocal sudo[4776]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:44:20 np0005538513.novalocal python3[4778]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:20 np0005538513.novalocal sudo[4776]: pam_unix(sudo:session): session closed for user root
Nov 28 06:44:22 np0005538513.novalocal sudo[4824]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bycoxgcfdobrzpjhyeqxgwbwolukajlu ; /usr/bin/python3
Nov 28 06:44:22 np0005538513.novalocal sudo[4824]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:44:22 np0005538513.novalocal python3[4826]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 06:44:22 np0005538513.novalocal sudo[4824]: pam_unix(sudo:session): session closed for user root
Nov 28 06:44:22 np0005538513.novalocal sudo[4867]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gojcuvqtxxoytmcjcucornrltoqeavwe ; /usr/bin/python3
Nov 28 06:44:22 np0005538513.novalocal sudo[4867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:44:22 np0005538513.novalocal python3[4869]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764312262.0988226-99-122024860704302/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:22 np0005538513.novalocal sudo[4867]: pam_unix(sudo:session): session closed for user root
Nov 28 06:44:30 np0005538513.novalocal python3[4897]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:30 np0005538513.novalocal python3[4911]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:31 np0005538513.novalocal python3[4925]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:31 np0005538513.novalocal python3[4939]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:31 np0005538513.novalocal python3[4953]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:31 np0005538513.novalocal python3[4967]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:32 np0005538513.novalocal python3[4981]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:32 np0005538513.novalocal python3[4995]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:32 np0005538513.novalocal python3[5009]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:32 np0005538513.novalocal python3[5023]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:33 np0005538513.novalocal python3[5037]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:33 np0005538513.novalocal python3[5051]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:33 np0005538513.novalocal python3[5065]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:33 np0005538513.novalocal python3[5079]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:34 np0005538513.novalocal python3[5093]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:34 np0005538513.novalocal python3[5107]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:34 np0005538513.novalocal python3[5121]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:34 np0005538513.novalocal python3[5135]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:35 np0005538513.novalocal python3[5149]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:35 np0005538513.novalocal python3[5163]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:35 np0005538513.novalocal python3[5177]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:35 np0005538513.novalocal python3[5191]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:36 np0005538513.novalocal python3[5205]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:36 np0005538513.novalocal python3[5219]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:36 np0005538513.novalocal python3[5233]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:36 np0005538513.novalocal python3[5247]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:38 np0005538513.novalocal sudo[5261]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ipsyiarbnpvtypsfwewwqvpqmtkfzlxa ; /usr/bin/python3
Nov 28 06:44:38 np0005538513.novalocal sudo[5261]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:44:38 np0005538513.novalocal python3[5263]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 28 06:44:38 np0005538513.novalocal systemd[1]: Starting Time & Date Service...
Nov 28 06:44:38 np0005538513.novalocal systemd[1]: Started Time & Date Service.
Nov 28 06:44:39 np0005538513.novalocal systemd-timedated[5265]: Changed time zone to 'UTC' (UTC).
Nov 28 06:44:39 np0005538513.novalocal sudo[5261]: pam_unix(sudo:session): session closed for user root
Nov 28 06:44:39 np0005538513.novalocal sudo[5282]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ilmogyxuauqyaaqbigqraztlrvvkmplx ; /usr/bin/python3
Nov 28 06:44:39 np0005538513.novalocal sudo[5282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:44:39 np0005538513.novalocal python3[5284]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:39 np0005538513.novalocal sudo[5282]: pam_unix(sudo:session): session closed for user root
Nov 28 06:44:40 np0005538513.novalocal python3[5330]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 06:44:41 np0005538513.novalocal python3[5371]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764312280.7328932-492-37995062897735/source _original_basename=tmp5eyxkp02 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:42 np0005538513.novalocal python3[5431]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 06:44:42 np0005538513.novalocal python3[5472]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764312282.215266-581-230248070919413/source _original_basename=tmp586k1ny0 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:44 np0005538513.novalocal sudo[5532]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rrajloegzdvodbkbxgildkwbmihxvgvh ; /usr/bin/python3
Nov 28 06:44:44 np0005538513.novalocal sudo[5532]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:44:44 np0005538513.novalocal python3[5534]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 06:44:44 np0005538513.novalocal sudo[5532]: pam_unix(sudo:session): session closed for user root
Nov 28 06:44:44 np0005538513.novalocal sudo[5575]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-muwybbmlwzwlildzattvlaaftdqbtgoo ; /usr/bin/python3
Nov 28 06:44:44 np0005538513.novalocal sudo[5575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:44:44 np0005538513.novalocal python3[5577]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764312284.3016577-724-11816403856429/source _original_basename=tmpt5fb9mp7 follow=False checksum=d1fb5b4f9f73b8c84cf3b5af0e2af5367a435780 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:44 np0005538513.novalocal sudo[5575]: pam_unix(sudo:session): session closed for user root
Nov 28 06:44:46 np0005538513.novalocal python3[5605]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 06:44:46 np0005538513.novalocal python3[5621]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 06:44:47 np0005538513.novalocal sudo[5669]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcxhcieftuxusfodvsupiijddiqkopkx ; /usr/bin/python3
Nov 28 06:44:47 np0005538513.novalocal sudo[5669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:44:47 np0005538513.novalocal python3[5671]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 06:44:47 np0005538513.novalocal sudo[5669]: pam_unix(sudo:session): session closed for user root
Nov 28 06:44:47 np0005538513.novalocal sudo[5712]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxbofwvbpdgwfcuqdvbrueaocpesfvjd ; /usr/bin/python3
Nov 28 06:44:47 np0005538513.novalocal sudo[5712]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:44:48 np0005538513.novalocal python3[5714]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764312287.381199-852-275465695180306/source _original_basename=tmpfes8jtas follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:48 np0005538513.novalocal sudo[5712]: pam_unix(sudo:session): session closed for user root
Nov 28 06:44:49 np0005538513.novalocal sudo[5743]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ossatxbklttozpouzydoqnhzwpfbkgdb ; /usr/bin/python3
Nov 28 06:44:49 np0005538513.novalocal sudo[5743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:44:49 np0005538513.novalocal python3[5745]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-161e-20ee-000000000023-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 06:44:49 np0005538513.novalocal sudo[5743]: pam_unix(sudo:session): session closed for user root
Nov 28 06:45:00 np0005538513.novalocal python3[5764]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-161e-20ee-000000000024-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 28 06:45:09 np0005538513.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 28 06:45:12 np0005538513.novalocal python3[5784]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:45:30 np0005538513.novalocal sudo[5798]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-enmwaxqzycpofflrxxfgbuythfsljmlp ; /usr/bin/python3
Nov 28 06:45:30 np0005538513.novalocal sudo[5798]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:45:30 np0005538513.novalocal python3[5800]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:45:30 np0005538513.novalocal sudo[5798]: pam_unix(sudo:session): session closed for user root
Nov 28 06:46:05 np0005538513.novalocal systemd[4177]: Starting Mark boot as successful...
Nov 28 06:46:05 np0005538513.novalocal systemd[4177]: Finished Mark boot as successful.
Nov 28 06:46:30 np0005538513.novalocal sshd[4186]: Received disconnect from 38.102.83.114 port 39962:11: disconnected by user
Nov 28 06:46:30 np0005538513.novalocal sshd[4186]: Disconnected from user zuul 38.102.83.114 port 39962
Nov 28 06:46:30 np0005538513.novalocal sshd[4173]: pam_unix(sshd:session): session closed for user zuul
Nov 28 06:46:30 np0005538513.novalocal systemd-logind[764]: Session 1 logged out. Waiting for processes to exit.
Nov 28 06:46:52 np0005538513.novalocal systemd[1]: Unmounting EFI System Partition Automount...
Nov 28 06:46:52 np0005538513.novalocal systemd[1]: efi.mount: Deactivated successfully.
Nov 28 06:46:52 np0005538513.novalocal systemd[1]: Unmounted EFI System Partition Automount.
Nov 28 06:47:56 np0005538513.novalocal sshd[5806]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:47:57 np0005538513.novalocal sshd[5806]: Received disconnect from 80.94.93.233 port 36786:11:  [preauth]
Nov 28 06:47:57 np0005538513.novalocal sshd[5806]: Disconnected from authenticating user root 80.94.93.233 port 36786 [preauth]
Nov 28 06:48:52 np0005538513.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000
Nov 28 06:48:52 np0005538513.novalocal kernel: pci 0000:00:07.0: reg 0x10: [io  0x0000-0x003f]
Nov 28 06:48:52 np0005538513.novalocal kernel: pci 0000:00:07.0: reg 0x14: [mem 0x00000000-0x00000fff]
Nov 28 06:48:52 np0005538513.novalocal kernel: pci 0000:00:07.0: reg 0x20: [mem 0x00000000-0x00003fff 64bit pref]
Nov 28 06:48:52 np0005538513.novalocal kernel: pci 0000:00:07.0: reg 0x30: [mem 0x00000000-0x0007ffff pref]
Nov 28 06:48:52 np0005538513.novalocal kernel: pci 0000:00:07.0: BAR 6: assigned [mem 0xc0000000-0xc007ffff pref]
Nov 28 06:48:52 np0005538513.novalocal kernel: pci 0000:00:07.0: BAR 4: assigned [mem 0x440000000-0x440003fff 64bit pref]
Nov 28 06:48:52 np0005538513.novalocal kernel: pci 0000:00:07.0: BAR 1: assigned [mem 0xc0080000-0xc0080fff]
Nov 28 06:48:52 np0005538513.novalocal kernel: pci 0000:00:07.0: BAR 0: assigned [io  0x1000-0x103f]
Nov 28 06:48:52 np0005538513.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Nov 28 06:48:52 np0005538513.novalocal NetworkManager[789]: <info>  [1764312532.1258] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 28 06:48:52 np0005538513.novalocal systemd-udevd[5810]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 06:48:52 np0005538513.novalocal NetworkManager[789]: <info>  [1764312532.1378] device (eth1): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external')
Nov 28 06:48:52 np0005538513.novalocal systemd[4177]: Created slice User Background Tasks Slice.
Nov 28 06:48:52 np0005538513.novalocal NetworkManager[789]: <info>  [1764312532.1411] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 28 06:48:52 np0005538513.novalocal NetworkManager[789]: <info>  [1764312532.1415] device (eth1): carrier: link connected
Nov 28 06:48:52 np0005538513.novalocal NetworkManager[789]: <info>  [1764312532.1417] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed')
Nov 28 06:48:52 np0005538513.novalocal NetworkManager[789]: <info>  [1764312532.1422] policy: auto-activating connection 'Wired connection 1' (5001363f-9ee2-3450-b752-8866e660be65)
Nov 28 06:48:52 np0005538513.novalocal NetworkManager[789]: <info>  [1764312532.1428] device (eth1): Activation: starting connection 'Wired connection 1' (5001363f-9ee2-3450-b752-8866e660be65)
Nov 28 06:48:52 np0005538513.novalocal NetworkManager[789]: <info>  [1764312532.1429] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed')
Nov 28 06:48:52 np0005538513.novalocal NetworkManager[789]: <info>  [1764312532.1432] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'managed')
Nov 28 06:48:52 np0005538513.novalocal NetworkManager[789]: <info>  [1764312532.1437] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed')
Nov 28 06:48:52 np0005538513.novalocal systemd[4177]: Starting Cleanup of User's Temporary Files and Directories...
Nov 28 06:48:52 np0005538513.novalocal NetworkManager[789]: <info>  [1764312532.1440] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 28 06:48:52 np0005538513.novalocal systemd[4177]: Finished Cleanup of User's Temporary Files and Directories.
Nov 28 06:48:53 np0005538513.novalocal sshd[5814]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:48:53 np0005538513.novalocal kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth1: link becomes ready
Nov 28 06:48:53 np0005538513.novalocal sshd[5814]: Accepted publickey for zuul from 38.102.83.114 port 48480 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 06:48:53 np0005538513.novalocal systemd-logind[764]: New session 3 of user zuul.
Nov 28 06:48:53 np0005538513.novalocal systemd[1]: Started Session 3 of User zuul.
Nov 28 06:48:53 np0005538513.novalocal sshd[5814]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 06:48:53 np0005538513.novalocal python3[5831]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-b1a9-fc65-00000000039b-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 06:49:06 np0005538513.novalocal sudo[5880]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hgwdnznedmhergqpqtvdvcuinelbjzzf ; OS_CLOUD=vexxhost /usr/bin/python3
Nov 28 06:49:06 np0005538513.novalocal sudo[5880]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:49:06 np0005538513.novalocal python3[5882]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 06:49:06 np0005538513.novalocal sudo[5880]: pam_unix(sudo:session): session closed for user root
Nov 28 06:49:06 np0005538513.novalocal sudo[5923]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uixdkssainrgbhsjctonsxyiwotimtzx ; OS_CLOUD=vexxhost /usr/bin/python3
Nov 28 06:49:06 np0005538513.novalocal sudo[5923]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:49:07 np0005538513.novalocal python3[5925]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764312546.3786542-435-234083149484200/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=e5e10e3b8898b1550d26d78981826d3ea337ef09 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:49:07 np0005538513.novalocal sudo[5923]: pam_unix(sudo:session): session closed for user root
Nov 28 06:49:07 np0005538513.novalocal sudo[5953]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shtpmgbjmtxifhyocvvwgiqybnvfodtf ; OS_CLOUD=vexxhost /usr/bin/python3
Nov 28 06:49:07 np0005538513.novalocal sudo[5953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:49:07 np0005538513.novalocal python3[5955]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 06:49:07 np0005538513.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 28 06:49:07 np0005538513.novalocal systemd[1]: Stopped Network Manager Wait Online.
Nov 28 06:49:07 np0005538513.novalocal systemd[1]: Stopping Network Manager Wait Online...
Nov 28 06:49:07 np0005538513.novalocal systemd[1]: Stopping Network Manager...
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[789]: <info>  [1764312547.6537] caught SIGTERM, shutting down normally.
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[789]: <info>  [1764312547.6642] dhcp4 (eth0): canceled DHCP transaction
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[789]: <info>  [1764312547.6643] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[789]: <info>  [1764312547.6643] dhcp4 (eth0): state changed no lease
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[789]: <info>  [1764312547.6647] manager: NetworkManager state is now CONNECTING
Nov 28 06:49:07 np0005538513.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[789]: <info>  [1764312547.6782] dhcp4 (eth1): canceled DHCP transaction
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[789]: <info>  [1764312547.6783] dhcp4 (eth1): state changed no lease
Nov 28 06:49:07 np0005538513.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[789]: <info>  [1764312547.6860] exiting (success)
Nov 28 06:49:07 np0005538513.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 28 06:49:07 np0005538513.novalocal systemd[1]: Stopped Network Manager.
Nov 28 06:49:07 np0005538513.novalocal systemd[1]: NetworkManager.service: Consumed 2.301s CPU time.
Nov 28 06:49:07 np0005538513.novalocal systemd[1]: Starting Network Manager...
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.7431] NetworkManager (version 1.42.2-1.el9) is starting... (after a restart, boot:590d17e7-bf7a-4d44-b812-a5de06abfb1f)
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.7435] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf)
Nov 28 06:49:07 np0005538513.novalocal systemd[1]: Started Network Manager.
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.7467] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 28 06:49:07 np0005538513.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.7565] manager[0x55d3ec1f6090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 28 06:49:07 np0005538513.novalocal systemd[1]: Starting Hostname Service...
Nov 28 06:49:07 np0005538513.novalocal sudo[5953]: pam_unix(sudo:session): session closed for user root
Nov 28 06:49:07 np0005538513.novalocal systemd[1]: Started Hostname Service.
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8442] hostname: hostname: using hostnamed
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8443] hostname: static hostname changed from (none) to "np0005538513.novalocal"
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8450] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8458] manager[0x55d3ec1f6090]: rfkill: Wi-Fi hardware radio set enabled
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8458] manager[0x55d3ec1f6090]: rfkill: WWAN hardware radio set enabled
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8504] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so)
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8505] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8506] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8506] manager: Networking is enabled by state file
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8518] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8521] settings: Loaded settings plugin: keyfile (internal)
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8569] dhcp: init: Using DHCP client 'internal'
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8573] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8581] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8589] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external')
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8601] device (lo): Activation: starting connection 'lo' (dc22fba5-a55e-4101-8dc2-18071340ca35)
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8609] device (eth0): carrier: link connected
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8616] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8623] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8624] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume')
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8635] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume')
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8649] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8658] device (eth1): carrier: link connected
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8664] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8671] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (5001363f-9ee2-3450-b752-8866e660be65) (indicated)
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8671] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume')
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8678] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume')
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8689] device (eth1): Activation: starting connection 'Wired connection 1' (5001363f-9ee2-3450-b752-8866e660be65)
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8723] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external')
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8731] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external')
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8736] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external')
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8740] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume')
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8746] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'assume')
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8750] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume')
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8757] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'assume')
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8804] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external')
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8817] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume')
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8824] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8835] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume')
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8839] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8864] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external')
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8874] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external')
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8885] device (lo): Activation: successful, device activated.
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8964] dhcp4 (eth0): state changed new lease, address=38.102.83.64
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.8974] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.9079] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume')
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.9108] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume')
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.9111] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume')
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.9116] manager: NetworkManager state is now CONNECTED_SITE
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.9120] device (eth0): Activation: successful, device activated.
Nov 28 06:49:07 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312547.9127] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 28 06:49:08 np0005538513.novalocal python3[6024]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-b1a9-fc65-000000000120-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 06:49:17 np0005538513.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 06:49:37 np0005538513.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 28 06:49:52 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312592.8293] device (eth1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume')
Nov 28 06:49:52 np0005538513.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 06:49:52 np0005538513.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 06:49:52 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312592.8526] device (eth1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume')
Nov 28 06:49:52 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312592.8529] device (eth1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume')
Nov 28 06:49:52 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312592.8534] device (eth1): Activation: successful, device activated.
Nov 28 06:49:52 np0005538513.novalocal NetworkManager[5967]: <info>  [1764312592.8539] manager: startup complete
Nov 28 06:49:52 np0005538513.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 28 06:50:02 np0005538513.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 06:50:08 np0005538513.novalocal sshd[5817]: Received disconnect from 38.102.83.114 port 48480:11: disconnected by user
Nov 28 06:50:08 np0005538513.novalocal sshd[5817]: Disconnected from user zuul 38.102.83.114 port 48480
Nov 28 06:50:08 np0005538513.novalocal sshd[5814]: pam_unix(sshd:session): session closed for user zuul
Nov 28 06:50:08 np0005538513.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Nov 28 06:50:08 np0005538513.novalocal systemd[1]: session-3.scope: Consumed 1.481s CPU time.
Nov 28 06:50:08 np0005538513.novalocal systemd-logind[764]: Session 3 logged out. Waiting for processes to exit.
Nov 28 06:50:08 np0005538513.novalocal systemd-logind[764]: Removed session 3.
Nov 28 06:51:23 np0005538513.novalocal sshd[6056]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:51:23 np0005538513.novalocal sshd[6056]: Accepted publickey for zuul from 38.102.83.114 port 34220 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 06:51:23 np0005538513.novalocal systemd-logind[764]: New session 4 of user zuul.
Nov 28 06:51:23 np0005538513.novalocal systemd[1]: Started Session 4 of User zuul.
Nov 28 06:51:23 np0005538513.novalocal sshd[6056]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 06:51:23 np0005538513.novalocal sudo[6105]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jouoqbxvttmkzvfmnrcqycgcvlmjmjwa ; OS_CLOUD=vexxhost /usr/bin/python3
Nov 28 06:51:23 np0005538513.novalocal sudo[6105]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:51:23 np0005538513.novalocal python3[6107]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 06:51:23 np0005538513.novalocal sudo[6105]: pam_unix(sudo:session): session closed for user root
Nov 28 06:51:23 np0005538513.novalocal sudo[6148]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfgaoswopsupebaawqrezjcnazmyoywu ; OS_CLOUD=vexxhost /usr/bin/python3
Nov 28 06:51:23 np0005538513.novalocal sudo[6148]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:51:23 np0005538513.novalocal python3[6150]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764312683.3882318-628-62124278769557/source _original_basename=tmpk5fxeurt follow=False checksum=10225105ecbcb8380becb3ed8e03293c5f034347 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:51:24 np0005538513.novalocal sudo[6148]: pam_unix(sudo:session): session closed for user root
Nov 28 06:51:28 np0005538513.novalocal sshd[6056]: pam_unix(sshd:session): session closed for user zuul
Nov 28 06:51:28 np0005538513.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Nov 28 06:51:28 np0005538513.novalocal systemd-logind[764]: Session 4 logged out. Waiting for processes to exit.
Nov 28 06:51:28 np0005538513.novalocal systemd-logind[764]: Removed session 4.
Nov 28 06:54:49 np0005538513.novalocal chronyd[766]: Selected source 149.56.19.163 (2.rhel.pool.ntp.org)
Nov 28 06:58:05 np0005538513.novalocal systemd[1]: Starting Cleanup of Temporary Directories...
Nov 28 06:58:05 np0005538513.novalocal systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 28 06:58:05 np0005538513.novalocal systemd[1]: Finished Cleanup of Temporary Directories.
Nov 28 06:58:05 np0005538513.novalocal systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 28 06:58:49 np0005538513.novalocal sshd[6170]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:58:49 np0005538513.novalocal sshd[6170]: Accepted publickey for zuul from 38.102.83.114 port 57408 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 06:58:49 np0005538513.novalocal systemd-logind[764]: New session 5 of user zuul.
Nov 28 06:58:49 np0005538513.novalocal systemd[1]: Started Session 5 of User zuul.
Nov 28 06:58:49 np0005538513.novalocal sshd[6170]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 06:58:49 np0005538513.novalocal sudo[6187]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aroexuqibygpbggsmldqazrcuuhanaqg ; /usr/bin/python3
Nov 28 06:58:49 np0005538513.novalocal sudo[6187]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:58:49 np0005538513.novalocal python3[6189]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-0218-9e4d-000000001d10-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 06:58:49 np0005538513.novalocal sudo[6187]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:00 np0005538513.novalocal sudo[6206]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ukkbiqyncuqpoosmefumyeqjrrbskzjp ; /usr/bin/python3
Nov 28 06:59:00 np0005538513.novalocal sudo[6206]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:00 np0005538513.novalocal python3[6208]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:59:01 np0005538513.novalocal sudo[6206]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:01 np0005538513.novalocal sudo[6222]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wolanmokatojhoehzxofvgoqkznbjtks ; /usr/bin/python3
Nov 28 06:59:01 np0005538513.novalocal sudo[6222]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:01 np0005538513.novalocal python3[6224]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:59:01 np0005538513.novalocal sudo[6222]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:01 np0005538513.novalocal sudo[6238]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqjpxxzpzsmtubvwgfeymosoehfycutq ; /usr/bin/python3
Nov 28 06:59:01 np0005538513.novalocal sudo[6238]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:01 np0005538513.novalocal python3[6240]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:59:01 np0005538513.novalocal sudo[6238]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:01 np0005538513.novalocal sudo[6254]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idurqlfakgxnhhopbvmfjaagcluzpsye ; /usr/bin/python3
Nov 28 06:59:01 np0005538513.novalocal sudo[6254]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:01 np0005538513.novalocal python3[6256]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:59:01 np0005538513.novalocal sudo[6254]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:02 np0005538513.novalocal sudo[6270]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pcgqpyzopeyikqtvzsipicupjfggiozv ; /usr/bin/python3
Nov 28 06:59:02 np0005538513.novalocal sudo[6270]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:02 np0005538513.novalocal python3[6272]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:59:02 np0005538513.novalocal sudo[6270]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:03 np0005538513.novalocal sudo[6318]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-farsmzszrbusadnvpcsrelfylbmwcsjw ; /usr/bin/python3
Nov 28 06:59:03 np0005538513.novalocal sudo[6318]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:03 np0005538513.novalocal python3[6320]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 06:59:03 np0005538513.novalocal sudo[6318]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:03 np0005538513.novalocal sudo[6361]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zkvngdhzpiztxkpxipzpkbyybxovlfjz ; /usr/bin/python3
Nov 28 06:59:03 np0005538513.novalocal sudo[6361]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:04 np0005538513.novalocal python3[6363]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764313143.4514127-643-59764353429855/source _original_basename=tmpzu2o480j follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:59:04 np0005538513.novalocal sudo[6361]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:05 np0005538513.novalocal sudo[6391]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hehwzrnerlbntqyddjagcemkimomabcs ; /usr/bin/python3
Nov 28 06:59:05 np0005538513.novalocal sudo[6391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:05 np0005538513.novalocal python3[6393]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 06:59:05 np0005538513.novalocal systemd[1]: Reloading.
Nov 28 06:59:05 np0005538513.novalocal systemd-rc-local-generator[6410]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 06:59:05 np0005538513.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 06:59:06 np0005538513.novalocal sudo[6391]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:07 np0005538513.novalocal sudo[6437]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okevqjzvbhdjiukqnfscpcfwhjwwxndu ; /usr/bin/python3
Nov 28 06:59:07 np0005538513.novalocal sudo[6437]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:07 np0005538513.novalocal python3[6439]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 28 06:59:07 np0005538513.novalocal sudo[6437]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:08 np0005538513.novalocal sudo[6453]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hnzxnxonpdsfsttiknkypxbzjqetshyx ; /usr/bin/python3
Nov 28 06:59:08 np0005538513.novalocal sudo[6453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:08 np0005538513.novalocal python3[6455]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 06:59:08 np0005538513.novalocal sudo[6453]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:08 np0005538513.novalocal sudo[6471]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kccstiyichrtsvvbqoektivuhylvflqq ; /usr/bin/python3
Nov 28 06:59:08 np0005538513.novalocal sudo[6471]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:08 np0005538513.novalocal python3[6473]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 06:59:09 np0005538513.novalocal sudo[6471]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:09 np0005538513.novalocal sudo[6489]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orctrspdcqevzbgtixwwiefdpjxlfccb ; /usr/bin/python3
Nov 28 06:59:09 np0005538513.novalocal sudo[6489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:09 np0005538513.novalocal python3[6491]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 06:59:09 np0005538513.novalocal sudo[6489]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:09 np0005538513.novalocal sudo[6507]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qslymlnfoxihnpjfvgsiksykjwhqwogu ; /usr/bin/python3
Nov 28 06:59:09 np0005538513.novalocal sudo[6507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:09 np0005538513.novalocal python3[6509]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 06:59:09 np0005538513.novalocal sudo[6507]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:10 np0005538513.novalocal python3[6526]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-0218-9e4d-000000001d17-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 06:59:21 np0005538513.novalocal python3[6547]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 06:59:24 np0005538513.novalocal sshd[6170]: pam_unix(sshd:session): session closed for user zuul
Nov 28 06:59:24 np0005538513.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Nov 28 06:59:24 np0005538513.novalocal systemd[1]: session-5.scope: Consumed 4.015s CPU time.
Nov 28 06:59:24 np0005538513.novalocal systemd-logind[764]: Session 5 logged out. Waiting for processes to exit.
Nov 28 06:59:24 np0005538513.novalocal systemd-logind[764]: Removed session 5.
Nov 28 07:00:38 np0005538513.novalocal sshd[6554]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:00:39 np0005538513.novalocal sshd[6554]: Accepted publickey for zuul from 38.102.83.114 port 47618 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 07:00:39 np0005538513.novalocal systemd-logind[764]: New session 6 of user zuul.
Nov 28 07:00:39 np0005538513.novalocal systemd[1]: Started Session 6 of User zuul.
Nov 28 07:00:39 np0005538513.novalocal sshd[6554]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 07:00:39 np0005538513.novalocal sudo[6571]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxcywxehtrmknjwgzemjyzucvwacuysv ; /usr/bin/python3
Nov 28 07:00:39 np0005538513.novalocal sudo[6571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:00:39 np0005538513.novalocal systemd[1]: Starting RHSM dbus service...
Nov 28 07:00:39 np0005538513.novalocal systemd[1]: Started RHSM dbus service.
Nov 28 07:00:39 np0005538513.novalocal rhsm-service[6578]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Nov 28 07:00:39 np0005538513.novalocal rhsm-service[6578]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Nov 28 07:00:39 np0005538513.novalocal rhsm-service[6578]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Nov 28 07:00:39 np0005538513.novalocal rhsm-service[6578]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Nov 28 07:00:42 np0005538513.novalocal rhsm-service[6578]:  INFO [subscription_manager.managerlib:90] Consumer created: np0005538513.novalocal (f7b9b60d-6b81-4721-85a2-48be6d80ec8a)
Nov 28 07:00:42 np0005538513.novalocal subscription-manager[6578]: Registered system with identity: f7b9b60d-6b81-4721-85a2-48be6d80ec8a
Nov 28 07:00:44 np0005538513.novalocal rhsm-service[6578]:  INFO [subscription_manager.entcertlib:131] certs updated:
Nov 28 07:00:44 np0005538513.novalocal rhsm-service[6578]: Total updates: 1
Nov 28 07:00:44 np0005538513.novalocal rhsm-service[6578]: Found (local) serial# []
Nov 28 07:00:44 np0005538513.novalocal rhsm-service[6578]: Expected (UEP) serial# [9132065098899233728]
Nov 28 07:00:44 np0005538513.novalocal rhsm-service[6578]: Added (new)
Nov 28 07:00:44 np0005538513.novalocal rhsm-service[6578]:   [sn:9132065098899233728 ( Content Access,) @ /etc/pki/entitlement/9132065098899233728.pem]
Nov 28 07:00:44 np0005538513.novalocal rhsm-service[6578]: Deleted (rogue):
Nov 28 07:00:44 np0005538513.novalocal rhsm-service[6578]:   <NONE>
Nov 28 07:00:44 np0005538513.novalocal subscription-manager[6578]: Added subscription for 'Content Access' contract 'None'
Nov 28 07:00:44 np0005538513.novalocal subscription-manager[6578]: Added subscription for product ' Content Access'
Nov 28 07:00:47 np0005538513.novalocal rhsm-service[6578]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Nov 28 07:00:47 np0005538513.novalocal rhsm-service[6578]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Nov 28 07:00:47 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:00:48 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:00:48 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:00:48 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:00:49 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:00:50 np0005538513.novalocal sudo[6571]: pam_unix(sudo:session): session closed for user root
Nov 28 07:00:59 np0005538513.novalocal python3[6669]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-cf29-7b10-00000000000d-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:01:00 np0005538513.novalocal sudo[6686]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bulewtuewawvnwekddpsopnyhzpghcwx ; /usr/bin/python3
Nov 28 07:01:00 np0005538513.novalocal sudo[6686]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:01:00 np0005538513.novalocal python3[6688]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:01:01 np0005538513.novalocal CROND[6691]: (root) CMD (run-parts /etc/cron.hourly)
Nov 28 07:01:01 np0005538513.novalocal run-parts[6694]: (/etc/cron.hourly) starting 0anacron
Nov 28 07:01:01 np0005538513.novalocal anacron[6702]: Anacron started on 2025-11-28
Nov 28 07:01:01 np0005538513.novalocal anacron[6702]: Will run job `cron.daily' in 22 min.
Nov 28 07:01:01 np0005538513.novalocal anacron[6702]: Will run job `cron.weekly' in 42 min.
Nov 28 07:01:01 np0005538513.novalocal anacron[6702]: Will run job `cron.monthly' in 62 min.
Nov 28 07:01:01 np0005538513.novalocal anacron[6702]: Jobs will be executed sequentially
Nov 28 07:01:01 np0005538513.novalocal run-parts[6704]: (/etc/cron.hourly) finished 0anacron
Nov 28 07:01:01 np0005538513.novalocal CROND[6690]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 28 07:01:31 np0005538513.novalocal setsebool[6778]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 28 07:01:31 np0005538513.novalocal setsebool[6778]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 28 07:01:42 np0005538513.novalocal kernel: SELinux:  Converting 407 SID table entries...
Nov 28 07:01:42 np0005538513.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 07:01:42 np0005538513.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 28 07:01:42 np0005538513.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 07:01:42 np0005538513.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 28 07:01:42 np0005538513.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 07:01:42 np0005538513.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 07:01:42 np0005538513.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 07:01:54 np0005538513.novalocal dbus-broker-launch[755]: avc:  op=load_policy lsm=selinux seqno=3 res=1
Nov 28 07:01:54 np0005538513.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 07:01:54 np0005538513.novalocal systemd[1]: Starting man-db-cache-update.service...
Nov 28 07:01:54 np0005538513.novalocal systemd[1]: Reloading.
Nov 28 07:01:55 np0005538513.novalocal systemd-rc-local-generator[7654]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:01:55 np0005538513.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:01:55 np0005538513.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 07:01:56 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:01:56 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:01:56 np0005538513.novalocal sudo[6686]: pam_unix(sudo:session): session closed for user root
Nov 28 07:02:04 np0005538513.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 07:02:04 np0005538513.novalocal systemd[1]: Finished man-db-cache-update.service.
Nov 28 07:02:04 np0005538513.novalocal systemd[1]: man-db-cache-update.service: Consumed 11.174s CPU time.
Nov 28 07:02:04 np0005538513.novalocal systemd[1]: run-rae0a51aa065742f38ba6caf1fb97a20f.service: Deactivated successfully.
Nov 28 07:02:47 np0005538513.novalocal sudo[18371]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfikdspjivqaowwvpggxbjdbsxsrlqbo ; /usr/bin/python3
Nov 28 07:02:47 np0005538513.novalocal sudo[18371]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:02:47 np0005538513.novalocal podman[18374]: 2025-11-28 07:02:47.936305556 +0000 UTC m=+0.097922582 system refresh
Nov 28 07:02:48 np0005538513.novalocal sudo[18371]: pam_unix(sudo:session): session closed for user root
Nov 28 07:02:48 np0005538513.novalocal systemd[4177]: Starting D-Bus User Message Bus...
Nov 28 07:02:48 np0005538513.novalocal dbus-broker-launch[18433]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 28 07:02:48 np0005538513.novalocal dbus-broker-launch[18433]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 28 07:02:48 np0005538513.novalocal systemd[4177]: Started D-Bus User Message Bus.
Nov 28 07:02:48 np0005538513.novalocal dbus-broker-lau[18433]: Ready
Nov 28 07:02:48 np0005538513.novalocal systemd[4177]: selinux: avc:  op=load_policy lsm=selinux seqno=3 res=1
Nov 28 07:02:48 np0005538513.novalocal systemd[4177]: Created slice Slice /user.
Nov 28 07:02:48 np0005538513.novalocal systemd[4177]: podman-18416.scope: unit configures an IP firewall, but not running as root.
Nov 28 07:02:48 np0005538513.novalocal systemd[4177]: (This warning is only shown for the first unit using IP firewalling.)
Nov 28 07:02:48 np0005538513.novalocal systemd[4177]: Started podman-18416.scope.
Nov 28 07:02:48 np0005538513.novalocal systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:02:49 np0005538513.novalocal systemd[4177]: Started podman-pause-90600334.scope.
Nov 28 07:02:51 np0005538513.novalocal sshd[6554]: pam_unix(sshd:session): session closed for user zuul
Nov 28 07:02:51 np0005538513.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Nov 28 07:02:51 np0005538513.novalocal systemd[1]: session-6.scope: Consumed 53.073s CPU time.
Nov 28 07:02:51 np0005538513.novalocal systemd-logind[764]: Session 6 logged out. Waiting for processes to exit.
Nov 28 07:02:51 np0005538513.novalocal systemd-logind[764]: Removed session 6.
Nov 28 07:03:06 np0005538513.novalocal sshd[18436]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:03:06 np0005538513.novalocal sshd[18437]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:03:06 np0005538513.novalocal sshd[18440]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:03:06 np0005538513.novalocal sshd[18438]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:03:06 np0005538513.novalocal sshd[18439]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:03:06 np0005538513.novalocal sshd[18437]: Unable to negotiate with 38.102.83.32 port 52622: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Nov 28 07:03:06 np0005538513.novalocal sshd[18440]: Connection closed by 38.102.83.32 port 52594 [preauth]
Nov 28 07:03:06 np0005538513.novalocal sshd[18438]: Connection closed by 38.102.83.32 port 52602 [preauth]
Nov 28 07:03:06 np0005538513.novalocal sshd[18439]: Unable to negotiate with 38.102.83.32 port 52608: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Nov 28 07:03:06 np0005538513.novalocal sshd[18436]: Unable to negotiate with 38.102.83.32 port 52632: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Nov 28 07:03:11 np0005538513.novalocal sshd[18446]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:03:11 np0005538513.novalocal sshd[18446]: Accepted publickey for zuul from 38.102.83.114 port 38344 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 07:03:11 np0005538513.novalocal systemd-logind[764]: New session 7 of user zuul.
Nov 28 07:03:11 np0005538513.novalocal systemd[1]: Started Session 7 of User zuul.
Nov 28 07:03:11 np0005538513.novalocal sshd[18446]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 07:03:11 np0005538513.novalocal python3[18463]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCwbdeJV6VDDXadsSf2RG5X7kz/GTOF493/FPhPlXmY8LaEjIgaNVgahbrG06qkZx72vk0TqexyzHBymiNAuWIc= zuul@np0005538507.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 07:03:12 np0005538513.novalocal sudo[18477]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwtbybrtdrplolycawmfyfstktjrlinr ; /usr/bin/python3
Nov 28 07:03:12 np0005538513.novalocal sudo[18477]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:03:12 np0005538513.novalocal python3[18479]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCwbdeJV6VDDXadsSf2RG5X7kz/GTOF493/FPhPlXmY8LaEjIgaNVgahbrG06qkZx72vk0TqexyzHBymiNAuWIc= zuul@np0005538507.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 07:03:12 np0005538513.novalocal sudo[18477]: pam_unix(sudo:session): session closed for user root
Nov 28 07:03:14 np0005538513.novalocal sshd[18446]: pam_unix(sshd:session): session closed for user zuul
Nov 28 07:03:14 np0005538513.novalocal systemd[1]: session-7.scope: Deactivated successfully.
Nov 28 07:03:14 np0005538513.novalocal systemd-logind[764]: Session 7 logged out. Waiting for processes to exit.
Nov 28 07:03:14 np0005538513.novalocal systemd-logind[764]: Removed session 7.
Nov 28 07:04:48 np0005538513.novalocal sshd[18482]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:04:49 np0005538513.novalocal sshd[18482]: Accepted publickey for zuul from 38.102.83.114 port 39860 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 07:04:49 np0005538513.novalocal systemd-logind[764]: New session 8 of user zuul.
Nov 28 07:04:49 np0005538513.novalocal systemd[1]: Started Session 8 of User zuul.
Nov 28 07:04:49 np0005538513.novalocal sshd[18482]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 07:04:49 np0005538513.novalocal sudo[18499]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wcnyxbmoynjvvzgqelgmmjpaoqwtyxra ; /usr/bin/python3
Nov 28 07:04:49 np0005538513.novalocal sudo[18499]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:04:49 np0005538513.novalocal python3[18501]: ansible-authorized_key Invoked with user=root manage_dir=True key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsnBivukZgTjr1SoC29hE3ofwUMxTaKeXh9gXvDwMJASbvK4q9943cbJ2j47GUf8sEgY38kkU/dxSMQWULl4d2oquIgZpJbJuXMU1WNxwGNSrS74OecQ3Or4VxTiDmu/HV83nIWHqfpDCra4DlrIBPPNwhBK4u0QYy87AJaML6NGEDaubbHgVCg1UpW1ho/sDoXptAehoCEaaeRz5tPHiXRnHpIXu44Sp8fRcyU9rBqdv+/lgachTcMYadsD2WBHIL+pptEDHB5TvQTDpnU58YdGFarn8uuGPP4t8H6xcqXbaJS9/oZa5Fb5Mh3vORBbR65jvlGg4PYGzCuI/xllY5+lGK7eyOleFyRqWKa2uAIaGoRBT4ZLKAssOFwCIaGfOAFFOBMkuylg4+MtbYiMJYRARPSRAufAROqhUDOo73y5lBrXh07aiWuSn8fU4mclWu+Xw382ryxW+XeHPc12d7S46TvGJaRvzsLtlyerRxGI77xOHRexq1Z/SFjOWLOwc= zuul-build-sshkey state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 07:04:49 np0005538513.novalocal sudo[18499]: pam_unix(sudo:session): session closed for user root
Nov 28 07:04:50 np0005538513.novalocal sudo[18515]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yspqeowvpphclyzlqtdubgcversbmhek ; /usr/bin/python3
Nov 28 07:04:50 np0005538513.novalocal sudo[18515]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:04:50 np0005538513.novalocal python3[18517]: ansible-user Invoked with name=root state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005538513.novalocal update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 28 07:04:50 np0005538513.novalocal sudo[18515]: pam_unix(sudo:session): session closed for user root
Nov 28 07:04:51 np0005538513.novalocal sudo[18565]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezchytimckoddvspjwtywqsdtxihnevc ; /usr/bin/python3
Nov 28 07:04:51 np0005538513.novalocal sudo[18565]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:04:52 np0005538513.novalocal python3[18567]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:04:52 np0005538513.novalocal sudo[18565]: pam_unix(sudo:session): session closed for user root
Nov 28 07:04:52 np0005538513.novalocal sudo[18608]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxxugxqewscvwqxjsiflxqjyaoijczvg ; /usr/bin/python3
Nov 28 07:04:52 np0005538513.novalocal sudo[18608]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:04:52 np0005538513.novalocal python3[18610]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764313491.7259095-133-223867814579/source dest=/root/.ssh/id_rsa mode=384 owner=root force=False _original_basename=4237e6bc560e46d9aa55417d911b2c55_id_rsa follow=False checksum=47f6a2f8fa426c1f34aad346f88073a22928af4e backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:04:52 np0005538513.novalocal sudo[18608]: pam_unix(sudo:session): session closed for user root
Nov 28 07:04:53 np0005538513.novalocal sudo[18670]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxnhporelahjndhoempwwhvyogmzoefu ; /usr/bin/python3
Nov 28 07:04:53 np0005538513.novalocal sudo[18670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:04:53 np0005538513.novalocal python3[18672]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:04:53 np0005538513.novalocal sudo[18670]: pam_unix(sudo:session): session closed for user root
Nov 28 07:04:53 np0005538513.novalocal sudo[18713]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdderlovwggchbnzamsrizkuvhixlmww ; /usr/bin/python3
Nov 28 07:04:53 np0005538513.novalocal sudo[18713]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:04:53 np0005538513.novalocal python3[18715]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764313493.310652-219-167592154753351/source dest=/root/.ssh/id_rsa.pub mode=420 owner=root force=False _original_basename=4237e6bc560e46d9aa55417d911b2c55_id_rsa.pub follow=False checksum=d1f12d852c72cfefab089d88337552962cfbc93d backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:04:53 np0005538513.novalocal sudo[18713]: pam_unix(sudo:session): session closed for user root
Nov 28 07:04:55 np0005538513.novalocal sudo[18743]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fivynjslwbqppxpqcudbjfrslpzssrum ; /usr/bin/python3
Nov 28 07:04:55 np0005538513.novalocal sudo[18743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:04:56 np0005538513.novalocal python3[18745]: ansible-ansible.builtin.file Invoked with path=/etc/nodepool state=directory mode=0777 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:04:56 np0005538513.novalocal sudo[18743]: pam_unix(sudo:session): session closed for user root
Nov 28 07:04:57 np0005538513.novalocal python3[18791]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:04:57 np0005538513.novalocal python3[18807]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes _original_basename=tmpzn48ue4i recurse=False state=file path=/etc/nodepool/sub_nodes force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:04:58 np0005538513.novalocal python3[18867]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:04:58 np0005538513.novalocal python3[18883]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes_private _original_basename=tmpdld9nvxw recurse=False state=file path=/etc/nodepool/sub_nodes_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:05:00 np0005538513.novalocal python3[18943]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:05:00 np0005538513.novalocal python3[18959]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/node_private _original_basename=tmpesfm0j2o recurse=False state=file path=/etc/nodepool/node_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:05:01 np0005538513.novalocal sshd[18482]: pam_unix(sshd:session): session closed for user zuul
Nov 28 07:05:01 np0005538513.novalocal systemd-logind[764]: Session 8 logged out. Waiting for processes to exit.
Nov 28 07:05:01 np0005538513.novalocal systemd[1]: session-8.scope: Deactivated successfully.
Nov 28 07:05:01 np0005538513.novalocal systemd[1]: session-8.scope: Consumed 3.638s CPU time.
Nov 28 07:05:01 np0005538513.novalocal systemd-logind[764]: Removed session 8.
Nov 28 07:07:23 np0005538513.novalocal sshd[18975]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:07:23 np0005538513.novalocal sshd[18975]: Accepted publickey for zuul from 38.102.83.32 port 35816 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 07:07:23 np0005538513.novalocal systemd-logind[764]: New session 9 of user zuul.
Nov 28 07:07:23 np0005538513.novalocal systemd[1]: Started Session 9 of User zuul.
Nov 28 07:07:23 np0005538513.novalocal sshd[18975]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 07:07:23 np0005538513.novalocal python3[19021]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:10:35 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:10:35 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:12:08 np0005538513.novalocal sshd[19144]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:20 np0005538513.novalocal sshd[19144]: Connection closed by authenticating user root 193.32.162.157 port 47124 [preauth]
Nov 28 07:12:22 np0005538513.novalocal sshd[19146]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:23 np0005538513.novalocal sshd[18978]: Received disconnect from 38.102.83.32 port 35816:11: disconnected by user
Nov 28 07:12:23 np0005538513.novalocal sshd[18978]: Disconnected from user zuul 38.102.83.32 port 35816
Nov 28 07:12:23 np0005538513.novalocal sshd[18975]: pam_unix(sshd:session): session closed for user zuul
Nov 28 07:12:23 np0005538513.novalocal systemd[1]: session-9.scope: Deactivated successfully.
Nov 28 07:12:23 np0005538513.novalocal systemd-logind[764]: Session 9 logged out. Waiting for processes to exit.
Nov 28 07:12:23 np0005538513.novalocal systemd-logind[764]: Removed session 9.
Nov 28 07:12:31 np0005538513.novalocal sshd[19146]: Invalid user laravel from 193.32.162.157 port 53308
Nov 28 07:12:34 np0005538513.novalocal sshd[19146]: Connection closed by invalid user laravel 193.32.162.157 port 53308 [preauth]
Nov 28 07:12:34 np0005538513.novalocal sshd[19151]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:46 np0005538513.novalocal sshd[19151]: Connection closed by authenticating user root 193.32.162.157 port 58128 [preauth]
Nov 28 07:12:46 np0005538513.novalocal sshd[19153]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:58 np0005538513.novalocal sshd[19153]: Connection closed by authenticating user root 193.32.162.157 port 46388 [preauth]
Nov 28 07:12:58 np0005538513.novalocal sshd[19155]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:10 np0005538513.novalocal sshd[19155]: Connection closed by authenticating user root 193.32.162.157 port 47376 [preauth]
Nov 28 07:13:10 np0005538513.novalocal sshd[19157]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:21 np0005538513.novalocal sshd[19157]: Connection closed by authenticating user root 193.32.162.157 port 51624 [preauth]
Nov 28 07:13:21 np0005538513.novalocal sshd[19159]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:33 np0005538513.novalocal sshd[19159]: Connection closed by authenticating user root 193.32.162.157 port 44828 [preauth]
Nov 28 07:13:34 np0005538513.novalocal sshd[19162]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:45 np0005538513.novalocal sshd[19162]: Connection closed by authenticating user root 193.32.162.157 port 45570 [preauth]
Nov 28 07:13:46 np0005538513.novalocal sshd[19164]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:57 np0005538513.novalocal sshd[19164]: Connection closed by authenticating user root 193.32.162.157 port 50100 [preauth]
Nov 28 07:13:58 np0005538513.novalocal sshd[19166]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:00 np0005538513.novalocal sshd[19168]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:01 np0005538513.novalocal sshd[19168]: Received disconnect from 193.46.255.244 port 28180:11:  [preauth]
Nov 28 07:14:01 np0005538513.novalocal sshd[19168]: Disconnected from authenticating user root 193.46.255.244 port 28180 [preauth]
Nov 28 07:14:09 np0005538513.novalocal sshd[19166]: Connection closed by authenticating user root 193.32.162.157 port 40336 [preauth]
Nov 28 07:14:09 np0005538513.novalocal sshd[19170]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:21 np0005538513.novalocal sshd[19170]: Connection closed by authenticating user root 193.32.162.157 port 45966 [preauth]
Nov 28 07:14:21 np0005538513.novalocal sshd[19172]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:33 np0005538513.novalocal sshd[19172]: Connection closed by authenticating user root 193.32.162.157 port 47196 [preauth]
Nov 28 07:14:33 np0005538513.novalocal sshd[19174]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:45 np0005538513.novalocal sshd[19174]: Connection closed by authenticating user root 193.32.162.157 port 51232 [preauth]
Nov 28 07:14:45 np0005538513.novalocal sshd[19176]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:56 np0005538513.novalocal sshd[19176]: Connection closed by authenticating user root 193.32.162.157 port 44202 [preauth]
Nov 28 07:14:57 np0005538513.novalocal sshd[19178]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:08 np0005538513.novalocal sshd[19178]: Connection closed by authenticating user root 193.32.162.157 port 59298 [preauth]
Nov 28 07:15:08 np0005538513.novalocal sshd[19180]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:20 np0005538513.novalocal sshd[19180]: Connection closed by authenticating user root 193.32.162.157 port 52046 [preauth]
Nov 28 07:15:21 np0005538513.novalocal sshd[19182]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:32 np0005538513.novalocal sshd[19182]: Connection closed by authenticating user root 193.32.162.157 port 55402 [preauth]
Nov 28 07:15:33 np0005538513.novalocal sshd[19184]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:44 np0005538513.novalocal sshd[19184]: Connection closed by authenticating user root 193.32.162.157 port 38874 [preauth]
Nov 28 07:15:44 np0005538513.novalocal sshd[19187]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:56 np0005538513.novalocal sshd[19187]: Connection closed by authenticating user root 193.32.162.157 port 55974 [preauth]
Nov 28 07:15:56 np0005538513.novalocal sshd[19189]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:08 np0005538513.novalocal sshd[19189]: Connection closed by authenticating user root 193.32.162.157 port 55778 [preauth]
Nov 28 07:16:08 np0005538513.novalocal sshd[19191]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:20 np0005538513.novalocal sshd[19191]: Connection closed by authenticating user root 193.32.162.157 port 51078 [preauth]
Nov 28 07:16:20 np0005538513.novalocal sshd[19193]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:32 np0005538513.novalocal sshd[19193]: Connection closed by authenticating user root 193.32.162.157 port 34628 [preauth]
Nov 28 07:16:32 np0005538513.novalocal sshd[19195]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:43 np0005538513.novalocal sshd[19195]: Connection closed by authenticating user root 193.32.162.157 port 54364 [preauth]
Nov 28 07:16:44 np0005538513.novalocal sshd[19197]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:53 np0005538513.novalocal sshd[19197]: Invalid user super from 193.32.162.157 port 51868
Nov 28 07:16:56 np0005538513.novalocal sshd[19197]: Connection closed by invalid user super 193.32.162.157 port 51868 [preauth]
Nov 28 07:16:56 np0005538513.novalocal sshd[19199]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:08 np0005538513.novalocal sshd[19199]: Connection closed by authenticating user root 193.32.162.157 port 52618 [preauth]
Nov 28 07:17:08 np0005538513.novalocal sshd[19201]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:20 np0005538513.novalocal sshd[19201]: Connection closed by authenticating user root 193.32.162.157 port 37442 [preauth]
Nov 28 07:17:20 np0005538513.novalocal sshd[19203]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:31 np0005538513.novalocal sshd[19203]: Connection closed by authenticating user root 193.32.162.157 port 40840 [preauth]
Nov 28 07:17:31 np0005538513.novalocal sshd[19205]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:43 np0005538513.novalocal sshd[19205]: Connection closed by authenticating user root 193.32.162.157 port 41450 [preauth]
Nov 28 07:17:44 np0005538513.novalocal sshd[19207]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:53 np0005538513.novalocal sshd[19207]: Invalid user cyber from 193.32.162.157 port 57738
Nov 28 07:17:55 np0005538513.novalocal sshd[19207]: Connection closed by invalid user cyber 193.32.162.157 port 57738 [preauth]
Nov 28 07:17:55 np0005538513.novalocal sshd[19209]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:08 np0005538513.novalocal sshd[19209]: Connection closed by authenticating user root 193.32.162.157 port 52778 [preauth]
Nov 28 07:18:08 np0005538513.novalocal sshd[19211]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:20 np0005538513.novalocal sshd[19211]: Connection closed by authenticating user root 193.32.162.157 port 42404 [preauth]
Nov 28 07:18:20 np0005538513.novalocal sshd[19214]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:30 np0005538513.novalocal sshd[19214]: Invalid user zabbix from 193.32.162.157 port 44570
Nov 28 07:18:32 np0005538513.novalocal sshd[19214]: Connection closed by invalid user zabbix 193.32.162.157 port 44570 [preauth]
Nov 28 07:18:32 np0005538513.novalocal sshd[19216]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:44 np0005538513.novalocal sshd[19216]: Connection closed by authenticating user root 193.32.162.157 port 43316 [preauth]
Nov 28 07:18:44 np0005538513.novalocal sshd[19218]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:54 np0005538513.novalocal sshd[19218]: Invalid user sam from 193.32.162.157 port 38940
Nov 28 07:18:56 np0005538513.novalocal sshd[19218]: Connection closed by invalid user sam 193.32.162.157 port 38940 [preauth]
Nov 28 07:18:56 np0005538513.novalocal sshd[19220]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:08 np0005538513.novalocal sshd[19220]: Connection closed by authenticating user root 193.32.162.157 port 35398 [preauth]
Nov 28 07:19:08 np0005538513.novalocal sshd[19222]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:20 np0005538513.novalocal sshd[19222]: Connection closed by authenticating user root 193.32.162.157 port 50386 [preauth]
Nov 28 07:19:21 np0005538513.novalocal sshd[19224]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:32 np0005538513.novalocal sshd[19224]: Connection closed by authenticating user root 193.32.162.157 port 46092 [preauth]
Nov 28 07:19:32 np0005538513.novalocal sshd[19226]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:39 np0005538513.novalocal sshd[19230]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:39 np0005538513.novalocal sshd[19230]: Accepted publickey for zuul from 38.102.83.114 port 33286 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 07:19:39 np0005538513.novalocal systemd-logind[764]: New session 10 of user zuul.
Nov 28 07:19:39 np0005538513.novalocal systemd[1]: Started Session 10 of User zuul.
Nov 28 07:19:39 np0005538513.novalocal sshd[19230]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 07:19:39 np0005538513.novalocal python3[19247]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-f34b-e95a-00000000000c-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:19:40 np0005538513.novalocal sudo[19265]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrlnsxeylxjnynkfbfsbhdburmkjdgip ; /usr/bin/python3
Nov 28 07:19:40 np0005538513.novalocal sudo[19265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:19:41 np0005538513.novalocal python3[19267]: ansible-ansible.legacy.command Invoked with _raw_params=yum clean all zuul_log_id=fa163ef9-e89a-f34b-e95a-00000000000d-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:19:43 np0005538513.novalocal sudo[19265]: pam_unix(sudo:session): session closed for user root
Nov 28 07:19:44 np0005538513.novalocal sshd[19226]: Connection closed by authenticating user root 193.32.162.157 port 55116 [preauth]
Nov 28 07:19:44 np0005538513.novalocal sshd[19271]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:45 np0005538513.novalocal sudo[19285]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xntlitiqbpylonplcganvinctfzpvaev ; /usr/bin/python3
Nov 28 07:19:45 np0005538513.novalocal sudo[19285]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:19:46 np0005538513.novalocal python3[19287]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-baseos-eus-rpms'] state=enabled purge=False
Nov 28 07:19:48 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:19:49 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:19:56 np0005538513.novalocal sshd[19271]: Connection closed by authenticating user root 193.32.162.157 port 33770 [preauth]
Nov 28 07:19:56 np0005538513.novalocal sshd[19481]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:08 np0005538513.novalocal sshd[19481]: Connection closed by authenticating user root 193.32.162.157 port 43798 [preauth]
Nov 28 07:20:08 np0005538513.novalocal sshd[19491]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:13 np0005538513.novalocal sudo[19285]: pam_unix(sudo:session): session closed for user root
Nov 28 07:20:20 np0005538513.novalocal sshd[19491]: Connection closed by authenticating user root 193.32.162.157 port 55574 [preauth]
Nov 28 07:20:20 np0005538513.novalocal sshd[19493]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:32 np0005538513.novalocal sshd[19493]: Connection closed by authenticating user root 193.32.162.157 port 42810 [preauth]
Nov 28 07:20:32 np0005538513.novalocal sshd[19495]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:43 np0005538513.novalocal sudo[19511]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ampexddnbtvqdrenemaemggmgvkhgaqp ; /usr/bin/python3
Nov 28 07:20:43 np0005538513.novalocal sudo[19511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:20:43 np0005538513.novalocal python3[19513]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-appstream-eus-rpms'] state=enabled purge=False
Nov 28 07:20:44 np0005538513.novalocal sshd[19495]: Connection closed by authenticating user root 193.32.162.157 port 39856 [preauth]
Nov 28 07:20:44 np0005538513.novalocal sshd[19515]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:46 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:20:46 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:20:48 np0005538513.novalocal sudo[19511]: pam_unix(sudo:session): session closed for user root
Nov 28 07:20:54 np0005538513.novalocal sudo[19654]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fmwqdyxeuototrquoggvapdnohmzaogw ; /usr/bin/python3
Nov 28 07:20:54 np0005538513.novalocal sudo[19654]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:20:54 np0005538513.novalocal python3[19656]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-highavailability-eus-rpms'] state=enabled purge=False
Nov 28 07:20:56 np0005538513.novalocal sshd[19515]: Connection closed by authenticating user root 193.32.162.157 port 49808 [preauth]
Nov 28 07:20:56 np0005538513.novalocal sshd[19659]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:57 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:20:57 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:21:02 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:21:02 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:21:08 np0005538513.novalocal sshd[19659]: Connection closed by authenticating user root 193.32.162.157 port 53320 [preauth]
Nov 28 07:21:08 np0005538513.novalocal sshd[19920]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:10 np0005538513.novalocal sudo[19654]: pam_unix(sudo:session): session closed for user root
Nov 28 07:21:17 np0005538513.novalocal sshd[19920]: Invalid user bot from 193.32.162.157 port 38612
Nov 28 07:21:20 np0005538513.novalocal sshd[19920]: Connection closed by invalid user bot 193.32.162.157 port 38612 [preauth]
Nov 28 07:21:20 np0005538513.novalocal sshd[19922]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:24 np0005538513.novalocal sudo[19937]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ddvbbuokafrizgoseqxvogwoawilpijj ; /usr/bin/python3
Nov 28 07:21:24 np0005538513.novalocal sudo[19937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:21:25 np0005538513.novalocal python3[19939]: ansible-community.general.rhsm_repository Invoked with name=['fast-datapath-for-rhel-9-x86_64-rpms'] state=enabled purge=False
Nov 28 07:21:28 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:21:28 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:21:32 np0005538513.novalocal sshd[19922]: Connection closed by authenticating user root 193.32.162.157 port 45758 [preauth]
Nov 28 07:21:32 np0005538513.novalocal sshd[20126]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:33 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:21:33 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:21:41 np0005538513.novalocal sudo[19937]: pam_unix(sudo:session): session closed for user root
Nov 28 07:21:44 np0005538513.novalocal sshd[20126]: Connection closed by authenticating user root 193.32.162.157 port 53284 [preauth]
Nov 28 07:21:44 np0005538513.novalocal sshd[20321]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:54 np0005538513.novalocal sshd[20321]: Invalid user nexus from 193.32.162.157 port 51624
Nov 28 07:21:56 np0005538513.novalocal sshd[20321]: Connection closed by invalid user nexus 193.32.162.157 port 51624 [preauth]
Nov 28 07:21:56 np0005538513.novalocal sshd[20323]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:57 np0005538513.novalocal sudo[20337]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gnfackjbpxbeilojlkdqcluaxbfwtdfj ; /usr/bin/python3
Nov 28 07:21:57 np0005538513.novalocal sudo[20337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:21:57 np0005538513.novalocal python3[20339]: ansible-community.general.rhsm_repository Invoked with name=['openstack-17.1-for-rhel-9-x86_64-rpms'] state=enabled purge=False
Nov 28 07:21:59 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:22:00 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:22:04 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:22:08 np0005538513.novalocal sshd[20323]: Connection closed by authenticating user root 193.32.162.157 port 35250 [preauth]
Nov 28 07:22:09 np0005538513.novalocal sshd[20597]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:22:12 np0005538513.novalocal sudo[20337]: pam_unix(sudo:session): session closed for user root
Nov 28 07:22:20 np0005538513.novalocal sshd[20597]: Connection closed by authenticating user root 193.32.162.157 port 44934 [preauth]
Nov 28 07:22:20 np0005538513.novalocal sshd[20605]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:22:30 np0005538513.novalocal sudo[20621]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-meksfkodykofbhiuobemgjwodrbxafcv ; /usr/bin/python3
Nov 28 07:22:30 np0005538513.novalocal sudo[20621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:22:30 np0005538513.novalocal python3[20623]: ansible-ansible.legacy.command Invoked with _raw_params=yum repolist --enabled
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-f34b-e95a-000000000013-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:22:32 np0005538513.novalocal sshd[20605]: Connection closed by authenticating user root 193.32.162.157 port 59818 [preauth]
Nov 28 07:22:32 np0005538513.novalocal sudo[20621]: pam_unix(sudo:session): session closed for user root
Nov 28 07:22:32 np0005538513.novalocal sshd[20627]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:22:35 np0005538513.novalocal sudo[20642]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgbxynojvqumubnnmjhymovvegmcnpkl ; /usr/bin/python3
Nov 28 07:22:35 np0005538513.novalocal sudo[20642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:22:35 np0005538513.novalocal python3[20644]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch', 'os-net-config', 'ansible-core'] state=present update_cache=True allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:22:44 np0005538513.novalocal sshd[20627]: Connection closed by authenticating user root 193.32.162.157 port 55510 [preauth]
Nov 28 07:22:44 np0005538513.novalocal sshd[20691]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:22:55 np0005538513.novalocal kernel: SELinux:  Converting 486 SID table entries...
Nov 28 07:22:55 np0005538513.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 07:22:55 np0005538513.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 28 07:22:55 np0005538513.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 07:22:55 np0005538513.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 28 07:22:55 np0005538513.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 07:22:55 np0005538513.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 07:22:55 np0005538513.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 07:22:55 np0005538513.novalocal groupadd[20744]: group added to /etc/group: name=unbound, GID=987
Nov 28 07:22:55 np0005538513.novalocal groupadd[20744]: group added to /etc/gshadow: name=unbound
Nov 28 07:22:55 np0005538513.novalocal groupadd[20744]: new group: name=unbound, GID=987
Nov 28 07:22:55 np0005538513.novalocal useradd[20751]: new user: name=unbound, UID=987, GID=987, home=/etc/unbound, shell=/sbin/nologin, from=none
Nov 28 07:22:55 np0005538513.novalocal dbus-broker-launch[755]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Nov 28 07:22:55 np0005538513.novalocal systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 28 07:22:55 np0005538513.novalocal groupadd[20764]: group added to /etc/group: name=openvswitch, GID=986
Nov 28 07:22:55 np0005538513.novalocal groupadd[20764]: group added to /etc/gshadow: name=openvswitch
Nov 28 07:22:55 np0005538513.novalocal groupadd[20764]: new group: name=openvswitch, GID=986
Nov 28 07:22:55 np0005538513.novalocal useradd[20771]: new user: name=openvswitch, UID=986, GID=986, home=/, shell=/sbin/nologin, from=none
Nov 28 07:22:55 np0005538513.novalocal groupadd[20779]: group added to /etc/group: name=hugetlbfs, GID=985
Nov 28 07:22:55 np0005538513.novalocal groupadd[20779]: group added to /etc/gshadow: name=hugetlbfs
Nov 28 07:22:55 np0005538513.novalocal groupadd[20779]: new group: name=hugetlbfs, GID=985
Nov 28 07:22:56 np0005538513.novalocal usermod[20787]: add 'openvswitch' to group 'hugetlbfs'
Nov 28 07:22:56 np0005538513.novalocal usermod[20787]: add 'openvswitch' to shadow group 'hugetlbfs'
Nov 28 07:22:56 np0005538513.novalocal sshd[20691]: Connection closed by authenticating user root 193.32.162.157 port 56818 [preauth]
Nov 28 07:22:57 np0005538513.novalocal sshd[20806]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:22:59 np0005538513.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 07:22:59 np0005538513.novalocal systemd[1]: Starting man-db-cache-update.service...
Nov 28 07:22:59 np0005538513.novalocal systemd[1]: Reloading.
Nov 28 07:22:59 np0005538513.novalocal systemd-sysv-generator[21313]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:22:59 np0005538513.novalocal systemd-rc-local-generator[21307]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:22:59 np0005538513.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:22:59 np0005538513.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 07:23:00 np0005538513.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 07:23:00 np0005538513.novalocal systemd[1]: Finished man-db-cache-update.service.
Nov 28 07:23:00 np0005538513.novalocal systemd[1]: run-rff007e7a76b34c89b61f49a553509fca.service: Deactivated successfully.
Nov 28 07:23:01 np0005538513.novalocal anacron[6702]: Job `cron.daily' started
Nov 28 07:23:01 np0005538513.novalocal anacron[6702]: Job `cron.daily' terminated
Nov 28 07:23:01 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:23:01 np0005538513.novalocal sudo[20642]: pam_unix(sudo:session): session closed for user root
Nov 28 07:23:01 np0005538513.novalocal rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:23:09 np0005538513.novalocal sshd[20806]: Connection closed by authenticating user root 193.32.162.157 port 54030 [preauth]
Nov 28 07:23:09 np0005538513.novalocal sshd[21929]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:23:20 np0005538513.novalocal sshd[21929]: Connection closed by authenticating user root 193.32.162.157 port 40490 [preauth]
Nov 28 07:23:20 np0005538513.novalocal sshd[21931]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:23:27 np0005538513.novalocal sudo[21947]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kyxndujctiviuyovtnqjmoyksiognmyr ; /usr/bin/python3
Nov 28 07:23:27 np0005538513.novalocal sudo[21947]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:23:27 np0005538513.novalocal python3[21949]: ansible-ansible.legacy.command Invoked with _raw_params=ansible-galaxy collection install ansible.posix
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-f34b-e95a-000000000015-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:23:33 np0005538513.novalocal sshd[21931]: Connection closed by authenticating user root 193.32.162.157 port 42836 [preauth]
Nov 28 07:23:33 np0005538513.novalocal sshd[21953]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:23:41 np0005538513.novalocal sudo[21947]: pam_unix(sudo:session): session closed for user root
Nov 28 07:23:45 np0005538513.novalocal sshd[21953]: Connection closed by authenticating user root 193.32.162.157 port 55760 [preauth]
Nov 28 07:23:45 np0005538513.novalocal sshd[21956]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:23:57 np0005538513.novalocal sshd[21956]: Connection closed by authenticating user root 193.32.162.157 port 42788 [preauth]
Nov 28 07:23:57 np0005538513.novalocal sshd[21958]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:23:58 np0005538513.novalocal sudo[21972]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrtzrmtkowvyedznwrguhbwabskomqsh ; /usr/bin/python3
Nov 28 07:23:58 np0005538513.novalocal sudo[21972]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:23:58 np0005538513.novalocal python3[21974]: ansible-ansible.builtin.file Invoked with path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:23:58 np0005538513.novalocal sudo[21972]: pam_unix(sudo:session): session closed for user root
Nov 28 07:23:59 np0005538513.novalocal sudo[22020]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vaqavjlsgmoniwcuxrqdofciisegjpud ; /usr/bin/python3
Nov 28 07:23:59 np0005538513.novalocal sudo[22020]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:23:59 np0005538513.novalocal python3[22022]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/tripleo_config.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:23:59 np0005538513.novalocal sudo[22020]: pam_unix(sudo:session): session closed for user root
Nov 28 07:23:59 np0005538513.novalocal sudo[22063]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ttrcbptpyxzqvdbtrgoewxhkqwpnlbec ; /usr/bin/python3
Nov 28 07:23:59 np0005538513.novalocal sudo[22063]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:23:59 np0005538513.novalocal python3[22065]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764314638.924288-290-148855971992213/source dest=/etc/os-net-config/tripleo_config.yaml mode=None follow=False _original_basename=overcloud_net_config.j2 checksum=3358dfc6c6ce646155135d0cad900026cb34ba08 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:23:59 np0005538513.novalocal sudo[22063]: pam_unix(sudo:session): session closed for user root
Nov 28 07:24:01 np0005538513.novalocal sudo[22094]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rodqfkskgumzkchcxwjbruvjyfkxzmwu ; /usr/bin/python3
Nov 28 07:24:01 np0005538513.novalocal sudo[22094]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:24:01 np0005538513.novalocal python3[22096]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network  state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Nov 28 07:24:01 np0005538513.novalocal sudo[22094]: pam_unix(sudo:session): session closed for user root
Nov 28 07:24:01 np0005538513.novalocal systemd-journald[618]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 89.2 (297 of 333 items), suggesting rotation.
Nov 28 07:24:01 np0005538513.novalocal systemd-journald[618]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 07:24:01 np0005538513.novalocal rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 07:24:01 np0005538513.novalocal rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 07:24:01 np0005538513.novalocal sudo[22115]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvmvlfpqonrqincdssghhvdcftaoannw ; /usr/bin/python3
Nov 28 07:24:01 np0005538513.novalocal sudo[22115]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:24:01 np0005538513.novalocal python3[22117]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-20 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Nov 28 07:24:01 np0005538513.novalocal sudo[22115]: pam_unix(sudo:session): session closed for user root
Nov 28 07:24:01 np0005538513.novalocal sudo[22135]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iudkrkhxtpfacyacuhqpwfplxlpkgiqz ; /usr/bin/python3
Nov 28 07:24:01 np0005538513.novalocal sudo[22135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:24:01 np0005538513.novalocal python3[22137]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-21 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Nov 28 07:24:01 np0005538513.novalocal sudo[22135]: pam_unix(sudo:session): session closed for user root
Nov 28 07:24:01 np0005538513.novalocal sudo[22155]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-peyryrbjaobzfwaiqwigyemrwdavjrai ; /usr/bin/python3
Nov 28 07:24:02 np0005538513.novalocal sudo[22155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:24:02 np0005538513.novalocal python3[22157]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-22 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Nov 28 07:24:03 np0005538513.novalocal sudo[22155]: pam_unix(sudo:session): session closed for user root
Nov 28 07:24:03 np0005538513.novalocal sudo[22175]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydscdyvrgvpwrvbdbrdjjjuqzkfnskch ; /usr/bin/python3
Nov 28 07:24:03 np0005538513.novalocal sudo[22175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:24:03 np0005538513.novalocal python3[22177]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-23 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Nov 28 07:24:03 np0005538513.novalocal sudo[22175]: pam_unix(sudo:session): session closed for user root
Nov 28 07:24:05 np0005538513.novalocal sudo[22195]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iznunncjfmoyxlpejbzqbylkcsqxvhxn ; /usr/bin/python3
Nov 28 07:24:05 np0005538513.novalocal sudo[22195]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:24:05 np0005538513.novalocal python3[22197]: ansible-ansible.builtin.systemd Invoked with name=network state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 07:24:05 np0005538513.novalocal systemd[1]: Starting LSB: Bring up/down networking...
Nov 28 07:24:05 np0005538513.novalocal network[22200]: WARN      : [network] You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:05 np0005538513.novalocal network[22211]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:05 np0005538513.novalocal network[22200]: WARN      : [network] 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:05 np0005538513.novalocal network[22212]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:05 np0005538513.novalocal network[22200]: WARN      : [network] It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 07:24:05 np0005538513.novalocal network[22213]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 07:24:05 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314645.8113] audit: op="connections-reload" pid=22241 uid=0 result="success"
Nov 28 07:24:05 np0005538513.novalocal network[22200]: Bringing up loopback interface:  [  OK  ]
Nov 28 07:24:06 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314646.0274] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth0" pid=22329 uid=0 result="success"
Nov 28 07:24:06 np0005538513.novalocal network[22200]: Bringing up interface eth0:  [  OK  ]
Nov 28 07:24:06 np0005538513.novalocal systemd[1]: Started LSB: Bring up/down networking.
Nov 28 07:24:06 np0005538513.novalocal sudo[22195]: pam_unix(sudo:session): session closed for user root
Nov 28 07:24:06 np0005538513.novalocal sudo[22368]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbqxnqenfgxwduqtsdrnjfflwiiajeky ; /usr/bin/python3
Nov 28 07:24:06 np0005538513.novalocal sudo[22368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:24:06 np0005538513.novalocal python3[22370]: ansible-ansible.builtin.systemd Invoked with name=openvswitch state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 07:24:07 np0005538513.novalocal sshd[21958]: Invalid user ansadmin from 193.32.162.157 port 60318
Nov 28 07:24:07 np0005538513.novalocal systemd[1]: Starting Open vSwitch Database Unit...
Nov 28 07:24:07 np0005538513.novalocal chown[22374]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 28 07:24:07 np0005538513.novalocal ovs-ctl[22379]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 28 07:24:07 np0005538513.novalocal ovs-ctl[22379]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 28 07:24:07 np0005538513.novalocal ovs-ctl[22379]: Starting ovsdb-server [  OK  ]
Nov 28 07:24:07 np0005538513.novalocal ovs-vsctl[22428]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 28 07:24:07 np0005538513.novalocal ovs-vsctl[22448]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.6-141.el9fdp "external-ids:system-id=\"c85299c6-8e38-42c8-8509-2eaaf15c050c\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"rhel\"" "system-version=\"9.2\""
Nov 28 07:24:07 np0005538513.novalocal ovs-ctl[22379]: Configuring Open vSwitch system IDs [  OK  ]
Nov 28 07:24:07 np0005538513.novalocal ovs-ctl[22379]: Enabling remote OVSDB managers [  OK  ]
Nov 28 07:24:07 np0005538513.novalocal systemd[1]: Started Open vSwitch Database Unit.
Nov 28 07:24:07 np0005538513.novalocal ovs-vsctl[22454]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005538513.novalocal
Nov 28 07:24:07 np0005538513.novalocal systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 28 07:24:07 np0005538513.novalocal systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 28 07:24:07 np0005538513.novalocal systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 28 07:24:08 np0005538513.novalocal kernel: openvswitch: Open vSwitch switching datapath
Nov 28 07:24:08 np0005538513.novalocal ovs-ctl[22498]: Inserting openvswitch module [  OK  ]
Nov 28 07:24:08 np0005538513.novalocal ovs-ctl[22467]: Starting ovs-vswitchd [  OK  ]
Nov 28 07:24:08 np0005538513.novalocal ovs-vsctl[22515]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005538513.novalocal
Nov 28 07:24:08 np0005538513.novalocal ovs-ctl[22467]: Enabling remote OVSDB managers [  OK  ]
Nov 28 07:24:08 np0005538513.novalocal systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 28 07:24:08 np0005538513.novalocal systemd[1]: Starting Open vSwitch...
Nov 28 07:24:08 np0005538513.novalocal systemd[1]: Finished Open vSwitch.
Nov 28 07:24:08 np0005538513.novalocal sudo[22368]: pam_unix(sudo:session): session closed for user root
Nov 28 07:24:09 np0005538513.novalocal sudo[22533]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpqmtimdlymkxsokxjdpgpkhhgeofxzp ; /usr/bin/python3
Nov 28 07:24:09 np0005538513.novalocal sudo[22533]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:24:09 np0005538513.novalocal sshd[21958]: Connection closed by invalid user ansadmin 193.32.162.157 port 60318 [preauth]
Nov 28 07:24:09 np0005538513.novalocal sshd[22536]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:24:09 np0005538513.novalocal python3[22535]: ansible-ansible.legacy.command Invoked with _raw_params=os-net-config -c /etc/os-net-config/tripleo_config.yaml
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-f34b-e95a-00000000001a-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:24:10 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314650.5745] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22694 uid=0 result="success"
Nov 28 07:24:10 np0005538513.novalocal ifup[22695]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:10 np0005538513.novalocal ifup[22696]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:10 np0005538513.novalocal ifup[22697]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:24:10 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314650.6071] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22703 uid=0 result="success"
Nov 28 07:24:10 np0005538513.novalocal ovs-vsctl[22705]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --may-exist add-br br-ex -- set bridge br-ex other-config:mac-table-size=50000 -- set bridge br-ex other-config:hwaddr=fa:16:3e:11:ad:79 -- set bridge br-ex fail_mode=standalone -- del-controller br-ex
Nov 28 07:24:10 np0005538513.novalocal kernel: device ovs-system entered promiscuous mode
Nov 28 07:24:10 np0005538513.novalocal systemd-udevd[22439]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 07:24:10 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314650.6362] manager: (ovs-system): new Generic device (/org/freedesktop/NetworkManager/Devices/4)
Nov 28 07:24:10 np0005538513.novalocal kernel: Timeout policy base is empty
Nov 28 07:24:10 np0005538513.novalocal kernel: Failed to associated timeout policy `ovs_test_tp'
Nov 28 07:24:10 np0005538513.novalocal kernel: device br-ex entered promiscuous mode
Nov 28 07:24:10 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314650.6852] manager: (br-ex): new Generic device (/org/freedesktop/NetworkManager/Devices/5)
Nov 28 07:24:10 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314650.7115] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22730 uid=0 result="success"
Nov 28 07:24:10 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314650.7316] device (br-ex): carrier: link connected
Nov 28 07:24:13 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314653.7930] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22760 uid=0 result="success"
Nov 28 07:24:13 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314653.8399] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22775 uid=0 result="success"
Nov 28 07:24:13 np0005538513.novalocal NET[22800]: /etc/sysconfig/network-scripts/ifup-post : updated /etc/resolv.conf
Nov 28 07:24:13 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314653.9280] device (eth1): state change: activated -> unmanaged (reason 'unmanaged', sys-iface-state: 'managed')
Nov 28 07:24:13 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314653.9422] dhcp4 (eth1): canceled DHCP transaction
Nov 28 07:24:13 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314653.9423] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 28 07:24:13 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314653.9423] dhcp4 (eth1): state changed no lease
Nov 28 07:24:13 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314653.9470] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22809 uid=0 result="success"
Nov 28 07:24:13 np0005538513.novalocal ifup[22810]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:13 np0005538513.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 07:24:13 np0005538513.novalocal ifup[22811]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:13 np0005538513.novalocal ifup[22813]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:24:13 np0005538513.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 07:24:13 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314653.9860] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22828 uid=0 result="success"
Nov 28 07:24:14 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314654.0526] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22837 uid=0 result="success"
Nov 28 07:24:14 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314654.0592] device (eth1): carrier: link connected
Nov 28 07:24:14 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314654.0815] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22846 uid=0 result="success"
Nov 28 07:24:14 np0005538513.novalocal ipv6_wait_tentative[22858]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state
Nov 28 07:24:15 np0005538513.novalocal ipv6_wait_tentative[22863]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state
Nov 28 07:24:16 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314656.1526] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22872 uid=0 result="success"
Nov 28 07:24:16 np0005538513.novalocal ovs-vsctl[22887]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex eth1 -- add-port br-ex eth1
Nov 28 07:24:16 np0005538513.novalocal kernel: device eth1 entered promiscuous mode
Nov 28 07:24:16 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314656.2278] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22895 uid=0 result="success"
Nov 28 07:24:16 np0005538513.novalocal ifup[22896]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:16 np0005538513.novalocal ifup[22897]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:16 np0005538513.novalocal ifup[22898]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:24:16 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314656.2593] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22904 uid=0 result="success"
Nov 28 07:24:16 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314656.3010] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22914 uid=0 result="success"
Nov 28 07:24:16 np0005538513.novalocal ifup[22915]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:16 np0005538513.novalocal ifup[22916]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:16 np0005538513.novalocal ifup[22917]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:24:16 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314656.3321] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22923 uid=0 result="success"
Nov 28 07:24:16 np0005538513.novalocal ovs-vsctl[22926]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal
Nov 28 07:24:16 np0005538513.novalocal kernel: device vlan20 entered promiscuous mode
Nov 28 07:24:16 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314656.3747] manager: (vlan20): new Generic device (/org/freedesktop/NetworkManager/Devices/6)
Nov 28 07:24:16 np0005538513.novalocal systemd-udevd[22928]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 07:24:16 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314656.3990] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22937 uid=0 result="success"
Nov 28 07:24:16 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314656.4204] device (vlan20): carrier: link connected
Nov 28 07:24:19 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314659.4736] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22967 uid=0 result="success"
Nov 28 07:24:19 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314659.5240] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22982 uid=0 result="success"
Nov 28 07:24:19 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314659.5836] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23003 uid=0 result="success"
Nov 28 07:24:19 np0005538513.novalocal ifup[23004]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:19 np0005538513.novalocal ifup[23005]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:19 np0005538513.novalocal ifup[23006]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:24:19 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314659.6146] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23012 uid=0 result="success"
Nov 28 07:24:19 np0005538513.novalocal ovs-vsctl[23015]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal
Nov 28 07:24:19 np0005538513.novalocal systemd-udevd[23017]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 07:24:19 np0005538513.novalocal kernel: device vlan23 entered promiscuous mode
Nov 28 07:24:19 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314659.6559] manager: (vlan23): new Generic device (/org/freedesktop/NetworkManager/Devices/7)
Nov 28 07:24:19 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314659.6821] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23027 uid=0 result="success"
Nov 28 07:24:19 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314659.7039] device (vlan23): carrier: link connected
Nov 28 07:24:22 np0005538513.novalocal sshd[22536]: Connection closed by authenticating user root 193.32.162.157 port 47870 [preauth]
Nov 28 07:24:22 np0005538513.novalocal sshd[23048]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:24:22 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314662.7644] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23058 uid=0 result="success"
Nov 28 07:24:22 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314662.8115] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23073 uid=0 result="success"
Nov 28 07:24:22 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314662.8999] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23094 uid=0 result="success"
Nov 28 07:24:22 np0005538513.novalocal ifup[23095]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:22 np0005538513.novalocal ifup[23096]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:22 np0005538513.novalocal ifup[23097]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:24:22 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314662.9347] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23103 uid=0 result="success"
Nov 28 07:24:22 np0005538513.novalocal ovs-vsctl[23106]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal
Nov 28 07:24:22 np0005538513.novalocal kernel: device vlan21 entered promiscuous mode
Nov 28 07:24:22 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314662.9676] manager: (vlan21): new Generic device (/org/freedesktop/NetworkManager/Devices/8)
Nov 28 07:24:22 np0005538513.novalocal systemd-udevd[23108]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 07:24:22 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314662.9960] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23118 uid=0 result="success"
Nov 28 07:24:23 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314663.0188] device (vlan21): carrier: link connected
Nov 28 07:24:23 np0005538513.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 07:24:26 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314666.0709] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23149 uid=0 result="success"
Nov 28 07:24:26 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314666.1187] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23164 uid=0 result="success"
Nov 28 07:24:26 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314666.1835] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23185 uid=0 result="success"
Nov 28 07:24:26 np0005538513.novalocal ifup[23186]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:26 np0005538513.novalocal ifup[23187]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:26 np0005538513.novalocal ifup[23188]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:24:26 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314666.2222] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23194 uid=0 result="success"
Nov 28 07:24:26 np0005538513.novalocal ovs-vsctl[23197]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal
Nov 28 07:24:26 np0005538513.novalocal kernel: device vlan22 entered promiscuous mode
Nov 28 07:24:26 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314666.2636] manager: (vlan22): new Generic device (/org/freedesktop/NetworkManager/Devices/9)
Nov 28 07:24:26 np0005538513.novalocal systemd-udevd[23199]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 07:24:26 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314666.2933] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23209 uid=0 result="success"
Nov 28 07:24:26 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314666.3141] device (vlan22): carrier: link connected
Nov 28 07:24:29 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314669.3601] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23239 uid=0 result="success"
Nov 28 07:24:29 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314669.4064] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23254 uid=0 result="success"
Nov 28 07:24:29 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314669.4603] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23275 uid=0 result="success"
Nov 28 07:24:29 np0005538513.novalocal ifup[23276]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:29 np0005538513.novalocal ifup[23277]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:29 np0005538513.novalocal ifup[23278]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:24:29 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314669.4905] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23284 uid=0 result="success"
Nov 28 07:24:29 np0005538513.novalocal ovs-vsctl[23287]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal
Nov 28 07:24:29 np0005538513.novalocal kernel: device vlan44 entered promiscuous mode
Nov 28 07:24:29 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314669.5199] manager: (vlan44): new Generic device (/org/freedesktop/NetworkManager/Devices/10)
Nov 28 07:24:29 np0005538513.novalocal systemd-udevd[23290]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 07:24:29 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314669.5390] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23299 uid=0 result="success"
Nov 28 07:24:29 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314669.5556] device (vlan44): carrier: link connected
Nov 28 07:24:32 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314672.6068] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23329 uid=0 result="success"
Nov 28 07:24:32 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314672.6544] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23344 uid=0 result="success"
Nov 28 07:24:32 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314672.7155] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23365 uid=0 result="success"
Nov 28 07:24:32 np0005538513.novalocal ifup[23366]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:32 np0005538513.novalocal ifup[23367]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:32 np0005538513.novalocal ifup[23368]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:24:32 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314672.7470] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23374 uid=0 result="success"
Nov 28 07:24:32 np0005538513.novalocal ovs-vsctl[23377]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal
Nov 28 07:24:32 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314672.8010] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23384 uid=0 result="success"
Nov 28 07:24:33 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314673.8562] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23411 uid=0 result="success"
Nov 28 07:24:33 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314673.9006] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23426 uid=0 result="success"
Nov 28 07:24:33 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314673.9591] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23447 uid=0 result="success"
Nov 28 07:24:33 np0005538513.novalocal ifup[23448]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:33 np0005538513.novalocal ifup[23449]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:33 np0005538513.novalocal ifup[23450]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:24:33 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314673.9906] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23456 uid=0 result="success"
Nov 28 07:24:34 np0005538513.novalocal ovs-vsctl[23459]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal
Nov 28 07:24:34 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314674.0627] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23466 uid=0 result="success"
Nov 28 07:24:34 np0005538513.novalocal sshd[23048]: Connection closed by authenticating user root 193.32.162.157 port 55034 [preauth]
Nov 28 07:24:34 np0005538513.novalocal sshd[23484]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:24:35 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314675.1236] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23495 uid=0 result="success"
Nov 28 07:24:35 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314675.1693] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23510 uid=0 result="success"
Nov 28 07:24:35 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314675.2299] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23531 uid=0 result="success"
Nov 28 07:24:35 np0005538513.novalocal ifup[23532]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:35 np0005538513.novalocal ifup[23533]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:35 np0005538513.novalocal ifup[23534]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:24:35 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314675.2651] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23540 uid=0 result="success"
Nov 28 07:24:35 np0005538513.novalocal ovs-vsctl[23543]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal
Nov 28 07:24:35 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314675.3239] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23550 uid=0 result="success"
Nov 28 07:24:36 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314676.3797] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23578 uid=0 result="success"
Nov 28 07:24:36 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314676.4269] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23593 uid=0 result="success"
Nov 28 07:24:36 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314676.4950] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23614 uid=0 result="success"
Nov 28 07:24:36 np0005538513.novalocal ifup[23616]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:36 np0005538513.novalocal ifup[23617]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:36 np0005538513.novalocal ifup[23618]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:24:36 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314676.5306] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23624 uid=0 result="success"
Nov 28 07:24:36 np0005538513.novalocal ovs-vsctl[23627]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal
Nov 28 07:24:36 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314676.5889] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23634 uid=0 result="success"
Nov 28 07:24:37 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314677.6500] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23662 uid=0 result="success"
Nov 28 07:24:37 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314677.6979] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23677 uid=0 result="success"
Nov 28 07:24:37 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314677.7654] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23698 uid=0 result="success"
Nov 28 07:24:37 np0005538513.novalocal ifup[23699]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:37 np0005538513.novalocal ifup[23700]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:37 np0005538513.novalocal ifup[23701]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:24:37 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314677.7999] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23707 uid=0 result="success"
Nov 28 07:24:37 np0005538513.novalocal ovs-vsctl[23710]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal
Nov 28 07:24:37 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314677.8609] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23717 uid=0 result="success"
Nov 28 07:24:38 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314678.9278] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23745 uid=0 result="success"
Nov 28 07:24:38 np0005538513.novalocal NetworkManager[5967]: <info>  [1764314678.9792] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23760 uid=0 result="success"
Nov 28 07:24:39 np0005538513.novalocal sudo[22533]: pam_unix(sudo:session): session closed for user root
Nov 28 07:24:46 np0005538513.novalocal sshd[23484]: Connection closed by authenticating user root 193.32.162.157 port 40776 [preauth]
Nov 28 07:24:46 np0005538513.novalocal sshd[23778]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:24:58 np0005538513.novalocal sshd[23778]: Connection closed by authenticating user root 193.32.162.157 port 45214 [preauth]
Nov 28 07:24:58 np0005538513.novalocal sshd[23780]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:25:10 np0005538513.novalocal sshd[23780]: Connection closed by authenticating user root 193.32.162.157 port 36190 [preauth]
Nov 28 07:25:10 np0005538513.novalocal sshd[23782]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:25:22 np0005538513.novalocal sshd[23782]: Connection closed by authenticating user root 193.32.162.157 port 52936 [preauth]
Nov 28 07:25:22 np0005538513.novalocal sshd[23784]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:25:32 np0005538513.novalocal python3[23800]: ansible-ansible.legacy.command Invoked with _raw_params=ip a
                                                       ping -c 2 -W 2 192.168.122.10
                                                       ping -c 2 -W 2 192.168.122.11
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-f34b-e95a-00000000001b-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:25:35 np0005538513.novalocal sshd[23784]: Connection closed by authenticating user root 193.32.162.157 port 42208 [preauth]
Nov 28 07:25:35 np0005538513.novalocal sshd[23806]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:25:37 np0005538513.novalocal python3[23822]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsnBivukZgTjr1SoC29hE3ofwUMxTaKeXh9gXvDwMJASbvK4q9943cbJ2j47GUf8sEgY38kkU/dxSMQWULl4d2oquIgZpJbJuXMU1WNxwGNSrS74OecQ3Or4VxTiDmu/HV83nIWHqfpDCra4DlrIBPPNwhBK4u0QYy87AJaML6NGEDaubbHgVCg1UpW1ho/sDoXptAehoCEaaeRz5tPHiXRnHpIXu44Sp8fRcyU9rBqdv+/lgachTcMYadsD2WBHIL+pptEDHB5TvQTDpnU58YdGFarn8uuGPP4t8H6xcqXbaJS9/oZa5Fb5Mh3vORBbR65jvlGg4PYGzCuI/xllY5+lGK7eyOleFyRqWKa2uAIaGoRBT4ZLKAssOFwCIaGfOAFFOBMkuylg4+MtbYiMJYRARPSRAufAROqhUDOo73y5lBrXh07aiWuSn8fU4mclWu+Xw382ryxW+XeHPc12d7S46TvGJaRvzsLtlyerRxGI77xOHRexq1Z/SFjOWLOwc= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 07:25:38 np0005538513.novalocal sudo[23836]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-glpzpnkihvgktwsdqpmdfvomgjocnhpe ; /usr/bin/python3
Nov 28 07:25:38 np0005538513.novalocal sudo[23836]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:25:38 np0005538513.novalocal python3[23838]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsnBivukZgTjr1SoC29hE3ofwUMxTaKeXh9gXvDwMJASbvK4q9943cbJ2j47GUf8sEgY38kkU/dxSMQWULl4d2oquIgZpJbJuXMU1WNxwGNSrS74OecQ3Or4VxTiDmu/HV83nIWHqfpDCra4DlrIBPPNwhBK4u0QYy87AJaML6NGEDaubbHgVCg1UpW1ho/sDoXptAehoCEaaeRz5tPHiXRnHpIXu44Sp8fRcyU9rBqdv+/lgachTcMYadsD2WBHIL+pptEDHB5TvQTDpnU58YdGFarn8uuGPP4t8H6xcqXbaJS9/oZa5Fb5Mh3vORBbR65jvlGg4PYGzCuI/xllY5+lGK7eyOleFyRqWKa2uAIaGoRBT4ZLKAssOFwCIaGfOAFFOBMkuylg4+MtbYiMJYRARPSRAufAROqhUDOo73y5lBrXh07aiWuSn8fU4mclWu+Xw382ryxW+XeHPc12d7S46TvGJaRvzsLtlyerRxGI77xOHRexq1Z/SFjOWLOwc= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 07:25:38 np0005538513.novalocal sudo[23836]: pam_unix(sudo:session): session closed for user root
Nov 28 07:25:39 np0005538513.novalocal python3[23852]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsnBivukZgTjr1SoC29hE3ofwUMxTaKeXh9gXvDwMJASbvK4q9943cbJ2j47GUf8sEgY38kkU/dxSMQWULl4d2oquIgZpJbJuXMU1WNxwGNSrS74OecQ3Or4VxTiDmu/HV83nIWHqfpDCra4DlrIBPPNwhBK4u0QYy87AJaML6NGEDaubbHgVCg1UpW1ho/sDoXptAehoCEaaeRz5tPHiXRnHpIXu44Sp8fRcyU9rBqdv+/lgachTcMYadsD2WBHIL+pptEDHB5TvQTDpnU58YdGFarn8uuGPP4t8H6xcqXbaJS9/oZa5Fb5Mh3vORBbR65jvlGg4PYGzCuI/xllY5+lGK7eyOleFyRqWKa2uAIaGoRBT4ZLKAssOFwCIaGfOAFFOBMkuylg4+MtbYiMJYRARPSRAufAROqhUDOo73y5lBrXh07aiWuSn8fU4mclWu+Xw382ryxW+XeHPc12d7S46TvGJaRvzsLtlyerRxGI77xOHRexq1Z/SFjOWLOwc= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 07:25:39 np0005538513.novalocal sudo[23866]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvlzdigbknyeyayjaugsbazryfinqyrw ; /usr/bin/python3
Nov 28 07:25:39 np0005538513.novalocal sudo[23866]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:25:40 np0005538513.novalocal python3[23868]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsnBivukZgTjr1SoC29hE3ofwUMxTaKeXh9gXvDwMJASbvK4q9943cbJ2j47GUf8sEgY38kkU/dxSMQWULl4d2oquIgZpJbJuXMU1WNxwGNSrS74OecQ3Or4VxTiDmu/HV83nIWHqfpDCra4DlrIBPPNwhBK4u0QYy87AJaML6NGEDaubbHgVCg1UpW1ho/sDoXptAehoCEaaeRz5tPHiXRnHpIXu44Sp8fRcyU9rBqdv+/lgachTcMYadsD2WBHIL+pptEDHB5TvQTDpnU58YdGFarn8uuGPP4t8H6xcqXbaJS9/oZa5Fb5Mh3vORBbR65jvlGg4PYGzCuI/xllY5+lGK7eyOleFyRqWKa2uAIaGoRBT4ZLKAssOFwCIaGfOAFFOBMkuylg4+MtbYiMJYRARPSRAufAROqhUDOo73y5lBrXh07aiWuSn8fU4mclWu+Xw382ryxW+XeHPc12d7S46TvGJaRvzsLtlyerRxGI77xOHRexq1Z/SFjOWLOwc= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 07:25:40 np0005538513.novalocal sudo[23866]: pam_unix(sudo:session): session closed for user root
Nov 28 07:25:41 np0005538513.novalocal python3[23882]: ansible-ansible.builtin.slurp Invoked with path=/etc/hostname src=/etc/hostname
Nov 28 07:25:41 np0005538513.novalocal python3[23897]: ansible-ansible.legacy.command Invoked with _raw_params=hostname="np0005538513.novalocal"
                                                       hostname_str_array=(${hostname//./ })
                                                       echo ${hostname_str_array[0]} > /home/zuul/ansible_hostname
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-f34b-e95a-000000000022-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:25:42 np0005538513.novalocal sudo[23915]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpdclolckwowlobdvuxlrqzejfkausme ; /usr/bin/python3
Nov 28 07:25:42 np0005538513.novalocal sudo[23915]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:25:42 np0005538513.novalocal python3[23917]: ansible-ansible.legacy.command Invoked with _raw_params=hostname=$(cat /home/zuul/ansible_hostname)
                                                       hostnamectl hostname "$hostname.localdomain"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-f34b-e95a-000000000023-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:25:42 np0005538513.novalocal systemd[1]: Starting Hostname Service...
Nov 28 07:25:42 np0005538513.novalocal systemd[1]: Started Hostname Service.
Nov 28 07:25:42 np0005538513.localdomain systemd-hostnamed[23921]: Hostname set to <np0005538513.localdomain> (static)
Nov 28 07:25:42 np0005538513.localdomain NetworkManager[5967]: <info>  [1764314742.8166] hostname: static hostname changed from "np0005538513.novalocal" to "np0005538513.localdomain"
Nov 28 07:25:42 np0005538513.localdomain systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 07:25:42 np0005538513.localdomain systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 07:25:42 np0005538513.localdomain sudo[23915]: pam_unix(sudo:session): session closed for user root
Nov 28 07:25:44 np0005538513.localdomain sshd[19230]: pam_unix(sshd:session): session closed for user zuul
Nov 28 07:25:44 np0005538513.localdomain systemd[1]: session-10.scope: Deactivated successfully.
Nov 28 07:25:44 np0005538513.localdomain systemd[1]: session-10.scope: Consumed 1min 44.334s CPU time.
Nov 28 07:25:44 np0005538513.localdomain systemd-logind[764]: Session 10 logged out. Waiting for processes to exit.
Nov 28 07:25:44 np0005538513.localdomain systemd-logind[764]: Removed session 10.
Nov 28 07:25:46 np0005538513.localdomain sshd[23932]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:25:46 np0005538513.localdomain sshd[23932]: Accepted publickey for zuul from 38.102.83.114 port 34202 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 07:25:46 np0005538513.localdomain systemd-logind[764]: New session 11 of user zuul.
Nov 28 07:25:46 np0005538513.localdomain systemd[1]: Started Session 11 of User zuul.
Nov 28 07:25:46 np0005538513.localdomain sshd[23932]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 07:25:47 np0005538513.localdomain python3[23949]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname
Nov 28 07:25:47 np0005538513.localdomain sshd[23806]: Connection closed by authenticating user root 193.32.162.157 port 36320 [preauth]
Nov 28 07:25:47 np0005538513.localdomain sshd[23950]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:25:49 np0005538513.localdomain sshd[23932]: pam_unix(sshd:session): session closed for user zuul
Nov 28 07:25:49 np0005538513.localdomain systemd[1]: session-11.scope: Deactivated successfully.
Nov 28 07:25:49 np0005538513.localdomain systemd-logind[764]: Session 11 logged out. Waiting for processes to exit.
Nov 28 07:25:49 np0005538513.localdomain systemd-logind[764]: Removed session 11.
Nov 28 07:25:52 np0005538513.localdomain systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 07:25:58 np0005538513.localdomain sshd[23950]: Connection closed by authenticating user root 193.32.162.157 port 38240 [preauth]
Nov 28 07:25:59 np0005538513.localdomain sshd[23953]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:26:08 np0005538513.localdomain sshd[23953]: Invalid user ubuntu from 193.32.162.157 port 42548
Nov 28 07:26:11 np0005538513.localdomain sshd[23953]: Connection closed by invalid user ubuntu 193.32.162.157 port 42548 [preauth]
Nov 28 07:26:11 np0005538513.localdomain sshd[23955]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:26:12 np0005538513.localdomain systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 28 07:26:23 np0005538513.localdomain sshd[23955]: Connection closed by authenticating user root 193.32.162.157 port 36226 [preauth]
Nov 28 07:26:23 np0005538513.localdomain sshd[23959]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:26:26 np0005538513.localdomain sshd[23961]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:26:26 np0005538513.localdomain sshd[23961]: Accepted publickey for zuul from 38.102.83.114 port 47074 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 07:26:26 np0005538513.localdomain systemd-logind[764]: New session 12 of user zuul.
Nov 28 07:26:26 np0005538513.localdomain systemd[1]: Started Session 12 of User zuul.
Nov 28 07:26:26 np0005538513.localdomain sshd[23961]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 07:26:26 np0005538513.localdomain sudo[23978]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kvxapcbmsirkvyuucmlndpcfuuuxookf ; /usr/bin/python3
Nov 28 07:26:26 np0005538513.localdomain sudo[23978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:26:26 np0005538513.localdomain python3[23980]: ansible-ansible.legacy.dnf Invoked with name=['lvm2', 'jq'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:26:28 np0005538513.localdomain sshd[23982]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:26:28 np0005538513.localdomain sshd[23983]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:26:28 np0005538513.localdomain sshd[23983]: error: kex_exchange_identification: read: Connection reset by peer
Nov 28 07:26:28 np0005538513.localdomain sshd[23983]: Connection reset by 45.140.17.97 port 23536
Nov 28 07:26:29 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:26:30 np0005538513.localdomain systemd-sysv-generator[24027]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:26:30 np0005538513.localdomain systemd-rc-local-generator[24022]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:26:30 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:26:30 np0005538513.localdomain systemd[1]: Starting dnf makecache...
Nov 28 07:26:30 np0005538513.localdomain systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 28 07:26:30 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:26:30 np0005538513.localdomain dnf[24037]: Updating Subscription Management repositories.
Nov 28 07:26:30 np0005538513.localdomain systemd-sysv-generator[24070]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:26:30 np0005538513.localdomain systemd-rc-local-generator[24067]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:26:30 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:26:30 np0005538513.localdomain systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 28 07:26:30 np0005538513.localdomain systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 28 07:26:30 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:26:30 np0005538513.localdomain systemd-rc-local-generator[24103]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:26:30 np0005538513.localdomain systemd-sysv-generator[24106]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:26:30 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:26:30 np0005538513.localdomain systemd[1]: Listening on LVM2 poll daemon socket.
Nov 28 07:26:31 np0005538513.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 07:26:31 np0005538513.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 28 07:26:31 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:26:31 np0005538513.localdomain systemd-sysv-generator[24171]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:26:31 np0005538513.localdomain systemd-rc-local-generator[24165]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:26:31 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:26:31 np0005538513.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 07:26:31 np0005538513.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 07:26:31 np0005538513.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 07:26:31 np0005538513.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 28 07:26:31 np0005538513.localdomain systemd[1]: run-r3770fc8bd18e41ec8a8f6bcf3d2a9a9c.service: Deactivated successfully.
Nov 28 07:26:31 np0005538513.localdomain systemd[1]: run-r3f41cd1c2eb74830b43e7fd95f8b8cc3.service: Deactivated successfully.
Nov 28 07:26:32 np0005538513.localdomain dnf[24037]: Failed determining last makecache time.
Nov 28 07:26:32 np0005538513.localdomain dnf[24037]: Red Hat OpenStack Platform 17.1 for RHEL 9 x86_  51 kB/s | 4.0 kB     00:00
Nov 28 07:26:32 np0005538513.localdomain dnf[24037]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS   39 kB/s | 4.1 kB     00:00
Nov 28 07:26:32 np0005538513.localdomain sudo[23978]: pam_unix(sudo:session): session closed for user root
Nov 28 07:26:32 np0005538513.localdomain dnf[24037]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS   41 kB/s | 4.1 kB     00:00
Nov 28 07:26:32 np0005538513.localdomain dnf[24037]: Red Hat Enterprise Linux 9 for x86_64 - AppStre  46 kB/s | 4.5 kB     00:00
Nov 28 07:26:32 np0005538513.localdomain sshd[23959]: Invalid user sftp from 193.32.162.157 port 36718
Nov 28 07:26:32 np0005538513.localdomain dnf[24037]: Red Hat Enterprise Linux 9 for x86_64 - AppStre  42 kB/s | 4.5 kB     00:00
Nov 28 07:26:33 np0005538513.localdomain dnf[24037]: Red Hat Enterprise Linux 9 for x86_64 - High Av  37 kB/s | 4.0 kB     00:00
Nov 28 07:26:33 np0005538513.localdomain dnf[24037]: Fast Datapath for RHEL 9 x86_64 (RPMs)           42 kB/s | 4.0 kB     00:00
Nov 28 07:26:33 np0005538513.localdomain dnf[24037]: Metadata cache created.
Nov 28 07:26:33 np0005538513.localdomain systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 28 07:26:33 np0005538513.localdomain systemd[1]: Finished dnf makecache.
Nov 28 07:26:33 np0005538513.localdomain systemd[1]: dnf-makecache.service: Consumed 2.705s CPU time.
Nov 28 07:26:35 np0005538513.localdomain sshd[23959]: Connection closed by invalid user sftp 193.32.162.157 port 36718 [preauth]
Nov 28 07:26:35 np0005538513.localdomain sshd[24764]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:26:47 np0005538513.localdomain sshd[24764]: Connection closed by authenticating user root 193.32.162.157 port 59910 [preauth]
Nov 28 07:26:47 np0005538513.localdomain sshd[24766]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:26:59 np0005538513.localdomain sshd[24766]: Connection closed by authenticating user root 193.32.162.157 port 47628 [preauth]
Nov 28 07:26:59 np0005538513.localdomain sshd[24768]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:27:11 np0005538513.localdomain sshd[24768]: Connection closed by authenticating user root 193.32.162.157 port 49928 [preauth]
Nov 28 07:27:32 np0005538513.localdomain sshd[23964]: Received disconnect from 38.102.83.114 port 47074:11: disconnected by user
Nov 28 07:27:32 np0005538513.localdomain sshd[23964]: Disconnected from user zuul 38.102.83.114 port 47074
Nov 28 07:27:32 np0005538513.localdomain sshd[23961]: pam_unix(sshd:session): session closed for user zuul
Nov 28 07:27:32 np0005538513.localdomain systemd[1]: session-12.scope: Deactivated successfully.
Nov 28 07:27:32 np0005538513.localdomain systemd[1]: session-12.scope: Consumed 4.535s CPU time.
Nov 28 07:27:32 np0005538513.localdomain systemd-logind[764]: Session 12 logged out. Waiting for processes to exit.
Nov 28 07:27:32 np0005538513.localdomain systemd-logind[764]: Removed session 12.
Nov 28 07:39:19 np0005538513.localdomain sshd[24774]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:39:19 np0005538513.localdomain sshd[24774]: error: kex_exchange_identification: banner line contains invalid characters
Nov 28 07:39:19 np0005538513.localdomain sshd[24774]: banner exchange: Connection from 152.32.131.245 port 60264: invalid format
Nov 28 07:39:20 np0005538513.localdomain sshd[24775]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:39:38 np0005538513.localdomain sshd[24775]: error: kex_exchange_identification: Connection closed by remote host
Nov 28 07:39:38 np0005538513.localdomain sshd[24775]: Connection closed by 152.32.131.245 port 48866
Nov 28 07:39:38 np0005538513.localdomain sshd[24776]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:39:39 np0005538513.localdomain sshd[24776]: fatal: mm_answer_sign: sign: error in libcrypto
Nov 28 07:39:39 np0005538513.localdomain sshd[24778]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:39:39 np0005538513.localdomain sshd[24778]: error: Protocol major versions differ: 2 vs. 1
Nov 28 07:39:39 np0005538513.localdomain sshd[24778]: banner exchange: Connection from 152.32.131.245 port 52016: could not read protocol version
Nov 28 07:40:18 np0005538513.localdomain sshd[24779]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:40:19 np0005538513.localdomain sshd[24779]: Received disconnect from 193.46.255.244 port 34730:11:  [preauth]
Nov 28 07:40:19 np0005538513.localdomain sshd[24779]: Disconnected from authenticating user root 193.46.255.244 port 34730 [preauth]
Nov 28 07:43:01 np0005538513.localdomain anacron[6702]: Job `cron.weekly' started
Nov 28 07:43:01 np0005538513.localdomain anacron[6702]: Job `cron.weekly' terminated
Nov 28 07:43:42 np0005538513.localdomain sshd[24784]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:43:42 np0005538513.localdomain sshd[24784]: Accepted publickey for zuul from 192.168.122.100 port 55686 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 07:43:42 np0005538513.localdomain systemd-logind[764]: New session 13 of user zuul.
Nov 28 07:43:42 np0005538513.localdomain systemd[1]: Started Session 13 of User zuul.
Nov 28 07:43:42 np0005538513.localdomain sshd[24784]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 07:43:43 np0005538513.localdomain sudo[24830]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bcrjwntgktgrwyhxomvcfuiodydjmbqm ; /usr/bin/python3
Nov 28 07:43:43 np0005538513.localdomain sudo[24830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:43 np0005538513.localdomain python3[24832]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 07:43:44 np0005538513.localdomain sudo[24830]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:45 np0005538513.localdomain sudo[24916]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-riyxusjpjnmcvwjdxnppfkmojwjvnziq ; /usr/bin/python3
Nov 28 07:43:45 np0005538513.localdomain sudo[24916]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:45 np0005538513.localdomain python3[24918]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:43:47 np0005538513.localdomain sudo[24916]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:48 np0005538513.localdomain sudo[24933]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpvnecqweuajlueeoqfiaqiozrfgaiie ; /usr/bin/python3
Nov 28 07:43:48 np0005538513.localdomain sudo[24933]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:48 np0005538513.localdomain python3[24935]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:43:48 np0005538513.localdomain sudo[24933]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:48 np0005538513.localdomain sudo[24951]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cnzqgwdzbnpeftbmnhkpzbjrnclphiov ; /usr/bin/python3
Nov 28 07:43:48 np0005538513.localdomain sudo[24951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:48 np0005538513.localdomain python3[24953]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G
                                                         losetup /dev/loop3 /var/lib/ceph-osd-0.img
                                                         lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:43:49 np0005538513.localdomain kernel: loop: module loaded
Nov 28 07:43:49 np0005538513.localdomain kernel: loop3: detected capacity change from 0 to 14680064
Nov 28 07:43:49 np0005538513.localdomain sudo[24951]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:49 np0005538513.localdomain sudo[24976]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fllhbcifxbfyzqdtgosbvutfnpcsathv ; /usr/bin/python3
Nov 28 07:43:49 np0005538513.localdomain sudo[24976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:49 np0005538513.localdomain python3[24978]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3
                                                         vgcreate ceph_vg0 /dev/loop3
                                                         lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0
                                                         lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:43:49 np0005538513.localdomain lvm[24981]: PV /dev/loop3 not used.
Nov 28 07:43:49 np0005538513.localdomain lvm[24983]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 28 07:43:49 np0005538513.localdomain systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Nov 28 07:43:49 np0005538513.localdomain lvm[24992]:   1 logical volume(s) in volume group "ceph_vg0" now active
Nov 28 07:43:49 np0005538513.localdomain systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Nov 28 07:43:49 np0005538513.localdomain sudo[24976]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:50 np0005538513.localdomain sudo[25039]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-voclqzhdjklxcedqubmqglobxzdsdzer ; /usr/bin/python3
Nov 28 07:43:50 np0005538513.localdomain sudo[25039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:50 np0005538513.localdomain python3[25041]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:43:50 np0005538513.localdomain sudo[25039]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:50 np0005538513.localdomain sudo[25082]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrpoixaywkcogxwlefhbgfvhbcyiudyt ; /usr/bin/python3
Nov 28 07:43:50 np0005538513.localdomain sudo[25082]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:50 np0005538513.localdomain python3[25084]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764315830.0177555-54708-186905177275905/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:43:50 np0005538513.localdomain sudo[25082]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:51 np0005538513.localdomain sudo[25112]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lfdphddpwphndvbtpialcxxqdaekgqry ; /usr/bin/python3
Nov 28 07:43:51 np0005538513.localdomain sudo[25112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:51 np0005538513.localdomain python3[25114]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:43:51 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:43:51 np0005538513.localdomain systemd-rc-local-generator[25143]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:43:51 np0005538513.localdomain systemd-sysv-generator[25147]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:43:51 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:43:51 np0005538513.localdomain systemd[1]: Starting Ceph OSD losetup...
Nov 28 07:43:51 np0005538513.localdomain bash[25155]: /dev/loop3: [64516]:8400144 (/var/lib/ceph-osd-0.img)
Nov 28 07:43:51 np0005538513.localdomain systemd[1]: Finished Ceph OSD losetup.
Nov 28 07:43:52 np0005538513.localdomain lvm[25156]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 28 07:43:52 np0005538513.localdomain lvm[25156]: VG ceph_vg0 finished
Nov 28 07:43:52 np0005538513.localdomain sudo[25112]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:52 np0005538513.localdomain sudo[25171]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pzbepwhzpwveddunvnzydkendoftqbai ; /usr/bin/python3
Nov 28 07:43:52 np0005538513.localdomain sudo[25171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:52 np0005538513.localdomain python3[25173]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:43:55 np0005538513.localdomain sudo[25171]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:55 np0005538513.localdomain sudo[25188]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmsswmggqasbynonncvyuuutpqifgcrr ; /usr/bin/python3
Nov 28 07:43:55 np0005538513.localdomain sudo[25188]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:55 np0005538513.localdomain python3[25190]: ansible-ansible.builtin.stat Invoked with path=/dev/loop4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:43:55 np0005538513.localdomain sudo[25188]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:55 np0005538513.localdomain sudo[25204]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqweptgxacptmfelwboktsmvrzmatdln ; /usr/bin/python3
Nov 28 07:43:55 np0005538513.localdomain sudo[25204]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:56 np0005538513.localdomain python3[25206]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-1.img bs=1 count=0 seek=7G
                                                         losetup /dev/loop4 /var/lib/ceph-osd-1.img
                                                         lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:43:56 np0005538513.localdomain kernel: loop4: detected capacity change from 0 to 14680064
Nov 28 07:43:56 np0005538513.localdomain sudo[25204]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:56 np0005538513.localdomain sudo[25226]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljrhwnzinzvftdoflfheydlzfzlgzykr ; /usr/bin/python3
Nov 28 07:43:56 np0005538513.localdomain sudo[25226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:56 np0005538513.localdomain python3[25228]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop4
                                                         vgcreate ceph_vg1 /dev/loop4
                                                         lvcreate -n ceph_lv1 -l +100%FREE ceph_vg1
                                                         lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:43:56 np0005538513.localdomain lvm[25231]: PV /dev/loop4 not used.
Nov 28 07:43:56 np0005538513.localdomain lvm[25233]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Nov 28 07:43:56 np0005538513.localdomain systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg1.
Nov 28 07:43:56 np0005538513.localdomain lvm[25239]:   1 logical volume(s) in volume group "ceph_vg1" now active
Nov 28 07:43:56 np0005538513.localdomain lvm[25244]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Nov 28 07:43:56 np0005538513.localdomain lvm[25244]: VG ceph_vg1 finished
Nov 28 07:43:56 np0005538513.localdomain systemd[1]: lvm-activate-ceph_vg1.service: Deactivated successfully.
Nov 28 07:43:56 np0005538513.localdomain sudo[25226]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:57 np0005538513.localdomain sudo[25290]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cpvrmzysdldtwufvjmcaqxkakfjuuixc ; /usr/bin/python3
Nov 28 07:43:57 np0005538513.localdomain sudo[25290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:57 np0005538513.localdomain python3[25292]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-1.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:43:57 np0005538513.localdomain sudo[25290]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:57 np0005538513.localdomain sudo[25333]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykjfmkqbmsmytddhbvijvdiszenewbom ; /usr/bin/python3
Nov 28 07:43:57 np0005538513.localdomain sudo[25333]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:57 np0005538513.localdomain python3[25335]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764315837.220584-54792-11099961487404/source dest=/etc/systemd/system/ceph-osd-losetup-1.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=19612168ea279db4171b94ee1f8625de1ec44b58 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:43:57 np0005538513.localdomain sudo[25333]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:58 np0005538513.localdomain sudo[25363]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tvnwsqlcpajzqjhlduxionxsvoeclnzf ; /usr/bin/python3
Nov 28 07:43:58 np0005538513.localdomain sudo[25363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:58 np0005538513.localdomain python3[25365]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-1.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:43:58 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:43:58 np0005538513.localdomain systemd-rc-local-generator[25390]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:43:58 np0005538513.localdomain systemd-sysv-generator[25395]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:43:58 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:43:58 np0005538513.localdomain systemd[1]: Starting Ceph OSD losetup...
Nov 28 07:43:58 np0005538513.localdomain bash[25406]: /dev/loop4: [64516]:8401550 (/var/lib/ceph-osd-1.img)
Nov 28 07:43:58 np0005538513.localdomain systemd[1]: Finished Ceph OSD losetup.
Nov 28 07:43:58 np0005538513.localdomain sudo[25363]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:58 np0005538513.localdomain lvm[25407]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Nov 28 07:43:58 np0005538513.localdomain lvm[25407]: VG ceph_vg1 finished
Nov 28 07:44:07 np0005538513.localdomain sudo[25451]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jubxwtdhwegdeohfqhzeuzmlwrmveswa ; /usr/bin/python3
Nov 28 07:44:07 np0005538513.localdomain sudo[25451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:07 np0005538513.localdomain python3[25453]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 07:44:07 np0005538513.localdomain sudo[25451]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:08 np0005538513.localdomain sudo[25471]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txtmusrminiwvuhkcnidkzonhhgcuwgs ; /usr/bin/python3
Nov 28 07:44:08 np0005538513.localdomain sudo[25471]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:08 np0005538513.localdomain python3[25473]: ansible-hostname Invoked with name=np0005538513.localdomain use=None
Nov 28 07:44:08 np0005538513.localdomain systemd[1]: Starting Hostname Service...
Nov 28 07:44:08 np0005538513.localdomain systemd[1]: Started Hostname Service.
Nov 28 07:44:08 np0005538513.localdomain sudo[25471]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:11 np0005538513.localdomain sudo[25494]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eubsdpfauhkzodihandswxmrfprgmlmy ; /usr/bin/python3
Nov 28 07:44:11 np0005538513.localdomain sudo[25494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:11 np0005538513.localdomain python3[25496]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None
Nov 28 07:44:11 np0005538513.localdomain sudo[25494]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:12 np0005538513.localdomain sudo[25542]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-holgaxifapmedzxqfpwnragjharlojdz ; /usr/bin/python3
Nov 28 07:44:12 np0005538513.localdomain sudo[25542]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:12 np0005538513.localdomain python3[25544]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.l5sb_fgttmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:44:12 np0005538513.localdomain sudo[25542]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:12 np0005538513.localdomain sudo[25572]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqktvbnemnqbavgpdpqfyndnpxvzavul ; /usr/bin/python3
Nov 28 07:44:12 np0005538513.localdomain sudo[25572]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:13 np0005538513.localdomain python3[25574]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.l5sb_fgttmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:44:13 np0005538513.localdomain sudo[25572]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:13 np0005538513.localdomain sudo[25588]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apeixupjqkmnbafxrljkmnocrqxdjdmv ; /usr/bin/python3
Nov 28 07:44:13 np0005538513.localdomain sudo[25588]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:13 np0005538513.localdomain python3[25590]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.l5sb_fgttmphosts insertbefore=BOF block=192.168.122.106 np0005538513.localdomain np0005538513
                                                         192.168.122.106 np0005538513.ctlplane.localdomain np0005538513.ctlplane
                                                         192.168.122.107 np0005538514.localdomain np0005538514
                                                         192.168.122.107 np0005538514.ctlplane.localdomain np0005538514.ctlplane
                                                         192.168.122.108 np0005538515.localdomain np0005538515
                                                         192.168.122.108 np0005538515.ctlplane.localdomain np0005538515.ctlplane
                                                         192.168.122.103 np0005538510.localdomain np0005538510
                                                         192.168.122.103 np0005538510.ctlplane.localdomain np0005538510.ctlplane
                                                         192.168.122.104 np0005538511.localdomain np0005538511
                                                         192.168.122.104 np0005538511.ctlplane.localdomain np0005538511.ctlplane
                                                         192.168.122.105 np0005538512.localdomain np0005538512
                                                         192.168.122.105 np0005538512.ctlplane.localdomain np0005538512.ctlplane
                                                         
                                                         192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane
                                                          marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:44:13 np0005538513.localdomain sudo[25588]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:13 np0005538513.localdomain sudo[25604]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pphbbhzvjzoznlsjizpaaetcobeewwdw ; /usr/bin/python3
Nov 28 07:44:13 np0005538513.localdomain sudo[25604]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:14 np0005538513.localdomain python3[25606]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.l5sb_fgttmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:44:14 np0005538513.localdomain sudo[25604]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:14 np0005538513.localdomain sudo[25621]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jaaswqvjtehpiljzwvwjtnanjltektmf ; /usr/bin/python3
Nov 28 07:44:14 np0005538513.localdomain sudo[25621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:14 np0005538513.localdomain python3[25623]: ansible-file Invoked with path=/tmp/ansible.l5sb_fgttmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:44:14 np0005538513.localdomain sudo[25621]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:16 np0005538513.localdomain sudo[25637]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eddxsgsfqwcgofhmjvugzodcwapunjqk ; /usr/bin/python3
Nov 28 07:44:16 np0005538513.localdomain sudo[25637]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:16 np0005538513.localdomain python3[25639]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:44:16 np0005538513.localdomain sudo[25637]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:18 np0005538513.localdomain sudo[25655]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofcvalipvyavggnpgmzzaqjmohppznpl ; /usr/bin/python3
Nov 28 07:44:18 np0005538513.localdomain sudo[25655]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:18 np0005538513.localdomain python3[25657]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:44:20 np0005538513.localdomain sudo[25655]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:22 np0005538513.localdomain sudo[25704]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kvrcnohtotkghpklqggtprdtotzibtyc ; /usr/bin/python3
Nov 28 07:44:22 np0005538513.localdomain sudo[25704]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:22 np0005538513.localdomain python3[25706]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:44:22 np0005538513.localdomain sudo[25704]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:22 np0005538513.localdomain sudo[25749]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upzbrnjcmrqugsplealkbtkqdndkjtfg ; /usr/bin/python3
Nov 28 07:44:22 np0005538513.localdomain sudo[25749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:22 np0005538513.localdomain python3[25751]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764315861.8813846-55736-173232186215563/source dest=/etc/chrony.conf owner=root group=root mode=420 follow=False _original_basename=chrony.conf.j2 checksum=4fd4fbbb2de00c70a54478b7feb8ef8adf6a3362 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:44:22 np0005538513.localdomain sudo[25749]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:23 np0005538513.localdomain sudo[25779]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbppuptqkpshxwkjqvmrpvonftnshjog ; /usr/bin/python3
Nov 28 07:44:23 np0005538513.localdomain sudo[25779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:24 np0005538513.localdomain python3[25781]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:44:24 np0005538513.localdomain sudo[25779]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:24 np0005538513.localdomain sudo[25797]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmokfqwfcncfmsrephrhvjaztoylenfu ; /usr/bin/python3
Nov 28 07:44:24 np0005538513.localdomain sudo[25797]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:24 np0005538513.localdomain python3[25799]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 07:44:24 np0005538513.localdomain chronyd[766]: chronyd exiting
Nov 28 07:44:24 np0005538513.localdomain systemd[1]: Stopping NTP client/server...
Nov 28 07:44:24 np0005538513.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Nov 28 07:44:24 np0005538513.localdomain systemd[1]: Stopped NTP client/server.
Nov 28 07:44:24 np0005538513.localdomain systemd[1]: chronyd.service: Consumed 119ms CPU time, read 1.9M from disk, written 0B to disk.
Nov 28 07:44:24 np0005538513.localdomain systemd[1]: Starting NTP client/server...
Nov 28 07:44:24 np0005538513.localdomain chronyd[25806]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Nov 28 07:44:24 np0005538513.localdomain chronyd[25806]: Frequency -30.600 +/- 0.236 ppm read from /var/lib/chrony/drift
Nov 28 07:44:24 np0005538513.localdomain chronyd[25806]: Loaded seccomp filter (level 2)
Nov 28 07:44:24 np0005538513.localdomain systemd[1]: Started NTP client/server.
Nov 28 07:44:24 np0005538513.localdomain sudo[25797]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:26 np0005538513.localdomain sudo[25853]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jumidzojzwrkfyblqboktwhujahbbakp ; /usr/bin/python3
Nov 28 07:44:26 np0005538513.localdomain sudo[25853]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:26 np0005538513.localdomain python3[25855]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:44:26 np0005538513.localdomain sudo[25853]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:26 np0005538513.localdomain sudo[25896]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmstpmftjdljhkbssiluzdoxqntkroeh ; /usr/bin/python3
Nov 28 07:44:26 np0005538513.localdomain sudo[25896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:26 np0005538513.localdomain python3[25898]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764315866.2825744-55884-269850070627229/source dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service follow=False checksum=d4d85e046d61f558ac7ec8178c6d529d893e81e1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:44:26 np0005538513.localdomain sudo[25896]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:27 np0005538513.localdomain sudo[25926]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmagigebdenehijducpkavzevunjubvo ; /usr/bin/python3
Nov 28 07:44:27 np0005538513.localdomain sudo[25926]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:27 np0005538513.localdomain python3[25928]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:44:27 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:44:27 np0005538513.localdomain systemd-rc-local-generator[25953]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:44:27 np0005538513.localdomain systemd-sysv-generator[25958]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:44:27 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:44:27 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:44:27 np0005538513.localdomain systemd-rc-local-generator[25990]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:44:27 np0005538513.localdomain systemd-sysv-generator[25995]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:44:27 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:44:28 np0005538513.localdomain systemd[1]: Starting chronyd online sources service...
Nov 28 07:44:28 np0005538513.localdomain chronyc[26004]: 200 OK
Nov 28 07:44:28 np0005538513.localdomain systemd[1]: chrony-online.service: Deactivated successfully.
Nov 28 07:44:28 np0005538513.localdomain systemd[1]: Finished chronyd online sources service.
Nov 28 07:44:28 np0005538513.localdomain sudo[25926]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:28 np0005538513.localdomain sudo[26019]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwuzkesplppfxtisgiqbnjoevxlqqefg ; /usr/bin/python3
Nov 28 07:44:28 np0005538513.localdomain sudo[26019]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:28 np0005538513.localdomain python3[26021]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:44:28 np0005538513.localdomain chronyd[25806]: System clock was stepped by 0.000000 seconds
Nov 28 07:44:28 np0005538513.localdomain sudo[26019]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:29 np0005538513.localdomain sudo[26036]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugtekbnwdklbtgzoaxarbprqocseahsi ; /usr/bin/python3
Nov 28 07:44:29 np0005538513.localdomain chronyd[25806]: Selected source 162.159.200.1 (pool.ntp.org)
Nov 28 07:44:29 np0005538513.localdomain sudo[26036]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:29 np0005538513.localdomain python3[26038]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:44:29 np0005538513.localdomain sudo[26036]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:38 np0005538513.localdomain systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 28 07:44:39 np0005538513.localdomain sudo[26056]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycbezonzneppxixyrkdzqiadskvuirmf ; /usr/bin/python3
Nov 28 07:44:39 np0005538513.localdomain sudo[26056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:39 np0005538513.localdomain python3[26058]: ansible-timezone Invoked with name=UTC hwclock=None
Nov 28 07:44:39 np0005538513.localdomain systemd[1]: Starting Time & Date Service...
Nov 28 07:44:39 np0005538513.localdomain systemd[1]: Started Time & Date Service.
Nov 28 07:44:39 np0005538513.localdomain sudo[26056]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:41 np0005538513.localdomain sudo[26076]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjwrbzvbrrhzizblfpokzpddxjlstcgr ; /usr/bin/python3
Nov 28 07:44:41 np0005538513.localdomain sudo[26076]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:41 np0005538513.localdomain python3[26078]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 07:44:41 np0005538513.localdomain chronyd[25806]: chronyd exiting
Nov 28 07:44:41 np0005538513.localdomain systemd[1]: Stopping NTP client/server...
Nov 28 07:44:41 np0005538513.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Nov 28 07:44:41 np0005538513.localdomain systemd[1]: Stopped NTP client/server.
Nov 28 07:44:41 np0005538513.localdomain systemd[1]: Starting NTP client/server...
Nov 28 07:44:41 np0005538513.localdomain chronyd[26085]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Nov 28 07:44:41 np0005538513.localdomain chronyd[26085]: Frequency -30.600 +/- 0.249 ppm read from /var/lib/chrony/drift
Nov 28 07:44:41 np0005538513.localdomain chronyd[26085]: Loaded seccomp filter (level 2)
Nov 28 07:44:41 np0005538513.localdomain systemd[1]: Started NTP client/server.
Nov 28 07:44:41 np0005538513.localdomain sudo[26076]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:46 np0005538513.localdomain chronyd[26085]: Selected source 174.138.193.90 (pool.ntp.org)
Nov 28 07:44:57 np0005538513.localdomain sudo[26100]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zboziitaipypdlvlmorfmxsunpeuinha ; /usr/bin/python3
Nov 28 07:44:57 np0005538513.localdomain sudo[26100]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:58 np0005538513.localdomain useradd[26104]: new group: name=ceph-admin, GID=1002
Nov 28 07:44:58 np0005538513.localdomain useradd[26104]: new user: name=ceph-admin, UID=1002, GID=1002, home=/home/ceph-admin, shell=/bin/bash, from=none
Nov 28 07:44:58 np0005538513.localdomain sudo[26100]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:58 np0005538513.localdomain sudo[26156]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kpejprzbakbtkscltzckyynhuhtbetnb ; /usr/bin/python3
Nov 28 07:44:58 np0005538513.localdomain sudo[26156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:58 np0005538513.localdomain sudo[26156]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:58 np0005538513.localdomain sudo[26199]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdkelcqobgxeitezzxmsdallmzvdqarz ; /usr/bin/python3
Nov 28 07:44:58 np0005538513.localdomain sudo[26199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:59 np0005538513.localdomain sudo[26199]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:59 np0005538513.localdomain sudo[26229]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pojxzswsbreznfvysllyssmnsktseybh ; /usr/bin/python3
Nov 28 07:44:59 np0005538513.localdomain sudo[26229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:59 np0005538513.localdomain sudo[26229]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:59 np0005538513.localdomain sudo[26245]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eejctlmqayfecyzbgssuhbkyflflqtrw ; /usr/bin/python3
Nov 28 07:44:59 np0005538513.localdomain sudo[26245]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:59 np0005538513.localdomain sudo[26245]: pam_unix(sudo:session): session closed for user root
Nov 28 07:45:00 np0005538513.localdomain sudo[26261]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usiiroxtchxymoajfylfvclwrahsncua ; /usr/bin/python3
Nov 28 07:45:00 np0005538513.localdomain sudo[26261]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:45:00 np0005538513.localdomain sudo[26261]: pam_unix(sudo:session): session closed for user root
Nov 28 07:45:00 np0005538513.localdomain sudo[26277]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xnrhcywlkoelrwbfuzjlppcjvyaltxno ; /usr/bin/python3
Nov 28 07:45:00 np0005538513.localdomain sudo[26277]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:45:01 np0005538513.localdomain sudo[26277]: pam_unix(sudo:session): session closed for user root
Nov 28 07:45:09 np0005538513.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 28 07:46:39 np0005538513.localdomain sshd[26282]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:39 np0005538513.localdomain sshd[26282]: Accepted publickey for ceph-admin from 192.168.122.103 port 55874 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:39 np0005538513.localdomain systemd-logind[764]: New session 14 of user ceph-admin.
Nov 28 07:46:39 np0005538513.localdomain systemd[1]: Created slice User Slice of UID 1002.
Nov 28 07:46:39 np0005538513.localdomain systemd[1]: Starting User Runtime Directory /run/user/1002...
Nov 28 07:46:40 np0005538513.localdomain systemd[1]: Finished User Runtime Directory /run/user/1002.
Nov 28 07:46:40 np0005538513.localdomain systemd[1]: Starting User Manager for UID 1002...
Nov 28 07:46:40 np0005538513.localdomain systemd[26286]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:40 np0005538513.localdomain systemd[26286]: Queued start job for default target Main User Target.
Nov 28 07:46:40 np0005538513.localdomain systemd[26286]: Created slice User Application Slice.
Nov 28 07:46:40 np0005538513.localdomain systemd[26286]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 07:46:40 np0005538513.localdomain systemd[26286]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 07:46:40 np0005538513.localdomain systemd[26286]: Reached target Paths.
Nov 28 07:46:40 np0005538513.localdomain systemd[26286]: Reached target Timers.
Nov 28 07:46:40 np0005538513.localdomain systemd[26286]: Starting D-Bus User Message Bus Socket...
Nov 28 07:46:40 np0005538513.localdomain systemd[26286]: Starting Create User's Volatile Files and Directories...
Nov 28 07:46:40 np0005538513.localdomain sshd[26299]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:40 np0005538513.localdomain systemd[26286]: Listening on D-Bus User Message Bus Socket.
Nov 28 07:46:40 np0005538513.localdomain systemd[26286]: Reached target Sockets.
Nov 28 07:46:40 np0005538513.localdomain systemd[26286]: Finished Create User's Volatile Files and Directories.
Nov 28 07:46:40 np0005538513.localdomain systemd[26286]: Reached target Basic System.
Nov 28 07:46:40 np0005538513.localdomain systemd[26286]: Reached target Main User Target.
Nov 28 07:46:40 np0005538513.localdomain systemd[26286]: Startup finished in 118ms.
Nov 28 07:46:40 np0005538513.localdomain systemd[1]: Started User Manager for UID 1002.
Nov 28 07:46:40 np0005538513.localdomain systemd[1]: Started Session 14 of User ceph-admin.
Nov 28 07:46:40 np0005538513.localdomain sshd[26282]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:40 np0005538513.localdomain sshd[26299]: Accepted publickey for ceph-admin from 192.168.122.103 port 55886 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:40 np0005538513.localdomain systemd-logind[764]: New session 16 of user ceph-admin.
Nov 28 07:46:40 np0005538513.localdomain systemd[1]: Started Session 16 of User ceph-admin.
Nov 28 07:46:40 np0005538513.localdomain sshd[26299]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:40 np0005538513.localdomain sudo[26306]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:46:40 np0005538513.localdomain sudo[26306]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:46:40 np0005538513.localdomain sudo[26306]: pam_unix(sudo:session): session closed for user root
Nov 28 07:46:40 np0005538513.localdomain sshd[26321]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:40 np0005538513.localdomain sshd[26321]: Accepted publickey for ceph-admin from 192.168.122.103 port 55892 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:40 np0005538513.localdomain systemd-logind[764]: New session 17 of user ceph-admin.
Nov 28 07:46:40 np0005538513.localdomain systemd[1]: Started Session 17 of User ceph-admin.
Nov 28 07:46:40 np0005538513.localdomain sshd[26321]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:40 np0005538513.localdomain sudo[26325]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host --expect-hostname np0005538513.localdomain
Nov 28 07:46:40 np0005538513.localdomain sudo[26325]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:46:40 np0005538513.localdomain sudo[26325]: pam_unix(sudo:session): session closed for user root
Nov 28 07:46:40 np0005538513.localdomain sshd[26340]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:41 np0005538513.localdomain sshd[26340]: Accepted publickey for ceph-admin from 192.168.122.103 port 55902 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:41 np0005538513.localdomain systemd-logind[764]: New session 18 of user ceph-admin.
Nov 28 07:46:41 np0005538513.localdomain systemd[1]: Started Session 18 of User ceph-admin.
Nov 28 07:46:41 np0005538513.localdomain sshd[26340]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:41 np0005538513.localdomain sudo[26344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3
Nov 28 07:46:41 np0005538513.localdomain sudo[26344]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:46:41 np0005538513.localdomain sudo[26344]: pam_unix(sudo:session): session closed for user root
Nov 28 07:46:41 np0005538513.localdomain sshd[26359]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:41 np0005538513.localdomain sshd[26359]: Accepted publickey for ceph-admin from 192.168.122.103 port 55908 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:41 np0005538513.localdomain systemd-logind[764]: New session 19 of user ceph-admin.
Nov 28 07:46:41 np0005538513.localdomain systemd[1]: Started Session 19 of User ceph-admin.
Nov 28 07:46:41 np0005538513.localdomain sshd[26359]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:41 np0005538513.localdomain sudo[26363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 07:46:41 np0005538513.localdomain sudo[26363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:46:41 np0005538513.localdomain sudo[26363]: pam_unix(sudo:session): session closed for user root
Nov 28 07:46:41 np0005538513.localdomain sshd[26378]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:41 np0005538513.localdomain sshd[26378]: Accepted publickey for ceph-admin from 192.168.122.103 port 55912 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:41 np0005538513.localdomain systemd-logind[764]: New session 20 of user ceph-admin.
Nov 28 07:46:41 np0005538513.localdomain systemd[1]: Started Session 20 of User ceph-admin.
Nov 28 07:46:41 np0005538513.localdomain sshd[26378]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:41 np0005538513.localdomain sudo[26382]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 07:46:41 np0005538513.localdomain sudo[26382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:46:41 np0005538513.localdomain sudo[26382]: pam_unix(sudo:session): session closed for user root
Nov 28 07:46:42 np0005538513.localdomain sshd[26397]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:42 np0005538513.localdomain sshd[26397]: Accepted publickey for ceph-admin from 192.168.122.103 port 55926 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:42 np0005538513.localdomain systemd-logind[764]: New session 21 of user ceph-admin.
Nov 28 07:46:42 np0005538513.localdomain systemd[1]: Started Session 21 of User ceph-admin.
Nov 28 07:46:42 np0005538513.localdomain sshd[26397]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:42 np0005538513.localdomain sudo[26401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3.new
Nov 28 07:46:42 np0005538513.localdomain sudo[26401]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:46:42 np0005538513.localdomain sudo[26401]: pam_unix(sudo:session): session closed for user root
Nov 28 07:46:42 np0005538513.localdomain sshd[26416]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:42 np0005538513.localdomain sshd[26416]: Accepted publickey for ceph-admin from 192.168.122.103 port 55940 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:42 np0005538513.localdomain systemd-logind[764]: New session 22 of user ceph-admin.
Nov 28 07:46:42 np0005538513.localdomain systemd[1]: Started Session 22 of User ceph-admin.
Nov 28 07:46:42 np0005538513.localdomain sshd[26416]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:42 np0005538513.localdomain sudo[26420]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 07:46:42 np0005538513.localdomain sudo[26420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:46:42 np0005538513.localdomain sudo[26420]: pam_unix(sudo:session): session closed for user root
Nov 28 07:46:42 np0005538513.localdomain sshd[26435]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:42 np0005538513.localdomain sshd[26435]: Accepted publickey for ceph-admin from 192.168.122.103 port 55942 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:42 np0005538513.localdomain systemd-logind[764]: New session 23 of user ceph-admin.
Nov 28 07:46:42 np0005538513.localdomain systemd[1]: Started Session 23 of User ceph-admin.
Nov 28 07:46:42 np0005538513.localdomain sshd[26435]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:43 np0005538513.localdomain sudo[26439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3.new
Nov 28 07:46:43 np0005538513.localdomain sudo[26439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:46:43 np0005538513.localdomain sudo[26439]: pam_unix(sudo:session): session closed for user root
Nov 28 07:46:43 np0005538513.localdomain sshd[26454]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:43 np0005538513.localdomain sshd[26454]: Accepted publickey for ceph-admin from 192.168.122.103 port 55952 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:43 np0005538513.localdomain systemd-logind[764]: New session 24 of user ceph-admin.
Nov 28 07:46:43 np0005538513.localdomain systemd[1]: Started Session 24 of User ceph-admin.
Nov 28 07:46:43 np0005538513.localdomain sshd[26454]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:43 np0005538513.localdomain sshd[26471]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:43 np0005538513.localdomain sshd[26471]: Accepted publickey for ceph-admin from 192.168.122.103 port 41656 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:43 np0005538513.localdomain systemd-logind[764]: New session 25 of user ceph-admin.
Nov 28 07:46:43 np0005538513.localdomain systemd[1]: Started Session 25 of User ceph-admin.
Nov 28 07:46:43 np0005538513.localdomain sshd[26471]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:44 np0005538513.localdomain sudo[26475]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3
Nov 28 07:46:44 np0005538513.localdomain sudo[26475]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:46:44 np0005538513.localdomain sudo[26475]: pam_unix(sudo:session): session closed for user root
Nov 28 07:46:44 np0005538513.localdomain sshd[26490]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:44 np0005538513.localdomain sshd[26490]: Accepted publickey for ceph-admin from 192.168.122.103 port 41666 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:44 np0005538513.localdomain systemd-logind[764]: New session 26 of user ceph-admin.
Nov 28 07:46:44 np0005538513.localdomain systemd[1]: Started Session 26 of User ceph-admin.
Nov 28 07:46:44 np0005538513.localdomain sshd[26490]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:44 np0005538513.localdomain sudo[26494]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host --expect-hostname np0005538513.localdomain
Nov 28 07:46:44 np0005538513.localdomain sudo[26494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:46:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:46:44 np0005538513.localdomain sudo[26494]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:09 np0005538513.localdomain sudo[26529]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:47:09 np0005538513.localdomain sudo[26529]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:09 np0005538513.localdomain sudo[26529]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:09 np0005538513.localdomain sudo[26544]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:47:09 np0005538513.localdomain sudo[26544]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:09 np0005538513.localdomain sudo[26544]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:10 np0005538513.localdomain sudo[26559]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 07:47:10 np0005538513.localdomain sudo[26559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:10 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:47:10 np0005538513.localdomain sudo[26559]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:10 np0005538513.localdomain sudo[26595]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:47:10 np0005538513.localdomain sudo[26595]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:10 np0005538513.localdomain sudo[26595]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:10 np0005538513.localdomain sudo[26610]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 07:47:10 np0005538513.localdomain sudo[26610]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:10 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:47:11 np0005538513.localdomain sudo[26610]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:11 np0005538513.localdomain sudo[26663]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:47:11 np0005538513.localdomain sudo[26663]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:11 np0005538513.localdomain sudo[26663]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:11 np0005538513.localdomain sudo[26678]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:47:11 np0005538513.localdomain sudo[26678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:11 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:47:11 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:47:11 np0005538513.localdomain systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 26705 (sysctl)
Nov 28 07:47:11 np0005538513.localdomain systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 28 07:47:11 np0005538513.localdomain systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 28 07:47:11 np0005538513.localdomain sudo[26678]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:12 np0005538513.localdomain sudo[26727]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:47:12 np0005538513.localdomain sudo[26727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:12 np0005538513.localdomain sudo[26727]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:12 np0005538513.localdomain sudo[26742]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 28 07:47:12 np0005538513.localdomain sudo[26742]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:12 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:47:12 np0005538513.localdomain sudo[26742]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:12 np0005538513.localdomain sudo[26775]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:47:12 np0005538513.localdomain sudo[26775]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:12 np0005538513.localdomain sudo[26775]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:12 np0005538513.localdomain sudo[26790]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 -- inventory --format=json-pretty --filter-for-batch
Nov 28 07:47:12 np0005538513.localdomain sudo[26790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:12 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:47:16 np0005538513.localdomain kernel: VFS: idmapped mount is not enabled.
Nov 28 07:47:35 np0005538513.localdomain podman[26843]: 
Nov 28 07:47:35 np0005538513.localdomain podman[26843]: 2025-11-28 07:47:35.544459656 +0000 UTC m=+22.458551621 container create 49d84c226d95e993c15293ea02b7fe971016efdfeadbf8407c7cf2b65927d060 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_proskuriakova, build-date=2025-09-24T08:57:55, version=7, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_CLEAN=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, release=553, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 07:47:35 np0005538513.localdomain systemd[1]: Created slice Slice /machine.
Nov 28 07:47:35 np0005538513.localdomain systemd[1]: Started libpod-conmon-49d84c226d95e993c15293ea02b7fe971016efdfeadbf8407c7cf2b65927d060.scope.
Nov 28 07:47:35 np0005538513.localdomain podman[26843]: 2025-11-28 07:47:13.132295829 +0000 UTC m=+0.046387814 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:47:35 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 07:47:35 np0005538513.localdomain podman[26843]: 2025-11-28 07:47:35.650666804 +0000 UTC m=+22.564758769 container init 49d84c226d95e993c15293ea02b7fe971016efdfeadbf8407c7cf2b65927d060 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_proskuriakova, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, build-date=2025-09-24T08:57:55, vcs-type=git, version=7, io.k8s.description=Red Hat Ceph Storage 7, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 07:47:35 np0005538513.localdomain podman[26843]: 2025-11-28 07:47:35.66270331 +0000 UTC m=+22.576795275 container start 49d84c226d95e993c15293ea02b7fe971016efdfeadbf8407c7cf2b65927d060 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_proskuriakova, vcs-type=git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, release=553, architecture=x86_64, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.buildah.version=1.33.12, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 07:47:35 np0005538513.localdomain podman[26843]: 2025-11-28 07:47:35.662954098 +0000 UTC m=+22.577046063 container attach 49d84c226d95e993c15293ea02b7fe971016efdfeadbf8407c7cf2b65927d060 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_proskuriakova, distribution-scope=public, name=rhceph, CEPH_POINT_RELEASE=, architecture=x86_64, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, com.redhat.component=rhceph-container, RELEASE=main, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, version=7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, release=553, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 07:47:35 np0005538513.localdomain intelligent_proskuriakova[26983]: 167 167
Nov 28 07:47:35 np0005538513.localdomain systemd[1]: libpod-49d84c226d95e993c15293ea02b7fe971016efdfeadbf8407c7cf2b65927d060.scope: Deactivated successfully.
Nov 28 07:47:35 np0005538513.localdomain podman[26843]: 2025-11-28 07:47:35.668791891 +0000 UTC m=+22.582883866 container died 49d84c226d95e993c15293ea02b7fe971016efdfeadbf8407c7cf2b65927d060 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_proskuriakova, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, release=553, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-type=git, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 07:47:35 np0005538513.localdomain podman[26988]: 2025-11-28 07:47:35.76719957 +0000 UTC m=+0.083021189 container remove 49d84c226d95e993c15293ea02b7fe971016efdfeadbf8407c7cf2b65927d060 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_proskuriakova, RELEASE=main, distribution-scope=public, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, ceph=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.buildah.version=1.33.12)
Nov 28 07:47:35 np0005538513.localdomain systemd[1]: libpod-conmon-49d84c226d95e993c15293ea02b7fe971016efdfeadbf8407c7cf2b65927d060.scope: Deactivated successfully.
Nov 28 07:47:35 np0005538513.localdomain podman[27009]: 
Nov 28 07:47:35 np0005538513.localdomain podman[27009]: 2025-11-28 07:47:35.99565664 +0000 UTC m=+0.074397750 container create e80abf35ae6ce6ba378eadac13dbb99b64b52ca9e145357efe0ffb735f496fd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_williams, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_BRANCH=main, RELEASE=main, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, ceph=True, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc.)
Nov 28 07:47:36 np0005538513.localdomain systemd[1]: Started libpod-conmon-e80abf35ae6ce6ba378eadac13dbb99b64b52ca9e145357efe0ffb735f496fd6.scope.
Nov 28 07:47:36 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 07:47:36 np0005538513.localdomain podman[27009]: 2025-11-28 07:47:35.965747478 +0000 UTC m=+0.044488588 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:47:36 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29f259534901738a5f4b97b654de74250a449d775673a37bdc56d10d23dcba1c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 07:47:36 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29f259534901738a5f4b97b654de74250a449d775673a37bdc56d10d23dcba1c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:47:36 np0005538513.localdomain podman[27009]: 2025-11-28 07:47:36.085374039 +0000 UTC m=+0.164115159 container init e80abf35ae6ce6ba378eadac13dbb99b64b52ca9e145357efe0ffb735f496fd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_williams, GIT_BRANCH=main, RELEASE=main, version=7, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vendor=Red Hat, Inc., name=rhceph, io.openshift.tags=rhceph ceph, release=553, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=)
Nov 28 07:47:36 np0005538513.localdomain podman[27009]: 2025-11-28 07:47:36.094157412 +0000 UTC m=+0.172898532 container start e80abf35ae6ce6ba378eadac13dbb99b64b52ca9e145357efe0ffb735f496fd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_williams, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, ceph=True, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, name=rhceph, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, architecture=x86_64, version=7, RELEASE=main, CEPH_POINT_RELEASE=, distribution-scope=public, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 07:47:36 np0005538513.localdomain podman[27009]: 2025-11-28 07:47:36.094492345 +0000 UTC m=+0.173233455 container attach e80abf35ae6ce6ba378eadac13dbb99b64b52ca9e145357efe0ffb735f496fd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_williams, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, ceph=True, RELEASE=main, vendor=Red Hat, Inc., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-type=git, distribution-scope=public, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, release=553, version=7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph)
Nov 28 07:47:36 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-6422b05a9753c3f1dbb0caf61c729a218996922d76eb2d3d412dd44a585ed3b6-merged.mount: Deactivated successfully.
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]: [
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:     {
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:         "available": false,
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:         "ceph_device": false,
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:         "lsm_data": {},
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:         "lvs": [],
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:         "path": "/dev/sr0",
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:         "rejected_reasons": [
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:             "Has a FileSystem",
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:             "Insufficient space (<5GB)"
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:         ],
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:         "sys_api": {
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:             "actuators": null,
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:             "device_nodes": "sr0",
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:             "human_readable_size": "482.00 KB",
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:             "id_bus": "ata",
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:             "model": "QEMU DVD-ROM",
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:             "nr_requests": "2",
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:             "partitions": {},
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:             "path": "/dev/sr0",
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:             "removable": "1",
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:             "rev": "2.5+",
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:             "ro": "0",
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:             "rotational": "1",
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:             "sas_address": "",
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:             "sas_device_handle": "",
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:             "scheduler_mode": "mq-deadline",
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:             "sectors": 0,
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:             "sectorsize": "2048",
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:             "size": 493568.0,
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:             "support_discard": "0",
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:             "type": "disk",
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:             "vendor": "QEMU"
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:         }
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]:     }
Nov 28 07:47:36 np0005538513.localdomain cranky_williams[27024]: ]
Nov 28 07:47:36 np0005538513.localdomain systemd[1]: libpod-e80abf35ae6ce6ba378eadac13dbb99b64b52ca9e145357efe0ffb735f496fd6.scope: Deactivated successfully.
Nov 28 07:47:36 np0005538513.localdomain podman[27009]: 2025-11-28 07:47:36.889843116 +0000 UTC m=+0.968584216 container died e80abf35ae6ce6ba378eadac13dbb99b64b52ca9e145357efe0ffb735f496fd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_williams, ceph=True, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, RELEASE=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 07:47:36 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-29f259534901738a5f4b97b654de74250a449d775673a37bdc56d10d23dcba1c-merged.mount: Deactivated successfully.
Nov 28 07:47:36 np0005538513.localdomain podman[28234]: 2025-11-28 07:47:36.989760737 +0000 UTC m=+0.084803410 container remove e80abf35ae6ce6ba378eadac13dbb99b64b52ca9e145357efe0ffb735f496fd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_williams, name=rhceph, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, release=553, version=7, vendor=Red Hat, Inc.)
Nov 28 07:47:36 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:47:36 np0005538513.localdomain systemd[1]: libpod-conmon-e80abf35ae6ce6ba378eadac13dbb99b64b52ca9e145357efe0ffb735f496fd6.scope: Deactivated successfully.
Nov 28 07:47:37 np0005538513.localdomain sudo[26790]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:37 np0005538513.localdomain sudo[28247]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:47:37 np0005538513.localdomain sudo[28247]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:37 np0005538513.localdomain sudo[28247]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:37 np0005538513.localdomain sudo[28262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 _orch set-coredump-overrides --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 --coredump-max-size=32G
Nov 28 07:47:37 np0005538513.localdomain sudo[28262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:37 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:47:37 np0005538513.localdomain systemd[1]: systemd-coredump.socket: Deactivated successfully.
Nov 28 07:47:37 np0005538513.localdomain systemd[1]: Closed Process Core Dump Socket.
Nov 28 07:47:37 np0005538513.localdomain systemd[1]: Stopping Process Core Dump Socket...
Nov 28 07:47:37 np0005538513.localdomain systemd[1]: Listening on Process Core Dump Socket.
Nov 28 07:47:37 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:47:37 np0005538513.localdomain systemd-rc-local-generator[28313]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:47:37 np0005538513.localdomain systemd-sysv-generator[28317]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:47:37 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:47:37 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:47:37 np0005538513.localdomain systemd-rc-local-generator[28352]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:47:37 np0005538513.localdomain systemd-sysv-generator[28355]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:47:37 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:47:38 np0005538513.localdomain sudo[28262]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:01 np0005538513.localdomain sudo[28364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:48:01 np0005538513.localdomain sudo[28364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:01 np0005538513.localdomain sudo[28364]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:01 np0005538513.localdomain sudo[28379]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 07:48:01 np0005538513.localdomain sudo[28379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:01 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:48:02 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:48:02 np0005538513.localdomain podman[28438]: 
Nov 28 07:48:02 np0005538513.localdomain podman[28438]: 2025-11-28 07:48:02.209461019 +0000 UTC m=+0.038100620 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:05 np0005538513.localdomain podman[28438]: 2025-11-28 07:48:05.376875744 +0000 UTC m=+3.205515285 container create f9fea4cc00b8bd6c2ca22841f14244101f9133134594667c167d29b1894ea569 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_tharp, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, version=7, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, architecture=x86_64, io.buildah.version=1.33.12, ceph=True, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-type=git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., RELEASE=main, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph)
Nov 28 07:48:05 np0005538513.localdomain systemd[1]: Started libpod-conmon-f9fea4cc00b8bd6c2ca22841f14244101f9133134594667c167d29b1894ea569.scope.
Nov 28 07:48:05 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:05 np0005538513.localdomain podman[28438]: 2025-11-28 07:48:05.745392148 +0000 UTC m=+3.574031669 container init f9fea4cc00b8bd6c2ca22841f14244101f9133134594667c167d29b1894ea569 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_tharp, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, ceph=True, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.tags=rhceph ceph, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 07:48:05 np0005538513.localdomain podman[28438]: 2025-11-28 07:48:05.753685619 +0000 UTC m=+3.582325170 container start f9fea4cc00b8bd6c2ca22841f14244101f9133134594667c167d29b1894ea569 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_tharp, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, version=7, io.buildah.version=1.33.12, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_CLEAN=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, ceph=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph)
Nov 28 07:48:05 np0005538513.localdomain podman[28438]: 2025-11-28 07:48:05.753884254 +0000 UTC m=+3.582523805 container attach f9fea4cc00b8bd6c2ca22841f14244101f9133134594667c167d29b1894ea569 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_tharp, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, ceph=True, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, io.buildah.version=1.33.12, vendor=Red Hat, Inc., version=7, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True)
Nov 28 07:48:05 np0005538513.localdomain youthful_tharp[28543]: 167 167
Nov 28 07:48:05 np0005538513.localdomain systemd[1]: libpod-f9fea4cc00b8bd6c2ca22841f14244101f9133134594667c167d29b1894ea569.scope: Deactivated successfully.
Nov 28 07:48:05 np0005538513.localdomain podman[28438]: 2025-11-28 07:48:05.756840529 +0000 UTC m=+3.585480100 container died f9fea4cc00b8bd6c2ca22841f14244101f9133134594667c167d29b1894ea569 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_tharp, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.openshift.expose-services=, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., version=7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, RELEASE=main, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main)
Nov 28 07:48:05 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-2a8d697ae47ba7fd5fc07a53e3854d38b8540e57427d94fd8afb44ee011fa460-merged.mount: Deactivated successfully.
Nov 28 07:48:05 np0005538513.localdomain podman[28548]: 2025-11-28 07:48:05.861005845 +0000 UTC m=+0.089195998 container remove f9fea4cc00b8bd6c2ca22841f14244101f9133134594667c167d29b1894ea569 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_tharp, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, CEPH_POINT_RELEASE=, version=7, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, distribution-scope=public, build-date=2025-09-24T08:57:55)
Nov 28 07:48:05 np0005538513.localdomain systemd[1]: libpod-conmon-f9fea4cc00b8bd6c2ca22841f14244101f9133134594667c167d29b1894ea569.scope: Deactivated successfully.
Nov 28 07:48:05 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:48:06 np0005538513.localdomain systemd-rc-local-generator[28587]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:48:06 np0005538513.localdomain systemd-sysv-generator[28594]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:48:06 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:48:06 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:48:06 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:48:06 np0005538513.localdomain systemd-rc-local-generator[28627]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:48:06 np0005538513.localdomain systemd-sysv-generator[28632]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:48:06 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:48:06 np0005538513.localdomain systemd[1]: Reached target All Ceph clusters and services.
Nov 28 07:48:06 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:48:06 np0005538513.localdomain systemd-rc-local-generator[28665]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:48:06 np0005538513.localdomain systemd-sysv-generator[28671]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:48:06 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:48:06 np0005538513.localdomain systemd[1]: Reached target Ceph cluster 2c5417c9-00eb-57d5-a565-ddecbc7995c1.
Nov 28 07:48:06 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:48:06 np0005538513.localdomain systemd-rc-local-generator[28706]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:48:06 np0005538513.localdomain systemd-sysv-generator[28709]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:48:06 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:48:06 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:48:06 np0005538513.localdomain systemd-sysv-generator[28750]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:48:06 np0005538513.localdomain systemd-rc-local-generator[28747]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:48:07 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:48:07 np0005538513.localdomain systemd[1]: Created slice Slice /system/ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1.
Nov 28 07:48:07 np0005538513.localdomain systemd[1]: Reached target System Time Set.
Nov 28 07:48:07 np0005538513.localdomain systemd[1]: Reached target System Time Synchronized.
Nov 28 07:48:07 np0005538513.localdomain systemd[1]: Starting Ceph crash.np0005538513 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1...
Nov 28 07:48:07 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:48:07 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:48:07 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:48:07 np0005538513.localdomain podman[28808]: 
Nov 28 07:48:07 np0005538513.localdomain podman[28808]: 2025-11-28 07:48:07.518027622 +0000 UTC m=+0.075893630 container create bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, GIT_BRANCH=main, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, version=7, com.redhat.component=rhceph-container, GIT_CLEAN=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, distribution-scope=public, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph)
Nov 28 07:48:07 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86da5d0fe3bd11ba723935f9d08933124c41be8b02c1ee8a86209bcb5d7ec477/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:07 np0005538513.localdomain podman[28808]: 2025-11-28 07:48:07.486474469 +0000 UTC m=+0.044340517 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:07 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86da5d0fe3bd11ba723935f9d08933124c41be8b02c1ee8a86209bcb5d7ec477/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:07 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86da5d0fe3bd11ba723935f9d08933124c41be8b02c1ee8a86209bcb5d7ec477/merged/etc/ceph/ceph.client.crash.np0005538513.keyring supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:07 np0005538513.localdomain podman[28808]: 2025-11-28 07:48:07.624286402 +0000 UTC m=+0.182152400 container init bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, CEPH_POINT_RELEASE=, architecture=x86_64, name=rhceph, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, distribution-scope=public, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-type=git, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.tags=rhceph ceph)
Nov 28 07:48:07 np0005538513.localdomain podman[28808]: 2025-11-28 07:48:07.634977673 +0000 UTC m=+0.192843671 container start bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, version=7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, name=rhceph, RELEASE=main, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Nov 28 07:48:07 np0005538513.localdomain bash[28808]: bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491
Nov 28 07:48:07 np0005538513.localdomain systemd[1]: Started Ceph crash.np0005538513 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1.
Nov 28 07:48:07 np0005538513.localdomain sudo[28379]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:07 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513[28822]: INFO:ceph-crash:pinging cluster to exercise our key
Nov 28 07:48:07 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513[28822]: 2025-11-28T07:48:07.813+0000 7fe5f3232640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Nov 28 07:48:07 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513[28822]: 2025-11-28T07:48:07.813+0000 7fe5f3232640 -1 AuthRegistry(0x7fe5ec0680d0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Nov 28 07:48:07 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513[28822]: 2025-11-28T07:48:07.814+0000 7fe5f3232640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Nov 28 07:48:07 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513[28822]: 2025-11-28T07:48:07.814+0000 7fe5f3232640 -1 AuthRegistry(0x7fe5f3231000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Nov 28 07:48:07 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513[28822]: 2025-11-28T07:48:07.821+0000 7fe5f0fa7640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Nov 28 07:48:07 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513[28822]: 2025-11-28T07:48:07.822+0000 7fe5ebfff640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Nov 28 07:48:07 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513[28822]: 2025-11-28T07:48:07.823+0000 7fe5f17a8640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Nov 28 07:48:07 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513[28822]: 2025-11-28T07:48:07.823+0000 7fe5f3232640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Nov 28 07:48:07 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513[28822]: [errno 13] RADOS permission denied (error connecting to the cluster)
Nov 28 07:48:07 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513[28822]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Nov 28 07:48:16 np0005538513.localdomain sudo[28839]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:48:16 np0005538513.localdomain sudo[28839]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:16 np0005538513.localdomain sudo[28839]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:16 np0005538513.localdomain sudo[28854]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 --yes --no-systemd
Nov 28 07:48:16 np0005538513.localdomain sudo[28854]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:17 np0005538513.localdomain podman[28908]: 
Nov 28 07:48:17 np0005538513.localdomain podman[28908]: 2025-11-28 07:48:17.481342182 +0000 UTC m=+0.107784940 container create bf7185081afeac295544e399612ccf401e6b802a7e42f41e752af54a7a116c49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_merkle, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, vcs-type=git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, name=rhceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, release=553, distribution-scope=public, description=Red Hat Ceph Storage 7, version=7, ceph=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 07:48:17 np0005538513.localdomain podman[28908]: 2025-11-28 07:48:17.417785946 +0000 UTC m=+0.044228714 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:17 np0005538513.localdomain systemd[1]: Started libpod-conmon-bf7185081afeac295544e399612ccf401e6b802a7e42f41e752af54a7a116c49.scope.
Nov 28 07:48:17 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:17 np0005538513.localdomain podman[28908]: 2025-11-28 07:48:17.557556178 +0000 UTC m=+0.183998946 container init bf7185081afeac295544e399612ccf401e6b802a7e42f41e752af54a7a116c49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_merkle, ceph=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, distribution-scope=public, GIT_CLEAN=True, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, architecture=x86_64, vcs-type=git)
Nov 28 07:48:17 np0005538513.localdomain podman[28908]: 2025-11-28 07:48:17.569085032 +0000 UTC m=+0.195527790 container start bf7185081afeac295544e399612ccf401e6b802a7e42f41e752af54a7a116c49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_merkle, architecture=x86_64, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, version=7, vendor=Red Hat, Inc., GIT_CLEAN=True, RELEASE=main, ceph=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, release=553, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 07:48:17 np0005538513.localdomain podman[28908]: 2025-11-28 07:48:17.569382299 +0000 UTC m=+0.195825057 container attach bf7185081afeac295544e399612ccf401e6b802a7e42f41e752af54a7a116c49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_merkle, distribution-scope=public, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, build-date=2025-09-24T08:57:55, name=rhceph, io.openshift.tags=rhceph ceph, vcs-type=git, ceph=True, release=553, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64)
Nov 28 07:48:17 np0005538513.localdomain systemd[1]: libpod-bf7185081afeac295544e399612ccf401e6b802a7e42f41e752af54a7a116c49.scope: Deactivated successfully.
Nov 28 07:48:17 np0005538513.localdomain hungry_merkle[28923]: 167 167
Nov 28 07:48:17 np0005538513.localdomain podman[28908]: 2025-11-28 07:48:17.575253658 +0000 UTC m=+0.201696416 container died bf7185081afeac295544e399612ccf401e6b802a7e42f41e752af54a7a116c49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_merkle, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, RELEASE=main, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.openshift.expose-services=, architecture=x86_64)
Nov 28 07:48:17 np0005538513.localdomain podman[28928]: 2025-11-28 07:48:17.706828222 +0000 UTC m=+0.123324015 container remove bf7185081afeac295544e399612ccf401e6b802a7e42f41e752af54a7a116c49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_merkle, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, version=7, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, release=553, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 07:48:17 np0005538513.localdomain systemd[1]: libpod-conmon-bf7185081afeac295544e399612ccf401e6b802a7e42f41e752af54a7a116c49.scope: Deactivated successfully.
Nov 28 07:48:17 np0005538513.localdomain podman[28949]: 
Nov 28 07:48:17 np0005538513.localdomain podman[28949]: 2025-11-28 07:48:17.927381325 +0000 UTC m=+0.076431532 container create 02eb82e9f5e9ef1d122bc7ef8cddaccfec09553409cf5ff1a670cbc20764d88f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_herschel, com.redhat.component=rhceph-container, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, architecture=x86_64, vcs-type=git, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public)
Nov 28 07:48:17 np0005538513.localdomain systemd[1]: Started libpod-conmon-02eb82e9f5e9ef1d122bc7ef8cddaccfec09553409cf5ff1a670cbc20764d88f.scope.
Nov 28 07:48:17 np0005538513.localdomain podman[28949]: 2025-11-28 07:48:17.896734427 +0000 UTC m=+0.045784634 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:18 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:18 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50d1703faa0a1860e04ba8ea53f9495f250a5cfa90c05cef5d0e3689954afe1f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:18 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50d1703faa0a1860e04ba8ea53f9495f250a5cfa90c05cef5d0e3689954afe1f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:18 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50d1703faa0a1860e04ba8ea53f9495f250a5cfa90c05cef5d0e3689954afe1f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:18 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50d1703faa0a1860e04ba8ea53f9495f250a5cfa90c05cef5d0e3689954afe1f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:18 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50d1703faa0a1860e04ba8ea53f9495f250a5cfa90c05cef5d0e3689954afe1f/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:18 np0005538513.localdomain podman[28949]: 2025-11-28 07:48:18.074886194 +0000 UTC m=+0.223936391 container init 02eb82e9f5e9ef1d122bc7ef8cddaccfec09553409cf5ff1a670cbc20764d88f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_herschel, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, architecture=x86_64, RELEASE=main, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.openshift.expose-services=, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 07:48:18 np0005538513.localdomain podman[28949]: 2025-11-28 07:48:18.084524478 +0000 UTC m=+0.233574675 container start 02eb82e9f5e9ef1d122bc7ef8cddaccfec09553409cf5ff1a670cbc20764d88f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_herschel, release=553, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, io.openshift.tags=rhceph ceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., architecture=x86_64, RELEASE=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 07:48:18 np0005538513.localdomain podman[28949]: 2025-11-28 07:48:18.084799396 +0000 UTC m=+0.233849593 container attach 02eb82e9f5e9ef1d122bc7ef8cddaccfec09553409cf5ff1a670cbc20764d88f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_herschel, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, name=rhceph, architecture=x86_64, vcs-type=git, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, distribution-scope=public)
Nov 28 07:48:18 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-73afdf7678a16c8a3a284b3ca5dd53ad89410e36908734ee3b3355b4b2b859bd-merged.mount: Deactivated successfully.
Nov 28 07:48:18 np0005538513.localdomain elated_herschel[28964]: --> passed data devices: 0 physical, 2 LVM
Nov 28 07:48:18 np0005538513.localdomain elated_herschel[28964]: --> relative data size: 1.0
Nov 28 07:48:18 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 28 07:48:18 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new d7af0c01-7a1e-4708-8e50-081c55d3ecd3
Nov 28 07:48:19 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 28 07:48:19 np0005538513.localdomain lvm[29018]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 28 07:48:19 np0005538513.localdomain lvm[29018]: VG ceph_vg0 finished
Nov 28 07:48:19 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2
Nov 28 07:48:19 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Nov 28 07:48:19 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 28 07:48:19 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Nov 28 07:48:19 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap
Nov 28 07:48:19 np0005538513.localdomain elated_herschel[28964]:  stderr: got monmap epoch 3
Nov 28 07:48:19 np0005538513.localdomain elated_herschel[28964]: --> Creating keyring file for osd.2
Nov 28 07:48:19 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring
Nov 28 07:48:19 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/
Nov 28 07:48:19 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid d7af0c01-7a1e-4708-8e50-081c55d3ecd3 --setuser ceph --setgroup ceph
Nov 28 07:48:22 np0005538513.localdomain elated_herschel[28964]:  stderr: 2025-11-28T07:48:19.783+0000 7fe0c21e3a80 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Nov 28 07:48:22 np0005538513.localdomain elated_herschel[28964]:  stderr: 2025-11-28T07:48:19.783+0000 7fe0c21e3a80 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid
Nov 28 07:48:22 np0005538513.localdomain elated_herschel[28964]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Nov 28 07:48:22 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 28 07:48:22 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Nov 28 07:48:22 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Nov 28 07:48:22 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Nov 28 07:48:22 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 28 07:48:22 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 28 07:48:22 np0005538513.localdomain elated_herschel[28964]: --> ceph-volume lvm activate successful for osd ID: 2
Nov 28 07:48:22 np0005538513.localdomain elated_herschel[28964]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Nov 28 07:48:22 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 28 07:48:22 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 59ca4283-ae21-42b7-993b-9e0e69e2fb94
Nov 28 07:48:23 np0005538513.localdomain lvm[29963]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Nov 28 07:48:23 np0005538513.localdomain lvm[29963]: VG ceph_vg1 finished
Nov 28 07:48:23 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 28 07:48:23 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-5
Nov 28 07:48:23 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg1/ceph_lv1
Nov 28 07:48:23 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Nov 28 07:48:23 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/ln -s /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-5/block
Nov 28 07:48:23 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-5/activate.monmap
Nov 28 07:48:23 np0005538513.localdomain elated_herschel[28964]:  stderr: got monmap epoch 3
Nov 28 07:48:23 np0005538513.localdomain elated_herschel[28964]: --> Creating keyring file for osd.5
Nov 28 07:48:23 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5/keyring
Nov 28 07:48:23 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5/
Nov 28 07:48:23 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 5 --monmap /var/lib/ceph/osd/ceph-5/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-5/ --osd-uuid 59ca4283-ae21-42b7-993b-9e0e69e2fb94 --setuser ceph --setgroup ceph
Nov 28 07:48:26 np0005538513.localdomain elated_herschel[28964]:  stderr: 2025-11-28T07:48:23.739+0000 7fc91224da80 -1 bluestore(/var/lib/ceph/osd/ceph-5//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Nov 28 07:48:26 np0005538513.localdomain elated_herschel[28964]:  stderr: 2025-11-28T07:48:23.739+0000 7fc91224da80 -1 bluestore(/var/lib/ceph/osd/ceph-5/) _read_fsid unparsable uuid
Nov 28 07:48:26 np0005538513.localdomain elated_herschel[28964]: --> ceph-volume lvm prepare successful for: ceph_vg1/ceph_lv1
Nov 28 07:48:26 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Nov 28 07:48:26 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg1/ceph_lv1 --path /var/lib/ceph/osd/ceph-5 --no-mon-config
Nov 28 07:48:26 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/ln -snf /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-5/block
Nov 28 07:48:26 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block
Nov 28 07:48:26 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Nov 28 07:48:26 np0005538513.localdomain elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Nov 28 07:48:26 np0005538513.localdomain elated_herschel[28964]: --> ceph-volume lvm activate successful for osd ID: 5
Nov 28 07:48:26 np0005538513.localdomain elated_herschel[28964]: --> ceph-volume lvm create successful for: ceph_vg1/ceph_lv1
Nov 28 07:48:26 np0005538513.localdomain systemd[1]: libpod-02eb82e9f5e9ef1d122bc7ef8cddaccfec09553409cf5ff1a670cbc20764d88f.scope: Deactivated successfully.
Nov 28 07:48:26 np0005538513.localdomain systemd[1]: libpod-02eb82e9f5e9ef1d122bc7ef8cddaccfec09553409cf5ff1a670cbc20764d88f.scope: Consumed 3.937s CPU time.
Nov 28 07:48:26 np0005538513.localdomain podman[30878]: 2025-11-28 07:48:26.465871441 +0000 UTC m=+0.055648145 container died 02eb82e9f5e9ef1d122bc7ef8cddaccfec09553409cf5ff1a670cbc20764d88f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_herschel, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, ceph=True, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, distribution-scope=public, io.buildah.version=1.33.12, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, architecture=x86_64, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 07:48:26 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-50d1703faa0a1860e04ba8ea53f9495f250a5cfa90c05cef5d0e3689954afe1f-merged.mount: Deactivated successfully.
Nov 28 07:48:26 np0005538513.localdomain podman[30878]: 2025-11-28 07:48:26.505555341 +0000 UTC m=+0.095332015 container remove 02eb82e9f5e9ef1d122bc7ef8cddaccfec09553409cf5ff1a670cbc20764d88f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_herschel, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, name=rhceph, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, distribution-scope=public, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_CLEAN=True, vcs-type=git, com.redhat.component=rhceph-container)
Nov 28 07:48:26 np0005538513.localdomain systemd[1]: libpod-conmon-02eb82e9f5e9ef1d122bc7ef8cddaccfec09553409cf5ff1a670cbc20764d88f.scope: Deactivated successfully.
Nov 28 07:48:26 np0005538513.localdomain sudo[28854]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:26 np0005538513.localdomain sudo[30892]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:48:26 np0005538513.localdomain sudo[30892]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:26 np0005538513.localdomain sudo[30892]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:26 np0005538513.localdomain sudo[30907]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 -- lvm list --format json
Nov 28 07:48:26 np0005538513.localdomain sudo[30907]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:27 np0005538513.localdomain podman[30959]: 
Nov 28 07:48:27 np0005538513.localdomain podman[30959]: 2025-11-28 07:48:27.321009271 +0000 UTC m=+0.077496500 container create bd356341998c924174850a354b0a1e9d9201c6b99d6d3d9c6780a4d43aead830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_feynman, ceph=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, name=rhceph, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_CLEAN=True, vendor=Red Hat, Inc., release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.openshift.expose-services=)
Nov 28 07:48:27 np0005538513.localdomain systemd[1]: Started libpod-conmon-bd356341998c924174850a354b0a1e9d9201c6b99d6d3d9c6780a4d43aead830.scope.
Nov 28 07:48:27 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:27 np0005538513.localdomain podman[30959]: 2025-11-28 07:48:27.383796116 +0000 UTC m=+0.140283345 container init bd356341998c924174850a354b0a1e9d9201c6b99d6d3d9c6780a4d43aead830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_feynman, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, ceph=True, CEPH_POINT_RELEASE=, architecture=x86_64, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, vcs-type=git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 07:48:27 np0005538513.localdomain podman[30959]: 2025-11-28 07:48:27.291132542 +0000 UTC m=+0.047619771 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:27 np0005538513.localdomain podman[30959]: 2025-11-28 07:48:27.394120789 +0000 UTC m=+0.150607978 container start bd356341998c924174850a354b0a1e9d9201c6b99d6d3d9c6780a4d43aead830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_feynman, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, vcs-type=git, release=553, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, distribution-scope=public, version=7, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container)
Nov 28 07:48:27 np0005538513.localdomain podman[30959]: 2025-11-28 07:48:27.394325384 +0000 UTC m=+0.150812623 container attach bd356341998c924174850a354b0a1e9d9201c6b99d6d3d9c6780a4d43aead830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_feynman, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, distribution-scope=public, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., version=7, release=553)
Nov 28 07:48:27 np0005538513.localdomain vigilant_feynman[30974]: 167 167
Nov 28 07:48:27 np0005538513.localdomain systemd[1]: libpod-bd356341998c924174850a354b0a1e9d9201c6b99d6d3d9c6780a4d43aead830.scope: Deactivated successfully.
Nov 28 07:48:27 np0005538513.localdomain podman[30959]: 2025-11-28 07:48:27.397543346 +0000 UTC m=+0.154030615 container died bd356341998c924174850a354b0a1e9d9201c6b99d6d3d9c6780a4d43aead830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_feynman, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, ceph=True, version=7, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc.)
Nov 28 07:48:27 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f71a6cc3ce445fa13a64367be4316221fb9076df268ac8fa76587aa61b9ba1b1-merged.mount: Deactivated successfully.
Nov 28 07:48:27 np0005538513.localdomain podman[30979]: 2025-11-28 07:48:27.497150006 +0000 UTC m=+0.088474288 container remove bd356341998c924174850a354b0a1e9d9201c6b99d6d3d9c6780a4d43aead830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_feynman, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_BRANCH=main, distribution-scope=public, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.buildah.version=1.33.12, version=7, release=553)
Nov 28 07:48:27 np0005538513.localdomain systemd[1]: libpod-conmon-bd356341998c924174850a354b0a1e9d9201c6b99d6d3d9c6780a4d43aead830.scope: Deactivated successfully.
Nov 28 07:48:27 np0005538513.localdomain podman[31001]: 
Nov 28 07:48:27 np0005538513.localdomain podman[31001]: 2025-11-28 07:48:27.690626314 +0000 UTC m=+0.080293542 container create 14068e5fdcbf2ee0e6fc31c9cf8063474ef34fb81f29fbb4efc5e7860feaabcc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackwell, io.k8s.description=Red Hat Ceph Storage 7, version=7, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, RELEASE=main, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., ceph=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 07:48:27 np0005538513.localdomain systemd[1]: Started libpod-conmon-14068e5fdcbf2ee0e6fc31c9cf8063474ef34fb81f29fbb4efc5e7860feaabcc.scope.
Nov 28 07:48:27 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:27 np0005538513.localdomain podman[31001]: 2025-11-28 07:48:27.657266976 +0000 UTC m=+0.046934214 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:27 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87cfa97b9ed18e3b802e5c3381ad0090efe7f6ccd40f1a65736a9319eba9f3a6/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:27 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87cfa97b9ed18e3b802e5c3381ad0090efe7f6ccd40f1a65736a9319eba9f3a6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:27 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87cfa97b9ed18e3b802e5c3381ad0090efe7f6ccd40f1a65736a9319eba9f3a6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:27 np0005538513.localdomain podman[31001]: 2025-11-28 07:48:27.830322012 +0000 UTC m=+0.219989230 container init 14068e5fdcbf2ee0e6fc31c9cf8063474ef34fb81f29fbb4efc5e7860feaabcc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackwell, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.tags=rhceph ceph, RELEASE=main, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, vendor=Red Hat, Inc., ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, release=553, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 07:48:27 np0005538513.localdomain podman[31001]: 2025-11-28 07:48:27.841436865 +0000 UTC m=+0.231104093 container start 14068e5fdcbf2ee0e6fc31c9cf8063474ef34fb81f29fbb4efc5e7860feaabcc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackwell, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.buildah.version=1.33.12, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, name=rhceph, RELEASE=main, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True)
Nov 28 07:48:27 np0005538513.localdomain podman[31001]: 2025-11-28 07:48:27.841784974 +0000 UTC m=+0.231452202 container attach 14068e5fdcbf2ee0e6fc31c9cf8063474ef34fb81f29fbb4efc5e7860feaabcc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackwell, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, release=553, architecture=x86_64, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, vcs-type=git, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, CEPH_POINT_RELEASE=, version=7, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]: {
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:     "2": [
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:         {
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             "devices": [
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "/dev/loop3"
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             ],
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             "lv_name": "ceph_lv0",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             "lv_size": "7511998464",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PBfNQH-dddB-qx5q-D9S0-20Aj-jIGQ-Q1V216,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=2c5417c9-00eb-57d5-a565-ddecbc7995c1,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=d7af0c01-7a1e-4708-8e50-081c55d3ecd3,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             "lv_uuid": "PBfNQH-dddB-qx5q-D9S0-20Aj-jIGQ-Q1V216",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             "name": "ceph_lv0",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             "path": "/dev/ceph_vg0/ceph_lv0",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             "tags": {
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.block_uuid": "PBfNQH-dddB-qx5q-D9S0-20Aj-jIGQ-Q1V216",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.cephx_lockbox_secret": "",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.cluster_fsid": "2c5417c9-00eb-57d5-a565-ddecbc7995c1",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.cluster_name": "ceph",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.crush_device_class": "",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.encrypted": "0",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.osd_fsid": "d7af0c01-7a1e-4708-8e50-081c55d3ecd3",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.osd_id": "2",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.osdspec_affinity": "default_drive_group",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.type": "block",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.vdo": "0"
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             },
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             "type": "block",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             "vg_name": "ceph_vg0"
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:         }
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:     ],
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:     "5": [
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:         {
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             "devices": [
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "/dev/loop4"
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             ],
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             "lv_name": "ceph_lv1",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             "lv_size": "7511998464",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=L6wfVH-D1iV-h6TJ-Yiyz-6IQS-j2Vx-XDZY4Q,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=2c5417c9-00eb-57d5-a565-ddecbc7995c1,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=59ca4283-ae21-42b7-993b-9e0e69e2fb94,ceph.osd_id=5,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             "lv_uuid": "L6wfVH-D1iV-h6TJ-Yiyz-6IQS-j2Vx-XDZY4Q",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             "name": "ceph_lv1",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             "path": "/dev/ceph_vg1/ceph_lv1",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             "tags": {
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.block_uuid": "L6wfVH-D1iV-h6TJ-Yiyz-6IQS-j2Vx-XDZY4Q",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.cephx_lockbox_secret": "",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.cluster_fsid": "2c5417c9-00eb-57d5-a565-ddecbc7995c1",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.cluster_name": "ceph",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.crush_device_class": "",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.encrypted": "0",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.osd_fsid": "59ca4283-ae21-42b7-993b-9e0e69e2fb94",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.osd_id": "5",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.osdspec_affinity": "default_drive_group",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.type": "block",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:                 "ceph.vdo": "0"
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             },
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             "type": "block",
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:             "vg_name": "ceph_vg1"
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:         }
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]:     ]
Nov 28 07:48:28 np0005538513.localdomain wizardly_blackwell[31017]: }
Nov 28 07:48:28 np0005538513.localdomain systemd[1]: libpod-14068e5fdcbf2ee0e6fc31c9cf8063474ef34fb81f29fbb4efc5e7860feaabcc.scope: Deactivated successfully.
Nov 28 07:48:28 np0005538513.localdomain podman[31001]: 2025-11-28 07:48:28.188403682 +0000 UTC m=+0.578070940 container died 14068e5fdcbf2ee0e6fc31c9cf8063474ef34fb81f29fbb4efc5e7860feaabcc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackwell, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, version=7, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, ceph=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, name=rhceph, vcs-type=git, io.buildah.version=1.33.12, GIT_BRANCH=main, vendor=Red Hat, Inc.)
Nov 28 07:48:28 np0005538513.localdomain podman[31026]: 2025-11-28 07:48:28.270769335 +0000 UTC m=+0.069241190 container remove 14068e5fdcbf2ee0e6fc31c9cf8063474ef34fb81f29fbb4efc5e7860feaabcc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackwell, release=553, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., RELEASE=main, description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, distribution-scope=public, io.openshift.expose-services=, GIT_BRANCH=main)
Nov 28 07:48:28 np0005538513.localdomain systemd[1]: libpod-conmon-14068e5fdcbf2ee0e6fc31c9cf8063474ef34fb81f29fbb4efc5e7860feaabcc.scope: Deactivated successfully.
Nov 28 07:48:28 np0005538513.localdomain sudo[30907]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:28 np0005538513.localdomain sudo[31041]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:48:28 np0005538513.localdomain sudo[31041]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:28 np0005538513.localdomain sudo[31041]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:28 np0005538513.localdomain sudo[31056]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 07:48:28 np0005538513.localdomain sudo[31056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:28 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-87cfa97b9ed18e3b802e5c3381ad0090efe7f6ccd40f1a65736a9319eba9f3a6-merged.mount: Deactivated successfully.
Nov 28 07:48:28 np0005538513.localdomain podman[31113]: 
Nov 28 07:48:29 np0005538513.localdomain podman[31113]: 2025-11-28 07:48:29.010997485 +0000 UTC m=+0.056484726 container create c938fc14255fc378430f225c5b853dd5df6dc5f431aed9f58bfbb35dfed312e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_sammet, CEPH_POINT_RELEASE=, ceph=True, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, RELEASE=main, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main)
Nov 28 07:48:29 np0005538513.localdomain podman[31113]: 2025-11-28 07:48:28.988843182 +0000 UTC m=+0.034330423 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:29 np0005538513.localdomain systemd[1]: Started libpod-conmon-c938fc14255fc378430f225c5b853dd5df6dc5f431aed9f58bfbb35dfed312e6.scope.
Nov 28 07:48:29 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:29 np0005538513.localdomain podman[31113]: 2025-11-28 07:48:29.671637482 +0000 UTC m=+0.717124693 container init c938fc14255fc378430f225c5b853dd5df6dc5f431aed9f58bfbb35dfed312e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_sammet, RELEASE=main, version=7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_CLEAN=True, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 07:48:29 np0005538513.localdomain podman[31113]: 2025-11-28 07:48:29.679230784 +0000 UTC m=+0.724717995 container start c938fc14255fc378430f225c5b853dd5df6dc5f431aed9f58bfbb35dfed312e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_sammet, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, name=rhceph, GIT_BRANCH=main, release=553)
Nov 28 07:48:29 np0005538513.localdomain podman[31113]: 2025-11-28 07:48:29.680055426 +0000 UTC m=+0.725542657 container attach c938fc14255fc378430f225c5b853dd5df6dc5f431aed9f58bfbb35dfed312e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_sammet, version=7, io.openshift.expose-services=, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., RELEASE=main, GIT_BRANCH=main, architecture=x86_64, name=rhceph, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=)
Nov 28 07:48:29 np0005538513.localdomain determined_sammet[31128]: 167 167
Nov 28 07:48:29 np0005538513.localdomain systemd[1]: libpod-c938fc14255fc378430f225c5b853dd5df6dc5f431aed9f58bfbb35dfed312e6.scope: Deactivated successfully.
Nov 28 07:48:29 np0005538513.localdomain podman[31113]: 2025-11-28 07:48:29.681448671 +0000 UTC m=+0.726935882 container died c938fc14255fc378430f225c5b853dd5df6dc5f431aed9f58bfbb35dfed312e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_sammet, RELEASE=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, architecture=x86_64, name=rhceph, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, version=7, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 07:48:29 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-b8b7eae0f1e25cf32e37a90502876f4c83c1623cbe43e94f894b54f9e908f8a0-merged.mount: Deactivated successfully.
Nov 28 07:48:29 np0005538513.localdomain podman[31133]: 2025-11-28 07:48:29.744519764 +0000 UTC m=+0.055042810 container remove c938fc14255fc378430f225c5b853dd5df6dc5f431aed9f58bfbb35dfed312e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_sammet, RELEASE=main, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, com.redhat.component=rhceph-container, name=rhceph, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 07:48:29 np0005538513.localdomain systemd[1]: libpod-conmon-c938fc14255fc378430f225c5b853dd5df6dc5f431aed9f58bfbb35dfed312e6.scope: Deactivated successfully.
Nov 28 07:48:30 np0005538513.localdomain podman[31162]: 
Nov 28 07:48:30 np0005538513.localdomain podman[31162]: 2025-11-28 07:48:30.014864763 +0000 UTC m=+0.060567619 container create c444739f84a73336309c7822485df59bc49bd77b5e9257ba6d2521cf4495a3d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate-test, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, name=rhceph, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vcs-type=git, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, ceph=True, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc.)
Nov 28 07:48:30 np0005538513.localdomain systemd[1]: Started libpod-conmon-c444739f84a73336309c7822485df59bc49bd77b5e9257ba6d2521cf4495a3d9.scope.
Nov 28 07:48:30 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:30 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7810b54db4be30bf4154937efab47e01189370daaf687a20db001750d36a7744/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:30 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7810b54db4be30bf4154937efab47e01189370daaf687a20db001750d36a7744/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:30 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7810b54db4be30bf4154937efab47e01189370daaf687a20db001750d36a7744/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:30 np0005538513.localdomain podman[31162]: 2025-11-28 07:48:29.992563676 +0000 UTC m=+0.038266552 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:30 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7810b54db4be30bf4154937efab47e01189370daaf687a20db001750d36a7744/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:30 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7810b54db4be30bf4154937efab47e01189370daaf687a20db001750d36a7744/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:30 np0005538513.localdomain podman[31162]: 2025-11-28 07:48:30.108709298 +0000 UTC m=+0.154412154 container init c444739f84a73336309c7822485df59bc49bd77b5e9257ba6d2521cf4495a3d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate-test, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, GIT_BRANCH=main, name=rhceph, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, version=7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.buildah.version=1.33.12, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=)
Nov 28 07:48:30 np0005538513.localdomain podman[31162]: 2025-11-28 07:48:30.117666636 +0000 UTC m=+0.163369502 container start c444739f84a73336309c7822485df59bc49bd77b5e9257ba6d2521cf4495a3d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate-test, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.openshift.expose-services=, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, distribution-scope=public, ceph=True, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git)
Nov 28 07:48:30 np0005538513.localdomain podman[31162]: 2025-11-28 07:48:30.117900792 +0000 UTC m=+0.163603668 container attach c444739f84a73336309c7822485df59bc49bd77b5e9257ba6d2521cf4495a3d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate-test, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, distribution-scope=public, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, name=rhceph, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, release=553, description=Red Hat Ceph Storage 7)
Nov 28 07:48:30 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate-test[31177]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Nov 28 07:48:30 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate-test[31177]:                             [--no-systemd] [--no-tmpfs]
Nov 28 07:48:30 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate-test[31177]: ceph-volume activate: error: unrecognized arguments: --bad-option
Nov 28 07:48:30 np0005538513.localdomain systemd[1]: libpod-c444739f84a73336309c7822485df59bc49bd77b5e9257ba6d2521cf4495a3d9.scope: Deactivated successfully.
Nov 28 07:48:30 np0005538513.localdomain podman[31162]: 2025-11-28 07:48:30.385156493 +0000 UTC m=+0.430859429 container died c444739f84a73336309c7822485df59bc49bd77b5e9257ba6d2521cf4495a3d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate-test, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, io.openshift.expose-services=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 07:48:30 np0005538513.localdomain podman[31182]: 2025-11-28 07:48:30.468147572 +0000 UTC m=+0.070457482 container remove c444739f84a73336309c7822485df59bc49bd77b5e9257ba6d2521cf4495a3d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate-test, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, version=7, architecture=x86_64, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-type=git)
Nov 28 07:48:30 np0005538513.localdomain systemd-journald[618]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation.
Nov 28 07:48:30 np0005538513.localdomain systemd-journald[618]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 07:48:30 np0005538513.localdomain rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 07:48:30 np0005538513.localdomain systemd[1]: libpod-conmon-c444739f84a73336309c7822485df59bc49bd77b5e9257ba6d2521cf4495a3d9.scope: Deactivated successfully.
Nov 28 07:48:30 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-7810b54db4be30bf4154937efab47e01189370daaf687a20db001750d36a7744-merged.mount: Deactivated successfully.
Nov 28 07:48:30 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:48:30 np0005538513.localdomain systemd-rc-local-generator[31238]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:48:30 np0005538513.localdomain systemd-sysv-generator[31241]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:48:30 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:48:31 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:48:31 np0005538513.localdomain systemd-rc-local-generator[31275]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:48:31 np0005538513.localdomain systemd-sysv-generator[31282]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:48:31 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:48:31 np0005538513.localdomain systemd[1]: Starting Ceph osd.2 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1...
Nov 28 07:48:31 np0005538513.localdomain podman[31340]: 
Nov 28 07:48:31 np0005538513.localdomain podman[31340]: 2025-11-28 07:48:31.650900596 +0000 UTC m=+0.076143186 container create efb74051641f9204a30b515a812649c53e6520e5112241529648a49895c7bedd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.buildah.version=1.33.12, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, ceph=True, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, architecture=x86_64, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 07:48:31 np0005538513.localdomain systemd[1]: tmp-crun.XlD2MG.mount: Deactivated successfully.
Nov 28 07:48:31 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:31 np0005538513.localdomain podman[31340]: 2025-11-28 07:48:31.619866057 +0000 UTC m=+0.045108637 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:31 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a6cf4bc8f2f92907c473d7bce1a628783db896d4e848a956efe2f4a59b43874/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:31 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a6cf4bc8f2f92907c473d7bce1a628783db896d4e848a956efe2f4a59b43874/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:31 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a6cf4bc8f2f92907c473d7bce1a628783db896d4e848a956efe2f4a59b43874/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:31 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a6cf4bc8f2f92907c473d7bce1a628783db896d4e848a956efe2f4a59b43874/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:31 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a6cf4bc8f2f92907c473d7bce1a628783db896d4e848a956efe2f4a59b43874/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:31 np0005538513.localdomain podman[31340]: 2025-11-28 07:48:31.759328251 +0000 UTC m=+0.184570841 container init efb74051641f9204a30b515a812649c53e6520e5112241529648a49895c7bedd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-type=git, name=rhceph, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, version=7, RELEASE=main, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, release=553, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_BRANCH=main)
Nov 28 07:48:31 np0005538513.localdomain systemd[1]: tmp-crun.5XYLON.mount: Deactivated successfully.
Nov 28 07:48:31 np0005538513.localdomain podman[31340]: 2025-11-28 07:48:31.769355776 +0000 UTC m=+0.194598326 container start efb74051641f9204a30b515a812649c53e6520e5112241529648a49895c7bedd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, version=7, GIT_BRANCH=main, release=553, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc.)
Nov 28 07:48:31 np0005538513.localdomain podman[31340]: 2025-11-28 07:48:31.769597282 +0000 UTC m=+0.194839922 container attach efb74051641f9204a30b515a812649c53e6520e5112241529648a49895c7bedd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_CLEAN=True, release=553, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph)
Nov 28 07:48:32 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate[31354]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 28 07:48:32 np0005538513.localdomain bash[31340]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 28 07:48:32 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate[31354]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Nov 28 07:48:32 np0005538513.localdomain bash[31340]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Nov 28 07:48:32 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate[31354]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Nov 28 07:48:32 np0005538513.localdomain bash[31340]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Nov 28 07:48:32 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate[31354]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 28 07:48:32 np0005538513.localdomain bash[31340]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 28 07:48:32 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate[31354]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Nov 28 07:48:32 np0005538513.localdomain bash[31340]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Nov 28 07:48:32 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate[31354]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 28 07:48:32 np0005538513.localdomain bash[31340]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 28 07:48:32 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate[31354]: --> ceph-volume raw activate successful for osd ID: 2
Nov 28 07:48:32 np0005538513.localdomain bash[31340]: --> ceph-volume raw activate successful for osd ID: 2
Nov 28 07:48:32 np0005538513.localdomain systemd[1]: libpod-efb74051641f9204a30b515a812649c53e6520e5112241529648a49895c7bedd.scope: Deactivated successfully.
Nov 28 07:48:32 np0005538513.localdomain podman[31340]: 2025-11-28 07:48:32.473802396 +0000 UTC m=+0.899045006 container died efb74051641f9204a30b515a812649c53e6520e5112241529648a49895c7bedd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, RELEASE=main, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, version=7, name=rhceph, release=553, io.openshift.expose-services=)
Nov 28 07:48:32 np0005538513.localdomain podman[31479]: 2025-11-28 07:48:32.574404833 +0000 UTC m=+0.085153265 container remove efb74051641f9204a30b515a812649c53e6520e5112241529648a49895c7bedd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.expose-services=, name=rhceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_CLEAN=True, version=7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, release=553, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 07:48:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-9a6cf4bc8f2f92907c473d7bce1a628783db896d4e848a956efe2f4a59b43874-merged.mount: Deactivated successfully.
Nov 28 07:48:32 np0005538513.localdomain podman[31539]: 
Nov 28 07:48:32 np0005538513.localdomain podman[31539]: 2025-11-28 07:48:32.951047323 +0000 UTC m=+0.117148737 container create 6cb78ea5787c09a22761cdff13c64234dc370dfb29fad754315b7ad61b1065f9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, architecture=x86_64)
Nov 28 07:48:33 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0de357ab3f32be41d06d275ed8a715c5e823c6c40791035f46e8d10ce58ea2e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:33 np0005538513.localdomain podman[31539]: 2025-11-28 07:48:32.917388428 +0000 UTC m=+0.083489912 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:33 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0de357ab3f32be41d06d275ed8a715c5e823c6c40791035f46e8d10ce58ea2e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:33 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0de357ab3f32be41d06d275ed8a715c5e823c6c40791035f46e8d10ce58ea2e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:33 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0de357ab3f32be41d06d275ed8a715c5e823c6c40791035f46e8d10ce58ea2e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:33 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0de357ab3f32be41d06d275ed8a715c5e823c6c40791035f46e8d10ce58ea2e/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:33 np0005538513.localdomain podman[31539]: 2025-11-28 07:48:33.070126459 +0000 UTC m=+0.236227883 container init 6cb78ea5787c09a22761cdff13c64234dc370dfb29fad754315b7ad61b1065f9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, ceph=True, GIT_BRANCH=main, RELEASE=main, CEPH_POINT_RELEASE=, distribution-scope=public, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7)
Nov 28 07:48:33 np0005538513.localdomain podman[31539]: 2025-11-28 07:48:33.07840634 +0000 UTC m=+0.244507774 container start 6cb78ea5787c09a22761cdff13c64234dc370dfb29fad754315b7ad61b1065f9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2, version=7, io.openshift.expose-services=, architecture=x86_64, build-date=2025-09-24T08:57:55, RELEASE=main, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, name=rhceph, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_BRANCH=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 07:48:33 np0005538513.localdomain bash[31539]: 6cb78ea5787c09a22761cdff13c64234dc370dfb29fad754315b7ad61b1065f9
Nov 28 07:48:33 np0005538513.localdomain systemd[1]: Started Ceph osd.2 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1.
Nov 28 07:48:33 np0005538513.localdomain sudo[31056]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: set uid:gid to 167:167 (ceph:ceph)
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: pidfile_write: ignore empty --pid-file
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e33180 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e33180 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e33180 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e33180 /var/lib/ceph/osd/ceph-2/block) close
Nov 28 07:48:33 np0005538513.localdomain sudo[31570]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:48:33 np0005538513.localdomain sudo[31570]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:33 np0005538513.localdomain sudo[31570]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:33 np0005538513.localdomain sudo[31585]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 07:48:33 np0005538513.localdomain sudo[31585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) close
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: load: jerasure load: lrc 
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) close
Nov 28 07:48:33 np0005538513.localdomain podman[31648]: 
Nov 28 07:48:33 np0005538513.localdomain podman[31648]: 2025-11-28 07:48:33.937098999 +0000 UTC m=+0.066684025 container create 91c281b2d0e97b15e60aeadecd2a9917d2cb03f5eab7a642c551444db475f5b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_ganguly, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_BRANCH=main, CEPH_POINT_RELEASE=, ceph=True, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, distribution-scope=public, com.redhat.component=rhceph-container, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, release=553, architecture=x86_64, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 28 07:48:33 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) close
Nov 28 07:48:33 np0005538513.localdomain systemd[1]: Started libpod-conmon-91c281b2d0e97b15e60aeadecd2a9917d2cb03f5eab7a642c551444db475f5b0.scope.
Nov 28 07:48:33 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:34 np0005538513.localdomain podman[31648]: 2025-11-28 07:48:33.907684952 +0000 UTC m=+0.037269978 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:34 np0005538513.localdomain podman[31648]: 2025-11-28 07:48:34.013653195 +0000 UTC m=+0.143238231 container init 91c281b2d0e97b15e60aeadecd2a9917d2cb03f5eab7a642c551444db475f5b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_ganguly, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, architecture=x86_64, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, distribution-scope=public, name=rhceph)
Nov 28 07:48:34 np0005538513.localdomain podman[31648]: 2025-11-28 07:48:34.022358956 +0000 UTC m=+0.151943992 container start 91c281b2d0e97b15e60aeadecd2a9917d2cb03f5eab7a642c551444db475f5b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_ganguly, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.buildah.version=1.33.12, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, RELEASE=main, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 07:48:34 np0005538513.localdomain podman[31648]: 2025-11-28 07:48:34.022562431 +0000 UTC m=+0.152147467 container attach 91c281b2d0e97b15e60aeadecd2a9917d2cb03f5eab7a642c551444db475f5b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_ganguly, io.openshift.expose-services=, ceph=True, name=rhceph, architecture=x86_64, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-type=git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc.)
Nov 28 07:48:34 np0005538513.localdomain relaxed_ganguly[31668]: 167 167
Nov 28 07:48:34 np0005538513.localdomain systemd[1]: libpod-91c281b2d0e97b15e60aeadecd2a9917d2cb03f5eab7a642c551444db475f5b0.scope: Deactivated successfully.
Nov 28 07:48:34 np0005538513.localdomain podman[31648]: 2025-11-28 07:48:34.026599633 +0000 UTC m=+0.156184689 container died 91c281b2d0e97b15e60aeadecd2a9917d2cb03f5eab7a642c551444db475f5b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_ganguly, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.expose-services=, name=rhceph, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, distribution-scope=public, release=553, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, ceph=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 07:48:34 np0005538513.localdomain podman[31673]: 2025-11-28 07:48:34.127160838 +0000 UTC m=+0.090467639 container remove 91c281b2d0e97b15e60aeadecd2a9917d2cb03f5eab7a642c551444db475f5b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_ganguly, io.buildah.version=1.33.12, version=7, distribution-scope=public, io.openshift.expose-services=, name=rhceph, CEPH_POINT_RELEASE=, ceph=True, vcs-type=git, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 07:48:34 np0005538513.localdomain systemd[1]: libpod-conmon-91c281b2d0e97b15e60aeadecd2a9917d2cb03f5eab7a642c551444db475f5b0.scope: Deactivated successfully.
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e33180 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e33180 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e33180 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bluefs mount
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bluefs mount shared_bdev_used = 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: RocksDB version: 7.9.2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Git sha 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Compile date 2025-09-23 00:00:00
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: DB SUMMARY
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: DB Session ID:  QQ88DS7L6ZLLRUDA5NZ1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: CURRENT file:  CURRENT
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: IDENTITY file:  IDENTITY
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                         Options.error_if_exists: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.create_if_missing: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                         Options.paranoid_checks: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                                     Options.env: 0x55ff510c6bd0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                                      Options.fs: LegacyFileSystem
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                                Options.info_log: 0x55ff51dba400
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_file_opening_threads: 16
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                              Options.statistics: (nil)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.use_fsync: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.max_log_file_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                         Options.allow_fallocate: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.use_direct_reads: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.create_missing_column_families: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                              Options.db_log_dir: 
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                                 Options.wal_dir: db.wal
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.advise_random_on_open: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.write_buffer_manager: 0x55ff50e1c140
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                            Options.rate_limiter: (nil)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.unordered_write: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.row_cache: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                              Options.wal_filter: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.allow_ingest_behind: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.two_write_queues: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.manual_wal_flush: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.wal_compression: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.atomic_flush: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.log_readahead_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.allow_data_in_errors: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.db_host_id: __hostname__
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.max_background_jobs: 4
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.max_background_compactions: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.max_subcompactions: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.writable_file_max_buffer_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.max_total_wal_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.max_open_files: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.bytes_per_sync: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.compaction_readahead_size: 2097152
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.max_background_flushes: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Compression algorithms supported:
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         kZSTD supported: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         kXpressCompression supported: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         kBZip2Compression supported: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         kLZ4Compression supported: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         kZlibCompression supported: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         kLZ4HCCompression supported: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         kSnappyCompression supported: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dba5c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ff50e0a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.table_properties_collectors: 
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dba5c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ff50e0a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dba5c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ff50e0a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dba5c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ff50e0a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dba5c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ff50e0a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dba5c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ff50e0a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dba5c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ff50e0a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dba7e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ff50e0a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dba7e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ff50e0a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dba7e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ff50e0a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 525bbb97-839b-44e9-be72-9b5ac24ac615
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316114267517, "job": 1, "event": "recovery_started", "wal_files": [31]}
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316114267809, "job": 1, "event": "recovery_finished"}
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: freelist init
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: freelist _read_cfg
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bluefs umount
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e33180 /var/lib/ceph/osd/ceph-2/block) close
Nov 28 07:48:34 np0005538513.localdomain podman[31897]: 
Nov 28 07:48:34 np0005538513.localdomain podman[31897]: 2025-11-28 07:48:34.483484343 +0000 UTC m=+0.087869984 container create 3cc0785e7c356c27f22f4de7b8ec16938d24fb01111693f2bb5ee88b61ccf62a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate-test, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, architecture=x86_64, build-date=2025-09-24T08:57:55, ceph=True, name=rhceph, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, GIT_CLEAN=True)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e33180 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e33180 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bdev(0x55ff50e33180 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bluefs mount
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bluefs mount shared_bdev_used = 4718592
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: RocksDB version: 7.9.2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Git sha 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Compile date 2025-09-23 00:00:00
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: DB SUMMARY
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: DB Session ID:  QQ88DS7L6ZLLRUDA5NZ0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: CURRENT file:  CURRENT
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: IDENTITY file:  IDENTITY
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                         Options.error_if_exists: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.create_if_missing: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                         Options.paranoid_checks: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                                     Options.env: 0x55ff510c7c00
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                                      Options.fs: LegacyFileSystem
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                                Options.info_log: 0x55ff51f6e820
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_file_opening_threads: 16
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                              Options.statistics: (nil)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.use_fsync: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.max_log_file_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                         Options.allow_fallocate: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.use_direct_reads: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.create_missing_column_families: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                              Options.db_log_dir: 
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                                 Options.wal_dir: db.wal
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.advise_random_on_open: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.write_buffer_manager: 0x55ff50e1d540
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                            Options.rate_limiter: (nil)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.unordered_write: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.row_cache: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                              Options.wal_filter: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.allow_ingest_behind: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.two_write_queues: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.manual_wal_flush: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.wal_compression: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.atomic_flush: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.log_readahead_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.allow_data_in_errors: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.db_host_id: __hostname__
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.max_background_jobs: 4
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.max_background_compactions: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.max_subcompactions: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.writable_file_max_buffer_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.max_total_wal_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.max_open_files: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.bytes_per_sync: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.compaction_readahead_size: 2097152
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.max_background_flushes: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Compression algorithms supported:
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         kZSTD supported: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         kXpressCompression supported: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         kBZip2Compression supported: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         kLZ4Compression supported: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         kZlibCompression supported: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         kLZ4HCCompression supported: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         kSnappyCompression supported: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51f6ea40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ff50e0b610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.table_properties_collectors: 
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51f6ea40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ff50e0b610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51f6ea40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ff50e0b610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain systemd[1]: Started libpod-conmon-3cc0785e7c356c27f22f4de7b8ec16938d24fb01111693f2bb5ee88b61ccf62a.scope.
Nov 28 07:48:34 np0005538513.localdomain podman[31897]: 2025-11-28 07:48:34.447608392 +0000 UTC m=+0.051994033 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:34 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51f6ea40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ff50e0b610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51f6ea40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ff50e0b610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51f6ea40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ff50e0b610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51f6ea40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ff50e0b610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dbab40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ff50e0a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:34 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a4e74e855d90d20904e991bf78eea8c7ad44a3a513f664d58ca3a32b1e45158/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dbab40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ff50e0a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dbab40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ff50e0a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 525bbb97-839b-44e9-be72-9b5ac24ac615
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316114539056, "job": 1, "event": "recovery_started", "wal_files": [31]}
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316114544544, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764316114, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "525bbb97-839b-44e9-be72-9b5ac24ac615", "db_session_id": "QQ88DS7L6ZLLRUDA5NZ0", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316114548180, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764316114, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "525bbb97-839b-44e9-be72-9b5ac24ac615", "db_session_id": "QQ88DS7L6ZLLRUDA5NZ0", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316114552059, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764316114, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "525bbb97-839b-44e9-be72-9b5ac24ac615", "db_session_id": "QQ88DS7L6ZLLRUDA5NZ0", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316114555822, "job": 1, "event": "recovery_finished"}
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Nov 28 07:48:34 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a4e74e855d90d20904e991bf78eea8c7ad44a3a513f664d58ca3a32b1e45158/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55ff51f82380
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: DB pointer 0x55ff51d11a00
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                          Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0a2d0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0a2d0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0a2d0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 28 07:48:34 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a4e74e855d90d20904e991bf78eea8c7ad44a3a513f664d58ca3a32b1e45158/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: _get_class not permitted to load lua
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: _get_class not permitted to load sdk
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: _get_class not permitted to load test_remote_reads
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: osd.2 0 load_pgs
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: osd.2 0 load_pgs opened 0 pgs
Nov 28 07:48:34 np0005538513.localdomain ceph-osd[31557]: osd.2 0 log_to_monitors true
Nov 28 07:48:34 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2[31553]: 2025-11-28T07:48:34.613+0000 7fd094bd2a80 -1 osd.2 0 log_to_monitors true
Nov 28 07:48:34 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a4e74e855d90d20904e991bf78eea8c7ad44a3a513f664d58ca3a32b1e45158/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:34 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a4e74e855d90d20904e991bf78eea8c7ad44a3a513f664d58ca3a32b1e45158/merged/var/lib/ceph/osd/ceph-5 supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:34 np0005538513.localdomain podman[31897]: 2025-11-28 07:48:34.636321607 +0000 UTC m=+0.240707248 container init 3cc0785e7c356c27f22f4de7b8ec16938d24fb01111693f2bb5ee88b61ccf62a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate-test, GIT_BRANCH=main, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, RELEASE=main, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, version=7, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=)
Nov 28 07:48:34 np0005538513.localdomain podman[31897]: 2025-11-28 07:48:34.647532372 +0000 UTC m=+0.251917993 container start 3cc0785e7c356c27f22f4de7b8ec16938d24fb01111693f2bb5ee88b61ccf62a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate-test, version=7, GIT_CLEAN=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, ceph=True, vcs-type=git, distribution-scope=public, io.openshift.tags=rhceph ceph)
Nov 28 07:48:34 np0005538513.localdomain podman[31897]: 2025-11-28 07:48:34.647696936 +0000 UTC m=+0.252082597 container attach 3cc0785e7c356c27f22f4de7b8ec16938d24fb01111693f2bb5ee88b61ccf62a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate-test, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, distribution-scope=public, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, GIT_CLEAN=True, vcs-type=git, com.redhat.component=rhceph-container, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, release=553, ceph=True, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 07:48:34 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate-test[32062]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Nov 28 07:48:34 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate-test[32062]:                             [--no-systemd] [--no-tmpfs]
Nov 28 07:48:34 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate-test[32062]: ceph-volume activate: error: unrecognized arguments: --bad-option
Nov 28 07:48:34 np0005538513.localdomain systemd[1]: libpod-3cc0785e7c356c27f22f4de7b8ec16938d24fb01111693f2bb5ee88b61ccf62a.scope: Deactivated successfully.
Nov 28 07:48:34 np0005538513.localdomain podman[31897]: 2025-11-28 07:48:34.873587966 +0000 UTC m=+0.477973597 container died 3cc0785e7c356c27f22f4de7b8ec16938d24fb01111693f2bb5ee88b61ccf62a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate-test, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, name=rhceph, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.component=rhceph-container, release=553, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7)
Nov 28 07:48:34 np0005538513.localdomain systemd[1]: tmp-crun.9ZWNwq.mount: Deactivated successfully.
Nov 28 07:48:34 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e1c35257d5e551a4d40add5184be9cafdc10e96519277913cde544a854cc14fa-merged.mount: Deactivated successfully.
Nov 28 07:48:34 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-7a4e74e855d90d20904e991bf78eea8c7ad44a3a513f664d58ca3a32b1e45158-merged.mount: Deactivated successfully.
Nov 28 07:48:34 np0005538513.localdomain podman[32132]: 2025-11-28 07:48:34.978266476 +0000 UTC m=+0.094066072 container remove 3cc0785e7c356c27f22f4de7b8ec16938d24fb01111693f2bb5ee88b61ccf62a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate-test, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.expose-services=, release=553, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, vcs-type=git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 07:48:34 np0005538513.localdomain systemd[1]: libpod-conmon-3cc0785e7c356c27f22f4de7b8ec16938d24fb01111693f2bb5ee88b61ccf62a.scope: Deactivated successfully.
Nov 28 07:48:35 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:48:35 np0005538513.localdomain systemd-sysv-generator[32193]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:48:35 np0005538513.localdomain systemd-rc-local-generator[32190]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:48:35 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:48:35 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:48:35 np0005538513.localdomain systemd-rc-local-generator[32228]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:48:35 np0005538513.localdomain systemd-sysv-generator[32233]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:48:35 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Nov 28 07:48:35 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Nov 28 07:48:35 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:48:35 np0005538513.localdomain systemd[1]: Starting Ceph osd.5 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1...
Nov 28 07:48:36 np0005538513.localdomain podman[32294]: 
Nov 28 07:48:36 np0005538513.localdomain podman[32294]: 2025-11-28 07:48:36.203881459 +0000 UTC m=+0.074460523 container create 6a080119ce945ea7739d53197fc2390b0cc9e3714c2b432aee35a9d3aa355013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.component=rhceph-container, version=7, io.openshift.expose-services=, io.buildah.version=1.33.12, name=rhceph, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, RELEASE=main, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, ceph=True, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55)
Nov 28 07:48:36 np0005538513.localdomain systemd[1]: tmp-crun.BFDco7.mount: Deactivated successfully.
Nov 28 07:48:36 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:36 np0005538513.localdomain podman[32294]: 2025-11-28 07:48:36.171493986 +0000 UTC m=+0.042073100 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:36 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edaf488f4faecc9741d4f49b8ba64db92db3c1345008684fdb91ab4b8869b56/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:36 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edaf488f4faecc9741d4f49b8ba64db92db3c1345008684fdb91ab4b8869b56/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:36 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edaf488f4faecc9741d4f49b8ba64db92db3c1345008684fdb91ab4b8869b56/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:36 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edaf488f4faecc9741d4f49b8ba64db92db3c1345008684fdb91ab4b8869b56/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:36 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edaf488f4faecc9741d4f49b8ba64db92db3c1345008684fdb91ab4b8869b56/merged/var/lib/ceph/osd/ceph-5 supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:36 np0005538513.localdomain podman[32294]: 2025-11-28 07:48:36.334562209 +0000 UTC m=+0.205141273 container init 6a080119ce945ea7739d53197fc2390b0cc9e3714c2b432aee35a9d3aa355013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, release=553, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., RELEASE=main, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, version=7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 07:48:36 np0005538513.localdomain podman[32294]: 2025-11-28 07:48:36.347529239 +0000 UTC m=+0.218108333 container start 6a080119ce945ea7739d53197fc2390b0cc9e3714c2b432aee35a9d3aa355013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_CLEAN=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, vcs-type=git, RELEASE=main, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.openshift.expose-services=, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc.)
Nov 28 07:48:36 np0005538513.localdomain podman[32294]: 2025-11-28 07:48:36.347841367 +0000 UTC m=+0.218420441 container attach 6a080119ce945ea7739d53197fc2390b0cc9e3714c2b432aee35a9d3aa355013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, architecture=x86_64, RELEASE=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 07:48:36 np0005538513.localdomain ceph-osd[31557]: osd.2 0 done with init, starting boot process
Nov 28 07:48:36 np0005538513.localdomain ceph-osd[31557]: osd.2 0 start_boot
Nov 28 07:48:36 np0005538513.localdomain ceph-osd[31557]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1
Nov 28 07:48:36 np0005538513.localdomain ceph-osd[31557]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Nov 28 07:48:36 np0005538513.localdomain ceph-osd[31557]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Nov 28 07:48:36 np0005538513.localdomain ceph-osd[31557]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Nov 28 07:48:36 np0005538513.localdomain ceph-osd[31557]: osd.2 0  bench count 12288000 bsize 4 KiB
Nov 28 07:48:37 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate[32308]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Nov 28 07:48:37 np0005538513.localdomain bash[32294]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Nov 28 07:48:37 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate[32308]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-5 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1
Nov 28 07:48:37 np0005538513.localdomain bash[32294]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-5 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1
Nov 28 07:48:37 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate[32308]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1
Nov 28 07:48:37 np0005538513.localdomain bash[32294]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1
Nov 28 07:48:37 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate[32308]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Nov 28 07:48:37 np0005538513.localdomain bash[32294]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Nov 28 07:48:37 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate[32308]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-5/block
Nov 28 07:48:37 np0005538513.localdomain bash[32294]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-5/block
Nov 28 07:48:37 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate[32308]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Nov 28 07:48:37 np0005538513.localdomain bash[32294]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Nov 28 07:48:37 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate[32308]: --> ceph-volume raw activate successful for osd ID: 5
Nov 28 07:48:37 np0005538513.localdomain bash[32294]: --> ceph-volume raw activate successful for osd ID: 5
Nov 28 07:48:37 np0005538513.localdomain systemd[1]: libpod-6a080119ce945ea7739d53197fc2390b0cc9e3714c2b432aee35a9d3aa355013.scope: Deactivated successfully.
Nov 28 07:48:37 np0005538513.localdomain podman[32294]: 2025-11-28 07:48:37.167772322 +0000 UTC m=+1.038351396 container died 6a080119ce945ea7739d53197fc2390b0cc9e3714c2b432aee35a9d3aa355013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.expose-services=, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.buildah.version=1.33.12, vcs-type=git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 07:48:37 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-8edaf488f4faecc9741d4f49b8ba64db92db3c1345008684fdb91ab4b8869b56-merged.mount: Deactivated successfully.
Nov 28 07:48:37 np0005538513.localdomain podman[32429]: 2025-11-28 07:48:37.292386158 +0000 UTC m=+0.110337504 container remove 6a080119ce945ea7739d53197fc2390b0cc9e3714c2b432aee35a9d3aa355013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, release=553, io.openshift.expose-services=, ceph=True, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, distribution-scope=public, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_CLEAN=True, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7)
Nov 28 07:48:37 np0005538513.localdomain podman[32488]: 
Nov 28 07:48:37 np0005538513.localdomain podman[32488]: 2025-11-28 07:48:37.669800569 +0000 UTC m=+0.087395282 container create 4f3622bdb38481233795ae3180a50069a5087f8f4c9c12f799952a3c52cc3fc6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5, ceph=True, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, GIT_CLEAN=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 07:48:37 np0005538513.localdomain podman[32488]: 2025-11-28 07:48:37.620839054 +0000 UTC m=+0.038433817 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:37 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e5e613a02c56da4c93d1e8a8188914bb612ea02364afe0c8ca7263373eb0ad5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:37 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e5e613a02c56da4c93d1e8a8188914bb612ea02364afe0c8ca7263373eb0ad5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:37 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e5e613a02c56da4c93d1e8a8188914bb612ea02364afe0c8ca7263373eb0ad5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:37 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e5e613a02c56da4c93d1e8a8188914bb612ea02364afe0c8ca7263373eb0ad5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:37 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e5e613a02c56da4c93d1e8a8188914bb612ea02364afe0c8ca7263373eb0ad5/merged/var/lib/ceph/osd/ceph-5 supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:37 np0005538513.localdomain podman[32488]: 2025-11-28 07:48:37.815111681 +0000 UTC m=+0.232706414 container init 4f3622bdb38481233795ae3180a50069a5087f8f4c9c12f799952a3c52cc3fc6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5, distribution-scope=public, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, name=rhceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, architecture=x86_64)
Nov 28 07:48:37 np0005538513.localdomain podman[32488]: 2025-11-28 07:48:37.848016117 +0000 UTC m=+0.265610860 container start 4f3622bdb38481233795ae3180a50069a5087f8f4c9c12f799952a3c52cc3fc6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, release=553, distribution-scope=public, RELEASE=main, io.openshift.tags=rhceph ceph)
Nov 28 07:48:37 np0005538513.localdomain bash[32488]: 4f3622bdb38481233795ae3180a50069a5087f8f4c9c12f799952a3c52cc3fc6
Nov 28 07:48:37 np0005538513.localdomain systemd[1]: Started Ceph osd.5 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1.
Nov 28 07:48:37 np0005538513.localdomain ceph-osd[32506]: set uid:gid to 167:167 (ceph:ceph)
Nov 28 07:48:37 np0005538513.localdomain ceph-osd[32506]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2
Nov 28 07:48:37 np0005538513.localdomain ceph-osd[32506]: pidfile_write: ignore empty --pid-file
Nov 28 07:48:37 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Nov 28 07:48:37 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Nov 28 07:48:37 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:37 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 28 07:48:37 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439171180 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Nov 28 07:48:37 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439171180 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Nov 28 07:48:37 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439171180 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:37 np0005538513.localdomain ceph-osd[32506]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-5/block size 7.0 GiB
Nov 28 07:48:37 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439171180 /var/lib/ceph/osd/ceph-5/block) close
Nov 28 07:48:37 np0005538513.localdomain sudo[31585]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:38 np0005538513.localdomain sudo[32519]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:48:38 np0005538513.localdomain sudo[32519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:38 np0005538513.localdomain sudo[32519]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:38 np0005538513.localdomain sudo[32534]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 -- raw list --format json
Nov 28 07:48:38 np0005538513.localdomain sudo[32534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) close
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: starting osd.5 osd_data /var/lib/ceph/osd/ceph-5 /var/lib/ceph/osd/ceph-5/journal
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: load: jerasure load: lrc 
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) close
Nov 28 07:48:38 np0005538513.localdomain podman[32597]: 
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) close
Nov 28 07:48:38 np0005538513.localdomain podman[32597]: 2025-11-28 07:48:38.718857025 +0000 UTC m=+0.097902989 container create 60208db6d0ac2f25afd9c2f81eb8933eb34be3c31a6a8ec5fe39fb46c3405918 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, distribution-scope=public, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.openshift.tags=rhceph ceph, version=7, io.buildah.version=1.33.12, GIT_CLEAN=True, vcs-type=git, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=)
Nov 28 07:48:38 np0005538513.localdomain podman[32597]: 2025-11-28 07:48:38.666616998 +0000 UTC m=+0.045662962 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:38 np0005538513.localdomain systemd[1]: Started libpod-conmon-60208db6d0ac2f25afd9c2f81eb8933eb34be3c31a6a8ec5fe39fb46c3405918.scope.
Nov 28 07:48:38 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:38 np0005538513.localdomain podman[32597]: 2025-11-28 07:48:38.849757902 +0000 UTC m=+0.228803846 container init 60208db6d0ac2f25afd9c2f81eb8933eb34be3c31a6a8ec5fe39fb46c3405918 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, CEPH_POINT_RELEASE=, release=553, GIT_CLEAN=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 07:48:38 np0005538513.localdomain podman[32597]: 2025-11-28 07:48:38.861084989 +0000 UTC m=+0.240130933 container start 60208db6d0ac2f25afd9c2f81eb8933eb34be3c31a6a8ec5fe39fb46c3405918 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, architecture=x86_64, name=rhceph, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, distribution-scope=public, release=553)
Nov 28 07:48:38 np0005538513.localdomain podman[32597]: 2025-11-28 07:48:38.861393087 +0000 UTC m=+0.240439101 container attach 60208db6d0ac2f25afd9c2f81eb8933eb34be3c31a6a8ec5fe39fb46c3405918 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, release=553, architecture=x86_64, io.openshift.expose-services=, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vendor=Red Hat, Inc., version=7, io.openshift.tags=rhceph ceph, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container)
Nov 28 07:48:38 np0005538513.localdomain gracious_gauss[32616]: 167 167
Nov 28 07:48:38 np0005538513.localdomain systemd[1]: libpod-60208db6d0ac2f25afd9c2f81eb8933eb34be3c31a6a8ec5fe39fb46c3405918.scope: Deactivated successfully.
Nov 28 07:48:38 np0005538513.localdomain podman[32597]: 2025-11-28 07:48:38.870119779 +0000 UTC m=+0.249165783 container died 60208db6d0ac2f25afd9c2f81eb8933eb34be3c31a6a8ec5fe39fb46c3405918 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, ceph=True, GIT_BRANCH=main, RELEASE=main, build-date=2025-09-24T08:57:55, release=553, GIT_CLEAN=True, vcs-type=git, version=7, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: osd.5:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439171180 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439171180 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439171180 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-5/block size 7.0 GiB
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: bluefs mount
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: bluefs mount shared_bdev_used = 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: RocksDB version: 7.9.2
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Git sha 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Compile date 2025-09-23 00:00:00
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: DB SUMMARY
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: DB Session ID:  QIG9JK7FL3F9WYN4LANW
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: CURRENT file:  CURRENT
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: IDENTITY file:  IDENTITY
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                         Options.error_if_exists: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.create_if_missing: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                         Options.paranoid_checks: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                                     Options.env: 0x558439404cb0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                                      Options.fs: LegacyFileSystem
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                                Options.info_log: 0x55843a102740
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_file_opening_threads: 16
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                              Options.statistics: (nil)
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.use_fsync: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.max_log_file_size: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                         Options.allow_fallocate: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.use_direct_reads: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.create_missing_column_families: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                              Options.db_log_dir: 
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                                 Options.wal_dir: db.wal
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.advise_random_on_open: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.write_buffer_manager: 0x55843915a140
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                            Options.rate_limiter: (nil)
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.unordered_write: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.row_cache: None
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                              Options.wal_filter: None
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.allow_ingest_behind: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.two_write_queues: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.manual_wal_flush: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.wal_compression: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.atomic_flush: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.log_readahead_size: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.allow_data_in_errors: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.db_host_id: __hostname__
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.max_background_jobs: 4
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.max_background_compactions: -1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.max_subcompactions: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.writable_file_max_buffer_size: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.max_total_wal_size: 1073741824
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.max_open_files: -1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.bytes_per_sync: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.compaction_readahead_size: 2097152
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.max_background_flushes: -1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Compression algorithms supported:
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         kZSTD supported: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         kXpressCompression supported: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         kBZip2Compression supported: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         kLZ4Compression supported: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         kZlibCompression supported: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         kLZ4HCCompression supported: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         kSnappyCompression supported: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55843a102900)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x558439148850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.table_properties_collectors: 
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55843a102900)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x558439148850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55843a102900)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x558439148850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55843a102900)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x558439148850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55843a102900)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x558439148850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55843a102900)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x558439148850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55843a102900)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x558439148850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55843a102b20)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5584391482d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55843a102b20)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5584391482d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55843a102b20)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5584391482d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 2c33d069-07f2-43c6-8b70-8b37a70b2431
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316119009345, "job": 1, "event": "recovery_started", "wal_files": [31]}
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316119009673, "job": 1, "event": "recovery_finished"}
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta old nid_max 1025
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta old blobid_max 10240
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta min_alloc_size 0x1000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: freelist init
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: freelist _read_cfg
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: bluefs umount
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439171180 /var/lib/ceph/osd/ceph-5/block) close
Nov 28 07:48:39 np0005538513.localdomain podman[32621]: 2025-11-28 07:48:39.026529603 +0000 UTC m=+0.138978302 container remove 60208db6d0ac2f25afd9c2f81eb8933eb34be3c31a6a8ec5fe39fb46c3405918 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, vendor=Red Hat, Inc., version=7, name=rhceph, vcs-type=git, io.buildah.version=1.33.12, RELEASE=main, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.openshift.tags=rhceph ceph, architecture=x86_64, ceph=True, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 07:48:39 np0005538513.localdomain systemd[1]: libpod-conmon-60208db6d0ac2f25afd9c2f81eb8933eb34be3c31a6a8ec5fe39fb46c3405918.scope: Deactivated successfully.
Nov 28 07:48:39 np0005538513.localdomain podman[32834]: 
Nov 28 07:48:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f369eaa6673c2c86c889c1ddac2e0d5d6a7af4d65af83ecf2e6fce15337e7f5e-merged.mount: Deactivated successfully.
Nov 28 07:48:39 np0005538513.localdomain podman[32834]: 2025-11-28 07:48:39.257337268 +0000 UTC m=+0.101004498 container create e18c31b9983404861865f964abccc1c24d5e067572f133d39dbb490052eeb2c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_lederberg, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, architecture=x86_64, vendor=Red Hat, Inc., version=7, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, vcs-type=git, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439171180 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439171180 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: bdev(0x558439171180 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-5/block size 7.0 GiB
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: bluefs mount
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: bluefs mount shared_bdev_used = 4718592
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: RocksDB version: 7.9.2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Git sha 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Compile date 2025-09-23 00:00:00
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: DB SUMMARY
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: DB Session ID:  QIG9JK7FL3F9WYN4LANX
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: CURRENT file:  CURRENT
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: IDENTITY file:  IDENTITY
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                         Options.error_if_exists: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.create_if_missing: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                         Options.paranoid_checks: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                                     Options.env: 0x558439405ea0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                                      Options.fs: LegacyFileSystem
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                                Options.info_log: 0x558439210a40
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_file_opening_threads: 16
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                              Options.statistics: (nil)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.use_fsync: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.max_log_file_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                         Options.allow_fallocate: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.use_direct_reads: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.create_missing_column_families: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                              Options.db_log_dir: 
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                                 Options.wal_dir: db.wal
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.advise_random_on_open: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.write_buffer_manager: 0x55843915b540
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                            Options.rate_limiter: (nil)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.unordered_write: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.row_cache: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                              Options.wal_filter: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.allow_ingest_behind: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.two_write_queues: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.manual_wal_flush: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.wal_compression: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.atomic_flush: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.log_readahead_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.allow_data_in_errors: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.db_host_id: __hostname__
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.max_background_jobs: 4
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.max_background_compactions: -1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.max_subcompactions: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.writable_file_max_buffer_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.max_total_wal_size: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.max_open_files: -1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.bytes_per_sync: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.compaction_readahead_size: 2097152
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.max_background_flushes: -1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Compression algorithms supported:
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         kZSTD supported: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         kXpressCompression supported: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         kBZip2Compression supported: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         kLZ4Compression supported: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         kZlibCompression supported: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         kLZ4HCCompression supported: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         kSnappyCompression supported: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5584392117e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x558439149610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.table_properties_collectors: 
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5584392117e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x558439149610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5584392117e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x558439149610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5584392117e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x558439149610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5584392117e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x558439149610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5584392117e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x558439149610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5584392117e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x558439149610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x558439210f60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5584391482d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x558439210f60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5584391482d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:39 np0005538513.localdomain podman[32834]: 2025-11-28 07:48:39.204096655 +0000 UTC m=+0.047763895 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x558439210f60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5584391482d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 2c33d069-07f2-43c6-8b70-8b37a70b2431
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316119306088, "job": 1, "event": "recovery_started", "wal_files": [31]}
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316119314181, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764316119, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2c33d069-07f2-43c6-8b70-8b37a70b2431", "db_session_id": "QIG9JK7FL3F9WYN4LANX", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316119319536, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764316119, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2c33d069-07f2-43c6-8b70-8b37a70b2431", "db_session_id": "QIG9JK7FL3F9WYN4LANX", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316119323907, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764316119, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2c33d069-07f2-43c6-8b70-8b37a70b2431", "db_session_id": "QIG9JK7FL3F9WYN4LANX", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Nov 28 07:48:39 np0005538513.localdomain systemd[1]: Started libpod-conmon-e18c31b9983404861865f964abccc1c24d5e067572f133d39dbb490052eeb2c6.scope.
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316119356289, "job": 1, "event": "recovery_finished"}
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Nov 28 07:48:39 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:39 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ae70a0968f7f760f7aaf3c7e95cf793447e5cc122a7a8d1f8c9cbc3e8197f70/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:39 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ae70a0968f7f760f7aaf3c7e95cf793447e5cc122a7a8d1f8c9cbc3e8197f70/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:39 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ae70a0968f7f760f7aaf3c7e95cf793447e5cc122a7a8d1f8c9cbc3e8197f70/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:39 np0005538513.localdomain podman[32834]: 2025-11-28 07:48:39.41206646 +0000 UTC m=+0.255733700 container init e18c31b9983404861865f964abccc1c24d5e067572f133d39dbb490052eeb2c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_lederberg, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, distribution-scope=public, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, ceph=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph)
Nov 28 07:48:39 np0005538513.localdomain systemd[1]: tmp-crun.Jmx6dQ.mount: Deactivated successfully.
Nov 28 07:48:39 np0005538513.localdomain podman[32834]: 2025-11-28 07:48:39.436608773 +0000 UTC m=+0.280275973 container start e18c31b9983404861865f964abccc1c24d5e067572f133d39dbb490052eeb2c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_lederberg, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, io.openshift.expose-services=, ceph=True, CEPH_POINT_RELEASE=, distribution-scope=public, vcs-type=git, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, name=rhceph, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7)
Nov 28 07:48:39 np0005538513.localdomain podman[32834]: 2025-11-28 07:48:39.436959353 +0000 UTC m=+0.280626583 container attach e18c31b9983404861865f964abccc1c24d5e067572f133d39dbb490052eeb2c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_lederberg, version=7, RELEASE=main, name=rhceph, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, ceph=True, description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.33.12, release=553, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55843a2c6380
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: DB pointer 0x55843a059a00
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _upgrade_super from 4, latest 4
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _upgrade_super done
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 0.3 total, 0.3 interval
                                                          Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                          Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.3 total, 0.3 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.3 total, 0.3 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.3 total, 0.3 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.3 total, 0.3 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.3 total, 0.3 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.3 total, 0.3 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.3 total, 0.3 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.3 total, 0.3 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5584391482d0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1.4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.3 total, 0.3 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5584391482d0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1.4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.3 total, 0.3 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5584391482d0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1.4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.032       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.032       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.032       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.032       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.3 total, 0.3 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.3 total, 0.3 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: _get_class not permitted to load lua
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: _get_class not permitted to load sdk
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: _get_class not permitted to load test_remote_reads
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: osd.5 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: osd.5 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: osd.5 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: osd.5 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: osd.5 0 load_pgs
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: osd.5 0 load_pgs opened 0 pgs
Nov 28 07:48:39 np0005538513.localdomain ceph-osd[32506]: osd.5 0 log_to_monitors true
Nov 28 07:48:39 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5[32502]: 2025-11-28T07:48:39.615+0000 7f777c525a80 -1 osd.5 0 log_to_monitors true
Nov 28 07:48:39 np0005538513.localdomain hungry_lederberg[33031]: {
Nov 28 07:48:39 np0005538513.localdomain hungry_lederberg[33031]:     "59ca4283-ae21-42b7-993b-9e0e69e2fb94": {
Nov 28 07:48:39 np0005538513.localdomain hungry_lederberg[33031]:         "ceph_fsid": "2c5417c9-00eb-57d5-a565-ddecbc7995c1",
Nov 28 07:48:39 np0005538513.localdomain hungry_lederberg[33031]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Nov 28 07:48:39 np0005538513.localdomain hungry_lederberg[33031]:         "osd_id": 5,
Nov 28 07:48:39 np0005538513.localdomain hungry_lederberg[33031]:         "osd_uuid": "59ca4283-ae21-42b7-993b-9e0e69e2fb94",
Nov 28 07:48:39 np0005538513.localdomain hungry_lederberg[33031]:         "type": "bluestore"
Nov 28 07:48:39 np0005538513.localdomain hungry_lederberg[33031]:     },
Nov 28 07:48:39 np0005538513.localdomain hungry_lederberg[33031]:     "d7af0c01-7a1e-4708-8e50-081c55d3ecd3": {
Nov 28 07:48:39 np0005538513.localdomain hungry_lederberg[33031]:         "ceph_fsid": "2c5417c9-00eb-57d5-a565-ddecbc7995c1",
Nov 28 07:48:39 np0005538513.localdomain hungry_lederberg[33031]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Nov 28 07:48:39 np0005538513.localdomain hungry_lederberg[33031]:         "osd_id": 2,
Nov 28 07:48:39 np0005538513.localdomain hungry_lederberg[33031]:         "osd_uuid": "d7af0c01-7a1e-4708-8e50-081c55d3ecd3",
Nov 28 07:48:39 np0005538513.localdomain hungry_lederberg[33031]:         "type": "bluestore"
Nov 28 07:48:39 np0005538513.localdomain hungry_lederberg[33031]:     }
Nov 28 07:48:39 np0005538513.localdomain hungry_lederberg[33031]: }
Nov 28 07:48:40 np0005538513.localdomain systemd[1]: libpod-e18c31b9983404861865f964abccc1c24d5e067572f133d39dbb490052eeb2c6.scope: Deactivated successfully.
Nov 28 07:48:40 np0005538513.localdomain podman[32834]: 2025-11-28 07:48:40.020145202 +0000 UTC m=+0.863812422 container died e18c31b9983404861865f964abccc1c24d5e067572f133d39dbb490052eeb2c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_lederberg, GIT_CLEAN=True, architecture=x86_64, io.openshift.expose-services=, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, distribution-scope=public, name=rhceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=)
Nov 28 07:48:40 np0005538513.localdomain podman[33100]: 2025-11-28 07:48:40.11690095 +0000 UTC m=+0.088740285 container remove e18c31b9983404861865f964abccc1c24d5e067572f133d39dbb490052eeb2c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_lederberg, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, ceph=True, io.buildah.version=1.33.12, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, release=553, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, version=7, vendor=Red Hat, Inc., name=rhceph, GIT_BRANCH=main)
Nov 28 07:48:40 np0005538513.localdomain systemd[1]: libpod-conmon-e18c31b9983404861865f964abccc1c24d5e067572f133d39dbb490052eeb2c6.scope: Deactivated successfully.
Nov 28 07:48:40 np0005538513.localdomain sudo[32534]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:40 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-7ae70a0968f7f760f7aaf3c7e95cf793447e5cc122a7a8d1f8c9cbc3e8197f70-merged.mount: Deactivated successfully.
Nov 28 07:48:40 np0005538513.localdomain systemd[26286]: Starting Mark boot as successful...
Nov 28 07:48:40 np0005538513.localdomain systemd[26286]: Finished Mark boot as successful.
Nov 28 07:48:40 np0005538513.localdomain ceph-osd[31557]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 19.279 iops: 4935.372 elapsed_sec: 0.608
Nov 28 07:48:40 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [WRN] : OSD bench result of 4935.372076 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Nov 28 07:48:40 np0005538513.localdomain ceph-osd[31557]: osd.2 0 waiting for initial osdmap
Nov 28 07:48:40 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2[31553]: 2025-11-28T07:48:40.255+0000 7fd091366640 -1 osd.2 0 waiting for initial osdmap
Nov 28 07:48:40 np0005538513.localdomain ceph-osd[31557]: osd.2 13 crush map has features 288514050185494528, adjusting msgr requires for clients
Nov 28 07:48:40 np0005538513.localdomain ceph-osd[31557]: osd.2 13 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons
Nov 28 07:48:40 np0005538513.localdomain ceph-osd[31557]: osd.2 13 crush map has features 3314932999778484224, adjusting msgr requires for osds
Nov 28 07:48:40 np0005538513.localdomain ceph-osd[31557]: osd.2 13 check_osdmap_features require_osd_release unknown -> reef
Nov 28 07:48:40 np0005538513.localdomain ceph-osd[31557]: osd.2 13 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Nov 28 07:48:40 np0005538513.localdomain ceph-osd[31557]: osd.2 13 set_numa_affinity not setting numa affinity
Nov 28 07:48:40 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2[31553]: 2025-11-28T07:48:40.273+0000 7fd08c17b640 -1 osd.2 13 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Nov 28 07:48:40 np0005538513.localdomain ceph-osd[31557]: osd.2 13 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial
Nov 28 07:48:40 np0005538513.localdomain sudo[33115]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:48:40 np0005538513.localdomain sudo[33115]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:40 np0005538513.localdomain sudo[33115]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:40 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Nov 28 07:48:40 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Nov 28 07:48:40 np0005538513.localdomain sudo[33130]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:48:40 np0005538513.localdomain sudo[33130]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:40 np0005538513.localdomain sudo[33130]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:40 np0005538513.localdomain sudo[33145]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 07:48:40 np0005538513.localdomain sudo[33145]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:41 np0005538513.localdomain ceph-osd[32506]: osd.5 0 done with init, starting boot process
Nov 28 07:48:41 np0005538513.localdomain ceph-osd[32506]: osd.5 0 start_boot
Nov 28 07:48:41 np0005538513.localdomain ceph-osd[32506]: osd.5 0 maybe_override_options_for_qos osd_max_backfills set to 1
Nov 28 07:48:41 np0005538513.localdomain ceph-osd[32506]: osd.5 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Nov 28 07:48:41 np0005538513.localdomain ceph-osd[32506]: osd.5 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Nov 28 07:48:41 np0005538513.localdomain ceph-osd[32506]: osd.5 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Nov 28 07:48:41 np0005538513.localdomain ceph-osd[32506]: osd.5 0  bench count 12288000 bsize 4 KiB
Nov 28 07:48:41 np0005538513.localdomain ceph-osd[31557]: osd.2 14 state: booting -> active
Nov 28 07:48:41 np0005538513.localdomain systemd[1]: tmp-crun.E1rhRd.mount: Deactivated successfully.
Nov 28 07:48:41 np0005538513.localdomain podman[33228]: 2025-11-28 07:48:41.703583889 +0000 UTC m=+0.119283962 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, vcs-type=git, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.buildah.version=1.33.12, architecture=x86_64, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, io.openshift.tags=rhceph ceph)
Nov 28 07:48:41 np0005538513.localdomain podman[33228]: 2025-11-28 07:48:41.837646746 +0000 UTC m=+0.253346859 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, build-date=2025-09-24T08:57:55, ceph=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., version=7, GIT_BRANCH=main, name=rhceph, RELEASE=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 07:48:42 np0005538513.localdomain sudo[33145]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:42 np0005538513.localdomain sudo[33296]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:48:42 np0005538513.localdomain sudo[33296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:42 np0005538513.localdomain sudo[33296]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:42 np0005538513.localdomain sudo[33311]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:48:42 np0005538513.localdomain sudo[33311]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:43 np0005538513.localdomain sudo[33311]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:43 np0005538513.localdomain ceph-osd[31557]: osd.2 16 crush map has features 288514051259236352, adjusting msgr requires for clients
Nov 28 07:48:43 np0005538513.localdomain ceph-osd[31557]: osd.2 16 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons
Nov 28 07:48:43 np0005538513.localdomain ceph-osd[31557]: osd.2 16 crush map has features 3314933000852226048, adjusting msgr requires for osds
Nov 28 07:48:43 np0005538513.localdomain sudo[33358]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:48:43 np0005538513.localdomain sudo[33358]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:43 np0005538513.localdomain sudo[33358]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:43 np0005538513.localdomain sudo[33373]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 -- inventory --format=json-pretty --filter-for-batch
Nov 28 07:48:43 np0005538513.localdomain sudo[33373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:43 np0005538513.localdomain podman[33426]: 
Nov 28 07:48:43 np0005538513.localdomain podman[33426]: 2025-11-28 07:48:43.888439757 +0000 UTC m=+0.100480954 container create 718f3beb942534a450e919b24a5f04bd5ecdefbb9a2c26f840fa8075b2c38c60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_sutherland, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, ceph=True, release=553, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, RELEASE=main, name=rhceph, io.openshift.tags=rhceph ceph, version=7, distribution-scope=public, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_CLEAN=True, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 07:48:43 np0005538513.localdomain systemd[1]: Started libpod-conmon-718f3beb942534a450e919b24a5f04bd5ecdefbb9a2c26f840fa8075b2c38c60.scope.
Nov 28 07:48:43 np0005538513.localdomain podman[33426]: 2025-11-28 07:48:43.838476957 +0000 UTC m=+0.050518184 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:43 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:43 np0005538513.localdomain podman[33426]: 2025-11-28 07:48:43.962768605 +0000 UTC m=+0.174809812 container init 718f3beb942534a450e919b24a5f04bd5ecdefbb9a2c26f840fa8075b2c38c60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_sutherland, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, com.redhat.component=rhceph-container, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_CLEAN=True, io.buildah.version=1.33.12, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.openshift.tags=rhceph ceph, architecture=x86_64, distribution-scope=public, build-date=2025-09-24T08:57:55)
Nov 28 07:48:43 np0005538513.localdomain agitated_sutherland[33442]: 167 167
Nov 28 07:48:43 np0005538513.localdomain systemd[1]: libpod-718f3beb942534a450e919b24a5f04bd5ecdefbb9a2c26f840fa8075b2c38c60.scope: Deactivated successfully.
Nov 28 07:48:43 np0005538513.localdomain podman[33426]: 2025-11-28 07:48:43.990723416 +0000 UTC m=+0.202764613 container start 718f3beb942534a450e919b24a5f04bd5ecdefbb9a2c26f840fa8075b2c38c60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_sutherland, GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, architecture=x86_64, release=553, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 07:48:43 np0005538513.localdomain podman[33426]: 2025-11-28 07:48:43.991129256 +0000 UTC m=+0.203170463 container attach 718f3beb942534a450e919b24a5f04bd5ecdefbb9a2c26f840fa8075b2c38c60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_sutherland, vendor=Red Hat, Inc., version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, distribution-scope=public, name=rhceph, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 07:48:43 np0005538513.localdomain podman[33426]: 2025-11-28 07:48:43.996109373 +0000 UTC m=+0.208150610 container died 718f3beb942534a450e919b24a5f04bd5ecdefbb9a2c26f840fa8075b2c38c60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_sutherland, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, RELEASE=main, vendor=Red Hat, Inc., name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.openshift.expose-services=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 07:48:44 np0005538513.localdomain podman[33447]: 2025-11-28 07:48:44.135112145 +0000 UTC m=+0.134910069 container remove 718f3beb942534a450e919b24a5f04bd5ecdefbb9a2c26f840fa8075b2c38c60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_sutherland, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, ceph=True, name=rhceph, vcs-type=git, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, version=7, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, distribution-scope=public, description=Red Hat Ceph Storage 7)
Nov 28 07:48:44 np0005538513.localdomain systemd[1]: libpod-conmon-718f3beb942534a450e919b24a5f04bd5ecdefbb9a2c26f840fa8075b2c38c60.scope: Deactivated successfully.
Nov 28 07:48:44 np0005538513.localdomain podman[33468]: 
Nov 28 07:48:44 np0005538513.localdomain podman[33468]: 2025-11-28 07:48:44.367667044 +0000 UTC m=+0.089801462 container create e59ab344e978e18de80bd7453fcebe05fb670bb6e6972895651ded8c4e76f0db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_dewdney, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.12, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, version=7, name=rhceph, ceph=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, architecture=x86_64, RELEASE=main, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph)
Nov 28 07:48:44 np0005538513.localdomain systemd[1]: Started libpod-conmon-e59ab344e978e18de80bd7453fcebe05fb670bb6e6972895651ded8c4e76f0db.scope.
Nov 28 07:48:44 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:44 np0005538513.localdomain podman[33468]: 2025-11-28 07:48:44.324187059 +0000 UTC m=+0.046321477 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:44 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66939a59852f714884024eaba25714c37ad2789e08b1f642da2b3619778100e4/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:44 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66939a59852f714884024eaba25714c37ad2789e08b1f642da2b3619778100e4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:44 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66939a59852f714884024eaba25714c37ad2789e08b1f642da2b3619778100e4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:44 np0005538513.localdomain podman[33468]: 2025-11-28 07:48:44.466555167 +0000 UTC m=+0.188689575 container init e59ab344e978e18de80bd7453fcebe05fb670bb6e6972895651ded8c4e76f0db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_dewdney, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, distribution-scope=public, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, com.redhat.component=rhceph-container, io.openshift.expose-services=, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.tags=rhceph ceph, vcs-type=git, build-date=2025-09-24T08:57:55)
Nov 28 07:48:44 np0005538513.localdomain podman[33468]: 2025-11-28 07:48:44.478934051 +0000 UTC m=+0.201068459 container start e59ab344e978e18de80bd7453fcebe05fb670bb6e6972895651ded8c4e76f0db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_dewdney, version=7, vcs-type=git, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, distribution-scope=public, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12)
Nov 28 07:48:44 np0005538513.localdomain podman[33468]: 2025-11-28 07:48:44.47923619 +0000 UTC m=+0.201370598 container attach e59ab344e978e18de80bd7453fcebe05fb670bb6e6972895651ded8c4e76f0db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_dewdney, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_BRANCH=main, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, vcs-type=git, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc.)
Nov 28 07:48:44 np0005538513.localdomain systemd[1]: tmp-crun.Dx4AKv.mount: Deactivated successfully.
Nov 28 07:48:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-4e4fef968abce8499352901c38db19c10e77328d673d6da31eb225f2f4312b58-merged.mount: Deactivated successfully.
Nov 28 07:48:45 np0005538513.localdomain ceph-osd[32506]: osd.5 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 19.941 iops: 5104.910 elapsed_sec: 0.588
Nov 28 07:48:45 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [WRN] : OSD bench result of 5104.909958 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.5. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Nov 28 07:48:45 np0005538513.localdomain ceph-osd[32506]: osd.5 0 waiting for initial osdmap
Nov 28 07:48:45 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5[32502]: 2025-11-28T07:48:45.016+0000 7f7778cb9640 -1 osd.5 0 waiting for initial osdmap
Nov 28 07:48:45 np0005538513.localdomain ceph-osd[32506]: osd.5 17 crush map has features 288514051259236352, adjusting msgr requires for clients
Nov 28 07:48:45 np0005538513.localdomain ceph-osd[32506]: osd.5 17 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Nov 28 07:48:45 np0005538513.localdomain ceph-osd[32506]: osd.5 17 crush map has features 3314933000852226048, adjusting msgr requires for osds
Nov 28 07:48:45 np0005538513.localdomain ceph-osd[32506]: osd.5 17 check_osdmap_features require_osd_release unknown -> reef
Nov 28 07:48:45 np0005538513.localdomain ceph-osd[32506]: osd.5 17 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Nov 28 07:48:45 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5[32502]: 2025-11-28T07:48:45.039+0000 7f7773ace640 -1 osd.5 17 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Nov 28 07:48:45 np0005538513.localdomain ceph-osd[32506]: osd.5 17 set_numa_affinity not setting numa affinity
Nov 28 07:48:45 np0005538513.localdomain ceph-osd[32506]: osd.5 17 _collect_metadata loop4:  no unique device id for loop4: fallback method has no model nor serial
Nov 28 07:48:45 np0005538513.localdomain ceph-osd[32506]: osd.5 18 state: booting -> active
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]: [
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:     {
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:         "available": false,
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:         "ceph_device": false,
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:         "lsm_data": {},
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:         "lvs": [],
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:         "path": "/dev/sr0",
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:         "rejected_reasons": [
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:             "Has a FileSystem",
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:             "Insufficient space (<5GB)"
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:         ],
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:         "sys_api": {
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:             "actuators": null,
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:             "device_nodes": "sr0",
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:             "human_readable_size": "482.00 KB",
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:             "id_bus": "ata",
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:             "model": "QEMU DVD-ROM",
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:             "nr_requests": "2",
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:             "partitions": {},
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:             "path": "/dev/sr0",
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:             "removable": "1",
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:             "rev": "2.5+",
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:             "ro": "0",
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:             "rotational": "1",
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:             "sas_address": "",
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:             "sas_device_handle": "",
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:             "scheduler_mode": "mq-deadline",
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:             "sectors": 0,
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:             "sectorsize": "2048",
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:             "size": 493568.0,
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:             "support_discard": "0",
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:             "type": "disk",
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:             "vendor": "QEMU"
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:         }
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]:     }
Nov 28 07:48:45 np0005538513.localdomain eloquent_dewdney[33484]: ]
Nov 28 07:48:45 np0005538513.localdomain systemd[1]: libpod-e59ab344e978e18de80bd7453fcebe05fb670bb6e6972895651ded8c4e76f0db.scope: Deactivated successfully.
Nov 28 07:48:45 np0005538513.localdomain podman[33468]: 2025-11-28 07:48:45.382683626 +0000 UTC m=+1.104818014 container died e59ab344e978e18de80bd7453fcebe05fb670bb6e6972895651ded8c4e76f0db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_dewdney, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, build-date=2025-09-24T08:57:55, ceph=True, GIT_CLEAN=True, architecture=x86_64, version=7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12)
Nov 28 07:48:45 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-66939a59852f714884024eaba25714c37ad2789e08b1f642da2b3619778100e4-merged.mount: Deactivated successfully.
Nov 28 07:48:45 np0005538513.localdomain podman[34703]: 2025-11-28 07:48:45.520372765 +0000 UTC m=+0.125188623 container remove e59ab344e978e18de80bd7453fcebe05fb670bb6e6972895651ded8c4e76f0db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_dewdney, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, distribution-scope=public, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, architecture=x86_64, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7)
Nov 28 07:48:45 np0005538513.localdomain systemd[1]: libpod-conmon-e59ab344e978e18de80bd7453fcebe05fb670bb6e6972895651ded8c4e76f0db.scope: Deactivated successfully.
Nov 28 07:48:45 np0005538513.localdomain sudo[33373]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:46 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 18 pg[1.0( empty local-lis/les=0/0 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=18) [1,5,3] r=1 lpr=18 pi=[15,18)/0 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 07:48:48 np0005538513.localdomain sudo[34715]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:48:48 np0005538513.localdomain sudo[34715]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:48 np0005538513.localdomain sudo[34715]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:54 np0005538513.localdomain sudo[34730]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:48:54 np0005538513.localdomain sudo[34730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:54 np0005538513.localdomain sudo[34730]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:54 np0005538513.localdomain sudo[34745]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 07:48:54 np0005538513.localdomain sudo[34745]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:55 np0005538513.localdomain podman[34829]: 2025-11-28 07:48:55.235066178 +0000 UTC m=+0.088917000 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, ceph=True, io.openshift.tags=rhceph ceph, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 07:48:55 np0005538513.localdomain podman[34829]: 2025-11-28 07:48:55.368897149 +0000 UTC m=+0.222747971 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., version=7, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_CLEAN=True, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, architecture=x86_64)
Nov 28 07:48:55 np0005538513.localdomain sudo[34745]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:56 np0005538513.localdomain sudo[34896]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:48:56 np0005538513.localdomain sudo[34896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:56 np0005538513.localdomain sudo[34896]: pam_unix(sudo:session): session closed for user root
Nov 28 07:49:37 np0005538513.localdomain sshd[34911]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:49:52 np0005538513.localdomain sshd[34911]: Connection closed by 167.94.138.43 port 19750 [preauth]
Nov 28 07:49:56 np0005538513.localdomain sudo[34913]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:49:56 np0005538513.localdomain sudo[34913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:49:56 np0005538513.localdomain sudo[34913]: pam_unix(sudo:session): session closed for user root
Nov 28 07:49:56 np0005538513.localdomain sudo[34928]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 07:49:56 np0005538513.localdomain sudo[34928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:49:57 np0005538513.localdomain podman[35011]: 2025-11-28 07:49:57.203469945 +0000 UTC m=+0.096831665 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, ceph=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git)
Nov 28 07:49:57 np0005538513.localdomain podman[35011]: 2025-11-28 07:49:57.310061951 +0000 UTC m=+0.203423691 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, vcs-type=git, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, release=553, version=7, CEPH_POINT_RELEASE=, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc.)
Nov 28 07:49:57 np0005538513.localdomain sudo[34928]: pam_unix(sudo:session): session closed for user root
Nov 28 07:49:57 np0005538513.localdomain sudo[35077]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:49:57 np0005538513.localdomain sudo[35077]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:49:57 np0005538513.localdomain sudo[35077]: pam_unix(sudo:session): session closed for user root
Nov 28 07:49:57 np0005538513.localdomain sudo[35092]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:49:57 np0005538513.localdomain sudo[35092]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:49:58 np0005538513.localdomain sudo[35092]: pam_unix(sudo:session): session closed for user root
Nov 28 07:49:58 np0005538513.localdomain sudo[35138]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:49:58 np0005538513.localdomain sudo[35138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:49:58 np0005538513.localdomain sudo[35138]: pam_unix(sudo:session): session closed for user root
Nov 28 07:50:01 np0005538513.localdomain sshd[24787]: Received disconnect from 192.168.122.100 port 55686:11: disconnected by user
Nov 28 07:50:01 np0005538513.localdomain sshd[24787]: Disconnected from user zuul 192.168.122.100 port 55686
Nov 28 07:50:01 np0005538513.localdomain sshd[24784]: pam_unix(sshd:session): session closed for user zuul
Nov 28 07:50:01 np0005538513.localdomain systemd[1]: session-13.scope: Deactivated successfully.
Nov 28 07:50:01 np0005538513.localdomain systemd[1]: session-13.scope: Consumed 20.957s CPU time.
Nov 28 07:50:01 np0005538513.localdomain systemd-logind[764]: Session 13 logged out. Waiting for processes to exit.
Nov 28 07:50:01 np0005538513.localdomain systemd-logind[764]: Removed session 13.
Nov 28 07:50:59 np0005538513.localdomain sudo[35153]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:50:59 np0005538513.localdomain sudo[35153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:50:59 np0005538513.localdomain sudo[35153]: pam_unix(sudo:session): session closed for user root
Nov 28 07:50:59 np0005538513.localdomain sudo[35168]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:50:59 np0005538513.localdomain sudo[35168]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:50:59 np0005538513.localdomain sudo[35168]: pam_unix(sudo:session): session closed for user root
Nov 28 07:51:00 np0005538513.localdomain sudo[35216]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:51:00 np0005538513.localdomain sudo[35216]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:51:00 np0005538513.localdomain sudo[35216]: pam_unix(sudo:session): session closed for user root
Nov 28 07:52:00 np0005538513.localdomain sudo[35231]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:52:00 np0005538513.localdomain sudo[35231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:52:00 np0005538513.localdomain sudo[35231]: pam_unix(sudo:session): session closed for user root
Nov 28 07:52:00 np0005538513.localdomain sudo[35246]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:52:00 np0005538513.localdomain sudo[35246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:52:01 np0005538513.localdomain sudo[35246]: pam_unix(sudo:session): session closed for user root
Nov 28 07:52:01 np0005538513.localdomain sudo[35292]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:52:01 np0005538513.localdomain sudo[35292]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:52:01 np0005538513.localdomain sudo[35292]: pam_unix(sudo:session): session closed for user root
Nov 28 07:52:05 np0005538513.localdomain systemd[26286]: Created slice User Background Tasks Slice.
Nov 28 07:52:05 np0005538513.localdomain systemd[26286]: Starting Cleanup of User's Temporary Files and Directories...
Nov 28 07:52:05 np0005538513.localdomain systemd[26286]: Finished Cleanup of User's Temporary Files and Directories.
Nov 28 07:53:02 np0005538513.localdomain sudo[35309]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:53:02 np0005538513.localdomain sudo[35309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:53:02 np0005538513.localdomain sudo[35309]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:02 np0005538513.localdomain sudo[35324]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:53:02 np0005538513.localdomain sudo[35324]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:53:02 np0005538513.localdomain sudo[35324]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:03 np0005538513.localdomain sudo[35370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:53:03 np0005538513.localdomain sudo[35370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:53:03 np0005538513.localdomain sudo[35370]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:40 np0005538513.localdomain sshd[35385]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:53:40 np0005538513.localdomain sshd[35385]: Accepted publickey for zuul from 192.168.122.100 port 53432 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 07:53:40 np0005538513.localdomain systemd-logind[764]: New session 27 of user zuul.
Nov 28 07:53:40 np0005538513.localdomain systemd[1]: Started Session 27 of User zuul.
Nov 28 07:53:40 np0005538513.localdomain sshd[35385]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 07:53:41 np0005538513.localdomain sudo[35431]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-erwceduugkylwhhghlvcjpcpztlxhcpb ; /usr/bin/python3
Nov 28 07:53:41 np0005538513.localdomain sudo[35431]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:53:41 np0005538513.localdomain python3[35433]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 28 07:53:41 np0005538513.localdomain sudo[35431]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:41 np0005538513.localdomain sudo[35476]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fbkkpgfeyyymmmzmmjxsgmhuobmqwjvm ; /usr/bin/python3
Nov 28 07:53:41 np0005538513.localdomain sudo[35476]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:53:42 np0005538513.localdomain python3[35478]: ansible-setup Invoked with gather_subset=['!facter', '!ohai'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 07:53:42 np0005538513.localdomain sudo[35476]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:42 np0005538513.localdomain sudo[35496]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udndveqcfwjrwkkezoqhwomrfvskpkiz ; /usr/bin/python3
Nov 28 07:53:42 np0005538513.localdomain sudo[35496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:53:42 np0005538513.localdomain python3[35498]: ansible-user Invoked with name=tripleo-admin generate_ssh_key=False state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005538513.localdomain update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 28 07:53:42 np0005538513.localdomain useradd[35500]: new group: name=tripleo-admin, GID=1003
Nov 28 07:53:42 np0005538513.localdomain useradd[35500]: new user: name=tripleo-admin, UID=1003, GID=1003, home=/home/tripleo-admin, shell=/bin/bash, from=none
Nov 28 07:53:42 np0005538513.localdomain sudo[35496]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:43 np0005538513.localdomain sudo[35552]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vbnjrujqxthjobtcmaccwjkzyploefzo ; /usr/bin/python3
Nov 28 07:53:43 np0005538513.localdomain sudo[35552]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:53:43 np0005538513.localdomain python3[35554]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/tripleo-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:53:43 np0005538513.localdomain sudo[35552]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:43 np0005538513.localdomain sudo[35595]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wutetnypueahbghleeosivfrqzukogzz ; /usr/bin/python3
Nov 28 07:53:43 np0005538513.localdomain sudo[35595]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:53:43 np0005538513.localdomain python3[35597]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/tripleo-admin mode=288 owner=root group=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764316423.0929687-66619-262688719115229/source _original_basename=tmp9pewikg7 follow=False checksum=b3e7ecdcc699d217c6b083a91b07208207813d93 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:53:43 np0005538513.localdomain sudo[35595]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:44 np0005538513.localdomain sudo[35625]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tcwcymprgmgysxodgwtovwpkyaqezvly ; /usr/bin/python3
Nov 28 07:53:44 np0005538513.localdomain sudo[35625]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:53:44 np0005538513.localdomain python3[35627]: ansible-file Invoked with path=/home/tripleo-admin state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:53:44 np0005538513.localdomain sudo[35625]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:44 np0005538513.localdomain sudo[35641]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vlyoobchevbtxlspmzovvetnhesxagyj ; /usr/bin/python3
Nov 28 07:53:44 np0005538513.localdomain sudo[35641]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:53:44 np0005538513.localdomain python3[35643]: ansible-file Invoked with path=/home/tripleo-admin/.ssh state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:53:44 np0005538513.localdomain sudo[35641]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:44 np0005538513.localdomain sudo[35657]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oyjvygzlpbfhoopabunbrlaqtscdfssq ; /usr/bin/python3
Nov 28 07:53:44 np0005538513.localdomain sudo[35657]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:53:44 np0005538513.localdomain python3[35659]: ansible-file Invoked with path=/home/tripleo-admin/.ssh/authorized_keys state=touch owner=tripleo-admin group=tripleo-admin mode=384 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:53:44 np0005538513.localdomain sudo[35657]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:45 np0005538513.localdomain sudo[35673]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jcfziqsuhhdawzswuxmsmetrmpbrngju ; /usr/bin/python3
Nov 28 07:53:45 np0005538513.localdomain sudo[35673]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:53:45 np0005538513.localdomain python3[35675]: ansible-lineinfile Invoked with path=/home/tripleo-admin/.ssh/authorized_keys line=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsnBivukZgTjr1SoC29hE3ofwUMxTaKeXh9gXvDwMJASbvK4q9943cbJ2j47GUf8sEgY38kkU/dxSMQWULl4d2oquIgZpJbJuXMU1WNxwGNSrS74OecQ3Or4VxTiDmu/HV83nIWHqfpDCra4DlrIBPPNwhBK4u0QYy87AJaML6NGEDaubbHgVCg1UpW1ho/sDoXptAehoCEaaeRz5tPHiXRnHpIXu44Sp8fRcyU9rBqdv+/lgachTcMYadsD2WBHIL+pptEDHB5TvQTDpnU58YdGFarn8uuGPP4t8H6xcqXbaJS9/oZa5Fb5Mh3vORBbR65jvlGg4PYGzCuI/xllY5+lGK7eyOleFyRqWKa2uAIaGoRBT4ZLKAssOFwCIaGfOAFFOBMkuylg4+MtbYiMJYRARPSRAufAROqhUDOo73y5lBrXh07aiWuSn8fU4mclWu+Xw382ryxW+XeHPc12d7S46TvGJaRvzsLtlyerRxGI77xOHRexq1Z/SFjOWLOwc= zuul-build-sshkey
                                                          regexp=Generated by TripleO state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:53:45 np0005538513.localdomain sudo[35673]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:46 np0005538513.localdomain python3[35689]: ansible-ping Invoked with data=pong
Nov 28 07:53:57 np0005538513.localdomain sshd[35690]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:53:57 np0005538513.localdomain sshd[35690]: Accepted publickey for tripleo-admin from 192.168.122.100 port 35712 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 07:53:57 np0005538513.localdomain systemd-logind[764]: New session 28 of user tripleo-admin.
Nov 28 07:53:57 np0005538513.localdomain systemd[1]: Created slice User Slice of UID 1003.
Nov 28 07:53:57 np0005538513.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Nov 28 07:53:57 np0005538513.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Nov 28 07:53:57 np0005538513.localdomain systemd[1]: Starting User Manager for UID 1003...
Nov 28 07:53:57 np0005538513.localdomain systemd[35694]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Nov 28 07:53:57 np0005538513.localdomain systemd[35694]: Queued start job for default target Main User Target.
Nov 28 07:53:57 np0005538513.localdomain systemd[35694]: Created slice User Application Slice.
Nov 28 07:53:57 np0005538513.localdomain systemd[35694]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 07:53:57 np0005538513.localdomain systemd[35694]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 07:53:57 np0005538513.localdomain systemd[35694]: Reached target Paths.
Nov 28 07:53:57 np0005538513.localdomain systemd[35694]: Reached target Timers.
Nov 28 07:53:57 np0005538513.localdomain systemd[35694]: Starting D-Bus User Message Bus Socket...
Nov 28 07:53:57 np0005538513.localdomain systemd[35694]: Starting Create User's Volatile Files and Directories...
Nov 28 07:53:57 np0005538513.localdomain systemd[35694]: Finished Create User's Volatile Files and Directories.
Nov 28 07:53:57 np0005538513.localdomain systemd[35694]: Listening on D-Bus User Message Bus Socket.
Nov 28 07:53:57 np0005538513.localdomain systemd[35694]: Reached target Sockets.
Nov 28 07:53:57 np0005538513.localdomain systemd[35694]: Reached target Basic System.
Nov 28 07:53:57 np0005538513.localdomain systemd[35694]: Reached target Main User Target.
Nov 28 07:53:57 np0005538513.localdomain systemd[35694]: Startup finished in 124ms.
Nov 28 07:53:57 np0005538513.localdomain systemd[1]: Started User Manager for UID 1003.
Nov 28 07:53:57 np0005538513.localdomain systemd[1]: Started Session 28 of User tripleo-admin.
Nov 28 07:53:57 np0005538513.localdomain sshd[35690]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Nov 28 07:53:58 np0005538513.localdomain sudo[35753]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-odyjbqrgwexiardfgjsmxlcpqjiyvoas ; /usr/bin/python3
Nov 28 07:53:58 np0005538513.localdomain sudo[35753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:53:58 np0005538513.localdomain python3[35755]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 07:53:58 np0005538513.localdomain sudo[35753]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:03 np0005538513.localdomain sudo[35760]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:54:03 np0005538513.localdomain sudo[35760]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:54:03 np0005538513.localdomain sudo[35760]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:03 np0005538513.localdomain sudo[35775]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:54:03 np0005538513.localdomain sudo[35775]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:54:03 np0005538513.localdomain sudo[35802]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-slseccplysrciaikxxksvkdoukuqxlii ; /usr/bin/python3
Nov 28 07:54:03 np0005538513.localdomain sudo[35802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:54:03 np0005538513.localdomain python3[35805]: ansible-selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config
Nov 28 07:54:03 np0005538513.localdomain sudo[35802]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:04 np0005538513.localdomain sudo[35775]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:04 np0005538513.localdomain sudo[35850]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aaefvzpptcoqeegflsotmsnvbciwphrc ; /usr/bin/python3
Nov 28 07:54:04 np0005538513.localdomain sudo[35850]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:54:04 np0005538513.localdomain python3[35852]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None
Nov 28 07:54:04 np0005538513.localdomain sudo[35850]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:04 np0005538513.localdomain sudo[35853]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:54:04 np0005538513.localdomain sudo[35853]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:54:04 np0005538513.localdomain sudo[35853]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:05 np0005538513.localdomain sudo[35913]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gvxeffhgghnjybtsphytzdqvuxwptbuv ; /usr/bin/python3
Nov 28 07:54:05 np0005538513.localdomain sudo[35913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:54:05 np0005538513.localdomain python3[35915]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.tykmt0pztmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:54:05 np0005538513.localdomain sudo[35913]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:05 np0005538513.localdomain sudo[35943]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uqrlnhtnffxwbdhryqyguodjpftvyemb ; /usr/bin/python3
Nov 28 07:54:05 np0005538513.localdomain sudo[35943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:54:05 np0005538513.localdomain python3[35945]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.tykmt0pztmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:54:05 np0005538513.localdomain sudo[35943]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:06 np0005538513.localdomain sudo[35959]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eukpbxwpiisywihvwsqjnjiacazjehnh ; /usr/bin/python3
Nov 28 07:54:06 np0005538513.localdomain sudo[35959]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:54:06 np0005538513.localdomain python3[35961]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.tykmt0pztmphosts insertbefore=BOF block=172.17.0.106 np0005538513.localdomain np0005538513
                                                         172.18.0.106 np0005538513.storage.localdomain np0005538513.storage
                                                         172.20.0.106 np0005538513.storagemgmt.localdomain np0005538513.storagemgmt
                                                         172.17.0.106 np0005538513.internalapi.localdomain np0005538513.internalapi
                                                         172.19.0.106 np0005538513.tenant.localdomain np0005538513.tenant
                                                         192.168.122.106 np0005538513.ctlplane.localdomain np0005538513.ctlplane
                                                         172.17.0.107 np0005538514.localdomain np0005538514
                                                         172.18.0.107 np0005538514.storage.localdomain np0005538514.storage
                                                         172.20.0.107 np0005538514.storagemgmt.localdomain np0005538514.storagemgmt
                                                         172.17.0.107 np0005538514.internalapi.localdomain np0005538514.internalapi
                                                         172.19.0.107 np0005538514.tenant.localdomain np0005538514.tenant
                                                         192.168.122.107 np0005538514.ctlplane.localdomain np0005538514.ctlplane
                                                         172.17.0.108 np0005538515.localdomain np0005538515
                                                         172.18.0.108 np0005538515.storage.localdomain np0005538515.storage
                                                         172.20.0.108 np0005538515.storagemgmt.localdomain np0005538515.storagemgmt
                                                         172.17.0.108 np0005538515.internalapi.localdomain np0005538515.internalapi
                                                         172.19.0.108 np0005538515.tenant.localdomain np0005538515.tenant
                                                         192.168.122.108 np0005538515.ctlplane.localdomain np0005538515.ctlplane
                                                         172.17.0.103 np0005538510.localdomain np0005538510
                                                         172.18.0.103 np0005538510.storage.localdomain np0005538510.storage
                                                         172.20.0.103 np0005538510.storagemgmt.localdomain np0005538510.storagemgmt
                                                         172.17.0.103 np0005538510.internalapi.localdomain np0005538510.internalapi
                                                         172.19.0.103 np0005538510.tenant.localdomain np0005538510.tenant
                                                         192.168.122.103 np0005538510.ctlplane.localdomain np0005538510.ctlplane
                                                         172.17.0.104 np0005538511.localdomain np0005538511
                                                         172.18.0.104 np0005538511.storage.localdomain np0005538511.storage
                                                         172.20.0.104 np0005538511.storagemgmt.localdomain np0005538511.storagemgmt
                                                         172.17.0.104 np0005538511.internalapi.localdomain np0005538511.internalapi
                                                         172.19.0.104 np0005538511.tenant.localdomain np0005538511.tenant
                                                         192.168.122.104 np0005538511.ctlplane.localdomain np0005538511.ctlplane
                                                         172.17.0.105 np0005538512.localdomain np0005538512
                                                         172.18.0.105 np0005538512.storage.localdomain np0005538512.storage
                                                         172.20.0.105 np0005538512.storagemgmt.localdomain np0005538512.storagemgmt
                                                         172.17.0.105 np0005538512.internalapi.localdomain np0005538512.internalapi
                                                         172.19.0.105 np0005538512.tenant.localdomain np0005538512.tenant
                                                         192.168.122.105 np0005538512.ctlplane.localdomain np0005538512.ctlplane
                                                         
                                                         192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane
                                                         192.168.122.99  overcloud.ctlplane.localdomain
                                                         172.18.0.197  overcloud.storage.localdomain
                                                         172.20.0.177  overcloud.storagemgmt.localdomain
                                                         172.17.0.128  overcloud.internalapi.localdomain
                                                         172.21.0.169  overcloud.localdomain
                                                          marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:54:06 np0005538513.localdomain sudo[35959]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:07 np0005538513.localdomain sudo[35975]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-laydqdowcvrtbtajqyalqyvtdwnhqkfu ; /usr/bin/python3
Nov 28 07:54:07 np0005538513.localdomain sudo[35975]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:54:07 np0005538513.localdomain python3[35977]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.tykmt0pztmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:54:07 np0005538513.localdomain sudo[35975]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:07 np0005538513.localdomain sudo[35992]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vkoegsmycqkiuzbwmrrkyqxnmveyeied ; /usr/bin/python3
Nov 28 07:54:07 np0005538513.localdomain sudo[35992]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:54:07 np0005538513.localdomain python3[35994]: ansible-file Invoked with path=/tmp/ansible.tykmt0pztmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:54:07 np0005538513.localdomain sudo[35992]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:08 np0005538513.localdomain sudo[36008]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emehmamhyzlegfsyuctdoolymgqagfsk ; /usr/bin/python3
Nov 28 07:54:08 np0005538513.localdomain sudo[36008]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:54:08 np0005538513.localdomain python3[36010]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides rhosp-release _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:54:08 np0005538513.localdomain sudo[36008]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:09 np0005538513.localdomain sudo[36025]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axashnyxklvwzkkftjfzorubxbblhcij ; /usr/bin/python3
Nov 28 07:54:09 np0005538513.localdomain sudo[36025]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:54:09 np0005538513.localdomain python3[36027]: ansible-ansible.legacy.dnf Invoked with name=['rhosp-release'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:54:12 np0005538513.localdomain sudo[36025]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:13 np0005538513.localdomain sudo[36044]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evphpgzgncflxmtvdcwndllmocbgailv ; /usr/bin/python3
Nov 28 07:54:13 np0005538513.localdomain sudo[36044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:54:13 np0005538513.localdomain python3[36046]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:54:13 np0005538513.localdomain sudo[36044]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:14 np0005538513.localdomain sudo[36061]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xcmqfgkituaqudfisxirxldxhijinuna ; /usr/bin/python3
Nov 28 07:54:14 np0005538513.localdomain sudo[36061]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:54:14 np0005538513.localdomain python3[36063]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'jq', 'nftables', 'openvswitch', 'openstack-heat-agents', 'openstack-selinux', 'os-net-config', 'python3-libselinux', 'python3-pyyaml', 'puppet-tripleo', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:54:29 np0005538513.localdomain groupadd[36232]: group added to /etc/group: name=puppet, GID=52
Nov 28 07:54:29 np0005538513.localdomain groupadd[36232]: group added to /etc/gshadow: name=puppet
Nov 28 07:54:29 np0005538513.localdomain groupadd[36232]: new group: name=puppet, GID=52
Nov 28 07:54:29 np0005538513.localdomain useradd[36239]: new user: name=puppet, UID=52, GID=52, home=/var/lib/puppet, shell=/sbin/nologin, from=none
Nov 28 07:55:04 np0005538513.localdomain sudo[36724]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:55:04 np0005538513.localdomain sudo[36724]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:55:04 np0005538513.localdomain sudo[36724]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:05 np0005538513.localdomain sudo[36739]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:55:05 np0005538513.localdomain sudo[36739]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:55:05 np0005538513.localdomain sudo[36739]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:06 np0005538513.localdomain sudo[36792]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:55:06 np0005538513.localdomain sudo[36792]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:55:06 np0005538513.localdomain sudo[36792]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:23 np0005538513.localdomain kernel: SELinux:  Converting 2700 SID table entries...
Nov 28 07:55:23 np0005538513.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 07:55:23 np0005538513.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 07:55:23 np0005538513.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 07:55:23 np0005538513.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 07:55:23 np0005538513.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 07:55:23 np0005538513.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 07:55:23 np0005538513.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 07:55:24 np0005538513.localdomain dbus-broker-launch[755]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 28 07:55:24 np0005538513.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 07:55:24 np0005538513.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 28 07:55:24 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:55:24 np0005538513.localdomain systemd-rc-local-generator[36968]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:55:24 np0005538513.localdomain systemd-sysv-generator[36973]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:55:24 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:55:24 np0005538513.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 07:55:24 np0005538513.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 07:55:24 np0005538513.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 28 07:55:24 np0005538513.localdomain systemd[1]: run-r045bd28e4bc94a408f5663ad62b4909b.service: Deactivated successfully.
Nov 28 07:55:25 np0005538513.localdomain sudo[36061]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:27 np0005538513.localdomain sudo[37408]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftnftnwtujfohsiggfjtqqcwgnfckedr ; /usr/bin/python3
Nov 28 07:55:27 np0005538513.localdomain sudo[37408]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:27 np0005538513.localdomain python3[37410]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:55:28 np0005538513.localdomain sudo[37408]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:28 np0005538513.localdomain sudo[37547]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klpqqgemfrewqjfohjxvrfnbawboxull ; /usr/bin/python3
Nov 28 07:55:28 np0005538513.localdomain sudo[37547]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:29 np0005538513.localdomain python3[37549]: ansible-ansible.legacy.systemd Invoked with name=openvswitch enabled=True state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:55:29 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:55:29 np0005538513.localdomain systemd-sysv-generator[37578]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:55:29 np0005538513.localdomain systemd-rc-local-generator[37575]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:55:29 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:55:29 np0005538513.localdomain sudo[37547]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:30 np0005538513.localdomain sudo[37601]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyslrhexcxpngtegwnpsxsonblhodvqp ; /usr/bin/python3
Nov 28 07:55:30 np0005538513.localdomain sudo[37601]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:31 np0005538513.localdomain python3[37603]: ansible-file Invoked with path=/var/lib/heat-config/tripleo-config-download state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:55:31 np0005538513.localdomain sudo[37601]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:31 np0005538513.localdomain sudo[37617]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edfvcuzoysdidfpjjgogpciwtheqfoky ; /usr/bin/python3
Nov 28 07:55:31 np0005538513.localdomain sudo[37617]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:31 np0005538513.localdomain python3[37619]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides openstack-network-scripts _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:55:31 np0005538513.localdomain sudo[37617]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:31 np0005538513.localdomain sudo[37634]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kkuwsbdqhzuhzxkmarjzkcjfxamlsfgy ; /usr/bin/python3
Nov 28 07:55:31 np0005538513.localdomain sudo[37634]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:32 np0005538513.localdomain python3[37636]: ansible-systemd Invoked with name=NetworkManager enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 28 07:55:32 np0005538513.localdomain sudo[37634]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:32 np0005538513.localdomain sudo[37652]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whwferdwvfnlkwhwakeqavszhopbuuog ; /usr/bin/python3
Nov 28 07:55:32 np0005538513.localdomain sudo[37652]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:33 np0005538513.localdomain python3[37654]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=dns value=none backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:55:33 np0005538513.localdomain sudo[37652]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:33 np0005538513.localdomain sudo[37670]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ctldcykaktloiwhmafiwwswzlwyaghfk ; /usr/bin/python3
Nov 28 07:55:33 np0005538513.localdomain sudo[37670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:33 np0005538513.localdomain python3[37672]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=rc-manager value=unmanaged backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:55:33 np0005538513.localdomain sudo[37670]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:33 np0005538513.localdomain sudo[37688]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mwcqzutynhezbqympyhdvijdptvawuci ; /usr/bin/python3
Nov 28 07:55:33 np0005538513.localdomain sudo[37688]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:34 np0005538513.localdomain python3[37690]: ansible-ansible.legacy.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 07:55:34 np0005538513.localdomain systemd[1]: Reloading Network Manager...
Nov 28 07:55:34 np0005538513.localdomain NetworkManager[5967]: <info>  [1764316534.3109] audit: op="reload" arg="0" pid=37693 uid=0 result="success"
Nov 28 07:55:34 np0005538513.localdomain NetworkManager[5967]: <info>  [1764316534.3117] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode,rc-manager (/etc/NetworkManager/NetworkManager.conf (lib: 00-server.conf) (run: 15-carrier-timeout.conf))
Nov 28 07:55:34 np0005538513.localdomain NetworkManager[5967]: <info>  [1764316534.3117] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Nov 28 07:55:34 np0005538513.localdomain systemd[1]: Reloaded Network Manager.
Nov 28 07:55:34 np0005538513.localdomain sudo[37688]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:35 np0005538513.localdomain sudo[37707]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bcbebtuxqkwgcxfapslvtgydvdwyrgeb ; /usr/bin/python3
Nov 28 07:55:35 np0005538513.localdomain sudo[37707]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:35 np0005538513.localdomain python3[37709]: ansible-ansible.legacy.command Invoked with _raw_params=ln -f -s /usr/share/openstack-puppet/modules/* /etc/puppet/modules/ _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:55:35 np0005538513.localdomain sudo[37707]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:36 np0005538513.localdomain sudo[37724]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atfcmxrjsclepojnkzyvyvkrgkauvhys ; /usr/bin/python3
Nov 28 07:55:36 np0005538513.localdomain sudo[37724]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:36 np0005538513.localdomain python3[37726]: ansible-stat Invoked with path=/usr/bin/ansible-playbook follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:55:36 np0005538513.localdomain sudo[37724]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:36 np0005538513.localdomain sudo[37742]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gokqogfkyabjfbkcxiindpyxjbsoqgks ; /usr/bin/python3
Nov 28 07:55:36 np0005538513.localdomain sudo[37742]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:36 np0005538513.localdomain python3[37744]: ansible-stat Invoked with path=/usr/bin/ansible-playbook-3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:55:36 np0005538513.localdomain sudo[37742]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:36 np0005538513.localdomain sudo[37758]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lqcanzqttbwjvorjevdbbtapdhgxtyub ; /usr/bin/python3
Nov 28 07:55:36 np0005538513.localdomain sudo[37758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:37 np0005538513.localdomain python3[37760]: ansible-file Invoked with state=link src=/usr/bin/ansible-playbook path=/usr/bin/ansible-playbook-3 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:55:37 np0005538513.localdomain sudo[37758]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:37 np0005538513.localdomain sudo[37774]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndkojdilvvjvqhpklizrlkrzxvlohjqt ; /usr/bin/python3
Nov 28 07:55:37 np0005538513.localdomain sudo[37774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:37 np0005538513.localdomain python3[37776]: ansible-tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 28 07:55:37 np0005538513.localdomain sudo[37774]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:37 np0005538513.localdomain sudo[37790]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezhfrktulvjfdakjfzvzppkhaspphdfn ; /usr/bin/python3
Nov 28 07:55:38 np0005538513.localdomain sudo[37790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:38 np0005538513.localdomain python3[37792]: ansible-stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:55:38 np0005538513.localdomain sudo[37790]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:38 np0005538513.localdomain sudo[37806]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezbdqqomzsbelzdqidzxbfeztuhibxxw ; /usr/bin/python3
Nov 28 07:55:38 np0005538513.localdomain sudo[37806]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:39 np0005538513.localdomain python3[37808]: ansible-blockinfile Invoked with path=/tmp/ansible.ah5juyos block=[192.168.122.106]*,[np0005538513.ctlplane.localdomain]*,[172.17.0.106]*,[np0005538513.internalapi.localdomain]*,[172.18.0.106]*,[np0005538513.storage.localdomain]*,[172.20.0.106]*,[np0005538513.storagemgmt.localdomain]*,[172.19.0.106]*,[np0005538513.tenant.localdomain]*,[np0005538513.localdomain]*,[np0005538513]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCToHi/c1OL/UxMWy2v/t0tcvSlMeoKa6EPBYbcu51p2Gn2UxEPgCRLM9+84Smh2pxAR4Y/5LVm2lbZ9Gf4okHGg5GLIyqzxxqbQHyR+YRljujVEOvksUPuKCptzx9fQj2Ij2t9GPGHc5klgGPIKjx0pza8T37vdz+G9y7zuK5wWI66AeN8y/6dD2hvi1Lp94VRSvTTEo+nUOFSIgsOwqQO+ZSwTgjG1pmtESBe8nkhW0I0BQPX46v9f1PN1LXDg8cN2FSVjQ91RI0uCvTaBYJ3soFBFspgiJ113zapbQCaNwg7lK7ofS0QT5WONP3QIsDAq1gSpWuOdS2DRY4NU3WMd4m5tLbj+ubiWr39rNU/zQiEl8r38aiM0OwOfuQ9S8wxO7phpVCQrbOkYCLLijdy/xTODvP+jYohTMWX8Gh6IVeVtm6SB2Tw3lDBCjpqlclCSs905Xe+mTJ6WYTaz+Q1xgflKEeemzJ0+rt+QZbrmL7u5MUdf/l/yOLAgACNsws=
                                                         [192.168.122.107]*,[np0005538514.ctlplane.localdomain]*,[172.17.0.107]*,[np0005538514.internalapi.localdomain]*,[172.18.0.107]*,[np0005538514.storage.localdomain]*,[172.20.0.107]*,[np0005538514.storagemgmt.localdomain]*,[172.19.0.107]*,[np0005538514.tenant.localdomain]*,[np0005538514.localdomain]*,[np0005538514]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLIqwhlOevSQuHXF0nrkLOzRoSQqnWWb/cXzK4um93clqGujVOE9PUyL6ONBo/qlr4Pp+QMzSsIFwjW1T/6G+Ce2CS/TGphIUxvvB9NhBt+OJl/zDUEmjAU6bwVIx6ApqtimsXWWIap9GEtVWA5P9pcqPMyGzq1mCzwCS252Ylioij0zZxfMrxTt3RSsWrDED61vRes0ZKd8HERTLN+Lzis5t8f74zfwTesOea6CRkIHth4cUP7ua3q+KhhbhIPj+fXWN5w+qVbcTMJSYyUPsZ2ymPhR4x4db1oPk1Jg14dw1BnmAZZl3v8o4l7bUQ2Fj/PE1JbSiApxbK+V0KdZGMrG4iVbnMmzwBXPXHa6lNQGneflVd3MNEepnTnXx4hAVpaJHc8EtIREq8aPe07DW0wL9clpTKaSGU2Ma+BLXmSDPkuPh6JWLxn3iM1yybL574NnGt2MgBj6z2tiSb4NkNmaBkoG8PMHw8YUSabKBBZNiMEO2GKBpHZldSrYvOZHU=
                                                         [192.168.122.108]*,[np0005538515.ctlplane.localdomain]*,[172.17.0.108]*,[np0005538515.internalapi.localdomain]*,[172.18.0.108]*,[np0005538515.storage.localdomain]*,[172.20.0.108]*,[np0005538515.storagemgmt.localdomain]*,[172.19.0.108]*,[np0005538515.tenant.localdomain]*,[np0005538515.localdomain]*,[np0005538515]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDbwc/gBZF5hmsFU6BSK1/DT5hduj2+3ukzoCGLU6mgpBv7BiInN7vVOqXilL+QUAWOvfKTekaQe1Vv/2jpygQnlu6MEMopmac/36IfVjgt39zxCULfSWv3Gp8tLP0ATF2LfhHBWFrGX7G3Bg3AiNfIUnQIQadBaKIByl+FfA7nJ7phwBAwJaQxvByGDeMwC2CWIUPgVqKclcw1WmldPnNmwquLlCbAeMV2hHlBfnVk8BI6fsOUcBB6a05zRpJpbrl584F+qkiQX0RpZYJQdZCoLiJStJv39lYhgiAWChUOVJsCbeNQnC9/Xgs5JhmRESgXh7Tm+8UNW9DxSHN7BS5qKYPUULdjobSp2v9pFOx30MLMsNd5r3JE07pgm5PpjuviSGEvJ8DIAPTF3kUXM43wax1q9rGV4ZfoJiLAwS9CmWWDWZDg17cnC5z+3qi+K8HUKz8LxQCHI+yEtTFzUEYyXTQfQbNvkauEHI/PwFA1iC+4/2g/0UhtjkM+FO2Czwk=
                                                         [192.168.122.103]*,[np0005538510.ctlplane.localdomain]*,[172.17.0.103]*,[np0005538510.internalapi.localdomain]*,[172.18.0.103]*,[np0005538510.storage.localdomain]*,[172.20.0.103]*,[np0005538510.storagemgmt.localdomain]*,[172.19.0.103]*,[np0005538510.tenant.localdomain]*,[np0005538510.localdomain]*,[np0005538510]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAxqgPHnyGChl6yd1/HRo8ox+w8llSVhIj8iYUdDG7IquyLr4/CZguZzRkngbXi/Dq544iKS4kFL/zPKi+yuxeFs4b6fgo4vGoV8wwKNSJXx0d0hOQa9651VqB6k/trENRTgLa2fHkXgF+/g0f7HvloQfhr7qjhTBRV4l4UfJiOEpMvMxN6map/D0JuHlAZGZ5mGUoBTEMuPGEPvMWqe0kc/I8WIgsMsvijOGM2xDxsOqAYlV9a8faoyMdacWUNkeQTfPF6h+z8xdvP8qWPtrPKWHMpcGicTI6pFZ2JxOjWnMaBXs2j/CN7HFLbyOCwuAvAu9efAbxJvgtZlO++6kSlq7SHMzwv7PLP69GaQJHR+jANJ/O2BchbxL09mIkpFSzLSS0k7xXJlwqnAMciIlTaud2n5Hqnnb06WgtvD6O0nnuCLH5am7F1YDGJJgUmNbbgF1PuwzOZqQy+tA2igji/n2z87KkGZdIbrHdPU1PPIlzVGPO6aO02RhvtD+/iQM=
                                                         [192.168.122.104]*,[np0005538511.ctlplane.localdomain]*,[172.17.0.104]*,[np0005538511.internalapi.localdomain]*,[172.18.0.104]*,[np0005538511.storage.localdomain]*,[172.20.0.104]*,[np0005538511.storagemgmt.localdomain]*,[172.19.0.104]*,[np0005538511.tenant.localdomain]*,[np0005538511.localdomain]*,[np0005538511]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDqtmgm0KAOOIJ7a8whlZPfasnwJpfcm6zVmQjiKHZZrcojE/a6oALfufKXbfWWiLjJ2VzyK9v7QPNXhIWxgAKT9J40A1lSpSAmmxMaWvy+hzzvePs0Z4Fc4bFX7V4zBGI+dAJ+eAu73z6OKNuMhxBrL46ejpRFbqjwBP3veWRiLOMbyPn+Wc+amop0p1eEzV2QHMAIC5Dwm6/tYNLixNSa/Ea0ciaY3jWii+IGhYy+wqQP+9qkoVf9bZ4Ewa+7UfXI/q4zvvic/Znb8ZpCpezLnH4ilBORLyV9r/wkkkVGY7UVgUdSoLVjzTGQAtHl2ZgA3zJ2F2ES9QcBEvrHygT4vGgtEaxQn8XFhBwhzCpPaLyXti/6d+8M36cJx+7gv1eEfDgLz3MNR+tcnFSew9N6dIN4afV0DvA/9FsWk8PTqddN4iHcZzRo0GiDJWNtB+gYVZOytTYMZm2Cyv59IthEzxaB+wTZoSdCeuEeTM0ohYspOKirIPqMPuCbGbtrJFE=
                                                         [192.168.122.105]*,[np0005538512.ctlplane.localdomain]*,[172.17.0.105]*,[np0005538512.internalapi.localdomain]*,[172.18.0.105]*,[np0005538512.storage.localdomain]*,[172.20.0.105]*,[np0005538512.storagemgmt.localdomain]*,[172.19.0.105]*,[np0005538512.tenant.localdomain]*,[np0005538512.localdomain]*,[np0005538512]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCy9/gxqH+eMqafXwUuPf+1Clpw4qsugdFefisnCDhJ5U7Pc+eWMUQVMS0ErxabBJhneDOyPwXwIbv72cEAtmgfvHDlSuS3mt8LRzKqsv1dXTy4Zqb3JGVzrvxo0iczGRsn2MIDJUv/Zjq9YqVeCnDj2HOwV+qx+EFecEFXS797FxsnMmTw0A5z8yUtBuJEGAKQX96LpZc4k5ltq+Uy0rK85Kk7cGR4A+wrIChLC8wggxvA99NdPEBtne6Chb+3PcbYUcTGhGtV6FGzpgbWmuWT/gcANb+fJE5/4n87loLmBMsmvGhvQuN9kuJ20g6nwPJbPTpIbV6XALx4tbma68bL3RL+lcGlh3jf0pEXPfolrB/MRmJn5ggMLjRv50FrowQalnCEgWE0gtd9IGjmqFz3jP008bGotn9rcacbjC2AvE+5NEjp7TzXGnFcD6jW8+9AWiusCww4ULs/oWbi0GLkmhwU5EifitDYF2+r1CigAdlEjb6sa0wAQSmclWk6guM=
                                                          create=True state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:55:39 np0005538513.localdomain sudo[37806]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:39 np0005538513.localdomain sudo[37822]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xubavhhcvgfzetwalwjejjzrcxctzasq ; /usr/bin/python3
Nov 28 07:55:39 np0005538513.localdomain sudo[37822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:39 np0005538513.localdomain python3[37824]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.ah5juyos' > /etc/ssh/ssh_known_hosts _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:55:39 np0005538513.localdomain sudo[37822]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:39 np0005538513.localdomain sudo[37840]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rewzuqypxjpqhzkpagdffsutphnunrnk ; /usr/bin/python3
Nov 28 07:55:39 np0005538513.localdomain sudo[37840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:39 np0005538513.localdomain python3[37842]: ansible-file Invoked with path=/tmp/ansible.ah5juyos state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:55:39 np0005538513.localdomain sudo[37840]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:40 np0005538513.localdomain sudo[37856]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wedwuurwxgagbqlrrhfjcgpevxxkoiqy ; /usr/bin/python3
Nov 28 07:55:40 np0005538513.localdomain sudo[37856]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:40 np0005538513.localdomain python3[37858]: ansible-file Invoked with path=/var/log/journal state=directory mode=0750 owner=root group=root setype=var_log_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:55:40 np0005538513.localdomain sudo[37856]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:41 np0005538513.localdomain sudo[37872]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ygdwxsnssqfbgfsmvijlzojcxxbbxakb ; /usr/bin/python3
Nov 28 07:55:41 np0005538513.localdomain sudo[37872]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:41 np0005538513.localdomain python3[37874]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active cloud-init.service || systemctl is-enabled cloud-init.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:55:41 np0005538513.localdomain sudo[37872]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:41 np0005538513.localdomain sudo[37890]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ajveqahaibhudnngkpejdqlouwymqntb ; /usr/bin/python3
Nov 28 07:55:41 np0005538513.localdomain sudo[37890]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:41 np0005538513.localdomain python3[37892]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline | grep -q cloud-init=disabled _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:55:41 np0005538513.localdomain sudo[37890]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:41 np0005538513.localdomain sudo[37909]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iuupoqgejiublzteuwbddjdsslgjwvei ; /usr/bin/python3
Nov 28 07:55:41 np0005538513.localdomain sudo[37909]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:41 np0005538513.localdomain python3[37911]: ansible-community.general.cloud_init_data_facts Invoked with filter=status
Nov 28 07:55:41 np0005538513.localdomain sudo[37909]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:42 np0005538513.localdomain sudo[37925]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nugedqkqypbxztuxgecpexfqjogxjvkb ; /usr/bin/python3
Nov 28 07:55:42 np0005538513.localdomain sudo[37925]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:42 np0005538513.localdomain sudo[37925]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:42 np0005538513.localdomain sudo[37973]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gcwfgtdwmvtshiwgqusjvjibpmgwhcba ; /usr/bin/python3
Nov 28 07:55:42 np0005538513.localdomain sudo[37973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:42 np0005538513.localdomain sudo[37973]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:43 np0005538513.localdomain sudo[38016]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dnqufkkifmfopthzbxzsaczccqhrfted ; /usr/bin/python3
Nov 28 07:55:43 np0005538513.localdomain sudo[38016]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:43 np0005538513.localdomain sudo[38016]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:44 np0005538513.localdomain sudo[38046]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aaiwdrxjptsouoinnfkxsrzubiexcukb ; /usr/bin/python3
Nov 28 07:55:44 np0005538513.localdomain sudo[38046]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:44 np0005538513.localdomain python3[38048]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:55:44 np0005538513.localdomain sudo[38046]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:44 np0005538513.localdomain sudo[38063]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzwgpfxufkvgjnepjhtwokieivyvwqyi ; /usr/bin/python3
Nov 28 07:55:44 np0005538513.localdomain sudo[38063]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:44 np0005538513.localdomain python3[38065]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:55:48 np0005538513.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Nov 28 07:55:48 np0005538513.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Nov 28 07:55:48 np0005538513.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 07:55:48 np0005538513.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 28 07:55:48 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:55:48 np0005538513.localdomain systemd-rc-local-generator[38134]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:55:48 np0005538513.localdomain systemd-sysv-generator[38139]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:55:48 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:55:48 np0005538513.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 07:55:48 np0005538513.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 28 07:55:48 np0005538513.localdomain systemd[1]: tuned.service: Deactivated successfully.
Nov 28 07:55:48 np0005538513.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 28 07:55:48 np0005538513.localdomain systemd[1]: tuned.service: Consumed 1.774s CPU time.
Nov 28 07:55:48 np0005538513.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 28 07:55:49 np0005538513.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 07:55:49 np0005538513.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 28 07:55:49 np0005538513.localdomain systemd[1]: run-r4394cd071c81425da221e64f64dcefd1.service: Deactivated successfully.
Nov 28 07:55:50 np0005538513.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Nov 28 07:55:50 np0005538513.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 07:55:50 np0005538513.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 28 07:55:50 np0005538513.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 07:55:50 np0005538513.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 28 07:55:50 np0005538513.localdomain systemd[1]: run-rcb4f556933214083b12f71af96247f7e.service: Deactivated successfully.
Nov 28 07:55:51 np0005538513.localdomain sudo[38063]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:51 np0005538513.localdomain sudo[38499]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewdyvqwjkqioqdinmuufoubcqmmjbkgm ; /usr/bin/python3
Nov 28 07:55:51 np0005538513.localdomain sudo[38499]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:51 np0005538513.localdomain python3[38501]: ansible-systemd Invoked with name=tuned state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:55:51 np0005538513.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 28 07:55:51 np0005538513.localdomain systemd[1]: tuned.service: Deactivated successfully.
Nov 28 07:55:51 np0005538513.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 28 07:55:51 np0005538513.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 28 07:55:53 np0005538513.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Nov 28 07:55:53 np0005538513.localdomain sudo[38499]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:53 np0005538513.localdomain sudo[38694]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wlweebdpuqdfybffoprmbtxkmncyhlop ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Nov 28 07:55:53 np0005538513.localdomain sudo[38694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:53 np0005538513.localdomain python3[38696]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:55:53 np0005538513.localdomain sudo[38694]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:53 np0005538513.localdomain sudo[38711]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydfthhxncfxfvgfqoaagldicwduoqlnm ; /usr/bin/python3
Nov 28 07:55:53 np0005538513.localdomain sudo[38711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:54 np0005538513.localdomain python3[38713]: ansible-slurp Invoked with src=/etc/tuned/active_profile
Nov 28 07:55:54 np0005538513.localdomain sudo[38711]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:54 np0005538513.localdomain sudo[38727]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrinucuiybfoltifasdayqyumjwrvoaw ; /usr/bin/python3
Nov 28 07:55:54 np0005538513.localdomain sudo[38727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:54 np0005538513.localdomain python3[38729]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:55:54 np0005538513.localdomain sudo[38727]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:54 np0005538513.localdomain sudo[38743]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cvrmjwojazgqjggtvrwebjvmnvkhzfts ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Nov 28 07:55:54 np0005538513.localdomain sudo[38743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:55 np0005538513.localdomain python3[38745]: ansible-ansible.legacy.command Invoked with _raw_params=tuned-adm profile throughput-performance _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:55:56 np0005538513.localdomain sudo[38743]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:56 np0005538513.localdomain sudo[38763]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxtejtvfpkezrkgjhpwdqulqrzgvoxfq ; /usr/bin/python3
Nov 28 07:55:56 np0005538513.localdomain sudo[38763]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:56 np0005538513.localdomain python3[38765]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:55:56 np0005538513.localdomain sudo[38763]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:57 np0005538513.localdomain sudo[38780]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-goxkvktrrvphkxxfbmlohbidirbqzgoi ; /usr/bin/python3
Nov 28 07:55:57 np0005538513.localdomain sudo[38780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:57 np0005538513.localdomain python3[38782]: ansible-stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:55:57 np0005538513.localdomain sudo[38780]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:59 np0005538513.localdomain sudo[38796]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qafwitzxuqrcedldhpkwlnxcbdkqyjbg ; /usr/bin/python3
Nov 28 07:55:59 np0005538513.localdomain sudo[38796]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:59 np0005538513.localdomain python3[38798]: ansible-replace Invoked with regexp=TRIPLEO_HEAT_TEMPLATE_KERNEL_ARGS dest=/etc/default/grub replace= path=/etc/default/grub backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:55:59 np0005538513.localdomain sudo[38796]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:04 np0005538513.localdomain sudo[38812]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhkvdjtgcmrxxdbggdpwgwwzkmxnmvbw ; /usr/bin/python3
Nov 28 07:56:04 np0005538513.localdomain sudo[38812]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:05 np0005538513.localdomain python3[38814]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:05 np0005538513.localdomain sudo[38812]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:05 np0005538513.localdomain sudo[38860]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yisdkxrytumdfqxxrltkhesfcmnbnavb ; /usr/bin/python3
Nov 28 07:56:05 np0005538513.localdomain sudo[38860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:05 np0005538513.localdomain python3[38862]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:05 np0005538513.localdomain sudo[38860]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:05 np0005538513.localdomain systemd[35694]: Starting Mark boot as successful...
Nov 28 07:56:05 np0005538513.localdomain systemd[35694]: Finished Mark boot as successful.
Nov 28 07:56:05 np0005538513.localdomain sudo[38906]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vyhxylyiymdwehxccyflcebcrrrqlnwl ; /usr/bin/python3
Nov 28 07:56:05 np0005538513.localdomain sudo[38906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:06 np0005538513.localdomain python3[38908]: ansible-ansible.legacy.copy Invoked with mode=384 dest=/etc/puppet/hiera.yaml src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316565.340449-71264-134970873186692/source _original_basename=tmpl19i9un2 follow=False checksum=aaf3699defba931d532f4955ae152f505046749a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:06 np0005538513.localdomain sudo[38906]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:06 np0005538513.localdomain sudo[38936]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjhbrkuyidjrnaihualtvwgwdfilbotq ; /usr/bin/python3
Nov 28 07:56:06 np0005538513.localdomain sudo[38936]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:06 np0005538513.localdomain sudo[38937]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:56:06 np0005538513.localdomain sudo[38937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:56:06 np0005538513.localdomain sudo[38937]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:06 np0005538513.localdomain sudo[38954]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:56:06 np0005538513.localdomain sudo[38954]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:56:06 np0005538513.localdomain python3[38951]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:06 np0005538513.localdomain sudo[38936]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:07 np0005538513.localdomain sudo[39045]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rsonddcsmkbxgwmynmiunpgqphvbvgnn ; /usr/bin/python3
Nov 28 07:56:07 np0005538513.localdomain sudo[39045]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:07 np0005538513.localdomain sudo[38954]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:07 np0005538513.localdomain python3[39047]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:07 np0005538513.localdomain sudo[39045]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:07 np0005538513.localdomain sudo[39088]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upikwismarwdnjzxqidtqwlohlguyara ; /usr/bin/python3
Nov 28 07:56:07 np0005538513.localdomain sudo[39088]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:07 np0005538513.localdomain sudo[39091]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:56:07 np0005538513.localdomain sudo[39091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:56:07 np0005538513.localdomain sudo[39091]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:07 np0005538513.localdomain python3[39090]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316566.9805455-71361-33364981101521/source dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json follow=False checksum=f62dcfb681d1b393d0933e3027f5bdff5685b671 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:07 np0005538513.localdomain sudo[39088]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:08 np0005538513.localdomain sudo[39165]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewyiejoondwnsumaihdkrbuvihpwkrtd ; /usr/bin/python3
Nov 28 07:56:08 np0005538513.localdomain sudo[39165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:08 np0005538513.localdomain python3[39167]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:08 np0005538513.localdomain sudo[39165]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:08 np0005538513.localdomain sudo[39208]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdgwspedgcmolcvidrkqzjblqwvuxuip ; /usr/bin/python3
Nov 28 07:56:08 np0005538513.localdomain sudo[39208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:08 np0005538513.localdomain python3[39210]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316567.9143033-71496-85961111386279/source dest=/etc/puppet/hieradata/bootstrap_node.json mode=None follow=False _original_basename=bootstrap_node.j2 checksum=526fa277b7a2f2320a39d589994ce8c8af83f91d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:08 np0005538513.localdomain sudo[39208]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:09 np0005538513.localdomain sudo[39270]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhllbhkqocdkjtbzbwvraelrnlijqyrc ; /usr/bin/python3
Nov 28 07:56:09 np0005538513.localdomain sudo[39270]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:09 np0005538513.localdomain python3[39272]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:09 np0005538513.localdomain sudo[39270]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:09 np0005538513.localdomain sudo[39313]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdyprshaxssuqhdwoglasxwbwueepttj ; /usr/bin/python3
Nov 28 07:56:09 np0005538513.localdomain sudo[39313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:09 np0005538513.localdomain python3[39315]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316568.853994-71496-238988601288442/source dest=/etc/puppet/hieradata/vip_data.json mode=None follow=False _original_basename=vip_data.j2 checksum=a223df0bad6272fbaedbfa3b3952717db2fe2201 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:09 np0005538513.localdomain sudo[39313]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:10 np0005538513.localdomain sudo[39375]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-viyqjksnaakacwtykziagpthaqxrvxpu ; /usr/bin/python3
Nov 28 07:56:10 np0005538513.localdomain sudo[39375]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:10 np0005538513.localdomain python3[39377]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:10 np0005538513.localdomain sudo[39375]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:10 np0005538513.localdomain sudo[39418]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wkehbkpjajulttpqaklqsgkfqjveuaht ; /usr/bin/python3
Nov 28 07:56:10 np0005538513.localdomain sudo[39418]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:10 np0005538513.localdomain python3[39420]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316569.7963903-71496-114579238824483/source dest=/etc/puppet/hieradata/net_ip_map.json mode=None follow=False _original_basename=net_ip_map.j2 checksum=68b5a56a66cb10764ef3288009ad5e9b7e8faf12 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:10 np0005538513.localdomain sudo[39418]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:10 np0005538513.localdomain sudo[39480]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbledlohdyjvabtdjatuajexsbuwpfxa ; /usr/bin/python3
Nov 28 07:56:10 np0005538513.localdomain sudo[39480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:11 np0005538513.localdomain python3[39482]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:11 np0005538513.localdomain sudo[39480]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:11 np0005538513.localdomain sudo[39523]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhkcclqnpgcoqcefibjohgxpfkinyodz ; /usr/bin/python3
Nov 28 07:56:11 np0005538513.localdomain sudo[39523]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:11 np0005538513.localdomain python3[39525]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316570.8300068-71496-25857694174394/source dest=/etc/puppet/hieradata/cloud_domain.json mode=None follow=False _original_basename=cloud_domain.j2 checksum=5dd835a63e6a03d74797c2e2eadf4bea1cecd9d9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:11 np0005538513.localdomain sudo[39523]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:11 np0005538513.localdomain sudo[39585]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjeodyyljyiuceouuonyyjuuxtbyihlq ; /usr/bin/python3
Nov 28 07:56:11 np0005538513.localdomain sudo[39585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:12 np0005538513.localdomain python3[39587]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:12 np0005538513.localdomain sudo[39585]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:12 np0005538513.localdomain sudo[39628]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cvneuuoxqlzwkgyvatcwhuaxiffsrrki ; /usr/bin/python3
Nov 28 07:56:12 np0005538513.localdomain sudo[39628]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:12 np0005538513.localdomain python3[39630]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316571.6708229-71496-225660578965828/source dest=/etc/puppet/hieradata/fqdn.json mode=None follow=False _original_basename=fqdn.j2 checksum=8507472d542de0e0675ce4c861ee207d860b9ae3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:12 np0005538513.localdomain sudo[39628]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:12 np0005538513.localdomain sudo[39690]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqgogvueqjtpinpzbywbiahcylcoofxg ; /usr/bin/python3
Nov 28 07:56:12 np0005538513.localdomain sudo[39690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:12 np0005538513.localdomain python3[39692]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:12 np0005538513.localdomain sudo[39690]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:13 np0005538513.localdomain sudo[39733]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zowuivwqelpkxbwjondmpoxenccmpfsc ; /usr/bin/python3
Nov 28 07:56:13 np0005538513.localdomain sudo[39733]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:13 np0005538513.localdomain python3[39735]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316572.5381417-71496-181357957177740/source dest=/etc/puppet/hieradata/service_names.json mode=None follow=False _original_basename=service_names.j2 checksum=ff586b96402d8ae133745cf06f17e772b2f22d52 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:13 np0005538513.localdomain sudo[39733]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:13 np0005538513.localdomain sudo[39795]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ilinfbsbgqefvcnhrilvoevuwylpucrl ; /usr/bin/python3
Nov 28 07:56:13 np0005538513.localdomain sudo[39795]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:13 np0005538513.localdomain python3[39797]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:13 np0005538513.localdomain sudo[39795]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:13 np0005538513.localdomain sudo[39838]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-roqcytpzzyykqbzwjafhpffoigswnniy ; /usr/bin/python3
Nov 28 07:56:13 np0005538513.localdomain sudo[39838]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:14 np0005538513.localdomain python3[39840]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316573.3885307-71496-39635942012134/source dest=/etc/puppet/hieradata/service_configs.json mode=None follow=False _original_basename=service_configs.j2 checksum=8f5fcf4d1773fc71cd0863786080c50634c31bf2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:14 np0005538513.localdomain sudo[39838]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:14 np0005538513.localdomain sudo[39900]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-urwxivzmxotsabwdyrthoxqlxvxezqha ; /usr/bin/python3
Nov 28 07:56:14 np0005538513.localdomain sudo[39900]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:14 np0005538513.localdomain python3[39902]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:14 np0005538513.localdomain sudo[39900]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:14 np0005538513.localdomain sudo[39943]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmcshauwmwvjnxazdlppzpztsagfsjtq ; /usr/bin/python3
Nov 28 07:56:14 np0005538513.localdomain sudo[39943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:15 np0005538513.localdomain python3[39945]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316574.3161898-71496-196645832956672/source dest=/etc/puppet/hieradata/extraconfig.json mode=None follow=False _original_basename=extraconfig.j2 checksum=5f36b2ea290645ee34d943220a14b54ee5ea5be5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:15 np0005538513.localdomain sudo[39943]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:15 np0005538513.localdomain sudo[40005]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqmvgdgybknittusijslmufvuvbbtrbj ; /usr/bin/python3
Nov 28 07:56:15 np0005538513.localdomain sudo[40005]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:15 np0005538513.localdomain python3[40007]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:15 np0005538513.localdomain sudo[40005]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:15 np0005538513.localdomain sudo[40048]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpozcdtzvgutpmcpmgbncdqtesxnbjvx ; /usr/bin/python3
Nov 28 07:56:15 np0005538513.localdomain sudo[40048]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:15 np0005538513.localdomain python3[40050]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316575.2061021-71496-248193349042453/source dest=/etc/puppet/hieradata/role_extraconfig.json mode=None follow=False _original_basename=role_extraconfig.j2 checksum=34875968bf996542162e620523f9dcfb3deac331 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:15 np0005538513.localdomain sudo[40048]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:16 np0005538513.localdomain sudo[40110]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dsrtbzrrgnjjotkhawzhgozjxfnqdrie ; /usr/bin/python3
Nov 28 07:56:16 np0005538513.localdomain sudo[40110]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:16 np0005538513.localdomain python3[40112]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:16 np0005538513.localdomain sudo[40110]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:16 np0005538513.localdomain sudo[40153]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypjmtipkyngqxfeigatnmyhnakjsremb ; /usr/bin/python3
Nov 28 07:56:16 np0005538513.localdomain sudo[40153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:16 np0005538513.localdomain python3[40155]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316576.077748-71496-217358525951479/source dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json mode=None follow=False _original_basename=ovn_chassis_mac_map.j2 checksum=cb8f167f8f40b87df4e2f7549c43619389cc84d7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:16 np0005538513.localdomain sudo[40153]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:17 np0005538513.localdomain sudo[40183]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lrwsfvbdcxgfudpjdbyppvkbqggdwnrb ; /usr/bin/python3
Nov 28 07:56:17 np0005538513.localdomain sudo[40183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:17 np0005538513.localdomain python3[40185]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:56:17 np0005538513.localdomain sudo[40183]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:17 np0005538513.localdomain sudo[40231]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krghthxabjpdtprorfjxzsajvtcvdzms ; /usr/bin/python3
Nov 28 07:56:17 np0005538513.localdomain sudo[40231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:18 np0005538513.localdomain python3[40233]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:18 np0005538513.localdomain sudo[40231]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:18 np0005538513.localdomain sudo[40274]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pzfjxftjjnyikmciaipvvxleesbmdene ; /usr/bin/python3
Nov 28 07:56:18 np0005538513.localdomain sudo[40274]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:18 np0005538513.localdomain python3[40276]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/ansible_managed.json owner=root group=root mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316577.8640547-72162-160011934086397/source _original_basename=tmpm632thdr follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:18 np0005538513.localdomain sudo[40274]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:23 np0005538513.localdomain sudo[40304]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vihdnlonkadebvvlhpadkjqxdnozmtjl ; /usr/bin/python3
Nov 28 07:56:23 np0005538513.localdomain sudo[40304]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:23 np0005538513.localdomain python3[40306]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_default_ipv4'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 07:56:23 np0005538513.localdomain sudo[40304]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:23 np0005538513.localdomain sudo[40365]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btgzbftrdqjxrilrabkmbdxhhibejhzz ; /usr/bin/python3
Nov 28 07:56:23 np0005538513.localdomain sudo[40365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:23 np0005538513.localdomain python3[40367]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 38.102.83.1 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:56:27 np0005538513.localdomain sudo[40365]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:28 np0005538513.localdomain sudo[40382]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vejkaukasrwukurrtwyfpdaibxchhepz ; /usr/bin/python3
Nov 28 07:56:28 np0005538513.localdomain sudo[40382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:28 np0005538513.localdomain python3[40384]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.10 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:56:32 np0005538513.localdomain sudo[40382]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:33 np0005538513.localdomain sudo[40399]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gshwmazmbatbobeqsntvtjsmzpecjckl ; /usr/bin/python3
Nov 28 07:56:33 np0005538513.localdomain sudo[40399]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:33 np0005538513.localdomain python3[40401]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 192.168.122.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:56:33 np0005538513.localdomain sudo[40399]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:34 np0005538513.localdomain sudo[40422]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bujalbzdffvphoovlxdhiqvozwtsroxq ; /usr/bin/python3
Nov 28 07:56:34 np0005538513.localdomain sudo[40422]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:34 np0005538513.localdomain python3[40424]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.18.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:56:34 np0005538513.localdomain sudo[40422]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:34 np0005538513.localdomain sudo[40445]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iqxqhowoxselyijvvyfhgmxnplwuqsek ; /usr/bin/python3
Nov 28 07:56:34 np0005538513.localdomain sudo[40445]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:34 np0005538513.localdomain python3[40447]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.20.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:56:34 np0005538513.localdomain sudo[40445]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:35 np0005538513.localdomain sudo[40468]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-urqkqotlmnxxyhewtdfflaefxnwjyfbt ; /usr/bin/python3
Nov 28 07:56:35 np0005538513.localdomain sudo[40468]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:35 np0005538513.localdomain python3[40470]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.17.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:56:35 np0005538513.localdomain sudo[40468]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:35 np0005538513.localdomain sudo[40491]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vbnrsckxfhddxasfynrjoyjyuctzcdby ; /usr/bin/python3
Nov 28 07:56:35 np0005538513.localdomain sudo[40491]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:35 np0005538513.localdomain python3[40493]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.19.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:56:35 np0005538513.localdomain sudo[40491]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:07 np0005538513.localdomain sudo[40501]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:57:07 np0005538513.localdomain sudo[40501]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:57:07 np0005538513.localdomain sudo[40501]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:07 np0005538513.localdomain sudo[40516]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:57:07 np0005538513.localdomain sudo[40516]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:57:08 np0005538513.localdomain sudo[40516]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:11 np0005538513.localdomain sudo[40562]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:57:11 np0005538513.localdomain sudo[40562]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:57:11 np0005538513.localdomain sudo[40562]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:16 np0005538513.localdomain sudo[40590]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ttyppbjiopwooqmhtatgxlxmeusaxogq ; /usr/bin/python3
Nov 28 07:57:16 np0005538513.localdomain sudo[40590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:17 np0005538513.localdomain python3[40592]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:17 np0005538513.localdomain sudo[40590]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:17 np0005538513.localdomain sudo[40638]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gxkskwleyginclsidrgwyqrcqdrbdyke ; /usr/bin/python3
Nov 28 07:57:17 np0005538513.localdomain sudo[40638]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:17 np0005538513.localdomain python3[40640]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:17 np0005538513.localdomain sudo[40638]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:17 np0005538513.localdomain sudo[40656]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wogwpdmzdxreqsauhfglsttdokxacpbp ; /usr/bin/python3
Nov 28 07:57:17 np0005538513.localdomain sudo[40656]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:17 np0005538513.localdomain python3[40658]: ansible-ansible.legacy.file Invoked with mode=384 dest=/etc/puppet/hiera.yaml _original_basename=tmp18fz95lr recurse=False state=file path=/etc/puppet/hiera.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:17 np0005538513.localdomain sudo[40656]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:18 np0005538513.localdomain sudo[40686]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whztoiojfjpgvkixgkjtvanjxatlbhhd ; /usr/bin/python3
Nov 28 07:57:18 np0005538513.localdomain sudo[40686]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:18 np0005538513.localdomain python3[40688]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:18 np0005538513.localdomain sudo[40686]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:18 np0005538513.localdomain sudo[40734]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kcksfntqzcziyznitbnxgsdxufedkbju ; /usr/bin/python3
Nov 28 07:57:18 np0005538513.localdomain sudo[40734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:19 np0005538513.localdomain python3[40736]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:19 np0005538513.localdomain sudo[40734]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:19 np0005538513.localdomain sudo[40752]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tdjcjdjpjmgkwuyxrtheqfgyikyhambb ; /usr/bin/python3
Nov 28 07:57:19 np0005538513.localdomain sudo[40752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:19 np0005538513.localdomain python3[40754]: ansible-ansible.legacy.file Invoked with dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json recurse=False state=file path=/etc/puppet/hieradata/all_nodes.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:19 np0005538513.localdomain sudo[40752]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:19 np0005538513.localdomain sudo[40814]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbhemezjvgbnwlafrggmluebzshefodt ; /usr/bin/python3
Nov 28 07:57:19 np0005538513.localdomain sudo[40814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:20 np0005538513.localdomain python3[40816]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:20 np0005538513.localdomain sudo[40814]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:20 np0005538513.localdomain sudo[40832]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfbzpzsqxsdhvfvlxpjcbnkkhhjumjrj ; /usr/bin/python3
Nov 28 07:57:20 np0005538513.localdomain sudo[40832]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:20 np0005538513.localdomain python3[40834]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/bootstrap_node.json _original_basename=bootstrap_node.j2 recurse=False state=file path=/etc/puppet/hieradata/bootstrap_node.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:20 np0005538513.localdomain sudo[40832]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:20 np0005538513.localdomain sudo[40894]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjbbbbsymrpaaaxxadkldztapitkhasu ; /usr/bin/python3
Nov 28 07:57:20 np0005538513.localdomain sudo[40894]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:20 np0005538513.localdomain python3[40896]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:20 np0005538513.localdomain sudo[40894]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:21 np0005538513.localdomain sudo[40912]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ozhjvxjihmrtmnlxkszstzlxopjunkgr ; /usr/bin/python3
Nov 28 07:57:21 np0005538513.localdomain sudo[40912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:21 np0005538513.localdomain python3[40914]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/vip_data.json _original_basename=vip_data.j2 recurse=False state=file path=/etc/puppet/hieradata/vip_data.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:21 np0005538513.localdomain sudo[40912]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:21 np0005538513.localdomain sudo[40974]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hgtaacbwbjxgekpzsrjtnhdwxceaytwz ; /usr/bin/python3
Nov 28 07:57:21 np0005538513.localdomain sudo[40974]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:21 np0005538513.localdomain python3[40976]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:21 np0005538513.localdomain sudo[40974]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:21 np0005538513.localdomain sudo[40992]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwvydoxzungtwfvunomvqeakpixtkavk ; /usr/bin/python3
Nov 28 07:57:21 np0005538513.localdomain sudo[40992]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:22 np0005538513.localdomain python3[40994]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/net_ip_map.json _original_basename=net_ip_map.j2 recurse=False state=file path=/etc/puppet/hieradata/net_ip_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:22 np0005538513.localdomain sudo[40992]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:22 np0005538513.localdomain sudo[41054]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzfiueverhcoxhfuhtiwjacjhgixdoul ; /usr/bin/python3
Nov 28 07:57:22 np0005538513.localdomain sudo[41054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:22 np0005538513.localdomain python3[41056]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:22 np0005538513.localdomain sudo[41054]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:22 np0005538513.localdomain sudo[41072]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svyiocvxbmzjfeuhmedzuoviiyuzxpbv ; /usr/bin/python3
Nov 28 07:57:22 np0005538513.localdomain sudo[41072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:22 np0005538513.localdomain python3[41074]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/cloud_domain.json _original_basename=cloud_domain.j2 recurse=False state=file path=/etc/puppet/hieradata/cloud_domain.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:22 np0005538513.localdomain sudo[41072]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:23 np0005538513.localdomain sudo[41134]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rkssjjrvlglzwanydarazclypwsigknd ; /usr/bin/python3
Nov 28 07:57:23 np0005538513.localdomain sudo[41134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:23 np0005538513.localdomain python3[41136]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:23 np0005538513.localdomain sudo[41134]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:23 np0005538513.localdomain sudo[41152]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xaksrbzemgfyqoeuwjpcvutailwjdllj ; /usr/bin/python3
Nov 28 07:57:23 np0005538513.localdomain sudo[41152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:23 np0005538513.localdomain python3[41154]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/fqdn.json _original_basename=fqdn.j2 recurse=False state=file path=/etc/puppet/hieradata/fqdn.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:23 np0005538513.localdomain sudo[41152]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:23 np0005538513.localdomain sudo[41214]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fhhmiofxnpjqqhrsqnysenpzcxknmoam ; /usr/bin/python3
Nov 28 07:57:23 np0005538513.localdomain sudo[41214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:24 np0005538513.localdomain python3[41216]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:24 np0005538513.localdomain sudo[41214]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:24 np0005538513.localdomain sudo[41232]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdcmubwrrlojehteizxvcdhsfpfcsmyf ; /usr/bin/python3
Nov 28 07:57:24 np0005538513.localdomain sudo[41232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:24 np0005538513.localdomain python3[41234]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_names.json _original_basename=service_names.j2 recurse=False state=file path=/etc/puppet/hieradata/service_names.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:24 np0005538513.localdomain sudo[41232]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:24 np0005538513.localdomain sudo[41294]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swejwayuzvleynmvpjanpjvzdvyxrdlr ; /usr/bin/python3
Nov 28 07:57:24 np0005538513.localdomain sudo[41294]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:24 np0005538513.localdomain python3[41296]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:24 np0005538513.localdomain sudo[41294]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:24 np0005538513.localdomain sudo[41312]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yswjsulndyqxeqyzuwwesraxoysaxhlp ; /usr/bin/python3
Nov 28 07:57:24 np0005538513.localdomain sudo[41312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:24 np0005538513.localdomain python3[41314]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_configs.json _original_basename=service_configs.j2 recurse=False state=file path=/etc/puppet/hieradata/service_configs.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:24 np0005538513.localdomain sudo[41312]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:25 np0005538513.localdomain sudo[41374]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epzssomcdolbownileiofgzrgeborivt ; /usr/bin/python3
Nov 28 07:57:25 np0005538513.localdomain sudo[41374]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:25 np0005538513.localdomain python3[41376]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:25 np0005538513.localdomain sudo[41374]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:25 np0005538513.localdomain sudo[41392]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atvzhkiwdoijujxtircoswfrpokiiupg ; /usr/bin/python3
Nov 28 07:57:25 np0005538513.localdomain sudo[41392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:25 np0005538513.localdomain python3[41394]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/extraconfig.json _original_basename=extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:25 np0005538513.localdomain sudo[41392]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:25 np0005538513.localdomain sudo[41454]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-peirznqviudyqwbwaryvkqyizfiqzpkc ; /usr/bin/python3
Nov 28 07:57:25 np0005538513.localdomain sudo[41454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:26 np0005538513.localdomain python3[41456]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:26 np0005538513.localdomain sudo[41454]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:26 np0005538513.localdomain sudo[41472]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lohixmfebespejvljnjkvucfkzllhixq ; /usr/bin/python3
Nov 28 07:57:26 np0005538513.localdomain sudo[41472]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:26 np0005538513.localdomain python3[41474]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/role_extraconfig.json _original_basename=role_extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/role_extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:26 np0005538513.localdomain sudo[41472]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:26 np0005538513.localdomain sudo[41534]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efpdijrmuvfrvkhdjqeugsmzfrzpyita ; /usr/bin/python3
Nov 28 07:57:26 np0005538513.localdomain sudo[41534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:26 np0005538513.localdomain python3[41536]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:26 np0005538513.localdomain sudo[41534]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:26 np0005538513.localdomain sudo[41552]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nedmabrdqhwjkuafxndhhmkovcqmjjwp ; /usr/bin/python3
Nov 28 07:57:26 np0005538513.localdomain sudo[41552]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:27 np0005538513.localdomain python3[41554]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json _original_basename=ovn_chassis_mac_map.j2 recurse=False state=file path=/etc/puppet/hieradata/ovn_chassis_mac_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:27 np0005538513.localdomain sudo[41552]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:27 np0005538513.localdomain sudo[41582]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gvizkoynemwoherbbsalliglchopnwpq ; /usr/bin/python3
Nov 28 07:57:27 np0005538513.localdomain sudo[41582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:27 np0005538513.localdomain python3[41584]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:57:27 np0005538513.localdomain sudo[41582]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:28 np0005538513.localdomain sudo[41630]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iiwisiarjeyhxbnkdnnsadqztszufbbo ; /usr/bin/python3
Nov 28 07:57:28 np0005538513.localdomain sudo[41630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:28 np0005538513.localdomain python3[41632]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:28 np0005538513.localdomain sudo[41630]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:28 np0005538513.localdomain sudo[41648]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivnhzpmyjazjznhkcxkzfkphphcxmgve ; /usr/bin/python3
Nov 28 07:57:28 np0005538513.localdomain sudo[41648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:28 np0005538513.localdomain python3[41650]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=0644 dest=/etc/puppet/hieradata/ansible_managed.json _original_basename=tmpwgostixu recurse=False state=file path=/etc/puppet/hieradata/ansible_managed.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:28 np0005538513.localdomain sudo[41648]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:31 np0005538513.localdomain sudo[41678]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emxirhfluxpkylrjmwhajbzyzqgufwuf ; /usr/bin/python3
Nov 28 07:57:31 np0005538513.localdomain sudo[41678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:31 np0005538513.localdomain python3[41680]: ansible-dnf Invoked with name=['firewalld'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:57:33 np0005538513.localdomain sudo[41678]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:35 np0005538513.localdomain sudo[41695]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ylgaicyhglkxslgjhxalkzxgwzlwvlkt ; /usr/bin/python3
Nov 28 07:57:35 np0005538513.localdomain sudo[41695]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:35 np0005538513.localdomain python3[41697]: ansible-ansible.builtin.systemd Invoked with name=iptables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:57:35 np0005538513.localdomain sudo[41695]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:35 np0005538513.localdomain sudo[41713]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxehotsgokymvvisftnpdscswhixhbbo ; /usr/bin/python3
Nov 28 07:57:35 np0005538513.localdomain sudo[41713]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:36 np0005538513.localdomain python3[41715]: ansible-ansible.builtin.systemd Invoked with name=ip6tables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:57:36 np0005538513.localdomain sudo[41713]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:36 np0005538513.localdomain sudo[41731]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfmihivtrnmlajsfssqqbtcqngthdjjq ; /usr/bin/python3
Nov 28 07:57:36 np0005538513.localdomain sudo[41731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:36 np0005538513.localdomain python3[41733]: ansible-ansible.builtin.systemd Invoked with name=nftables state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:57:36 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:57:36 np0005538513.localdomain systemd-rc-local-generator[41760]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:57:36 np0005538513.localdomain systemd-sysv-generator[41765]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:57:36 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:57:37 np0005538513.localdomain systemd[1]: Starting Netfilter Tables...
Nov 28 07:57:37 np0005538513.localdomain systemd[1]: Finished Netfilter Tables.
Nov 28 07:57:37 np0005538513.localdomain sudo[41731]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:37 np0005538513.localdomain sudo[41821]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlsztjxnyzencbfoudnkaobxpugoqdwo ; /usr/bin/python3
Nov 28 07:57:37 np0005538513.localdomain sudo[41821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:37 np0005538513.localdomain python3[41823]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:37 np0005538513.localdomain sudo[41821]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:38 np0005538513.localdomain sudo[41864]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zymrvsztczgdmtdcejbfjmriemvdihuw ; /usr/bin/python3
Nov 28 07:57:38 np0005538513.localdomain sudo[41864]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:38 np0005538513.localdomain python3[41866]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316657.5355775-74983-248966599474069/source _original_basename=iptables.nft follow=False checksum=ede9860c99075946a7bc827210247aac639bc84a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:38 np0005538513.localdomain sudo[41864]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:38 np0005538513.localdomain sudo[41894]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fuqjcljfmlflkjolmmyymmlgwzyhxige ; /usr/bin/python3
Nov 28 07:57:38 np0005538513.localdomain sudo[41894]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:38 np0005538513.localdomain python3[41896]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:57:38 np0005538513.localdomain sudo[41894]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:39 np0005538513.localdomain sudo[41912]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jkbwthbcpkiuewwqlidvocpvclmnlihv ; /usr/bin/python3
Nov 28 07:57:39 np0005538513.localdomain sudo[41912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:39 np0005538513.localdomain python3[41914]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:57:39 np0005538513.localdomain sudo[41912]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:39 np0005538513.localdomain sudo[41961]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utzfphiraajetmgugcouwvbcpjfiuwcc ; /usr/bin/python3
Nov 28 07:57:39 np0005538513.localdomain sudo[41961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:39 np0005538513.localdomain python3[41963]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:39 np0005538513.localdomain sudo[41961]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:40 np0005538513.localdomain sudo[42004]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdyuxyrvihfhyirswhtvcqkozfjrdfpz ; /usr/bin/python3
Nov 28 07:57:40 np0005538513.localdomain sudo[42004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:40 np0005538513.localdomain python3[42006]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316659.4205143-75125-198920321598253/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:40 np0005538513.localdomain sudo[42004]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:40 np0005538513.localdomain sudo[42066]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fszellblujsotavdcztvovvpuwtrpawd ; /usr/bin/python3
Nov 28 07:57:40 np0005538513.localdomain sudo[42066]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:40 np0005538513.localdomain python3[42068]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-update-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:40 np0005538513.localdomain sudo[42066]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:41 np0005538513.localdomain sudo[42109]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsjskuiububgzpurorszqhppterkcglx ; /usr/bin/python3
Nov 28 07:57:41 np0005538513.localdomain sudo[42109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:41 np0005538513.localdomain python3[42111]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-update-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316660.3991268-75240-144059158843353/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:41 np0005538513.localdomain sudo[42109]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:41 np0005538513.localdomain sudo[42171]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqiqflgfxbpgcoutwdjwmxetckwfeidf ; /usr/bin/python3
Nov 28 07:57:41 np0005538513.localdomain sudo[42171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:41 np0005538513.localdomain python3[42173]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-flushes.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:41 np0005538513.localdomain sudo[42171]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:41 np0005538513.localdomain sudo[42214]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fmvavbvgbmqwjhzqplpfeewvmxtbgguy ; /usr/bin/python3
Nov 28 07:57:41 np0005538513.localdomain sudo[42214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:42 np0005538513.localdomain python3[42216]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-flushes.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316661.4136913-75310-166567593647304/source mode=None follow=False _original_basename=flush-chain.j2 checksum=e8e7b8db0d61a7fe393441cc91613f470eb34a6e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:42 np0005538513.localdomain sudo[42214]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:42 np0005538513.localdomain sudo[42276]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqoxoctysdymhecgsntnodjnsetjpwyx ; /usr/bin/python3
Nov 28 07:57:42 np0005538513.localdomain sudo[42276]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:42 np0005538513.localdomain python3[42278]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-chains.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:42 np0005538513.localdomain sudo[42276]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:42 np0005538513.localdomain sudo[42319]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fcbpqmutvjmxjdjjvkctmqldspspoinv ; /usr/bin/python3
Nov 28 07:57:42 np0005538513.localdomain sudo[42319]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:43 np0005538513.localdomain python3[42321]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-chains.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316662.3431256-75370-81703153813183/source mode=None follow=False _original_basename=chains.j2 checksum=e60ee651f5014e83924f4e901ecc8e25b1906610 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:43 np0005538513.localdomain sudo[42319]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:43 np0005538513.localdomain sudo[42381]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vcqoxkzlgmaimtnmnmhraoifgyvjosji ; /usr/bin/python3
Nov 28 07:57:43 np0005538513.localdomain sudo[42381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:44 np0005538513.localdomain python3[42383]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-rules.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:44 np0005538513.localdomain sudo[42381]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:44 np0005538513.localdomain sudo[42424]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lyopwaauqrjpklmpebblwgfkbnrqnxtp ; /usr/bin/python3
Nov 28 07:57:44 np0005538513.localdomain sudo[42424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:44 np0005538513.localdomain python3[42426]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-rules.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316663.2980568-75431-50936521324994/source mode=None follow=False _original_basename=ruleset.j2 checksum=0444e4206083f91e2fb2aabfa2928244c2db35ed backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:44 np0005538513.localdomain sudo[42424]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:44 np0005538513.localdomain sudo[42454]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mlchpetjbqbuyealyqtrgntqmtefccay ; /usr/bin/python3
Nov 28 07:57:44 np0005538513.localdomain sudo[42454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:44 np0005538513.localdomain python3[42456]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-chains.nft /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft /etc/nftables/tripleo-jumps.nft | nft -c -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:57:45 np0005538513.localdomain sudo[42454]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:45 np0005538513.localdomain sudo[42519]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kftffvizfcmskjspkkfbtxgbawuffjnd ; /usr/bin/python3
Nov 28 07:57:45 np0005538513.localdomain sudo[42519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:45 np0005538513.localdomain python3[42521]: ansible-ansible.builtin.blockinfile Invoked with path=/etc/sysconfig/nftables.conf backup=False validate=nft -c -f %s block=include "/etc/nftables/iptables.nft"
                                                         include "/etc/nftables/tripleo-chains.nft"
                                                         include "/etc/nftables/tripleo-rules.nft"
                                                         include "/etc/nftables/tripleo-jumps.nft"
                                                          state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:45 np0005538513.localdomain sudo[42519]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:45 np0005538513.localdomain sudo[42536]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfqpqebwaugmhpytqcpovdtcoxrdfylb ; /usr/bin/python3
Nov 28 07:57:45 np0005538513.localdomain sudo[42536]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:45 np0005538513.localdomain python3[42538]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/tripleo-chains.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:57:45 np0005538513.localdomain sudo[42536]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:46 np0005538513.localdomain sudo[42553]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zmlttdsimpbnajurrxzlxsmqucirzibd ; /usr/bin/python3
Nov 28 07:57:46 np0005538513.localdomain sudo[42553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:46 np0005538513.localdomain python3[42555]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft | nft -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:57:46 np0005538513.localdomain sudo[42553]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:46 np0005538513.localdomain sudo[42572]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxsbaayksfulywqvlkaspnwpnlfevzpt ; /usr/bin/python3
Nov 28 07:57:46 np0005538513.localdomain sudo[42572]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:46 np0005538513.localdomain python3[42574]: ansible-file Invoked with mode=0750 path=/var/log/containers/collectd setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:57:46 np0005538513.localdomain sudo[42572]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:46 np0005538513.localdomain sudo[42588]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehjcwocqlfetwpydlsjtdfzdjqaxamby ; /usr/bin/python3
Nov 28 07:57:46 np0005538513.localdomain sudo[42588]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:46 np0005538513.localdomain python3[42590]: ansible-file Invoked with mode=0755 path=/var/lib/container-user-scripts/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:57:46 np0005538513.localdomain sudo[42588]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:47 np0005538513.localdomain sudo[42604]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtudpvluzqjncmegbpourhmtccafjskx ; /usr/bin/python3
Nov 28 07:57:47 np0005538513.localdomain sudo[42604]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:47 np0005538513.localdomain python3[42606]: ansible-file Invoked with mode=0750 path=/var/log/containers/ceilometer setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:57:47 np0005538513.localdomain sudo[42604]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:47 np0005538513.localdomain sudo[42620]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-opifsbyrdtbwqgwefucnxgvxtkqjpkxs ; /usr/bin/python3
Nov 28 07:57:47 np0005538513.localdomain sudo[42620]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:47 np0005538513.localdomain python3[42622]: ansible-seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 28 07:57:48 np0005538513.localdomain sudo[42620]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:49 np0005538513.localdomain sudo[42640]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gtkpcseezqbivdudpkifdlmctwuwphyx ; /usr/bin/python3
Nov 28 07:57:49 np0005538513.localdomain dbus-broker-launch[755]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Nov 28 07:57:49 np0005538513.localdomain sudo[42640]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:49 np0005538513.localdomain python3[42642]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Nov 28 07:57:50 np0005538513.localdomain kernel: SELinux:  Converting 2704 SID table entries...
Nov 28 07:57:50 np0005538513.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 07:57:50 np0005538513.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 07:57:50 np0005538513.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 07:57:50 np0005538513.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 07:57:50 np0005538513.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 07:57:50 np0005538513.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 07:57:50 np0005538513.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 07:57:50 np0005538513.localdomain sudo[42640]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:50 np0005538513.localdomain sudo[42661]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onocvwdtegtrvwnehpzftkrdjqafjduk ; /usr/bin/python3
Nov 28 07:57:50 np0005538513.localdomain dbus-broker-launch[755]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 28 07:57:50 np0005538513.localdomain sudo[42661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:50 np0005538513.localdomain python3[42663]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/target(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Nov 28 07:57:51 np0005538513.localdomain kernel: SELinux:  Converting 2704 SID table entries...
Nov 28 07:57:51 np0005538513.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 07:57:51 np0005538513.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 07:57:51 np0005538513.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 07:57:51 np0005538513.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 07:57:51 np0005538513.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 07:57:51 np0005538513.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 07:57:51 np0005538513.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 07:57:51 np0005538513.localdomain sudo[42661]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:51 np0005538513.localdomain sudo[42682]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntmnftvouiwfatwghtzndgwhdisqewvf ; /usr/bin/python3
Nov 28 07:57:51 np0005538513.localdomain dbus-broker-launch[755]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 28 07:57:51 np0005538513.localdomain sudo[42682]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:52 np0005538513.localdomain python3[42684]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/var/lib/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Nov 28 07:57:52 np0005538513.localdomain kernel: SELinux:  Converting 2704 SID table entries...
Nov 28 07:57:52 np0005538513.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 07:57:52 np0005538513.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 07:57:52 np0005538513.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 07:57:52 np0005538513.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 07:57:52 np0005538513.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 07:57:52 np0005538513.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 07:57:52 np0005538513.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 07:57:53 np0005538513.localdomain sudo[42682]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:53 np0005538513.localdomain sudo[42705]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-elkhpgmokoilzhudempghfcprkaceecv ; /usr/bin/python3
Nov 28 07:57:53 np0005538513.localdomain dbus-broker-launch[755]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 28 07:57:53 np0005538513.localdomain sudo[42705]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:53 np0005538513.localdomain python3[42707]: ansible-file Invoked with path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:57:53 np0005538513.localdomain sudo[42705]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:53 np0005538513.localdomain sudo[42721]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xdypmvobzgekpjsxpegwaksowjzvimdf ; /usr/bin/python3
Nov 28 07:57:53 np0005538513.localdomain sudo[42721]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:53 np0005538513.localdomain python3[42723]: ansible-file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:57:53 np0005538513.localdomain sudo[42721]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:53 np0005538513.localdomain sudo[42737]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zbcmsrfcpvavtuvrveeiwzbbdlyqclni ; /usr/bin/python3
Nov 28 07:57:53 np0005538513.localdomain sudo[42737]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:54 np0005538513.localdomain python3[42739]: ansible-file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:57:54 np0005538513.localdomain sudo[42737]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:54 np0005538513.localdomain sudo[42753]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cpwddclubenejqirzrfkyefgfhnglznx ; /usr/bin/python3
Nov 28 07:57:54 np0005538513.localdomain sudo[42753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:54 np0005538513.localdomain python3[42755]: ansible-stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:57:54 np0005538513.localdomain sudo[42753]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:54 np0005538513.localdomain sudo[42769]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvewqhewcvwlvhrbbltqyqjuymiwcnzj ; /usr/bin/python3
Nov 28 07:57:54 np0005538513.localdomain sudo[42769]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:55 np0005538513.localdomain python3[42771]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-enabled --quiet iscsi.service _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:57:55 np0005538513.localdomain sudo[42769]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:55 np0005538513.localdomain sudo[42786]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-waargdjptoiwqzjcvjcofqjakafvcnxo ; /usr/bin/python3
Nov 28 07:57:55 np0005538513.localdomain sudo[42786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:55 np0005538513.localdomain python3[42788]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:57:59 np0005538513.localdomain sudo[42786]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:59 np0005538513.localdomain sudo[42803]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hntwcnwfwwzwpzkejiccuabehqoakclr ; /usr/bin/python3
Nov 28 07:57:59 np0005538513.localdomain sudo[42803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:59 np0005538513.localdomain python3[42805]: ansible-file Invoked with path=/etc/modules-load.d state=directory mode=493 owner=root group=root setype=etc_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:57:59 np0005538513.localdomain sudo[42803]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:59 np0005538513.localdomain sudo[42851]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phrsresjibiqmiwteritohnyeqetdtxf ; /usr/bin/python3
Nov 28 07:57:59 np0005538513.localdomain sudo[42851]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:59 np0005538513.localdomain python3[42853]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:59 np0005538513.localdomain sudo[42851]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:00 np0005538513.localdomain sudo[42894]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntygaijrggzkqaecobxkvcvodcesmlfa ; /usr/bin/python3
Nov 28 07:58:00 np0005538513.localdomain sudo[42894]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:00 np0005538513.localdomain python3[42896]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316679.6330104-76381-160894852188564/source dest=/etc/modules-load.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:00 np0005538513.localdomain sudo[42894]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:00 np0005538513.localdomain sudo[42924]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upebjuuzkmjwjphnhvswtxzrmmzqygsz ; /usr/bin/python3
Nov 28 07:58:00 np0005538513.localdomain sudo[42924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:00 np0005538513.localdomain python3[42926]: ansible-systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 07:58:00 np0005538513.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 28 07:58:00 np0005538513.localdomain systemd[1]: Stopped Load Kernel Modules.
Nov 28 07:58:00 np0005538513.localdomain systemd[1]: Stopping Load Kernel Modules...
Nov 28 07:58:00 np0005538513.localdomain systemd[1]: Starting Load Kernel Modules...
Nov 28 07:58:00 np0005538513.localdomain kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 28 07:58:00 np0005538513.localdomain kernel: Bridge firewalling registered
Nov 28 07:58:00 np0005538513.localdomain systemd-modules-load[42929]: Inserted module 'br_netfilter'
Nov 28 07:58:00 np0005538513.localdomain systemd-modules-load[42929]: Module 'msr' is built in
Nov 28 07:58:00 np0005538513.localdomain systemd[1]: Finished Load Kernel Modules.
Nov 28 07:58:00 np0005538513.localdomain sudo[42924]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:01 np0005538513.localdomain sudo[42978]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nraquntndlblrmfbaahqwcjkcvqywllu ; /usr/bin/python3
Nov 28 07:58:01 np0005538513.localdomain sudo[42978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:01 np0005538513.localdomain python3[42980]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:01 np0005538513.localdomain sudo[42978]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:01 np0005538513.localdomain sudo[43021]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-omejtbjetirirpjfzgnpnftyifheyobo ; /usr/bin/python3
Nov 28 07:58:01 np0005538513.localdomain sudo[43021]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:01 np0005538513.localdomain python3[43023]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316681.100863-76432-139094064857990/source dest=/etc/sysctl.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-sysctl.conf.j2 checksum=cddb9401fdafaaf28a4a94b98448f98ae93c94c9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:01 np0005538513.localdomain sudo[43021]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:02 np0005538513.localdomain sudo[43051]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evpdxqbgjwhgfbskvqxyjhlzaiqezjvm ; /usr/bin/python3
Nov 28 07:58:02 np0005538513.localdomain sudo[43051]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:02 np0005538513.localdomain python3[43053]: ansible-sysctl Invoked with name=fs.aio-max-nr value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:02 np0005538513.localdomain sudo[43051]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:02 np0005538513.localdomain sudo[43068]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsolnxglqdoahtgjsazdmijweocullok ; /usr/bin/python3
Nov 28 07:58:02 np0005538513.localdomain sudo[43068]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:02 np0005538513.localdomain python3[43070]: ansible-sysctl Invoked with name=fs.inotify.max_user_instances value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:02 np0005538513.localdomain sudo[43068]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:02 np0005538513.localdomain sudo[43086]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtthsvpcwtaunvcddomkumydidqptxco ; /usr/bin/python3
Nov 28 07:58:02 np0005538513.localdomain sudo[43086]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:02 np0005538513.localdomain python3[43088]: ansible-sysctl Invoked with name=kernel.pid_max value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:03 np0005538513.localdomain sudo[43086]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:03 np0005538513.localdomain sudo[43104]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-reaehjfwtvyqgvccazophfvaksazocpl ; /usr/bin/python3
Nov 28 07:58:03 np0005538513.localdomain sudo[43104]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:03 np0005538513.localdomain python3[43106]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-arptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:03 np0005538513.localdomain sudo[43104]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:03 np0005538513.localdomain sudo[43121]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlwknhypqwpukkzxcldzlpfwhgalajim ; /usr/bin/python3
Nov 28 07:58:03 np0005538513.localdomain sudo[43121]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:03 np0005538513.localdomain python3[43123]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-ip6tables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:03 np0005538513.localdomain sudo[43121]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:03 np0005538513.localdomain sudo[43138]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvvrnpboiepidkdwofzbuetdqspqocuv ; /usr/bin/python3
Nov 28 07:58:03 np0005538513.localdomain sudo[43138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:04 np0005538513.localdomain python3[43140]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-iptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:04 np0005538513.localdomain sudo[43138]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:04 np0005538513.localdomain sudo[43155]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swjrhupxjqylfhdroebeuieoqnlqrskn ; /usr/bin/python3
Nov 28 07:58:04 np0005538513.localdomain sudo[43155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:04 np0005538513.localdomain python3[43157]: ansible-sysctl Invoked with name=net.ipv4.conf.all.rp_filter value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:04 np0005538513.localdomain sudo[43155]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:04 np0005538513.localdomain sudo[43173]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-loomveujcaepzdmkchgibvvgwqbmnker ; /usr/bin/python3
Nov 28 07:58:04 np0005538513.localdomain sudo[43173]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:04 np0005538513.localdomain python3[43175]: ansible-sysctl Invoked with name=net.ipv4.ip_forward value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:05 np0005538513.localdomain sudo[43173]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:05 np0005538513.localdomain sudo[43191]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mujyjmzyxqxunjmcqvefewacknzamoru ; /usr/bin/python3
Nov 28 07:58:05 np0005538513.localdomain sudo[43191]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:06 np0005538513.localdomain python3[43193]: ansible-sysctl Invoked with name=net.ipv4.ip_local_reserved_ports value=35357,49000-49001 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:06 np0005538513.localdomain sudo[43191]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:06 np0005538513.localdomain sudo[43209]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgcnbfysbjbczlwmmyvyqfsvcbhszqxi ; /usr/bin/python3
Nov 28 07:58:06 np0005538513.localdomain sudo[43209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:06 np0005538513.localdomain python3[43211]: ansible-sysctl Invoked with name=net.ipv4.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:06 np0005538513.localdomain sudo[43209]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:06 np0005538513.localdomain sudo[43227]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-awxjiwtehermzvhcbvfvxrfummtodzen ; /usr/bin/python3
Nov 28 07:58:06 np0005538513.localdomain sudo[43227]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:06 np0005538513.localdomain python3[43229]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh1 value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:06 np0005538513.localdomain sudo[43227]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:06 np0005538513.localdomain sudo[43245]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btkywipkdlmartzhhnuxsqjanunxysbu ; /usr/bin/python3
Nov 28 07:58:06 np0005538513.localdomain sudo[43245]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:06 np0005538513.localdomain python3[43247]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh2 value=2048 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:06 np0005538513.localdomain sudo[43245]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:07 np0005538513.localdomain sudo[43263]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shiahkwxleevcuxetlcsxddrghcqtwjr ; /usr/bin/python3
Nov 28 07:58:07 np0005538513.localdomain sudo[43263]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:07 np0005538513.localdomain python3[43265]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh3 value=4096 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:07 np0005538513.localdomain sudo[43263]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:07 np0005538513.localdomain sudo[43281]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cbxnypwcshfgapmxzvrhtgaukrnjqitg ; /usr/bin/python3
Nov 28 07:58:07 np0005538513.localdomain sudo[43281]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:07 np0005538513.localdomain python3[43283]: ansible-sysctl Invoked with name=net.ipv6.conf.all.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:07 np0005538513.localdomain sudo[43281]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:07 np0005538513.localdomain sudo[43298]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vfycnbgpnjoxinzzhhnenquqllzbydhy ; /usr/bin/python3
Nov 28 07:58:07 np0005538513.localdomain sudo[43298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:07 np0005538513.localdomain python3[43300]: ansible-sysctl Invoked with name=net.ipv6.conf.all.forwarding value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:07 np0005538513.localdomain sudo[43298]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:08 np0005538513.localdomain sudo[43315]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtmbbqfanhgimgynbfmtgingzakmhoww ; /usr/bin/python3
Nov 28 07:58:08 np0005538513.localdomain sudo[43315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:08 np0005538513.localdomain python3[43317]: ansible-sysctl Invoked with name=net.ipv6.conf.default.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:08 np0005538513.localdomain sudo[43315]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:08 np0005538513.localdomain sudo[43332]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dwzrypwlqofcuouhoprpdtxkzzrihljw ; /usr/bin/python3
Nov 28 07:58:08 np0005538513.localdomain sudo[43332]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:08 np0005538513.localdomain python3[43334]: ansible-sysctl Invoked with name=net.ipv6.conf.lo.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:08 np0005538513.localdomain sudo[43332]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:08 np0005538513.localdomain sudo[43349]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uedxyyqfqfocjzytwidgtujhomauzaxo ; /usr/bin/python3
Nov 28 07:58:08 np0005538513.localdomain sudo[43349]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:08 np0005538513.localdomain python3[43351]: ansible-sysctl Invoked with name=net.ipv6.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:08 np0005538513.localdomain sudo[43349]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:09 np0005538513.localdomain sudo[43367]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vnufhhvzrlgagamyvknbzltvamjjyyzt ; /usr/bin/python3
Nov 28 07:58:09 np0005538513.localdomain sudo[43367]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:09 np0005538513.localdomain python3[43369]: ansible-systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 07:58:09 np0005538513.localdomain systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 28 07:58:09 np0005538513.localdomain systemd[1]: Stopped Apply Kernel Variables.
Nov 28 07:58:09 np0005538513.localdomain systemd[1]: Stopping Apply Kernel Variables...
Nov 28 07:58:09 np0005538513.localdomain systemd[1]: Starting Apply Kernel Variables...
Nov 28 07:58:09 np0005538513.localdomain systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 28 07:58:09 np0005538513.localdomain systemd[1]: Finished Apply Kernel Variables.
Nov 28 07:58:09 np0005538513.localdomain sudo[43367]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:09 np0005538513.localdomain sudo[43387]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-necasosnzenjekmionzraawndyqfpbpf ; /usr/bin/python3
Nov 28 07:58:09 np0005538513.localdomain sudo[43387]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:09 np0005538513.localdomain python3[43389]: ansible-file Invoked with mode=0750 path=/var/log/containers/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:09 np0005538513.localdomain sudo[43387]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:10 np0005538513.localdomain sudo[43403]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-miwmtyocgjnavxmilzlgwdwhqgcfqkys ; /usr/bin/python3
Nov 28 07:58:10 np0005538513.localdomain sudo[43403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:10 np0005538513.localdomain python3[43405]: ansible-file Invoked with path=/var/lib/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:10 np0005538513.localdomain sudo[43403]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:10 np0005538513.localdomain sudo[43419]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sbfxuaiajwclovdrmwcknuoerszoobpl ; /usr/bin/python3
Nov 28 07:58:10 np0005538513.localdomain sudo[43419]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:10 np0005538513.localdomain python3[43421]: ansible-file Invoked with mode=0750 path=/var/log/containers/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:10 np0005538513.localdomain sudo[43419]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:10 np0005538513.localdomain sudo[43435]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggzmxxxnqtqqwztepilbxiqdodfsgiwy ; /usr/bin/python3
Nov 28 07:58:10 np0005538513.localdomain sudo[43435]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:11 np0005538513.localdomain python3[43437]: ansible-stat Invoked with path=/var/lib/nova/instances follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:58:11 np0005538513.localdomain sudo[43435]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:11 np0005538513.localdomain sudo[43451]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxevjxmaehmukqpvhvsdpgexdxfgikxc ; /usr/bin/python3
Nov 28 07:58:11 np0005538513.localdomain sudo[43451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:11 np0005538513.localdomain sudo[43453]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:58:11 np0005538513.localdomain sudo[43453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:58:11 np0005538513.localdomain sudo[43453]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:11 np0005538513.localdomain sudo[43469]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 07:58:11 np0005538513.localdomain sudo[43469]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:58:11 np0005538513.localdomain python3[43456]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:11 np0005538513.localdomain sudo[43451]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:11 np0005538513.localdomain sudo[43497]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrksmronfbwdzoyaxxxktcepkwktcihs ; /usr/bin/python3
Nov 28 07:58:11 np0005538513.localdomain sudo[43497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:11 np0005538513.localdomain python3[43499]: ansible-file Invoked with path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:11 np0005538513.localdomain sudo[43497]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:11 np0005538513.localdomain sudo[43469]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:11 np0005538513.localdomain sudo[43533]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kztdqzjxpkzrgrkxvyxivxfhvaabrvqp ; /usr/bin/python3
Nov 28 07:58:11 np0005538513.localdomain sudo[43533]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:11 np0005538513.localdomain sudo[43536]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:58:11 np0005538513.localdomain sudo[43536]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:58:11 np0005538513.localdomain sudo[43536]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:12 np0005538513.localdomain python3[43535]: ansible-file Invoked with path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:12 np0005538513.localdomain sudo[43533]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:12 np0005538513.localdomain sudo[43551]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:58:12 np0005538513.localdomain sudo[43551]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:58:12 np0005538513.localdomain sudo[43579]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-inyrjpocziowlrmckbvijanxnolcwtsj ; /usr/bin/python3
Nov 28 07:58:12 np0005538513.localdomain sudo[43579]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:12 np0005538513.localdomain python3[43581]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:12 np0005538513.localdomain sudo[43579]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:12 np0005538513.localdomain sudo[43610]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqmpqvtggmxbvnftqoxbuxpksxoskifx ; /usr/bin/python3
Nov 28 07:58:12 np0005538513.localdomain sudo[43610]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:12 np0005538513.localdomain python3[43612]: ansible-file Invoked with path=/etc/tmpfiles.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:12 np0005538513.localdomain sudo[43610]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:12 np0005538513.localdomain sudo[43551]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:12 np0005538513.localdomain sudo[43675]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ssojekaqkmtocrfsxocooewmlnvzedos ; /usr/bin/python3
Nov 28 07:58:12 np0005538513.localdomain sudo[43675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:13 np0005538513.localdomain python3[43677]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-nova.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:13 np0005538513.localdomain sudo[43675]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:13 np0005538513.localdomain sudo[43678]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:58:13 np0005538513.localdomain sudo[43678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:58:13 np0005538513.localdomain sudo[43678]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:13 np0005538513.localdomain sudo[43733]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpytqstdztgfqwwltsyumavoicjjcvpj ; /usr/bin/python3
Nov 28 07:58:13 np0005538513.localdomain sudo[43733]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:13 np0005538513.localdomain python3[43735]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-nova.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316692.7713733-76839-31204975545471/source _original_basename=tmpvv4ci4f6 follow=False checksum=f834349098718ec09c7562bcb470b717a83ff411 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:13 np0005538513.localdomain sudo[43733]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:13 np0005538513.localdomain sudo[43763]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mlznrczzhneqomhunqlyonyvqxqjtfhq ; /usr/bin/python3
Nov 28 07:58:13 np0005538513.localdomain sudo[43763]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:13 np0005538513.localdomain python3[43765]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-tmpfiles --create _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:14 np0005538513.localdomain sudo[43763]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:15 np0005538513.localdomain sudo[43780]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojowdwrcrctrwsjkwavdtyswgqsvpwex ; /usr/bin/python3
Nov 28 07:58:15 np0005538513.localdomain sudo[43780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:15 np0005538513.localdomain python3[43782]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:15 np0005538513.localdomain sudo[43780]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:16 np0005538513.localdomain sudo[43828]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-decbwlvoivzzhsmboszamsfkztklzsrw ; /usr/bin/python3
Nov 28 07:58:16 np0005538513.localdomain sudo[43828]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:16 np0005538513.localdomain python3[43830]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/delay-nova-compute follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:16 np0005538513.localdomain sudo[43828]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:16 np0005538513.localdomain sudo[43871]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zegqnfexpxdtsmosftkitrtpejlayqpo ; /usr/bin/python3
Nov 28 07:58:16 np0005538513.localdomain sudo[43871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:16 np0005538513.localdomain python3[43873]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/nova/delay-nova-compute mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316695.9173484-77028-89722746026274/source _original_basename=tmp0jcbpm90 follow=False checksum=f07ad3e8cf3766b3b3b07ae8278826a0ef3bb5e3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:16 np0005538513.localdomain sudo[43871]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:16 np0005538513.localdomain sudo[43901]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ufhjybkwxqsmqjrfygdeuwqliwmqokkd ; /usr/bin/python3
Nov 28 07:58:16 np0005538513.localdomain sudo[43901]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:17 np0005538513.localdomain python3[43903]: ansible-file Invoked with mode=0750 path=/var/log/containers/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:17 np0005538513.localdomain sudo[43901]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:17 np0005538513.localdomain sudo[43917]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tjecipttrnxxluejrljwuwfiaxlgxgwf ; /usr/bin/python3
Nov 28 07:58:17 np0005538513.localdomain sudo[43917]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:17 np0005538513.localdomain python3[43919]: ansible-file Invoked with path=/etc/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:17 np0005538513.localdomain sudo[43917]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:17 np0005538513.localdomain sudo[43933]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zgnafiuiqpenvqjukmjbfvbuhoxdrcuu ; /usr/bin/python3
Nov 28 07:58:17 np0005538513.localdomain sudo[43933]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:17 np0005538513.localdomain python3[43935]: ansible-file Invoked with path=/etc/libvirt/secrets setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:17 np0005538513.localdomain sudo[43933]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:17 np0005538513.localdomain sudo[43949]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cokknfnczvyeiwexokatnuhoogoqpvxc ; /usr/bin/python3
Nov 28 07:58:17 np0005538513.localdomain sudo[43949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:18 np0005538513.localdomain python3[43951]: ansible-file Invoked with path=/etc/libvirt/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:18 np0005538513.localdomain sudo[43949]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:18 np0005538513.localdomain sudo[43965]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tvcphamasqjhjwdimrenhzyeoivyyikr ; /usr/bin/python3
Nov 28 07:58:18 np0005538513.localdomain sudo[43965]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:18 np0005538513.localdomain python3[43967]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:18 np0005538513.localdomain sudo[43965]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:18 np0005538513.localdomain sudo[43981]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lphedjtowolrohwuyjpczjmhxrhbntxf ; /usr/bin/python3
Nov 28 07:58:18 np0005538513.localdomain sudo[43981]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:18 np0005538513.localdomain python3[43983]: ansible-file Invoked with path=/var/cache/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:18 np0005538513.localdomain sudo[43981]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:18 np0005538513.localdomain sudo[43997]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gopgsxilgzaaxvrwdngqgvmolszhdhve ; /usr/bin/python3
Nov 28 07:58:18 np0005538513.localdomain sudo[43997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:18 np0005538513.localdomain python3[43999]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:19 np0005538513.localdomain sudo[43997]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:19 np0005538513.localdomain sudo[44013]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dfhewgjxdtpdojzwicwjmqqxyrvmyook ; /usr/bin/python3
Nov 28 07:58:19 np0005538513.localdomain sudo[44013]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:19 np0005538513.localdomain python3[44015]: ansible-file Invoked with path=/run/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:19 np0005538513.localdomain sudo[44013]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:19 np0005538513.localdomain sudo[44029]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upgavlgsdtepvnnutwudvcksnnkjarri ; /usr/bin/python3
Nov 28 07:58:19 np0005538513.localdomain sudo[44029]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:19 np0005538513.localdomain python3[44031]: ansible-file Invoked with mode=0770 path=/var/log/containers/libvirt/swtpm setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:19 np0005538513.localdomain sudo[44029]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:19 np0005538513.localdomain sudo[44045]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxscdirfeavubqjruevzaqmyjhgfycca ; /usr/bin/python3
Nov 28 07:58:19 np0005538513.localdomain sudo[44045]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:20 np0005538513.localdomain python3[44047]: ansible-group Invoked with gid=107 name=qemu state=present system=False local=False non_unique=False
Nov 28 07:58:20 np0005538513.localdomain groupadd[44048]: group added to /etc/group: name=qemu, GID=107
Nov 28 07:58:20 np0005538513.localdomain groupadd[44048]: group added to /etc/gshadow: name=qemu
Nov 28 07:58:20 np0005538513.localdomain groupadd[44048]: new group: name=qemu, GID=107
Nov 28 07:58:20 np0005538513.localdomain sudo[44045]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:20 np0005538513.localdomain sudo[44067]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbyjwjdrbcljwutirvnxhsofdtiueuct ; /usr/bin/python3
Nov 28 07:58:20 np0005538513.localdomain sudo[44067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:20 np0005538513.localdomain python3[44069]: ansible-user Invoked with comment=qemu user group=qemu name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005538513.localdomain update_password=always groups=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 28 07:58:20 np0005538513.localdomain useradd[44071]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=none
Nov 28 07:58:20 np0005538513.localdomain sudo[44067]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:20 np0005538513.localdomain sudo[44091]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-decdkiydwyhzchopnqlyzkqfhjtnavrs ; /usr/bin/python3
Nov 28 07:58:20 np0005538513.localdomain sudo[44091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:21 np0005538513.localdomain python3[44093]: ansible-file Invoked with group=qemu owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None serole=None selevel=None attributes=None
Nov 28 07:58:21 np0005538513.localdomain sudo[44091]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:21 np0005538513.localdomain sudo[44107]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zadtgfneygkzlyzjvoclsgwodrwmdtpp ; /usr/bin/python3
Nov 28 07:58:21 np0005538513.localdomain sudo[44107]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:21 np0005538513.localdomain python3[44109]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/rpm -q libvirt-daemon _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:21 np0005538513.localdomain sudo[44107]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:21 np0005538513.localdomain sudo[44156]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvsrbgwrkufwbplsrhozxgihzryeifop ; /usr/bin/python3
Nov 28 07:58:21 np0005538513.localdomain sudo[44156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:21 np0005538513.localdomain python3[44158]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-libvirt.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:21 np0005538513.localdomain sudo[44156]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:22 np0005538513.localdomain sudo[44199]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lrsjlemxiguvhtcaqlemcxkdagfgotjz ; /usr/bin/python3
Nov 28 07:58:22 np0005538513.localdomain sudo[44199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:22 np0005538513.localdomain python3[44201]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-libvirt.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316701.6468189-77355-258856901106800/source _original_basename=tmpw8ggb0wa follow=False checksum=57f3ff94c666c6aae69ae22e23feb750cf9e8b13 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:22 np0005538513.localdomain sudo[44199]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:22 np0005538513.localdomain sudo[44229]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljlqoouvclpbgqegqvusvvahvundqcgu ; /usr/bin/python3
Nov 28 07:58:22 np0005538513.localdomain sudo[44229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:22 np0005538513.localdomain python3[44231]: ansible-seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 28 07:58:23 np0005538513.localdomain sudo[44229]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:23 np0005538513.localdomain sudo[44251]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umungbrwqbxxllrmvnbjvzvynmslhlrf ; /usr/bin/python3
Nov 28 07:58:23 np0005538513.localdomain dbus-broker-launch[755]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 28 07:58:23 np0005538513.localdomain sudo[44251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:23 np0005538513.localdomain python3[44253]: ansible-file Invoked with path=/etc/crypto-policies/local.d/gnutls-qemu.config state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:23 np0005538513.localdomain sudo[44251]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:24 np0005538513.localdomain sudo[44267]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nsqkzbuismeurvbemmayirmrzpcyftjc ; /usr/bin/python3
Nov 28 07:58:24 np0005538513.localdomain sudo[44267]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:24 np0005538513.localdomain python3[44269]: ansible-file Invoked with path=/run/libvirt setype=virt_var_run_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:24 np0005538513.localdomain sudo[44267]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:24 np0005538513.localdomain sudo[44283]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-soxfobiamttvmtcgxcworhohpfjknxto ; /usr/bin/python3
Nov 28 07:58:24 np0005538513.localdomain sudo[44283]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:24 np0005538513.localdomain python3[44285]: ansible-seboolean Invoked with name=logrotate_read_inside_containers persistent=True state=True ignore_selinux_state=False
Nov 28 07:58:25 np0005538513.localdomain sudo[44283]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:26 np0005538513.localdomain sudo[44303]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzjjkmpsymrogpdbkwwltfgkzkrihjjo ; /usr/bin/python3
Nov 28 07:58:26 np0005538513.localdomain dbus-broker-launch[755]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Nov 28 07:58:26 np0005538513.localdomain sudo[44303]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:26 np0005538513.localdomain python3[44305]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:58:28 np0005538513.localdomain sudo[44303]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:29 np0005538513.localdomain sudo[44320]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iyeapcpllstetkpsjtuxtucfjqxehdmh ; /usr/bin/python3
Nov 28 07:58:29 np0005538513.localdomain sudo[44320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:29 np0005538513.localdomain python3[44322]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_interfaces'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 07:58:29 np0005538513.localdomain sudo[44320]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:30 np0005538513.localdomain sudo[44381]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibhcfnivcrwhjmpmbxrsktwdxldeyaxs ; /usr/bin/python3
Nov 28 07:58:30 np0005538513.localdomain sudo[44381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:30 np0005538513.localdomain python3[44383]: ansible-file Invoked with path=/etc/containers/networks state=directory recurse=True mode=493 owner=root group=root force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:30 np0005538513.localdomain sudo[44381]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:30 np0005538513.localdomain sudo[44397]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yuhdibqkaqsepdgitcjgfkmwgjjwsblf ; /usr/bin/python3
Nov 28 07:58:30 np0005538513.localdomain sudo[44397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:30 np0005538513.localdomain python3[44399]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:30 np0005538513.localdomain sudo[44397]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:31 np0005538513.localdomain sudo[44457]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgkxxnoucsohnsgguulcrnlmibcknbvc ; /usr/bin/python3
Nov 28 07:58:31 np0005538513.localdomain sudo[44457]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:31 np0005538513.localdomain python3[44459]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:31 np0005538513.localdomain sudo[44457]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:31 np0005538513.localdomain sudo[44500]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snugiivbxwnuowhxqipdvhgffnfdldir ; /usr/bin/python3
Nov 28 07:58:31 np0005538513.localdomain sudo[44500]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:32 np0005538513.localdomain python3[44502]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316710.785518-77853-248879723636934/source dest=/etc/containers/networks/podman.json mode=0644 owner=root group=root follow=False _original_basename=podman_network_config.j2 checksum=5f4ee99fc7d9e996ae5b1d2f917f41c82ac4db9e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:32 np0005538513.localdomain sudo[44500]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:32 np0005538513.localdomain sudo[44562]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzbnqukdcveyakuqnvsvyxdhaendvlox ; /usr/bin/python3
Nov 28 07:58:32 np0005538513.localdomain sudo[44562]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:32 np0005538513.localdomain python3[44564]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:32 np0005538513.localdomain sudo[44562]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:33 np0005538513.localdomain sudo[44607]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjarrbttrnjhykepfwvscrqpfaoydkxm ; /usr/bin/python3
Nov 28 07:58:33 np0005538513.localdomain sudo[44607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:33 np0005538513.localdomain python3[44609]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316712.450254-77953-234226567019734/source dest=/etc/containers/registries.conf owner=root group=root setype=etc_t mode=0644 follow=False _original_basename=registries.conf.j2 checksum=710a00cfb11a4c3eba9c028ef1984a9fea9ba83a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:33 np0005538513.localdomain sudo[44607]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:33 np0005538513.localdomain sudo[44637]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmqnjiynushxhznjqykqkhhtgknserph ; /usr/bin/python3
Nov 28 07:58:33 np0005538513.localdomain sudo[44637]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:33 np0005538513.localdomain python3[44639]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=containers option=pids_limit value=4096 backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:33 np0005538513.localdomain sudo[44637]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:33 np0005538513.localdomain sudo[44653]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gixqwvkziiztxssttuvomzmdkxbvgrqi ; /usr/bin/python3
Nov 28 07:58:33 np0005538513.localdomain sudo[44653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:33 np0005538513.localdomain python3[44655]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=events_logger value="journald" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:33 np0005538513.localdomain sudo[44653]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:34 np0005538513.localdomain sudo[44669]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fbtasdouwlzdjsfifmpfckbfuylkfbcv ; /usr/bin/python3
Nov 28 07:58:34 np0005538513.localdomain sudo[44669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:34 np0005538513.localdomain python3[44671]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=runtime value="crun" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:34 np0005538513.localdomain sudo[44669]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:34 np0005538513.localdomain sudo[44685]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uxoghdjcfvxkirkhbabzmzololzhasyq ; /usr/bin/python3
Nov 28 07:58:34 np0005538513.localdomain sudo[44685]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:34 np0005538513.localdomain python3[44687]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=network option=network_backend value="netavark" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:34 np0005538513.localdomain sudo[44685]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 07:58:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Cumulative writes: 3254 writes, 16K keys, 3254 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.02 MB/s
                                                          Cumulative WAL: 3254 writes, 143 syncs, 22.76 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 3254 writes, 16K keys, 3254 commit groups, 1.0 writes per commit group, ingest: 14.65 MB, 0.02 MB/s
                                                          Interval WAL: 3254 writes, 143 syncs, 22.76 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0a2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0a2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0a2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 28 07:58:35 np0005538513.localdomain sudo[44733]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntfsxyujuungaehhpmanskekovfzzrmc ; /usr/bin/python3
Nov 28 07:58:35 np0005538513.localdomain sudo[44733]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:35 np0005538513.localdomain python3[44735]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:35 np0005538513.localdomain sudo[44733]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:35 np0005538513.localdomain sudo[44776]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecyzooutmfieqrsrxiomxbqjkketkcqe ; /usr/bin/python3
Nov 28 07:58:35 np0005538513.localdomain sudo[44776]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:35 np0005538513.localdomain python3[44778]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316714.8907115-78057-78672103747418/source _original_basename=tmp8he6cfek follow=False checksum=0bfbc70e9a4740c9004b9947da681f723d529c83 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:35 np0005538513.localdomain sudo[44776]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:35 np0005538513.localdomain sudo[44806]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzjyuekaqhlmkqzppldugselkkchexww ; /usr/bin/python3
Nov 28 07:58:35 np0005538513.localdomain sudo[44806]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:35 np0005538513.localdomain python3[44808]: ansible-file Invoked with mode=0750 path=/var/log/containers/rsyslog setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:35 np0005538513.localdomain sudo[44806]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:36 np0005538513.localdomain sudo[44822]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzpyntgssogboqnsctzleahvvlamegjj ; /usr/bin/python3
Nov 28 07:58:36 np0005538513.localdomain sudo[44822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:36 np0005538513.localdomain python3[44824]: ansible-file Invoked with path=/var/lib/rsyslog.container setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:36 np0005538513.localdomain sudo[44822]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:36 np0005538513.localdomain sudo[44838]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-examismleiotlrfpdweyyrtmfxzevvtv ; /usr/bin/python3
Nov 28 07:58:36 np0005538513.localdomain sudo[44838]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:37 np0005538513.localdomain python3[44840]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:58:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 07:58:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 600.3 total, 600.0 interval
                                                          Cumulative writes: 3383 writes, 16K keys, 3383 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.03 MB/s
                                                          Cumulative WAL: 3383 writes, 195 syncs, 17.35 writes per sync, written: 0.01 GB, 0.03 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 3383 writes, 16K keys, 3383 commit groups, 1.0 writes per commit group, ingest: 15.26 MB, 0.03 MB/s
                                                          Interval WAL: 3383 writes, 195 syncs, 17.35 writes per sync, written: 0.01 GB, 0.03 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5584391482d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5584391482d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5584391482d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.032       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.032       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.032       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 28 07:58:39 np0005538513.localdomain sudo[44838]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:40 np0005538513.localdomain sudo[44887]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqpfaqotddnqxjihzllaepjcuqqktdez ; /usr/bin/python3
Nov 28 07:58:40 np0005538513.localdomain sudo[44887]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:40 np0005538513.localdomain python3[44889]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:40 np0005538513.localdomain sudo[44887]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:40 np0005538513.localdomain sudo[44932]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vjhrytbmcnkmjqtrgqxukrlddrgiufek ; /usr/bin/python3
Nov 28 07:58:40 np0005538513.localdomain sudo[44932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:41 np0005538513.localdomain python3[44934]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316720.2912393-78342-246714240581004/source validate=/usr/sbin/sshd -T -f %s mode=None follow=False _original_basename=sshd_config_block.j2 checksum=913c99ed7d5c33615bfb07a6792a4ef143dcfd2b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:41 np0005538513.localdomain sudo[44932]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:41 np0005538513.localdomain sudo[44963]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kanaiycrcwrsvchbztxbwrudvykauxai ; /usr/bin/python3
Nov 28 07:58:41 np0005538513.localdomain sudo[44963]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:41 np0005538513.localdomain python3[44965]: ansible-systemd Invoked with name=sshd state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:58:41 np0005538513.localdomain sshd[1129]: Received signal 15; terminating.
Nov 28 07:58:41 np0005538513.localdomain systemd[1]: Stopping OpenSSH server daemon...
Nov 28 07:58:41 np0005538513.localdomain systemd[1]: sshd.service: Deactivated successfully.
Nov 28 07:58:41 np0005538513.localdomain systemd[1]: Stopped OpenSSH server daemon.
Nov 28 07:58:41 np0005538513.localdomain systemd[1]: sshd.service: Consumed 4.460s CPU time, read 2.1M from disk, written 40.0K to disk.
Nov 28 07:58:41 np0005538513.localdomain systemd[1]: Stopped target sshd-keygen.target.
Nov 28 07:58:41 np0005538513.localdomain systemd[1]: Stopping sshd-keygen.target...
Nov 28 07:58:41 np0005538513.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 07:58:41 np0005538513.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 07:58:41 np0005538513.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 07:58:41 np0005538513.localdomain systemd[1]: Reached target sshd-keygen.target.
Nov 28 07:58:41 np0005538513.localdomain systemd[1]: Starting OpenSSH server daemon...
Nov 28 07:58:41 np0005538513.localdomain sshd[44969]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:58:41 np0005538513.localdomain sshd[44969]: Server listening on 0.0.0.0 port 22.
Nov 28 07:58:41 np0005538513.localdomain sshd[44969]: Server listening on :: port 22.
Nov 28 07:58:41 np0005538513.localdomain systemd[1]: Started OpenSSH server daemon.
Nov 28 07:58:41 np0005538513.localdomain sudo[44963]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:41 np0005538513.localdomain sudo[44983]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iepqwbsmvghndgfholfxapsfjznbqnbi ; /usr/bin/python3
Nov 28 07:58:41 np0005538513.localdomain sudo[44983]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:42 np0005538513.localdomain python3[44985]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:42 np0005538513.localdomain sudo[44983]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:43 np0005538513.localdomain sudo[45001]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jevzyfzvynoviqjkodlbieczzcoyobrb ; /usr/bin/python3
Nov 28 07:58:43 np0005538513.localdomain sudo[45001]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:43 np0005538513.localdomain python3[45003]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:43 np0005538513.localdomain sudo[45001]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:43 np0005538513.localdomain sudo[45019]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjvagznfigmpvaoqbamrsmjfrlemcnmy ; /usr/bin/python3
Nov 28 07:58:43 np0005538513.localdomain sudo[45019]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:44 np0005538513.localdomain python3[45021]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:58:46 np0005538513.localdomain sudo[45019]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:47 np0005538513.localdomain sudo[45068]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cekrbjgmmrxcxtswumkoqwowbtlgqfca ; /usr/bin/python3
Nov 28 07:58:47 np0005538513.localdomain sudo[45068]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:47 np0005538513.localdomain python3[45070]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:47 np0005538513.localdomain sudo[45068]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:47 np0005538513.localdomain sudo[45086]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlgsyqucfdhavfzbohvontxoavxkurpg ; /usr/bin/python3
Nov 28 07:58:47 np0005538513.localdomain sudo[45086]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:47 np0005538513.localdomain python3[45088]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=420 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:47 np0005538513.localdomain sudo[45086]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:48 np0005538513.localdomain sudo[45116]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-roxpelcmnpyjylllvkclmyttkiqqrfzn ; /usr/bin/python3
Nov 28 07:58:48 np0005538513.localdomain sudo[45116]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:48 np0005538513.localdomain python3[45118]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:58:48 np0005538513.localdomain sudo[45116]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:49 np0005538513.localdomain sudo[45166]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcwsyamhupalozprspjbcczxchsyuqxa ; /usr/bin/python3
Nov 28 07:58:49 np0005538513.localdomain sudo[45166]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:49 np0005538513.localdomain python3[45168]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:49 np0005538513.localdomain sudo[45166]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:49 np0005538513.localdomain sudo[45184]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vetabqqvkbzwzbvjjozqelmdvllfmqut ; /usr/bin/python3
Nov 28 07:58:49 np0005538513.localdomain sudo[45184]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:49 np0005538513.localdomain python3[45186]: ansible-ansible.legacy.file Invoked with dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service recurse=False state=file path=/etc/systemd/system/chrony-online.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:49 np0005538513.localdomain sudo[45184]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:49 np0005538513.localdomain sudo[45214]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfpdmsxvfnuqvufmspsbyyypoigzoiuh ; /usr/bin/python3
Nov 28 07:58:49 np0005538513.localdomain sudo[45214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:50 np0005538513.localdomain python3[45216]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:58:50 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:58:50 np0005538513.localdomain systemd-sysv-generator[45246]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:58:50 np0005538513.localdomain systemd-rc-local-generator[45243]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:58:50 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:58:50 np0005538513.localdomain systemd[1]: Starting chronyd online sources service...
Nov 28 07:58:50 np0005538513.localdomain chronyc[45255]: 200 OK
Nov 28 07:58:50 np0005538513.localdomain systemd[1]: chrony-online.service: Deactivated successfully.
Nov 28 07:58:50 np0005538513.localdomain systemd[1]: Finished chronyd online sources service.
Nov 28 07:58:50 np0005538513.localdomain sudo[45214]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:50 np0005538513.localdomain sudo[45269]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mkxpvppjonavjfjuqijfiioqrwnrfirc ; /usr/bin/python3
Nov 28 07:58:50 np0005538513.localdomain sudo[45269]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:50 np0005538513.localdomain python3[45271]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:50 np0005538513.localdomain chronyd[26085]: System clock was stepped by -0.000194 seconds
Nov 28 07:58:50 np0005538513.localdomain sudo[45269]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:51 np0005538513.localdomain sudo[45286]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypchtinsyqtgjwknoqazevxaonguhjqw ; /usr/bin/python3
Nov 28 07:58:51 np0005538513.localdomain sudo[45286]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:51 np0005538513.localdomain python3[45288]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:51 np0005538513.localdomain sudo[45286]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:51 np0005538513.localdomain sudo[45303]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idtjzgdrsdnjkogwvuxqqvnsqwivrwgj ; /usr/bin/python3
Nov 28 07:58:51 np0005538513.localdomain sudo[45303]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:51 np0005538513.localdomain python3[45305]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:51 np0005538513.localdomain chronyd[26085]: System clock was stepped by 0.000000 seconds
Nov 28 07:58:51 np0005538513.localdomain sudo[45303]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:51 np0005538513.localdomain sudo[45320]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhttzlpigifxipfsiazunvvchjmefzdw ; /usr/bin/python3
Nov 28 07:58:51 np0005538513.localdomain sudo[45320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:51 np0005538513.localdomain python3[45322]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:51 np0005538513.localdomain sudo[45320]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:52 np0005538513.localdomain sudo[45337]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emotnalfnrhwqeysvugilzrntcknmqjw ; /usr/bin/python3
Nov 28 07:58:52 np0005538513.localdomain sudo[45337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:52 np0005538513.localdomain python3[45339]: ansible-timezone Invoked with name=UTC hwclock=None
Nov 28 07:58:52 np0005538513.localdomain systemd[1]: Starting Time & Date Service...
Nov 28 07:58:52 np0005538513.localdomain systemd[1]: Started Time & Date Service.
Nov 28 07:58:52 np0005538513.localdomain sudo[45337]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:53 np0005538513.localdomain sudo[45357]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olzunygiiyusdupdjfmffilcaxmbholj ; /usr/bin/python3
Nov 28 07:58:53 np0005538513.localdomain sudo[45357]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:53 np0005538513.localdomain python3[45359]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:53 np0005538513.localdomain sudo[45357]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:53 np0005538513.localdomain sudo[45374]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-grvsqwanpxtwhlzimcgnjmlxqlgyimay ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Nov 28 07:58:53 np0005538513.localdomain sudo[45374]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:54 np0005538513.localdomain python3[45376]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:54 np0005538513.localdomain sudo[45374]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:54 np0005538513.localdomain sudo[45391]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfqqnivepsckdkhcajwisrbvvjuxppsw ; /usr/bin/python3
Nov 28 07:58:54 np0005538513.localdomain sudo[45391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:54 np0005538513.localdomain python3[45393]: ansible-slurp Invoked with src=/etc/tuned/active_profile
Nov 28 07:58:54 np0005538513.localdomain sudo[45391]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:55 np0005538513.localdomain sudo[45407]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-alqiiwucmcwubjstsfkzvwildngsdpei ; /usr/bin/python3
Nov 28 07:58:55 np0005538513.localdomain sudo[45407]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:55 np0005538513.localdomain python3[45409]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:58:55 np0005538513.localdomain sudo[45407]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:55 np0005538513.localdomain sudo[45423]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-myqakzubizntbsdbauinyrdsodkapycy ; /usr/bin/python3
Nov 28 07:58:55 np0005538513.localdomain sudo[45423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:55 np0005538513.localdomain python3[45425]: ansible-file Invoked with mode=0750 path=/var/log/containers/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:55 np0005538513.localdomain sudo[45423]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:55 np0005538513.localdomain sudo[45439]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jlgrkxlqxejdepoyezzcmvmqbzvfucnw ; /usr/bin/python3
Nov 28 07:58:55 np0005538513.localdomain sudo[45439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:56 np0005538513.localdomain python3[45441]: ansible-file Invoked with path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:56 np0005538513.localdomain sudo[45439]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:56 np0005538513.localdomain sudo[45487]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzfjtsuccizvlvwkbsyfprnhpcpefclr ; /usr/bin/python3
Nov 28 07:58:56 np0005538513.localdomain sudo[45487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:56 np0005538513.localdomain python3[45489]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/neutron-cleanup follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:56 np0005538513.localdomain sudo[45487]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:56 np0005538513.localdomain sudo[45530]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvcjsjlpvwnjsuglzympemoegkzaffbx ; /usr/bin/python3
Nov 28 07:58:56 np0005538513.localdomain sudo[45530]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:56 np0005538513.localdomain python3[45532]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/neutron-cleanup force=True mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316736.2150152-79292-101910019104249/source _original_basename=tmpdsa2me3l follow=False checksum=f9cc7d1e91fbae49caa7e35eb2253bba146a73b4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:56 np0005538513.localdomain sudo[45530]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:57 np0005538513.localdomain sudo[45592]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jgxvuzrhcxicsbewejguyjoulisuedac ; /usr/bin/python3
Nov 28 07:58:57 np0005538513.localdomain sudo[45592]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:57 np0005538513.localdomain python3[45594]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/neutron-cleanup.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:57 np0005538513.localdomain sudo[45592]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:57 np0005538513.localdomain sudo[45635]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxceeiftkbiclroftmauiwqpssygpsvk ; /usr/bin/python3
Nov 28 07:58:57 np0005538513.localdomain sudo[45635]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:57 np0005538513.localdomain python3[45637]: ansible-ansible.legacy.copy Invoked with dest=/usr/lib/systemd/system/neutron-cleanup.service force=True src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316737.080491-79394-149285758843359/source _original_basename=tmp34357v48 follow=False checksum=6b6cd9f074903a28d054eb530a10c7235d0c39fc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:57 np0005538513.localdomain sudo[45635]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:57 np0005538513.localdomain sudo[45665]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbsroivggdiuxjuugmghjmbebtobnjnx ; /usr/bin/python3
Nov 28 07:58:57 np0005538513.localdomain sudo[45665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:58 np0005538513.localdomain python3[45667]: ansible-ansible.legacy.systemd Invoked with enabled=True name=neutron-cleanup daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 28 07:58:58 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:58:58 np0005538513.localdomain systemd-sysv-generator[45695]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:58:58 np0005538513.localdomain systemd-rc-local-generator[45692]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:58:58 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:58:58 np0005538513.localdomain sudo[45665]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:58 np0005538513.localdomain sudo[45719]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tribulavtvvfmktmhkbhnjaofopdvowk ; /usr/bin/python3
Nov 28 07:58:58 np0005538513.localdomain sudo[45719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:58 np0005538513.localdomain python3[45721]: ansible-file Invoked with mode=0750 path=/var/log/containers/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:58 np0005538513.localdomain sudo[45719]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:59 np0005538513.localdomain sudo[45735]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqvhrtnwvzswtgthfidghqcuibthwexw ; /usr/bin/python3
Nov 28 07:58:59 np0005538513.localdomain sudo[45735]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:59 np0005538513.localdomain python3[45737]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns add ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:59 np0005538513.localdomain systemd[35694]: Created slice User Background Tasks Slice.
Nov 28 07:58:59 np0005538513.localdomain systemd[35694]: Starting Cleanup of User's Temporary Files and Directories...
Nov 28 07:58:59 np0005538513.localdomain sudo[45735]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:59 np0005538513.localdomain systemd[35694]: Finished Cleanup of User's Temporary Files and Directories.
Nov 28 07:58:59 np0005538513.localdomain sudo[45753]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fefopeolzghwmfgvhezdlpxrxxvtteym ; /usr/bin/python3
Nov 28 07:58:59 np0005538513.localdomain sudo[45753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:59 np0005538513.localdomain python3[45755]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns delete ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:59 np0005538513.localdomain systemd[1]: run-netns-ns_temp.mount: Deactivated successfully.
Nov 28 07:58:59 np0005538513.localdomain sudo[45753]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:59 np0005538513.localdomain sudo[45770]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tbwkesrlgrizrvemaqtftwarpkgykwcv ; /usr/bin/python3
Nov 28 07:58:59 np0005538513.localdomain sudo[45770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:59 np0005538513.localdomain python3[45772]: ansible-file Invoked with path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:59 np0005538513.localdomain sudo[45770]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:00 np0005538513.localdomain sudo[45786]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nahtzespluvvyzugkpbxkvnfcnvdqfed ; /usr/bin/python3
Nov 28 07:59:00 np0005538513.localdomain sudo[45786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:00 np0005538513.localdomain python3[45788]: ansible-file Invoked with path=/var/lib/neutron/kill_scripts state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:59:00 np0005538513.localdomain sudo[45786]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:00 np0005538513.localdomain sudo[45834]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cneqtmiseoyifiiriflrqoqvwnpolvyt ; /usr/bin/python3
Nov 28 07:59:00 np0005538513.localdomain sudo[45834]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:00 np0005538513.localdomain python3[45836]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:59:00 np0005538513.localdomain sudo[45834]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:00 np0005538513.localdomain sudo[45877]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wflboirkufynuckxwzsqjulebssnkjxm ; /usr/bin/python3
Nov 28 07:59:00 np0005538513.localdomain sudo[45877]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:01 np0005538513.localdomain python3[45879]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316740.4072773-79651-244303052039741/source _original_basename=tmpwzas0rzl follow=False checksum=2f369fbe8f83639cdfd4efc53e7feb4ee77d1ed7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:59:01 np0005538513.localdomain sudo[45877]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:13 np0005538513.localdomain sudo[45894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:59:13 np0005538513.localdomain sudo[45894]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:59:13 np0005538513.localdomain sudo[45894]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:13 np0005538513.localdomain sudo[45909]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:59:13 np0005538513.localdomain sudo[45909]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:59:14 np0005538513.localdomain sudo[45909]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:15 np0005538513.localdomain sudo[45956]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:59:15 np0005538513.localdomain sudo[45956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:59:15 np0005538513.localdomain sudo[45956]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:22 np0005538513.localdomain sudo[45984]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjgmotfdqqlmhwiollvrkclarfwhvcmn ; /usr/bin/python3
Nov 28 07:59:22 np0005538513.localdomain sudo[45984]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:22 np0005538513.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 28 07:59:22 np0005538513.localdomain python3[45986]: ansible-file Invoked with path=/var/log/containers state=directory setype=container_file_t selevel=s0 mode=488 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 07:59:22 np0005538513.localdomain sudo[45984]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:23 np0005538513.localdomain sudo[46002]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-niaptwcucsobcsxineaqzzhvtmscrtuq ; /usr/bin/python3
Nov 28 07:59:23 np0005538513.localdomain sudo[46002]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:23 np0005538513.localdomain python3[46004]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None setype=None attributes=None
Nov 28 07:59:23 np0005538513.localdomain sudo[46002]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:23 np0005538513.localdomain sudo[46018]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpspxgtkhclxibxsocmiimsllzzripoc ; /usr/bin/python3
Nov 28 07:59:23 np0005538513.localdomain sudo[46018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:23 np0005538513.localdomain python3[46020]: ansible-file Invoked with path=/var/lib/tripleo-config state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 07:59:23 np0005538513.localdomain sudo[46018]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:23 np0005538513.localdomain sudo[46034]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwhgyxdmndrsrpkqmiohkgvrdwjvewtc ; /usr/bin/python3
Nov 28 07:59:23 np0005538513.localdomain sudo[46034]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:23 np0005538513.localdomain python3[46036]: ansible-file Invoked with path=/var/lib/container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:59:23 np0005538513.localdomain sudo[46034]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:24 np0005538513.localdomain sudo[46050]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlaqgaeftmxsqyidgdwinjekoiadoaig ; /usr/bin/python3
Nov 28 07:59:24 np0005538513.localdomain sudo[46050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:24 np0005538513.localdomain python3[46052]: ansible-file Invoked with path=/var/lib/docker-container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:59:24 np0005538513.localdomain sudo[46050]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:24 np0005538513.localdomain sudo[46066]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kobzfmgosvkpraokdzxmcampfwtaykgv ; /usr/bin/python3
Nov 28 07:59:24 np0005538513.localdomain sudo[46066]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:24 np0005538513.localdomain python3[46068]: ansible-community.general.sefcontext Invoked with target=/var/lib/container-config-scripts(/.*)? setype=container_file_t state=present ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Nov 28 07:59:25 np0005538513.localdomain kernel: SELinux:  Converting 2707 SID table entries...
Nov 28 07:59:25 np0005538513.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 07:59:25 np0005538513.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 07:59:25 np0005538513.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 07:59:25 np0005538513.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 07:59:25 np0005538513.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 07:59:25 np0005538513.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 07:59:25 np0005538513.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 07:59:25 np0005538513.localdomain sudo[46066]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:25 np0005538513.localdomain sudo[46087]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-anzkfpeuyknnurkqxzodgusrjmnoedju ; /usr/bin/python3
Nov 28 07:59:25 np0005538513.localdomain dbus-broker-launch[755]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 28 07:59:25 np0005538513.localdomain sudo[46087]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:26 np0005538513.localdomain python3[46089]: ansible-file Invoked with path=/var/lib/container-config-scripts state=directory setype=container_file_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:59:26 np0005538513.localdomain sudo[46087]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:26 np0005538513.localdomain sudo[46103]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gaaqexzmnciawhtielidyutmrzkkgbtk ; /usr/bin/python3
Nov 28 07:59:26 np0005538513.localdomain sudo[46103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:26 np0005538513.localdomain sudo[46103]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:26 np0005538513.localdomain sudo[46151]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojakotyzvxohdgwkaqcpzfaxjmswxtji ; /usr/bin/python3
Nov 28 07:59:26 np0005538513.localdomain sudo[46151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:27 np0005538513.localdomain sudo[46151]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:27 np0005538513.localdomain sudo[46194]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptmwtwupnioxqnpiekvkgpropjaknimq ; /usr/bin/python3
Nov 28 07:59:27 np0005538513.localdomain sudo[46194]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:27 np0005538513.localdomain sudo[46194]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:27 np0005538513.localdomain sudo[46224]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnraqwzoxmarhadnwmyghanfqxcfgvfn ; /usr/bin/python3
Nov 28 07:59:27 np0005538513.localdomain sudo[46224]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:27 np0005538513.localdomain python3[46226]: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-config/container-startup-config config_data={'step_1': {'metrics_qdr': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, 'metrics_qdr_init_logs': {'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}}, 'step_2': {'create_haproxy_wrapper': {'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, 'create_virtlogd_wrapper': {'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, 'nova_compute_init_log': {'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, 'nova_virtqemud_init_logs': {'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}}, 'step_3': {'ceilometer_init_log': {'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'collectd': {'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, 'iscsid': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, 'nova_statedir_owner': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, 'nova_virtlogd_wrapper': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, 'nova_virtnodedevd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtproxyd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtqemud': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, 'nova_virtsecretd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtstoraged': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, 'rsyslog': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}}, 'step_4': {'ceilometer_agent_compute': {'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'ceilometer_agent_ipmi': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'configure_cms_options': {'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, 'logrotate_crond': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, 'nova_libvirt_init_secret': {'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, 'nova_migration_target': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, 'ovn_controller': {'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, 'ovn_metadata_agent': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, 'setup_ovs_manager': {'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}}, 'step_5': {'nova_compute': {'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, 'nova_wait_for_compute_service': {'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}}}
Nov 28 07:59:27 np0005538513.localdomain sudo[46224]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:27 np0005538513.localdomain rsyslogd[759]: message too long (31243) with configured size 8096, begin of message is: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-c [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2445 ]
Nov 28 07:59:28 np0005538513.localdomain sudo[46240]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nsnszjbzuyhhqcussxcfmnufwkscoxwh ; /usr/bin/python3
Nov 28 07:59:28 np0005538513.localdomain sudo[46240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:28 np0005538513.localdomain python3[46242]: ansible-file Invoked with path=/var/lib/kolla/config_files state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 07:59:28 np0005538513.localdomain sudo[46240]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:28 np0005538513.localdomain sudo[46256]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wyvwienuhmrdogskdltpctyuenckhxza ; /usr/bin/python3
Nov 28 07:59:28 np0005538513.localdomain sudo[46256]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:29 np0005538513.localdomain python3[46258]: ansible-file Invoked with path=/var/lib/config-data mode=493 state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 07:59:29 np0005538513.localdomain sudo[46256]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:29 np0005538513.localdomain sudo[46272]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhupydguldxnvsounehtqwehjhtuzkvh ; /usr/bin/python3
Nov 28 07:59:29 np0005538513.localdomain sudo[46272]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:29 np0005538513.localdomain python3[46274]: ansible-tripleo_container_configs Invoked with config_data={'/var/lib/kolla/config_files/ceilometer-agent-ipmi.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces ipmi --logfile /var/log/ceilometer/ipmi.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/ceilometer_agent_compute.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /var/log/ceilometer/compute.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/collectd.json': {'command': '/usr/sbin/collectd -f', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/', 'merge': False, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/etc/collectd.d'}], 'permissions': [{'owner': 'collectd:collectd', 'path': '/var/log/collectd', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/scripts', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/config-scripts', 'recurse': True}]}, '/var/lib/kolla/config_files/iscsid.json': {'command': '/usr/sbin/iscsid -f', 'config_files': [{'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/'}]}, '/var/lib/kolla/config_files/logrotate-crond.json': {'command': '/usr/sbin/crond -s -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/metrics_qdr.json': {'command': '/usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/', 'merge': True, 'optional': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-tls/*'}], 'permissions': [{'owner': 'qdrouterd:qdrouterd', 'path': '/var/lib/qdrouterd', 'recurse': True}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/certs/metrics_qdr.crt'}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/private/metrics_qdr.key'}]}, '/var/lib/kolla/config_files/nova-migration-target.json': {'command': 'dumb-init --single-child -- /usr/sbin/sshd -D -p 2022', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ssh/', 'owner': 'root', 'perm': '0600', 'source': '/host-ssh/ssh_host_*_key'}]}, '/var/lib/kolla/config_files/nova_compute.json': {'command': '/var/lib/nova/delay-nova-compute --delay 180 --nova-binary /usr/bin/nova-compute ', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}, {'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_wait_for_compute_service.py', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}]}, '/var/lib/kolla/config_files/nova_virtlogd.json': {'command': '/usr/local/bin/virtlogd_wrapper', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtnodedevd.json': {'command': '/usr/sbin/virtnodedevd --config /etc/libvirt/virtnodedevd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtproxyd.json': {'command': '/usr/sbin/virtproxyd --config /etc/libvirt/virtproxyd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtqemud.json': {'command': '/usr/sbin/virtqemud --config /etc/libvirt/virtqemud.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtsecretd.json': {'command': '/usr/sbin/virtsecretd --config /etc/libvirt/virtsecretd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtstoraged.json': {'command': '/usr/sbin/virtstoraged --config /etc/libvirt/virtstoraged.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/ovn_controller.json': {'command': '/usr/bin/ovn-controller --pidfile --log-file unix:/run/openvswitch/db.sock ', 'permissions': [{'owner': 'root:root', 'path': '/var/log/openvswitch', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/ovn', 'recurse': True}]}, '/var/lib/kolla/config_files/ovn_metadata_agent.json': {'command': '/usr/bin/networking-ovn-metadata-agent --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --log-file=/var/log/neutron/ovn-metadata-agent.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'neutron:neutron', 'path': '/var/log/neutron', 'recurse': True}, {'owner': 'neutron:neutron', 'path': '/var/lib/neutron', 'recurse': True}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/certs/ovn_metadata.crt', 'perm': '0644'}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/private/ovn_metadata.key', 'perm': '0644'}]}, '/var/lib/kolla/config_files/rsyslog.json': {'command': '/usr/sbin/rsyslogd -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'root:root', 'path': '/var/lib/rsyslog', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/rsyslog', 'recurse': True}]}}
Nov 28 07:59:29 np0005538513.localdomain sudo[46272]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:34 np0005538513.localdomain sudo[46320]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bywpldgahsigadnxnjdbkhytscklikkc ; /usr/bin/python3
Nov 28 07:59:34 np0005538513.localdomain sudo[46320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:35 np0005538513.localdomain python3[46322]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:59:35 np0005538513.localdomain sudo[46320]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:35 np0005538513.localdomain sudo[46363]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wgpkuikoajkhhexhbsnxjdykqvvwrbvz ; /usr/bin/python3
Nov 28 07:59:35 np0005538513.localdomain sudo[46363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:35 np0005538513.localdomain python3[46365]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316774.8521078-81135-25730311528161/source _original_basename=tmp789d6unz follow=False checksum=dfdcc7695edd230e7a2c06fc7b739bfa56506d8f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:59:35 np0005538513.localdomain sudo[46363]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:35 np0005538513.localdomain sudo[46393]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shaoxnxlfuuxjqfovakriltcndllsqpc ; /usr/bin/python3
Nov 28 07:59:35 np0005538513.localdomain sudo[46393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:35 np0005538513.localdomain python3[46395]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:59:35 np0005538513.localdomain sudo[46393]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:36 np0005538513.localdomain sudo[46443]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kuxtjacnjhrgyqjietqlwclrxdurayyy ; /usr/bin/python3
Nov 28 07:59:36 np0005538513.localdomain sudo[46443]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:37 np0005538513.localdomain sudo[46443]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:37 np0005538513.localdomain sudo[46486]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gomrhjfjpwlsuelxotjwjjredjasgliy ; /usr/bin/python3
Nov 28 07:59:37 np0005538513.localdomain sudo[46486]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:37 np0005538513.localdomain sudo[46486]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:37 np0005538513.localdomain sudo[46516]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iljzvyzjaxrpjvoihnpjjoytiohubkqd ; /usr/bin/python3
Nov 28 07:59:37 np0005538513.localdomain sudo[46516]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:38 np0005538513.localdomain python3[46518]: ansible-file Invoked with path=/var/lib/container-puppet state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 07:59:38 np0005538513.localdomain sudo[46516]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:38 np0005538513.localdomain sudo[46564]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmnosbvufplarxbwhkojkmmmqfcjrxso ; /usr/bin/python3
Nov 28 07:59:38 np0005538513.localdomain sudo[46564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:38 np0005538513.localdomain sudo[46564]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:38 np0005538513.localdomain sudo[46607]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzskabnazdtkiwivvjvkwwhbwxwmxkvp ; /usr/bin/python3
Nov 28 07:59:38 np0005538513.localdomain sudo[46607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:39 np0005538513.localdomain sudo[46607]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:39 np0005538513.localdomain sudo[46637]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjfkbkpozogvadboueyluqpqkzupkzif ; /usr/bin/python3
Nov 28 07:59:39 np0005538513.localdomain sudo[46637]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:39 np0005538513.localdomain python3[46639]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Nov 28 07:59:39 np0005538513.localdomain sudo[46637]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:41 np0005538513.localdomain sudo[46653]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lycoskqzfwmxjjlvnfepbdkscgrwpoyk ; /usr/bin/python3
Nov 28 07:59:41 np0005538513.localdomain sudo[46653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:41 np0005538513.localdomain python3[46655]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q lvm2 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:59:41 np0005538513.localdomain sudo[46653]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:42 np0005538513.localdomain sudo[46670]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zgblxovkutovaidgcvtdooxugjctaegg ; /usr/bin/python3
Nov 28 07:59:42 np0005538513.localdomain sudo[46670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:42 np0005538513.localdomain python3[46672]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:59:46 np0005538513.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Nov 28 07:59:46 np0005538513.localdomain dbus-broker-launch[18433]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 28 07:59:46 np0005538513.localdomain dbus-broker-launch[18433]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 28 07:59:46 np0005538513.localdomain dbus-broker-launch[18433]: Noticed file-system modification, trigger reload.
Nov 28 07:59:46 np0005538513.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Nov 28 07:59:46 np0005538513.localdomain systemd[1]: Reexecuting.
Nov 28 07:59:46 np0005538513.localdomain systemd[1]: systemd 252-14.el9_2.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 28 07:59:46 np0005538513.localdomain systemd[1]: Detected virtualization kvm.
Nov 28 07:59:46 np0005538513.localdomain systemd[1]: Detected architecture x86-64.
Nov 28 07:59:46 np0005538513.localdomain systemd-sysv-generator[46730]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:59:46 np0005538513.localdomain systemd-rc-local-generator[46727]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:59:46 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:59:55 np0005538513.localdomain kernel: SELinux:  Converting 2707 SID table entries...
Nov 28 07:59:55 np0005538513.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 07:59:55 np0005538513.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 07:59:55 np0005538513.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 07:59:55 np0005538513.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 07:59:55 np0005538513.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 07:59:55 np0005538513.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 07:59:55 np0005538513.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 07:59:55 np0005538513.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Nov 28 07:59:55 np0005538513.localdomain dbus-broker-launch[755]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 28 07:59:55 np0005538513.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Nov 28 07:59:56 np0005538513.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 07:59:56 np0005538513.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 28 07:59:56 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:59:56 np0005538513.localdomain systemd-rc-local-generator[46817]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:59:56 np0005538513.localdomain systemd-sysv-generator[46823]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:59:56 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:59:56 np0005538513.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 07:59:56 np0005538513.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 07:59:56 np0005538513.localdomain systemd-journald[618]: Journal stopped
Nov 28 07:59:56 np0005538513.localdomain systemd-journald[618]: Received SIGTERM from PID 1 (systemd).
Nov 28 07:59:56 np0005538513.localdomain systemd[1]: Stopping Journal Service...
Nov 28 07:59:56 np0005538513.localdomain systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 28 07:59:56 np0005538513.localdomain systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 28 07:59:56 np0005538513.localdomain systemd[1]: Stopped Journal Service.
Nov 28 07:59:56 np0005538513.localdomain systemd[1]: systemd-journald.service: Consumed 1.883s CPU time.
Nov 28 07:59:56 np0005538513.localdomain systemd[1]: Starting Journal Service...
Nov 28 07:59:56 np0005538513.localdomain systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 28 07:59:56 np0005538513.localdomain systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 28 07:59:56 np0005538513.localdomain systemd[1]: systemd-udevd.service: Consumed 3.212s CPU time.
Nov 28 07:59:56 np0005538513.localdomain systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 28 07:59:56 np0005538513.localdomain systemd-journald[47227]: Journal started
Nov 28 07:59:56 np0005538513.localdomain systemd-journald[47227]: Runtime Journal (/run/log/journal/5cd59ba25ae47acac865224fa46a5f9e) is 12.2M, max 314.7M, 302.5M free.
Nov 28 07:59:56 np0005538513.localdomain systemd[1]: Started Journal Service.
Nov 28 07:59:56 np0005538513.localdomain systemd-journald[47227]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation.
Nov 28 07:59:56 np0005538513.localdomain systemd-journald[47227]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 07:59:56 np0005538513.localdomain rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 07:59:57 np0005538513.localdomain systemd-udevd[47236]: Using default interface naming scheme 'rhel-9.0'.
Nov 28 07:59:57 np0005538513.localdomain systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 28 07:59:57 np0005538513.localdomain rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 07:59:57 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 07:59:57 np0005538513.localdomain systemd-rc-local-generator[47847]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:59:57 np0005538513.localdomain systemd-sysv-generator[47850]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:59:57 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:59:57 np0005538513.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 07:59:57 np0005538513.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 07:59:57 np0005538513.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 28 07:59:57 np0005538513.localdomain systemd[1]: man-db-cache-update.service: Consumed 1.373s CPU time.
Nov 28 07:59:57 np0005538513.localdomain systemd[1]: run-rcc4a7a8e9d8c412bb6980a53ecd801fb.service: Deactivated successfully.
Nov 28 07:59:57 np0005538513.localdomain systemd[1]: run-r5cab8d9fc4ab41ccbc6ba563f3390d5f.service: Deactivated successfully.
Nov 28 07:59:58 np0005538513.localdomain sudo[46670]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:59 np0005538513.localdomain sudo[48164]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shgzkvrxlyqicxhsbkdusfkgmnexlyoe ; /usr/bin/python3
Nov 28 07:59:59 np0005538513.localdomain sudo[48164]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:59 np0005538513.localdomain python3[48166]: ansible-sysctl Invoked with name=vm.unprivileged_userfaultfd reload=True state=present sysctl_file=/etc/sysctl.d/99-tripleo-postcopy.conf sysctl_set=True value=1 ignoreerrors=False
Nov 28 07:59:59 np0005538513.localdomain sudo[48164]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:59 np0005538513.localdomain sudo[48183]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjjxskfljcgkuqakejcsirufeqslqrer ; /usr/bin/python3
Nov 28 07:59:59 np0005538513.localdomain sudo[48183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:59 np0005538513.localdomain python3[48185]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ksm.service || systemctl is-enabled ksm.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:59:59 np0005538513.localdomain sudo[48183]: pam_unix(sudo:session): session closed for user root
Nov 28 08:00:00 np0005538513.localdomain sudo[48201]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oyrdwkpedyntnaozsylmdhobokcodqld ; /usr/bin/python3
Nov 28 08:00:00 np0005538513.localdomain sudo[48201]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:00:00 np0005538513.localdomain python3[48203]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 28 08:00:00 np0005538513.localdomain python3[48203]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 --format json
Nov 28 08:00:00 np0005538513.localdomain python3[48203]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 -q --tls-verify=false
Nov 28 08:00:09 np0005538513.localdomain podman[48217]: 2025-11-28 08:00:00.849810715 +0000 UTC m=+0.043416845 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Nov 28 08:00:09 np0005538513.localdomain python3[48203]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect bac901955dcf7a32a493c6ef724c092009bbc18467858aa8c55e916b8c2b2b8f --format json
Nov 28 08:00:09 np0005538513.localdomain sudo[48201]: pam_unix(sudo:session): session closed for user root
Nov 28 08:00:10 np0005538513.localdomain sudo[48316]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xljninycunfmoueelgkosoqpfuycbnhl ; /usr/bin/python3
Nov 28 08:00:10 np0005538513.localdomain sudo[48316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:00:10 np0005538513.localdomain python3[48318]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 28 08:00:10 np0005538513.localdomain python3[48318]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 --format json
Nov 28 08:00:10 np0005538513.localdomain python3[48318]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 -q --tls-verify=false
Nov 28 08:00:16 np0005538513.localdomain sudo[48381]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:00:16 np0005538513.localdomain sudo[48381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:00:16 np0005538513.localdomain sudo[48381]: pam_unix(sudo:session): session closed for user root
Nov 28 08:00:16 np0005538513.localdomain sudo[48396]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 08:00:16 np0005538513.localdomain sudo[48396]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:00:19 np0005538513.localdomain podman[48330]: 2025-11-28 08:00:10.354211851 +0000 UTC m=+0.046690909 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Nov 28 08:00:19 np0005538513.localdomain python3[48318]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 44feaf8d87c1d40487578230316b622680576d805efdb45dfeea6aad464b41f1 --format json
Nov 28 08:00:19 np0005538513.localdomain sudo[48316]: pam_unix(sudo:session): session closed for user root
Nov 28 08:00:19 np0005538513.localdomain sudo[48526]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-grmcgcskjlyezdyueeqxsvbffmybcpgo ; /usr/bin/python3
Nov 28 08:00:19 np0005538513.localdomain sudo[48526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:00:19 np0005538513.localdomain systemd[1]: tmp-crun.f9wZgD.mount: Deactivated successfully.
Nov 28 08:00:19 np0005538513.localdomain podman[48535]: 2025-11-28 08:00:19.98773249 +0000 UTC m=+0.108311899 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, RELEASE=main, io.buildah.version=1.33.12, name=rhceph, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, ceph=True, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public)
Nov 28 08:00:20 np0005538513.localdomain python3[48534]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 28 08:00:20 np0005538513.localdomain python3[48534]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 --format json
Nov 28 08:00:20 np0005538513.localdomain python3[48534]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 -q --tls-verify=false
Nov 28 08:00:20 np0005538513.localdomain podman[48535]: 2025-11-28 08:00:20.125381285 +0000 UTC m=+0.245960704 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, name=rhceph, com.redhat.component=rhceph-container, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-type=git, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553)
Nov 28 08:00:20 np0005538513.localdomain sudo[48396]: pam_unix(sudo:session): session closed for user root
Nov 28 08:00:20 np0005538513.localdomain sudo[48624]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:00:20 np0005538513.localdomain sudo[48624]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:00:20 np0005538513.localdomain sudo[48624]: pam_unix(sudo:session): session closed for user root
Nov 28 08:00:20 np0005538513.localdomain sudo[48639]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:00:20 np0005538513.localdomain sudo[48639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:00:21 np0005538513.localdomain sudo[48639]: pam_unix(sudo:session): session closed for user root
Nov 28 08:00:21 np0005538513.localdomain sudo[48710]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:00:21 np0005538513.localdomain sudo[48710]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:00:21 np0005538513.localdomain sudo[48710]: pam_unix(sudo:session): session closed for user root
Nov 28 08:00:37 np0005538513.localdomain podman[48565]: 2025-11-28 08:00:20.157460029 +0000 UTC m=+0.051076296 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:00:37 np0005538513.localdomain python3[48534]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 3a088c12511c977065fdc5f1594cba7b1a79f163578a6ffd0ac4a475b8e67938 --format json
Nov 28 08:00:38 np0005538513.localdomain sudo[48526]: pam_unix(sudo:session): session closed for user root
Nov 28 08:00:38 np0005538513.localdomain sudo[49288]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmzduannqcfethqtfrjqzocsvtyblssl ; /usr/bin/python3
Nov 28 08:00:38 np0005538513.localdomain sudo[49288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:00:38 np0005538513.localdomain python3[49290]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 28 08:00:38 np0005538513.localdomain python3[49290]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 --format json
Nov 28 08:00:38 np0005538513.localdomain python3[49290]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 -q --tls-verify=false
Nov 28 08:00:51 np0005538513.localdomain podman[49303]: 2025-11-28 08:00:38.477373753 +0000 UTC m=+0.042759614 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 28 08:00:51 np0005538513.localdomain python3[49290]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 514d439186251360cf734cbc6d4a44c834664891872edf3798a653dfaacf10c0 --format json
Nov 28 08:00:51 np0005538513.localdomain sudo[49288]: pam_unix(sudo:session): session closed for user root
Nov 28 08:00:51 np0005538513.localdomain sudo[49428]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzjtikcfeqidheoayydafcxsjixvohvp ; /usr/bin/python3
Nov 28 08:00:51 np0005538513.localdomain sudo[49428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:00:51 np0005538513.localdomain python3[49430]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 28 08:00:51 np0005538513.localdomain python3[49430]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 --format json
Nov 28 08:00:51 np0005538513.localdomain python3[49430]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 -q --tls-verify=false
Nov 28 08:01:01 np0005538513.localdomain CROND[49668]: (root) CMD (run-parts /etc/cron.hourly)
Nov 28 08:01:01 np0005538513.localdomain run-parts[49671]: (/etc/cron.hourly) starting 0anacron
Nov 28 08:01:01 np0005538513.localdomain run-parts[49677]: (/etc/cron.hourly) finished 0anacron
Nov 28 08:01:01 np0005538513.localdomain CROND[49667]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 28 08:01:01 np0005538513.localdomain podman[49444]: 2025-11-28 08:00:51.757801314 +0000 UTC m=+0.042974310 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Nov 28 08:01:01 np0005538513.localdomain python3[49430]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect a9dd7a2ac6f35cb086249f87f74e2f8e74e7e2ad5141ce2228263be6faedce26 --format json
Nov 28 08:01:01 np0005538513.localdomain sudo[49428]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:01 np0005538513.localdomain sudo[49716]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzqtqrqnpuavphjkklhgzdzfpcfujawo ; /usr/bin/python3
Nov 28 08:01:01 np0005538513.localdomain sudo[49716]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:02 np0005538513.localdomain python3[49718]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 28 08:01:02 np0005538513.localdomain python3[49718]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 --format json
Nov 28 08:01:02 np0005538513.localdomain python3[49718]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 -q --tls-verify=false
Nov 28 08:01:06 np0005538513.localdomain podman[49732]: 2025-11-28 08:01:02.199534862 +0000 UTC m=+0.035767993 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Nov 28 08:01:06 np0005538513.localdomain python3[49718]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 24976907b2c2553304119aba5731a800204d664feed24ca9eb7f2b4c7d81016b --format json
Nov 28 08:01:06 np0005538513.localdomain sudo[49716]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:06 np0005538513.localdomain sudo[49808]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztoauaeeqkclsxrfvwufljcgxmjvtysl ; /usr/bin/python3
Nov 28 08:01:06 np0005538513.localdomain sudo[49808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:07 np0005538513.localdomain python3[49810]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 28 08:01:07 np0005538513.localdomain python3[49810]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 --format json
Nov 28 08:01:07 np0005538513.localdomain python3[49810]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 -q --tls-verify=false
Nov 28 08:01:09 np0005538513.localdomain podman[49822]: 2025-11-28 08:01:07.259391475 +0000 UTC m=+0.045091124 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Nov 28 08:01:09 np0005538513.localdomain python3[49810]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 57163a7b21fdbb804a27897cb6e6052a5e5c7a339c45d663e80b52375a760dcf --format json
Nov 28 08:01:09 np0005538513.localdomain sudo[49808]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:09 np0005538513.localdomain sudo[49895]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jduvwuytxmrxjzeyiynusdzmndophtab ; /usr/bin/python3
Nov 28 08:01:09 np0005538513.localdomain sudo[49895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:10 np0005538513.localdomain python3[49897]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 28 08:01:10 np0005538513.localdomain python3[49897]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 --format json
Nov 28 08:01:10 np0005538513.localdomain python3[49897]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 -q --tls-verify=false
Nov 28 08:01:12 np0005538513.localdomain podman[49910]: 2025-11-28 08:01:10.179442684 +0000 UTC m=+0.044354152 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Nov 28 08:01:12 np0005538513.localdomain python3[49897]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 076d82a27d63c8328729ed27ceb4291585ae18d017befe6fe353df7aa11715ae --format json
Nov 28 08:01:12 np0005538513.localdomain sudo[49895]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:12 np0005538513.localdomain sudo[49987]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqyrnqigmeifzippovmwbxflctxvehmh ; /usr/bin/python3
Nov 28 08:01:12 np0005538513.localdomain sudo[49987]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:12 np0005538513.localdomain python3[49989]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 28 08:01:12 np0005538513.localdomain python3[49989]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 --format json
Nov 28 08:01:12 np0005538513.localdomain python3[49989]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 -q --tls-verify=false
Nov 28 08:01:15 np0005538513.localdomain podman[50002]: 2025-11-28 08:01:12.696980587 +0000 UTC m=+0.030486017 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Nov 28 08:01:15 np0005538513.localdomain python3[49989]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect d0dbcb95546840a8d088df044347a7877ad5ea45a2ddba0578e9bb5de4ab0da5 --format json
Nov 28 08:01:15 np0005538513.localdomain sudo[49987]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:15 np0005538513.localdomain sudo[50077]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxcoccsfdxsoxblkugifmwuvkznlwndh ; /usr/bin/python3
Nov 28 08:01:15 np0005538513.localdomain sudo[50077]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:15 np0005538513.localdomain python3[50079]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 28 08:01:15 np0005538513.localdomain python3[50079]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 --format json
Nov 28 08:01:15 np0005538513.localdomain python3[50079]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 -q --tls-verify=false
Nov 28 08:01:19 np0005538513.localdomain podman[50092]: 2025-11-28 08:01:15.840400036 +0000 UTC m=+0.045283170 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Nov 28 08:01:19 np0005538513.localdomain python3[50079]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect e6e981540e553415b2d6eda490d7683db07164af2e7a0af8245623900338a4d6 --format json
Nov 28 08:01:19 np0005538513.localdomain sudo[50077]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:19 np0005538513.localdomain sudo[50179]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-npjhydzwrismjxpdnjmasxzgcfmlokbv ; /usr/bin/python3
Nov 28 08:01:19 np0005538513.localdomain sudo[50179]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:20 np0005538513.localdomain python3[50181]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 28 08:01:20 np0005538513.localdomain python3[50181]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 --format json
Nov 28 08:01:20 np0005538513.localdomain python3[50181]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 -q --tls-verify=false
Nov 28 08:01:21 np0005538513.localdomain sudo[50206]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:01:21 np0005538513.localdomain sudo[50206]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:01:21 np0005538513.localdomain sudo[50206]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:21 np0005538513.localdomain sudo[50221]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:01:21 np0005538513.localdomain sudo[50221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:01:22 np0005538513.localdomain sudo[50221]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:23 np0005538513.localdomain podman[50193]: 2025-11-28 08:01:20.237374965 +0000 UTC m=+0.044319541 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Nov 28 08:01:23 np0005538513.localdomain python3[50181]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 87ee88cbf01fb42e0b22747072843bcca6130a90eda4de6e74b3ccd847bb4040 --format json
Nov 28 08:01:23 np0005538513.localdomain sudo[50179]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:23 np0005538513.localdomain sudo[50319]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:01:23 np0005538513.localdomain sudo[50319]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:01:23 np0005538513.localdomain sudo[50319]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:23 np0005538513.localdomain sudo[50347]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhgztufeeidbaiofsryefmeaocuxtiyq ; /usr/bin/python3
Nov 28 08:01:23 np0005538513.localdomain sudo[50347]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:23 np0005538513.localdomain python3[50349]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:01:23 np0005538513.localdomain sudo[50347]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:24 np0005538513.localdomain sudo[50397]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srpkzxvyzvuohjwdpuwcjwhecfgrwxpe ; /usr/bin/python3
Nov 28 08:01:24 np0005538513.localdomain sudo[50397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:24 np0005538513.localdomain sudo[50397]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:24 np0005538513.localdomain sudo[50415]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eypfbujljbpjjfradzkocowjqbiwdcac ; /usr/bin/python3
Nov 28 08:01:24 np0005538513.localdomain sudo[50415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:24 np0005538513.localdomain sudo[50415]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:25 np0005538513.localdomain sudo[50519]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-spffwpafblhspxypsswqvnafdidotwck ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316884.9235888-83883-161618226582021/async_wrapper.py 642356236050 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316884.9235888-83883-161618226582021/AnsiballZ_command.py _
Nov 28 08:01:25 np0005538513.localdomain sudo[50519]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 28 08:01:25 np0005538513.localdomain ansible-async_wrapper.py[50521]: Invoked with 642356236050 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316884.9235888-83883-161618226582021/AnsiballZ_command.py _
Nov 28 08:01:25 np0005538513.localdomain ansible-async_wrapper.py[50524]: Starting module and watcher
Nov 28 08:01:25 np0005538513.localdomain ansible-async_wrapper.py[50524]: Start watching 50525 (3600)
Nov 28 08:01:25 np0005538513.localdomain ansible-async_wrapper.py[50525]: Start module (50525)
Nov 28 08:01:25 np0005538513.localdomain ansible-async_wrapper.py[50521]: Return async_wrapper task started.
Nov 28 08:01:25 np0005538513.localdomain sudo[50519]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:26 np0005538513.localdomain sudo[50543]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwqmmxihxwdszhmroyvztictotjlaukj ; /usr/bin/python3
Nov 28 08:01:26 np0005538513.localdomain sudo[50543]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:26 np0005538513.localdomain python3[50545]: ansible-ansible.legacy.async_status Invoked with jid=642356236050.50521 mode=status _async_dir=/tmp/.ansible_async
Nov 28 08:01:26 np0005538513.localdomain sudo[50543]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:29 np0005538513.localdomain puppet-user[50529]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:01:29 np0005538513.localdomain puppet-user[50529]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:01:29 np0005538513.localdomain puppet-user[50529]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:01:29 np0005538513.localdomain puppet-user[50529]:    (file & line not available)
Nov 28 08:01:29 np0005538513.localdomain puppet-user[50529]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:01:29 np0005538513.localdomain puppet-user[50529]:    (file & line not available)
Nov 28 08:01:29 np0005538513.localdomain puppet-user[50529]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.12 seconds
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Exec[directory-create-etc-my.cnf.d]/returns: executed successfully
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/File[/etc/my.cnf.d/tripleo.cnf]/ensure: created
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Augeas[tripleo-mysql-client-conf]/returns: executed successfully
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]: Notice: Applied catalog in 0.05 seconds
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]: Application:
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]:    Initial environment: production
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]:    Converged environment: production
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]:          Run mode: user
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]: Changes:
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]:             Total: 3
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]: Events:
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]:           Success: 3
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]:             Total: 3
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]: Resources:
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]:           Changed: 3
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]:       Out of sync: 3
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]:             Total: 10
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]: Time:
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]:          Schedule: 0.00
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]:              File: 0.00
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]:              Exec: 0.02
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]:            Augeas: 0.02
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]:    Transaction evaluation: 0.05
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]:    Catalog application: 0.05
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]:    Config retrieval: 0.15
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]:          Last run: 1764316890
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]:        Filebucket: 0.00
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]:             Total: 0.05
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]: Version:
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]:            Config: 1764316889
Nov 28 08:01:30 np0005538513.localdomain puppet-user[50529]:            Puppet: 7.10.0
Nov 28 08:01:30 np0005538513.localdomain ansible-async_wrapper.py[50525]: Module complete (50525)
Nov 28 08:01:30 np0005538513.localdomain ansible-async_wrapper.py[50524]: Done in kid B.
Nov 28 08:01:36 np0005538513.localdomain sudo[50670]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfeaslyhjkfdelniaivnpslmchpjxyuh ; /usr/bin/python3
Nov 28 08:01:36 np0005538513.localdomain sudo[50670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:36 np0005538513.localdomain python3[50672]: ansible-ansible.legacy.async_status Invoked with jid=642356236050.50521 mode=status _async_dir=/tmp/.ansible_async
Nov 28 08:01:36 np0005538513.localdomain sudo[50670]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:37 np0005538513.localdomain sudo[50686]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbrpkaoboiqdblukrnkotghosgtwgbyr ; /usr/bin/python3
Nov 28 08:01:37 np0005538513.localdomain sudo[50686]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:37 np0005538513.localdomain python3[50688]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 08:01:37 np0005538513.localdomain sudo[50686]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:37 np0005538513.localdomain sudo[50702]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evjphyqjnamlbehoxzqxphsmnmipaiel ; /usr/bin/python3
Nov 28 08:01:37 np0005538513.localdomain sudo[50702]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:37 np0005538513.localdomain python3[50704]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:01:38 np0005538513.localdomain sudo[50702]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:38 np0005538513.localdomain sudo[50750]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxwvtpigkbxjkfkzfpooxjskzpcsweyo ; /usr/bin/python3
Nov 28 08:01:38 np0005538513.localdomain sudo[50750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:38 np0005538513.localdomain python3[50752]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:01:38 np0005538513.localdomain sudo[50750]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:38 np0005538513.localdomain sudo[50793]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ozlbwymioutgrrmhglnzrjzcjcbgdivl ; /usr/bin/python3
Nov 28 08:01:38 np0005538513.localdomain sudo[50793]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:38 np0005538513.localdomain python3[50795]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/container-puppet/puppetlabs/facter.conf setype=svirt_sandbox_file_t selevel=s0 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316898.162096-84332-52933245925340/source _original_basename=tmp3tjighwz follow=False checksum=53908622cb869db5e2e2a68e737aa2ab1a872111 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 08:01:38 np0005538513.localdomain sudo[50793]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:39 np0005538513.localdomain sudo[50823]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jsjtxialthgwcsufnhgmpkwoakwhmtds ; /usr/bin/python3
Nov 28 08:01:39 np0005538513.localdomain sudo[50823]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:39 np0005538513.localdomain python3[50825]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:01:39 np0005538513.localdomain sudo[50823]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:39 np0005538513.localdomain sudo[50839]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjoutgcpdclqcxhjgxiqzkdlgnjxlrdq ; /usr/bin/python3
Nov 28 08:01:39 np0005538513.localdomain sudo[50839]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:39 np0005538513.localdomain sudo[50839]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:40 np0005538513.localdomain sudo[50927]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzuhapsfbsrknodgrqnekcabnychciog ; /usr/bin/python3
Nov 28 08:01:40 np0005538513.localdomain sudo[50927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:40 np0005538513.localdomain python3[50929]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Nov 28 08:01:40 np0005538513.localdomain sudo[50927]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:40 np0005538513.localdomain sudo[50946]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uuuhmeulsmhzfmzergaynrbxpfrogxtv ; /usr/bin/python3
Nov 28 08:01:40 np0005538513.localdomain sudo[50946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:40 np0005538513.localdomain python3[50948]: ansible-file Invoked with path=/var/lib/tripleo-config/container-puppet-config mode=448 recurse=True setype=container_file_t force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 08:01:40 np0005538513.localdomain sudo[50946]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:41 np0005538513.localdomain sudo[50962]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azuciliprmxwqrslqjtanmirfetmerum ; /usr/bin/python3
Nov 28 08:01:41 np0005538513.localdomain sudo[50962]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:41 np0005538513.localdomain python3[50964]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=False puppet_config=/var/lib/container-puppet/container-puppet.json short_hostname=np0005538513 step=1 update_config_hash_only=False
Nov 28 08:01:41 np0005538513.localdomain sudo[50962]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:42 np0005538513.localdomain sudo[50978]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cggnbgbyzuulthrjaqeojghqsiycgbit ; /usr/bin/python3
Nov 28 08:01:42 np0005538513.localdomain sudo[50978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:42 np0005538513.localdomain python3[50980]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:01:42 np0005538513.localdomain sudo[50978]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:42 np0005538513.localdomain sudo[50994]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbxegidqlpgjgyoldujfavixdqxhrscf ; /usr/bin/python3
Nov 28 08:01:42 np0005538513.localdomain sudo[50994]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:43 np0005538513.localdomain python3[50996]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True
Nov 28 08:01:43 np0005538513.localdomain sudo[50994]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:43 np0005538513.localdomain sudo[51010]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpwhlzreeuvgxpaujyxqmrsgpbehonea ; /usr/bin/python3
Nov 28 08:01:43 np0005538513.localdomain sudo[51010]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:43 np0005538513.localdomain python3[51012]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 28 08:01:44 np0005538513.localdomain sudo[51010]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:44 np0005538513.localdomain sudo[51052]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lcnfnnrrgkhchvehqbifhbbmsvjrsfsj ; /usr/bin/python3
Nov 28 08:01:44 np0005538513.localdomain sudo[51052]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:44 np0005538513.localdomain python3[51054]: ansible-tripleo_container_manage Invoked with config_id=tripleo_puppet_step1 config_dir=/var/lib/tripleo-config/container-puppet-config/step_1 config_patterns=container-puppet-*.json config_overrides={} concurrency=6 log_base_path=/var/log/containers/stdouts debug=False
Nov 28 08:01:45 np0005538513.localdomain podman[51209]: 2025-11-28 08:01:45.219163059 +0000 UTC m=+0.090019322 container create a788096a7b720f3740c34aa5bb7ae69f2852b290197c68c46688dd4c33a52360 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, container_name=container-puppet-collectd)
Nov 28 08:01:45 np0005538513.localdomain systemd[1]: Started libpod-conmon-a788096a7b720f3740c34aa5bb7ae69f2852b290197c68c46688dd4c33a52360.scope.
Nov 28 08:01:45 np0005538513.localdomain podman[51227]: 2025-11-28 08:01:45.251684099 +0000 UTC m=+0.101702399 container create 700dcb6d86cc0871b7bbf5977f2c9ead9a93d5086f4486ed0144de2a9b1c2b0e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, version=17.1.12, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-nova_libvirt, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, build-date=2025-11-19T00:35:22Z, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com)
Nov 28 08:01:45 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:01:45 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c554e4db742c3febdfb49af667b5d853d210243bf4484c52c41efc08d328831f/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 28 08:01:45 np0005538513.localdomain podman[51209]: 2025-11-28 08:01:45.169411579 +0000 UTC m=+0.040267852 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Nov 28 08:01:45 np0005538513.localdomain podman[51209]: 2025-11-28 08:01:45.27596436 +0000 UTC m=+0.146820613 container init a788096a7b720f3740c34aa5bb7ae69f2852b290197c68c46688dd4c33a52360 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, container_name=container-puppet-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_puppet_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-type=git)
Nov 28 08:01:45 np0005538513.localdomain podman[51209]: 2025-11-28 08:01:45.288352988 +0000 UTC m=+0.159209261 container start a788096a7b720f3740c34aa5bb7ae69f2852b290197c68c46688dd4c33a52360 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, config_id=tripleo_puppet_step1, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, container_name=container-puppet-collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git)
Nov 28 08:01:45 np0005538513.localdomain podman[51209]: 2025-11-28 08:01:45.290355151 +0000 UTC m=+0.161211414 container attach a788096a7b720f3740c34aa5bb7ae69f2852b290197c68c46688dd4c33a52360 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.expose-services=)
Nov 28 08:01:45 np0005538513.localdomain systemd[1]: Started libpod-conmon-700dcb6d86cc0871b7bbf5977f2c9ead9a93d5086f4486ed0144de2a9b1c2b0e.scope.
Nov 28 08:01:45 np0005538513.localdomain podman[51227]: 2025-11-28 08:01:45.204983884 +0000 UTC m=+0.055002194 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:01:45 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:01:45 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8fe5cc4372bb33f59044e24a5715f4f503612636c3ff0d7cebe06eec67fcb4f/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 28 08:01:45 np0005538513.localdomain podman[51290]: 2025-11-28 08:01:45.288143731 +0000 UTC m=+0.047169189 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Nov 28 08:01:45 np0005538513.localdomain podman[51253]: 2025-11-28 08:01:45.319006049 +0000 UTC m=+0.138972318 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Nov 28 08:01:45 np0005538513.localdomain podman[51251]: 2025-11-28 08:01:45.321232289 +0000 UTC m=+0.143359685 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Nov 28 08:01:46 np0005538513.localdomain podman[51227]: 2025-11-28 08:01:46.377650888 +0000 UTC m=+1.227669208 container init 700dcb6d86cc0871b7bbf5977f2c9ead9a93d5086f4486ed0144de2a9b1c2b0e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, version=17.1.12, container_name=container-puppet-nova_libvirt, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:35:22Z, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git)
Nov 28 08:01:46 np0005538513.localdomain systemd[1]: tmp-crun.B2eBHk.mount: Deactivated successfully.
Nov 28 08:01:46 np0005538513.localdomain podman[51290]: 2025-11-28 08:01:46.388097066 +0000 UTC m=+1.147122534 container create 6209037692ad6a27fd5ca43e0f8d40701d26de0335f5c337009fe96a996c7eb5 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_puppet_step1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=container-puppet-iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, version=17.1.12, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:01:46 np0005538513.localdomain podman[51227]: 2025-11-28 08:01:46.46612441 +0000 UTC m=+1.316142700 container start 700dcb6d86cc0871b7bbf5977f2c9ead9a93d5086f4486ed0144de2a9b1c2b0e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, build-date=2025-11-19T00:35:22Z, container_name=container-puppet-nova_libvirt, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 28 08:01:46 np0005538513.localdomain podman[51227]: 2025-11-28 08:01:46.468863776 +0000 UTC m=+1.318882126 container attach 700dcb6d86cc0871b7bbf5977f2c9ead9a93d5086f4486ed0144de2a9b1c2b0e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_puppet_step1, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, container_name=container-puppet-nova_libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, version=17.1.12)
Nov 28 08:01:46 np0005538513.localdomain podman[51253]: 2025-11-28 08:01:46.481982718 +0000 UTC m=+1.301948947 container create d33e206cd9e309107e146b933b6ae92c41f3aaab17f4f1c3587ef51309b905d4 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=container-puppet-metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_puppet_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4)
Nov 28 08:01:46 np0005538513.localdomain podman[51251]: 2025-11-28 08:01:46.5005429 +0000 UTC m=+1.322670256 container create 49d1198114e3cf2834836d3b9377664b111beda2bfbba5e8342177144c6fdf6b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, container_name=container-puppet-crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., name=rhosp17/openstack-cron)
Nov 28 08:01:46 np0005538513.localdomain systemd[1]: Started libpod-conmon-d33e206cd9e309107e146b933b6ae92c41f3aaab17f4f1c3587ef51309b905d4.scope.
Nov 28 08:01:46 np0005538513.localdomain systemd[1]: Started libpod-conmon-49d1198114e3cf2834836d3b9377664b111beda2bfbba5e8342177144c6fdf6b.scope.
Nov 28 08:01:46 np0005538513.localdomain systemd[1]: Started libpod-conmon-6209037692ad6a27fd5ca43e0f8d40701d26de0335f5c337009fe96a996c7eb5.scope.
Nov 28 08:01:46 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:01:46 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29b4c40b440520fe09ba18ff760c1e8bcaccec35e4300eedc14527806169da1c/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 28 08:01:46 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:01:46 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:01:46 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abe1c4a3f68b93d18940fc38699bd560bf3f77db8c7795c1004a1bfa9999fcdb/merged/tmp/iscsi.host supports timestamps until 2038 (0x7fffffff)
Nov 28 08:01:46 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abe1c4a3f68b93d18940fc38699bd560bf3f77db8c7795c1004a1bfa9999fcdb/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 28 08:01:46 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa2c8f8913bff67439dab4aeb6db6e2056909e99c44d0de88f7837d07d6c8847/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 28 08:01:46 np0005538513.localdomain podman[51253]: 2025-11-28 08:01:46.558438254 +0000 UTC m=+1.378404513 container init d33e206cd9e309107e146b933b6ae92c41f3aaab17f4f1c3587ef51309b905d4 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_puppet_step1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=container-puppet-metrics_qdr, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.4)
Nov 28 08:01:46 np0005538513.localdomain podman[51251]: 2025-11-28 08:01:46.567349954 +0000 UTC m=+1.389477340 container init 49d1198114e3cf2834836d3b9377664b111beda2bfbba5e8342177144c6fdf6b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=container-puppet-crond, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64)
Nov 28 08:01:46 np0005538513.localdomain podman[51251]: 2025-11-28 08:01:46.579390731 +0000 UTC m=+1.401518117 container start 49d1198114e3cf2834836d3b9377664b111beda2bfbba5e8342177144c6fdf6b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=container-puppet-crond, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, version=17.1.12, com.redhat.component=openstack-cron-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public)
Nov 28 08:01:46 np0005538513.localdomain podman[51251]: 2025-11-28 08:01:46.579837855 +0000 UTC m=+1.401965221 container attach 49d1198114e3cf2834836d3b9377664b111beda2bfbba5e8342177144c6fdf6b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, url=https://www.redhat.com, config_id=tripleo_puppet_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, distribution-scope=public, version=17.1.12, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=container-puppet-crond, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=)
Nov 28 08:01:46 np0005538513.localdomain podman[51290]: 2025-11-28 08:01:46.613286963 +0000 UTC m=+1.372312411 container init 6209037692ad6a27fd5ca43e0f8d40701d26de0335f5c337009fe96a996c7eb5 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, container_name=container-puppet-iscsid, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, tcib_managed=true, distribution-scope=public, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:01:46 np0005538513.localdomain podman[51253]: 2025-11-28 08:01:46.620879342 +0000 UTC m=+1.440845611 container start d33e206cd9e309107e146b933b6ae92c41f3aaab17f4f1c3587ef51309b905d4 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, name=rhosp17/openstack-qdrouterd, container_name=container-puppet-metrics_qdr, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git)
Nov 28 08:01:46 np0005538513.localdomain podman[51253]: 2025-11-28 08:01:46.621240383 +0000 UTC m=+1.441206652 container attach d33e206cd9e309107e146b933b6ae92c41f3aaab17f4f1c3587ef51309b905d4 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=container-puppet-metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com)
Nov 28 08:01:46 np0005538513.localdomain podman[51290]: 2025-11-28 08:01:46.672761467 +0000 UTC m=+1.431786945 container start 6209037692ad6a27fd5ca43e0f8d40701d26de0335f5c337009fe96a996c7eb5 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, container_name=container-puppet-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:01:46 np0005538513.localdomain podman[51290]: 2025-11-28 08:01:46.673230972 +0000 UTC m=+1.432256430 container attach 6209037692ad6a27fd5ca43e0f8d40701d26de0335f5c337009fe96a996c7eb5 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=container-puppet-iscsid, config_id=tripleo_puppet_step1, batch=17.1_20251118.1, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:01:47 np0005538513.localdomain podman[51143]: 2025-11-28 08:01:45.078655165 +0000 UTC m=+0.038459676 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Nov 28 08:01:47 np0005538513.localdomain podman[51465]: 2025-11-28 08:01:47.753898742 +0000 UTC m=+0.095297658 container create e9e81dc4d463c3e33e320bcac7f504711c3df69a454f9c676b6265b24fcfa267 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, build-date=2025-11-19T00:11:59Z, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-central-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 ceilometer-central, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, container_name=container-puppet-ceilometer, name=rhosp17/openstack-ceilometer-central, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central)
Nov 28 08:01:47 np0005538513.localdomain podman[51465]: 2025-11-28 08:01:47.705913858 +0000 UTC m=+0.047312814 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Nov 28 08:01:47 np0005538513.localdomain systemd[1]: Started libpod-conmon-e9e81dc4d463c3e33e320bcac7f504711c3df69a454f9c676b6265b24fcfa267.scope.
Nov 28 08:01:47 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:01:47 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0de61d8517bf2f9ac2b1f4b59bb4938a9bcb5e653aa4685b7d899b570eb2d566/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 28 08:01:47 np0005538513.localdomain podman[51465]: 2025-11-28 08:01:47.823131732 +0000 UTC m=+0.164530618 container init e9e81dc4d463c3e33e320bcac7f504711c3df69a454f9c676b6265b24fcfa267 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, config_id=tripleo_puppet_step1, tcib_managed=true, architecture=x86_64, container_name=container-puppet-ceilometer, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-central, release=1761123044, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-type=git, com.redhat.component=openstack-ceilometer-central-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2025-11-19T00:11:59Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, batch=17.1_20251118.1)
Nov 28 08:01:47 np0005538513.localdomain podman[51465]: 2025-11-28 08:01:47.830848434 +0000 UTC m=+0.172247330 container start e9e81dc4d463c3e33e320bcac7f504711c3df69a454f9c676b6265b24fcfa267 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, io.openshift.expose-services=, container_name=container-puppet-ceilometer, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-ceilometer-central-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 ceilometer-central, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-central, build-date=2025-11-19T00:11:59Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central)
Nov 28 08:01:47 np0005538513.localdomain podman[51465]: 2025-11-28 08:01:47.831090521 +0000 UTC m=+0.172489427 container attach e9e81dc4d463c3e33e320bcac7f504711c3df69a454f9c676b6265b24fcfa267 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-ceilometer-central, distribution-scope=public, release=1761123044, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, container_name=container-puppet-ceilometer, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-central-container, version=17.1.12, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2025-11-19T00:11:59Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]:    (file & line not available)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]:    (file & line not available)
Nov 28 08:01:48 np0005538513.localdomain ovs-vsctl[51595]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]:    (file & line not available)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]:    (file & line not available)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]:    (file & line not available)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51422]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51422]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51422]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51422]:    (file & line not available)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]:    (file & line not available)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.07 seconds
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51364]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51364]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51364]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51364]:    (file & line not available)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51422]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51422]:    (file & line not available)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]: Notice: Accepting previously invalid value for target type 'Integer'
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51364]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51364]:    (file & line not available)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/File[/etc/logrotate-crond.conf]/ensure: defined content as '{sha256}1c3202f58bd2ae16cb31badcbb7f0d4e6697157b987d1887736ad96bb73d70b0'
Nov 28 08:01:48 np0005538513.localdomain crontab[51805]: (root) LIST (root)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/Cron[logrotate-crond]/ensure: created
Nov 28 08:01:48 np0005538513.localdomain crontab[51806]: (root) REPLACE (root)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]: Notice: Applied catalog in 0.04 seconds
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]: Application:
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]:    Initial environment: production
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]:    Converged environment: production
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]:          Run mode: user
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]: Changes:
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]:             Total: 2
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]: Events:
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]:           Success: 2
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]:             Total: 2
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]: Resources:
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]:           Changed: 2
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]:       Out of sync: 2
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]:           Skipped: 7
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]:             Total: 9
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]: Time:
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]:              File: 0.01
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]:              Cron: 0.01
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]:    Transaction evaluation: 0.04
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]:    Catalog application: 0.04
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]:    Config retrieval: 0.10
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]:          Last run: 1764316908
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]:             Total: 0.04
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]: Version:
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]:            Config: 1764316908
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51410]:            Puppet: 7.10.0
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.13 seconds
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51422]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.11 seconds
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/owner: owner changed 'qdrouterd' to 'root'
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/group: group changed 'qdrouterd' to 'root'
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/mode: mode changed '0700' to '0755'
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]: Notice: /Stage[main]/Qdr::Config/File[/etc/qpid-dispatch/ssl]/ensure: created
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]: Notice: /Stage[main]/Qdr::Config/File[qdrouterd.conf]/content: content changed '{sha256}89e10d8896247f992c5f0baf027c25a8ca5d0441be46d8859d9db2067ea74cd3' to '{sha256}780ab1440a8faf33b15233a48577851d7bee558b8306ab6e193d265286e7d4ed'
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd]/ensure: created
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd/metrics_qdr.log]/ensure: created
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]: Notice: Applied catalog in 0.03 seconds
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]: Application:
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]:    Initial environment: production
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]:    Converged environment: production
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]:          Run mode: user
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]: Changes:
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]:             Total: 7
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]: Events:
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]:           Success: 7
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]:             Total: 7
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]: Resources:
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]:           Skipped: 13
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]:           Changed: 5
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]:       Out of sync: 5
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]:             Total: 20
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]: Time:
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]:              File: 0.01
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]:    Transaction evaluation: 0.02
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]:    Catalog application: 0.03
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]:    Config retrieval: 0.16
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]:          Last run: 1764316908
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]:             Total: 0.03
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]: Version:
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]:            Config: 1764316908
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51402]:            Puppet: 7.10.0
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51422]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[reset-iscsi-initiator-name]/returns: executed successfully
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51422]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/File[/etc/iscsi/.initiator_reset]/ensure: created
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.38 seconds
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51422]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[sync-iqn-to-host]/returns: executed successfully
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51364]: Warning: Scope(Class[Nova]): The os_region_name parameter is deprecated and will be removed \
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51364]: in a future release. Use nova::cinder::os_region_name instead
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51364]: Warning: Scope(Class[Nova]): The catalog_info parameter is deprecated and will be removed \
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51364]: in a future release. Use nova::cinder::catalog_info instead
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/content: content changed '{sha256}aea388a73ebafc7e07a81ddb930a91099211f660eee55fbf92c13007a77501e5' to '{sha256}2523d01ee9c3022c0e9f61d896b1474a168e18472aee141cc278e69fe13f41c1'
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/owner: owner changed 'collectd' to 'root'
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/group: group changed 'collectd' to 'root'
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/mode: mode changed '0644' to '0640'
Nov 28 08:01:48 np0005538513.localdomain systemd[1]: libpod-49d1198114e3cf2834836d3b9377664b111beda2bfbba5e8342177144c6fdf6b.scope: Deactivated successfully.
Nov 28 08:01:48 np0005538513.localdomain systemd[1]: libpod-49d1198114e3cf2834836d3b9377664b111beda2bfbba5e8342177144c6fdf6b.scope: Consumed 2.084s CPU time.
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/owner: owner changed 'collectd' to 'root'
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/group: group changed 'collectd' to 'root'
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/mode: mode changed '0755' to '0750'
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-cpu.conf]/ensure: removed
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-interface.conf]/ensure: removed
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-load.conf]/ensure: removed
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-memory.conf]/ensure: removed
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-syslog.conf]/ensure: removed
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/apache.conf]/ensure: removed
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/dns.conf]/ensure: removed
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ipmi.conf]/ensure: removed
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mcelog.conf]/ensure: removed
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mysql.conf]/ensure: removed
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-events.conf]/ensure: removed
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-stats.conf]/ensure: removed
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ping.conf]/ensure: removed
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/pmu.conf]/ensure: removed
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/rdt.conf]/ensure: removed
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/sensors.conf]/ensure: removed
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/snmp.conf]/ensure: removed
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/write_prometheus.conf]/ensure: removed
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Python/File[/usr/lib/python3.9/site-packages]/mode: mode changed '0755' to '0750'
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Python/Collectd::Plugin[python]/File[python.load]/ensure: defined content as '{sha256}0163924a0099dd43fe39cb85e836df147fd2cfee8197dc6866d3c384539eb6ee'
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Python/Concat[/etc/collectd.d/python-config.conf]/File[/etc/collectd.d/python-config.conf]/ensure: defined content as '{sha256}2e5fb20e60b30f84687fc456a37fc62451000d2d85f5bbc1b3fca3a5eac9deeb'
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51364]: Warning: Unknown variable: '::nova::compute::verify_glance_signatures'. (file: /etc/puppet/modules/nova/manifests/glance.pp, line: 62, column: 41)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Logfile/Collectd::Plugin[logfile]/File[logfile.load]/ensure: defined content as '{sha256}07bbda08ef9b824089500bdc6ac5a86e7d1ef2ae3ed4ed423c0559fe6361e5af'
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Amqp1/Collectd::Plugin[amqp1]/File[amqp1.load]/ensure: defined content as '{sha256}dee3f10cb1ff461ac3f1e743a5ef3f06993398c6c829895de1dae7f242a64b39'
Nov 28 08:01:48 np0005538513.localdomain podman[51872]: 2025-11-28 08:01:48.863676254 +0000 UTC m=+0.047150009 container died 49d1198114e3cf2834836d3b9377664b111beda2bfbba5e8342177144c6fdf6b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, version=17.1.12, build-date=2025-11-18T22:49:32Z, container_name=container-puppet-crond, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Ceph/Collectd::Plugin[ceph]/File[ceph.load]/ensure: defined content as '{sha256}c796abffda2e860875295b4fc11cc95c6032b4e13fa8fb128e839a305aa1676c'
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Cpu/Collectd::Plugin[cpu]/File[cpu.load]/ensure: defined content as '{sha256}67d4c8bf6bf5785f4cb6b596712204d9eacbcebbf16fe289907195d4d3cb0e34'
Nov 28 08:01:48 np0005538513.localdomain systemd[1]: tmp-crun.GS3uEZ.mount: Deactivated successfully.
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Df/Collectd::Plugin[df]/File[df.load]/ensure: defined content as '{sha256}edeb4716d96fc9dca2c6adfe07bae70ba08c6af3944a3900581cba0f08f3c4ba'
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Disk/Collectd::Plugin[disk]/File[disk.load]/ensure: defined content as '{sha256}1d0cb838278f3226fcd381f0fc2e0e1abaf0d590f4ba7bcb2fc6ec113d3ebde7'
Nov 28 08:01:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-49d1198114e3cf2834836d3b9377664b111beda2bfbba5e8342177144c6fdf6b-userdata-shm.mount: Deactivated successfully.
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[hugepages.load]/ensure: defined content as '{sha256}9b9f35b65a73da8d4037e4355a23b678f2cf61997ccf7a5e1adf2a7ce6415827'
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[older_hugepages.load]/ensure: removed
Nov 28 08:01:48 np0005538513.localdomain systemd[1]: libpod-d33e206cd9e309107e146b933b6ae92c41f3aaab17f4f1c3587ef51309b905d4.scope: Deactivated successfully.
Nov 28 08:01:48 np0005538513.localdomain systemd[1]: libpod-d33e206cd9e309107e146b933b6ae92c41f3aaab17f4f1c3587ef51309b905d4.scope: Consumed 2.197s CPU time.
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51364]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_base_images'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 44, column: 5)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51364]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_original_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 48, column: 5)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51364]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_resized_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 52, column: 5)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51364]: Warning: Scope(Class[Tripleo::Profile::Base::Nova::Compute]): The keymgr_backend parameter has been deprecated
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Interface/Collectd::Plugin[interface]/File[interface.load]/ensure: defined content as '{sha256}b76b315dc312e398940fe029c6dbc5c18d2b974ff7527469fc7d3617b5222046'
Nov 28 08:01:48 np0005538513.localdomain podman[51872]: 2025-11-28 08:01:48.975347724 +0000 UTC m=+0.158821479 container cleanup 49d1198114e3cf2834836d3b9377664b111beda2bfbba5e8342177144c6fdf6b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, container_name=container-puppet-crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_puppet_step1, version=17.1.12)
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Load/Collectd::Plugin[load]/File[load.load]/ensure: defined content as '{sha256}af2403f76aebd2f10202d66d2d55e1a8d987eed09ced5a3e3873a4093585dc31'
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51364]: Warning: Scope(Class[Nova::Compute]): vcpu_pin_set is deprecated, instead use cpu_dedicated_set or cpu_shared_set.
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51364]: Warning: Scope(Class[Nova::Compute]): verify_glance_signatures is deprecated. Use the same parameter in nova::glance
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Memory/Collectd::Plugin[memory]/File[memory.load]/ensure: defined content as '{sha256}0f270425ee6b05fc9440ee32b9afd1010dcbddd9b04ca78ff693858f7ecb9d0e'
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Unixsock/Collectd::Plugin[unixsock]/File[unixsock.load]/ensure: defined content as '{sha256}9d1ec1c51ba386baa6f62d2e019dbd6998ad924bf868b3edc2d24d3dc3c63885'
Nov 28 08:01:48 np0005538513.localdomain systemd[1]: libpod-conmon-49d1198114e3cf2834836d3b9377664b111beda2bfbba5e8342177144c6fdf6b.scope: Deactivated successfully.
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Uptime/Collectd::Plugin[uptime]/File[uptime.load]/ensure: defined content as '{sha256}f7a26c6369f904d0ca1af59627ebea15f5e72160bcacdf08d217af282b42e5c0'
Nov 28 08:01:48 np0005538513.localdomain python3[51054]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-crond --conmon-pidfile /run/container-puppet-crond.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538513 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=crond --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::logging::logrotate --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-crond --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-crond.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[virt.load]/ensure: defined content as '{sha256}9a2bcf913f6bf8a962a0ff351a9faea51ae863cc80af97b77f63f8ab68941c62'
Nov 28 08:01:48 np0005538513.localdomain puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[older_virt.load]/ensure: removed
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]: Notice: Applied catalog in 0.33 seconds
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]: Application:
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]:    Initial environment: production
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]:    Converged environment: production
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]:          Run mode: user
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]: Changes:
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]:             Total: 43
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]: Events:
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]:           Success: 43
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]:             Total: 43
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]: Resources:
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]:           Skipped: 14
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]:           Changed: 38
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]:       Out of sync: 38
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]:             Total: 82
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]: Time:
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]:              File: 0.17
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]:    Transaction evaluation: 0.32
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]:    Catalog application: 0.33
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]:    Config retrieval: 0.46
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]:          Last run: 1764316909
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]:       Concat file: 0.00
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]:    Concat fragment: 0.00
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]:             Total: 0.33
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]: Version:
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]:            Config: 1764316908
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51359]:            Puppet: 7.10.0
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Augeas[chap_algs in /etc/iscsi/iscsid.conf]/returns: executed successfully
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]: Notice: Applied catalog in 0.50 seconds
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]: Application:
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]:    Initial environment: production
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]:    Converged environment: production
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]:          Run mode: user
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]: Changes:
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]:             Total: 4
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]: Events:
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]:           Success: 4
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]:             Total: 4
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]: Resources:
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]:           Changed: 4
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]:       Out of sync: 4
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]:           Skipped: 8
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]:             Total: 13
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]: Time:
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]:              File: 0.00
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]:              Exec: 0.06
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]:    Config retrieval: 0.14
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]:            Augeas: 0.42
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]:    Transaction evaluation: 0.50
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]:    Catalog application: 0.50
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]:          Last run: 1764316909
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]:             Total: 0.50
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]: Version:
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]:            Config: 1764316908
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51422]:            Puppet: 7.10.0
Nov 28 08:01:49 np0005538513.localdomain podman[51904]: 2025-11-28 08:01:49.062870406 +0000 UTC m=+0.077133047 container died d33e206cd9e309107e146b933b6ae92c41f3aaab17f4f1c3587ef51309b905d4 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-metrics_qdr, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_puppet_step1, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:01:49 np0005538513.localdomain podman[51904]: 2025-11-28 08:01:49.092835796 +0000 UTC m=+0.107098397 container cleanup d33e206cd9e309107e146b933b6ae92c41f3aaab17f4f1c3587ef51309b905d4 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-metrics_qdr, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_puppet_step1)
Nov 28 08:01:49 np0005538513.localdomain systemd[1]: libpod-conmon-d33e206cd9e309107e146b933b6ae92c41f3aaab17f4f1c3587ef51309b905d4.scope: Deactivated successfully.
Nov 28 08:01:49 np0005538513.localdomain python3[51054]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-metrics_qdr --conmon-pidfile /run/container-puppet-metrics_qdr.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538513 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=metrics_qdr --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::metrics::qdr
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-metrics_qdr --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-metrics_qdr.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Nov 28 08:01:49 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-aa2c8f8913bff67439dab4aeb6db6e2056909e99c44d0de88f7837d07d6c8847-merged.mount: Deactivated successfully.
Nov 28 08:01:49 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-29b4c40b440520fe09ba18ff760c1e8bcaccec35e4300eedc14527806169da1c-merged.mount: Deactivated successfully.
Nov 28 08:01:49 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d33e206cd9e309107e146b933b6ae92c41f3aaab17f4f1c3587ef51309b905d4-userdata-shm.mount: Deactivated successfully.
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51364]: Warning: Scope(Class[Nova::Compute::Libvirt]): nova::compute::libvirt::images_type will be required if rbd ephemeral storage is used.
Nov 28 08:01:49 np0005538513.localdomain systemd[1]: libpod-6209037692ad6a27fd5ca43e0f8d40701d26de0335f5c337009fe96a996c7eb5.scope: Deactivated successfully.
Nov 28 08:01:49 np0005538513.localdomain systemd[1]: libpod-6209037692ad6a27fd5ca43e0f8d40701d26de0335f5c337009fe96a996c7eb5.scope: Consumed 2.542s CPU time.
Nov 28 08:01:49 np0005538513.localdomain podman[51290]: 2025-11-28 08:01:49.364272533 +0000 UTC m=+4.123297971 container died 6209037692ad6a27fd5ca43e0f8d40701d26de0335f5c337009fe96a996c7eb5 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, config_id=tripleo_puppet_step1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:01:49 np0005538513.localdomain systemd[1]: libpod-a788096a7b720f3740c34aa5bb7ae69f2852b290197c68c46688dd4c33a52360.scope: Deactivated successfully.
Nov 28 08:01:49 np0005538513.localdomain systemd[1]: libpod-a788096a7b720f3740c34aa5bb7ae69f2852b290197c68c46688dd4c33a52360.scope: Consumed 2.719s CPU time.
Nov 28 08:01:49 np0005538513.localdomain podman[51209]: 2025-11-28 08:01:49.375316199 +0000 UTC m=+4.246172462 container died a788096a7b720f3740c34aa5bb7ae69f2852b290197c68c46688dd4c33a52360 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.12, tcib_managed=true, io.openshift.expose-services=, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, container_name=container-puppet-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Nov 28 08:01:49 np0005538513.localdomain systemd[1]: tmp-crun.V8VMN7.mount: Deactivated successfully.
Nov 28 08:01:49 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6209037692ad6a27fd5ca43e0f8d40701d26de0335f5c337009fe96a996c7eb5-userdata-shm.mount: Deactivated successfully.
Nov 28 08:01:49 np0005538513.localdomain podman[52090]: 2025-11-28 08:01:49.482185869 +0000 UTC m=+0.108684928 container cleanup 6209037692ad6a27fd5ca43e0f8d40701d26de0335f5c337009fe96a996c7eb5 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.12, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=container-puppet-iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:01:49 np0005538513.localdomain systemd[1]: libpod-conmon-6209037692ad6a27fd5ca43e0f8d40701d26de0335f5c337009fe96a996c7eb5.scope: Deactivated successfully.
Nov 28 08:01:49 np0005538513.localdomain python3[51054]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-iscsid --conmon-pidfile /run/container-puppet-iscsid.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538513 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,iscsid_config --env NAME=iscsid --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::iscsid
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-iscsid --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-iscsid.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/iscsi:/tmp/iscsi.host:z --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Nov 28 08:01:49 np0005538513.localdomain podman[52110]: 2025-11-28 08:01:49.507228944 +0000 UTC m=+0.107266293 container create d50eb1873bbbfb25c378d8d09e1b58b68df7de1660980d5d6de8b3858f88a068 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, container_name=container-puppet-ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, distribution-scope=public, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']})
Nov 28 08:01:49 np0005538513.localdomain systemd[1]: Started libpod-conmon-d50eb1873bbbfb25c378d8d09e1b58b68df7de1660980d5d6de8b3858f88a068.scope.
Nov 28 08:01:49 np0005538513.localdomain podman[52110]: 2025-11-28 08:01:49.44298505 +0000 UTC m=+0.043022399 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Nov 28 08:01:49 np0005538513.localdomain podman[52118]: 2025-11-28 08:01:49.548152467 +0000 UTC m=+0.136602123 container create 01fcb2b9a34ed7002e2c1ab9793a6c6e2f4640510df7d4a54d984a2cebb71adf (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, name=rhosp17/openstack-rsyslog, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-18T22:49:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, container_name=container-puppet-rsyslog, config_id=tripleo_puppet_step1, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, com.redhat.component=openstack-rsyslog-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, vendor=Red Hat, Inc.)
Nov 28 08:01:49 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:01:49 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4619bcf03a81c0f8085ac49d4104b51527a7e62eb19e6e3702042c0e09f1d20b/merged/etc/sysconfig/modules supports timestamps until 2038 (0x7fffffff)
Nov 28 08:01:49 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4619bcf03a81c0f8085ac49d4104b51527a7e62eb19e6e3702042c0e09f1d20b/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 28 08:01:49 np0005538513.localdomain podman[52110]: 2025-11-28 08:01:49.571493328 +0000 UTC m=+0.171530717 container init d50eb1873bbbfb25c378d8d09e1b58b68df7de1660980d5d6de8b3858f88a068 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, container_name=container-puppet-ovn_controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_puppet_step1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc.)
Nov 28 08:01:49 np0005538513.localdomain podman[52110]: 2025-11-28 08:01:49.579987744 +0000 UTC m=+0.180025133 container start d50eb1873bbbfb25c378d8d09e1b58b68df7de1660980d5d6de8b3858f88a068 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=container-puppet-ovn_controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:01:49 np0005538513.localdomain podman[52110]: 2025-11-28 08:01:49.580457849 +0000 UTC m=+0.180495288 container attach d50eb1873bbbfb25c378d8d09e1b58b68df7de1660980d5d6de8b3858f88a068 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=container-puppet-ovn_controller, config_id=tripleo_puppet_step1, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z)
Nov 28 08:01:49 np0005538513.localdomain podman[52118]: 2025-11-28 08:01:49.482167598 +0000 UTC m=+0.070617264 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Nov 28 08:01:49 np0005538513.localdomain systemd[1]: Started libpod-conmon-01fcb2b9a34ed7002e2c1ab9793a6c6e2f4640510df7d4a54d984a2cebb71adf.scope.
Nov 28 08:01:49 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:01:49 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc098275d7f03b1c1fd504750e41e859fc80e308322902c4e50d7df8300f206a/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 28 08:01:49 np0005538513.localdomain podman[52100]: 2025-11-28 08:01:49.630165707 +0000 UTC m=+0.241924144 container cleanup a788096a7b720f3740c34aa5bb7ae69f2852b290197c68c46688dd4c33a52360 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, container_name=container-puppet-collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, name=rhosp17/openstack-collectd, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:01:49 np0005538513.localdomain systemd[1]: libpod-conmon-a788096a7b720f3740c34aa5bb7ae69f2852b290197c68c46688dd4c33a52360.scope: Deactivated successfully.
Nov 28 08:01:49 np0005538513.localdomain python3[51054]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-collectd --conmon-pidfile /run/container-puppet-collectd.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538513 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,collectd_client_config,exec --env NAME=collectd --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::metrics::collectd --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-collectd --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-collectd.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Nov 28 08:01:49 np0005538513.localdomain podman[52118]: 2025-11-28 08:01:49.683279972 +0000 UTC m=+0.271729628 container init 01fcb2b9a34ed7002e2c1ab9793a6c6e2f4640510df7d4a54d984a2cebb71adf (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.component=openstack-rsyslog-container, build-date=2025-11-18T22:49:49Z, name=rhosp17/openstack-rsyslog, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=container-puppet-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_puppet_step1)
Nov 28 08:01:49 np0005538513.localdomain podman[52118]: 2025-11-28 08:01:49.691888801 +0000 UTC m=+0.280338457 container start 01fcb2b9a34ed7002e2c1ab9793a6c6e2f4640510df7d4a54d984a2cebb71adf (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, name=rhosp17/openstack-rsyslog, url=https://www.redhat.com, config_id=tripleo_puppet_step1, io.buildah.version=1.41.4, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, release=1761123044, container_name=container-puppet-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:01:49 np0005538513.localdomain podman[52118]: 2025-11-28 08:01:49.692079397 +0000 UTC m=+0.280529053 container attach 01fcb2b9a34ed7002e2c1ab9793a6c6e2f4640510df7d4a54d984a2cebb71adf (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_puppet_step1, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:49Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, io.buildah.version=1.41.4, container_name=container-puppet-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, version=17.1.12, com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com)
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51364]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 1.39 seconds
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51508]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51508]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51508]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51508]:    (file & line not available)
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51508]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:01:49 np0005538513.localdomain puppet-user[51508]:    (file & line not available)
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Warning: Unknown variable: '::ceilometer::cache_backend'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 145, column: 39)
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Warning: Unknown variable: '::ceilometer::memcache_servers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 146, column: 39)
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Warning: Unknown variable: '::ceilometer::cache_tls_enabled'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 147, column: 39)
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Warning: Unknown variable: '::ceilometer::cache_tls_cafile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 148, column: 39)
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Warning: Unknown variable: '::ceilometer::cache_tls_certfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 149, column: 39)
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Warning: Unknown variable: '::ceilometer::cache_tls_keyfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 150, column: 39)
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Warning: Unknown variable: '::ceilometer::cache_tls_allowed_ciphers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 151, column: 39)
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Warning: Unknown variable: '::ceilometer::manage_backend_package'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 152, column: 39)
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_password'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 63, column: 25)
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_url'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 68, column: 25)
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_region'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 69, column: 28)
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 70, column: 25)
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_tenant_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 71, column: 29)
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_cacert'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 72, column: 23)
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_endpoint_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 73, column: 26)
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 74, column: 33)
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_project_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 75, column: 36)
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 76, column: 26)
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File[/etc/nova/migration/identity]/content: content changed '{sha256}86610d84e745a3992358ae0b747297805d075492e5114c666fa08f8aecce7da0' to '{sha256}e8f4c9c311633f219a6b4c8a97d1389467ae0d86e6640d015eb10a4c73ac6b8b'
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File_line[nova_ssh_port]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/File[/etc/sasl2/libvirt.conf]/content: content changed '{sha256}78510a0d6f14b269ddeb9f9638dfdfba9f976d370ee2ec04ba25352a8af6df35' to '{sha256}6d7bcae773217a30c0772f75d0d1b6d21f5d64e72853f5e3d91bb47799dbb7fe'
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Warning: Empty environment setting 'TLS_PASSWORD'
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]:    (file: /etc/puppet/modules/tripleo/manifests/profile/base/nova/libvirt.pp, line: 182)
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/Exec[set libvirt sasl credentials]/returns: executed successfully
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File[/etc/nova/migration/authorized_keys]/content: content changed '{sha256}0d05a8832f36c0517b84e9c3ad11069d531c7d2be5297661e5552fd29e3a5e47' to '{sha256}ae9c4ab6bedd07e63d6f2c3a5743334d26ea3ed4d1f695ab855f72927fdb71bc'
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File_line[nova_migration_logindefs]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-abe1c4a3f68b93d18940fc38699bd560bf3f77db8c7795c1004a1bfa9999fcdb-merged.mount: Deactivated successfully.
Nov 28 08:01:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c554e4db742c3febdfb49af667b5d853d210243bf4484c52c41efc08d328831f-merged.mount: Deactivated successfully.
Nov 28 08:01:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a788096a7b720f3740c34aa5bb7ae69f2852b290197c68c46688dd4c33a52360-userdata-shm.mount: Deactivated successfully.
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.36 seconds
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/never_download_image_if_on_rbd]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/disable_compute_service_check_for_ffu]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ssl_only]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/my_ip]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/http_timeout]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/host]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[publisher/telemetry_secret]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/host]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_name]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_password]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_url]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/region_name]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/username]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/password]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/cpu_allocation_ratio]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_name]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/interface]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ram_allocation_ratio]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/user_domain_name]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_domain_name]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_type]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/disk_allocation_ratio]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[compute/instance_discovery_method]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[DEFAULT/polling_namespaces]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/dhcp_domain]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[polling/tenant_name_discovery]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[coordination/backend_url]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[vif_plug_ovs/ovsdb_connection]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[notifications/notification_format]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/backend]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/enabled]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/memcache_servers]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/tls_enabled]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Rabbit[ceilometer_config]/Ceilometer_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/rpc_address_prefix]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/notify_address_prefix]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[notifications/notify_on_state_change]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[cinder/cross_az_attach]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Glance/Nova_config[glance/valid_interfaces]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/driver]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/transport_url]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/topics]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Default[ceilometer_config]/Ceilometer_config[DEFAULT/transport_url]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/debug]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/log_dir]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_type]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_url]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Notice: Applied catalog in 0.41 seconds
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/password]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Application:
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]:    Initial environment: production
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]:    Converged environment: production
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]:          Run mode: user
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Changes:
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]:             Total: 31
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Events:
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]:           Success: 31
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]:             Total: 31
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Resources:
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]:           Skipped: 22
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]:           Changed: 31
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]:       Out of sync: 31
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]:             Total: 151
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Time:
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]:           Package: 0.03
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]:    Ceilometer config: 0.32
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]:    Transaction evaluation: 0.40
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]:    Catalog application: 0.41
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]:    Config retrieval: 0.43
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]:          Last run: 1764316910
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]:         Resources: 0.00
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]:             Total: 0.41
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]: Version:
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]:            Config: 1764316909
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51508]:            Puppet: 7.10.0
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_domain_name]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_name]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/user_domain_name]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/username]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/region_name]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/valid_interfaces]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/password]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_type]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_url]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/region_name]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_name]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_domain_name]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/username]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/user_domain_name]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/os_region_name]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/catalog_info]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/manager_interval]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_base_images]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_original_minimum_age_seconds]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_resized_minimum_age_seconds]/ensure: created
Nov 28 08:01:50 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/precache_concurrency]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/project_domain_name]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/user_domain_name]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Provider/Nova_config[compute/provider_config_location]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Provider/File[/etc/nova/provider_config]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain systemd[1]: libpod-e9e81dc4d463c3e33e320bcac7f504711c3df69a454f9c676b6265b24fcfa267.scope: Deactivated successfully.
Nov 28 08:01:51 np0005538513.localdomain systemd[1]: libpod-e9e81dc4d463c3e33e320bcac7f504711c3df69a454f9c676b6265b24fcfa267.scope: Consumed 2.995s CPU time.
Nov 28 08:01:51 np0005538513.localdomain podman[51465]: 2025-11-28 08:01:51.138628684 +0000 UTC m=+3.480027610 container died e9e81dc4d463c3e33e320bcac7f504711c3df69a454f9c676b6265b24fcfa267 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, name=rhosp17/openstack-ceilometer-central, distribution-scope=public, container_name=container-puppet-ceilometer, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-central, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, build-date=2025-11-19T00:11:59Z, com.redhat.component=openstack-ceilometer-central-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central)
Nov 28 08:01:51 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e9e81dc4d463c3e33e320bcac7f504711c3df69a454f9c676b6265b24fcfa267-userdata-shm.mount: Deactivated successfully.
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/use_cow_images]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/mkisofs_cmd]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-0de61d8517bf2f9ac2b1f4b59bb4938a9bcb5e653aa4685b7d899b570eb2d566-merged.mount: Deactivated successfully.
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_huge_pages]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain podman[52382]: 2025-11-28 08:01:51.284165365 +0000 UTC m=+0.132587646 container cleanup e9e81dc4d463c3e33e320bcac7f504711c3df69a454f9c676b6265b24fcfa267 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, build-date=2025-11-19T00:11:59Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 ceilometer-central, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-ceilometer-central-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, name=rhosp17/openstack-ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-ceilometer, io.openshift.expose-services=, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central)
Nov 28 08:01:51 np0005538513.localdomain systemd[1]: libpod-conmon-e9e81dc4d463c3e33e320bcac7f504711c3df69a454f9c676b6265b24fcfa267.scope: Deactivated successfully.
Nov 28 08:01:51 np0005538513.localdomain python3[51054]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ceilometer --conmon-pidfile /run/container-puppet-ceilometer.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538513 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config --env NAME=ceilometer --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::ceilometer::agent::polling
                                                         include tripleo::profile::base::ceilometer::agent::polling
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ceilometer --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ceilometer.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/resume_guests_state_on_host_boot]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[key_manager/backend]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/sync_power_state_interval]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/consecutive_build_service_disable_threshold]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/live_migration_wait_for_vif_plug]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52177]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52177]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52177]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52177]:    (file & line not available)
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/max_disk_devices_to_attach]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52177]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52177]:    (file & line not available)
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Vncproxy::Common/Nova_config[vnc/novncproxy_base_url]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]:    (file & line not available)
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/server_proxyclient_address]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]:    (file & line not available)
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[spice/enabled]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/default_floating_pool]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52177]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.25 seconds
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/valid_interfaces]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.22 seconds
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain ovs-vsctl[52536]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote=tcp:172.17.0.103:6642,tcp:172.17.0.104:6642,tcp:172.17.0.105:6642
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_type]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_uri]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain ovs-vsctl[52538]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-type=geneve
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_tunnelled]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-type]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain ovs-vsctl[52540]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-ip=172.19.0.106
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_inbound_addr]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_post_copy]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-ip]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_auto_converge]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tls]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tcp]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]: Notice: /Stage[main]/Rsyslog::Base/File[/etc/rsyslog.conf]/content: content changed '{sha256}d6f679f6a4eb6f33f9fc20c846cb30bef93811e1c86bc4da1946dc3100b826c3' to '{sha256}7963bd801fadd49a17561f4d3f80738c3f504b413b11c443432d8303138041f2'
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_user]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]: Notice: /Stage[main]/Rsyslog::Config::Global/Rsyslog::Component::Global_config[MaxMessageSize]/Rsyslog::Generate_concat[rsyslog::concat::global_config::MaxMessageSize]/Concat[/etc/rsyslog.d/00_rsyslog.conf]/File[/etc/rsyslog.d/00_rsyslog.conf]/ensure: defined content as '{sha256}a291d5cc6d5884a978161f4c7b5831d43edd07797cc590bae366e7f150b8643b'
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_secret_uuid]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain ovs-vsctl[52543]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:hostname=np0005538513.localdomain
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Rbd/File[/etc/nova/secret.xml]/ensure: defined content as '{sha256}af00b55795dabd7a8ca15fb762e773701eb5c91ea4ae135b9bcdde564d7077dd'
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]: Notice: /Stage[main]/Rsyslog::Config::Templates/Rsyslog::Component::Template[rsyslog-node-index]/Rsyslog::Generate_concat[rsyslog::concat::template::rsyslog-node-index]/Concat[/etc/rsyslog.d/50_openstack_logs.conf]/File[/etc/rsyslog.d/50_openstack_logs.conf]/ensure: defined content as '{sha256}bc8c213fdf58f8f987d47662b8c132319595d70e171ac4f45ccffbcd69fa92c7'
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:hostname]/value: value changed 'np0005538513.novalocal' to 'np0005538513.localdomain'
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_type]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]: Notice: Applied catalog in 0.12 seconds
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]: Application:
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]:    Initial environment: production
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]:    Converged environment: production
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]:          Run mode: user
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]: Changes:
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]:             Total: 3
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]: Events:
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]:           Success: 3
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]:             Total: 3
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]: Resources:
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]:           Skipped: 11
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]:           Changed: 3
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]:       Out of sync: 3
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]:             Total: 25
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]: Time:
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]:       Concat file: 0.00
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]:    Concat fragment: 0.00
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]:              File: 0.02
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]:    Transaction evaluation: 0.12
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]:    Catalog application: 0.12
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]:    Config retrieval: 0.27
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]:          Last run: 1764316911
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]:             Total: 0.12
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]: Version:
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]:            Config: 1764316911
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52233]:            Puppet: 7.10.0
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_pool]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain ovs-vsctl[52545]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge=br-int
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_ceph_conf]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_store_name]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain ovs-vsctl[52547]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote-probe-interval=60000
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_poll_interval]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_timeout]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote-probe-interval]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain ovs-vsctl[52554]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-openflow-probe-interval=60
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-openflow-probe-interval]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/preallocate_images]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/server_listen]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain ovs-vsctl[52557]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-monitor-all=true
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-monitor-all]/ensure: created
Nov 28 08:01:51 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain ovs-vsctl[52562]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-ofctrl-wait-before-clear=8000
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-ofctrl-wait-before-clear]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain ovs-vsctl[52564]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-tos=0
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-tos]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_disk_discard]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain ovs-vsctl[52572]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-chassis-mac-mappings=datacentre:fa:16:3e:ab:c7:63
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_machine_type]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-chassis-mac-mappings]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/enabled_perf_events]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain ovs-vsctl[52582]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge-mappings=datacentre:br-ex
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge-mappings]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/rx_queue_size]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain ovs-vsctl[52584]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-match-northd-version=false
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/tx_queue_size]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-match-northd-version]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/file_backed_memory]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain ovs-vsctl[52591]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:garp-max-timeout-sec=0
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/volume_use_multipath]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:garp-max-timeout-sec]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/num_pcie_ports]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/mem_stats_period_seconds]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain systemd[1]: libpod-01fcb2b9a34ed7002e2c1ab9793a6c6e2f4640510df7d4a54d984a2cebb71adf.scope: Deactivated successfully.
Nov 28 08:01:52 np0005538513.localdomain systemd[1]: libpod-01fcb2b9a34ed7002e2c1ab9793a6c6e2f4640510df7d4a54d984a2cebb71adf.scope: Consumed 2.342s CPU time.
Nov 28 08:01:52 np0005538513.localdomain podman[52118]: 2025-11-28 08:01:52.182250913 +0000 UTC m=+2.770700569 container died 01fcb2b9a34ed7002e2c1ab9793a6c6e2f4640510df7d4a54d984a2cebb71adf (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, url=https://www.redhat.com, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=container-puppet-rsyslog, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1761123044, build-date=2025-11-18T22:49:49Z, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.)
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/pmem_namespaces]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]: Notice: Applied catalog in 0.50 seconds
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]: Application:
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]:    Initial environment: production
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]:    Converged environment: production
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]:          Run mode: user
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]: Changes:
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]:             Total: 14
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]: Events:
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]:           Success: 14
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]:             Total: 14
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]: Resources:
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]:           Skipped: 12
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]:           Changed: 14
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]:       Out of sync: 14
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]:             Total: 29
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]: Time:
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]:              Exec: 0.02
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]:    Config retrieval: 0.28
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]:         Vs config: 0.44
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]:    Transaction evaluation: 0.49
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]:    Catalog application: 0.50
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]:          Last run: 1764316912
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]:             Total: 0.50
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]: Version:
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]:            Config: 1764316911
Nov 28 08:01:52 np0005538513.localdomain puppet-user[52177]:            Puppet: 7.10.0
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/swtpm_enabled]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_model_extra_flags]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/disk_cachemodes]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_filters]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_outputs]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_filters]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_outputs]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain systemd[1]: tmp-crun.SMWbOA.mount: Deactivated successfully.
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_filters]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_outputs]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-01fcb2b9a34ed7002e2c1ab9793a6c6e2f4640510df7d4a54d984a2cebb71adf-userdata-shm.mount: Deactivated successfully.
Nov 28 08:01:52 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cc098275d7f03b1c1fd504750e41e859fc80e308322902c4e50d7df8300f206a-merged.mount: Deactivated successfully.
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_filters]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_outputs]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_filters]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_outputs]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_filters]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_outputs]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_group]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_ro]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_rw]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain podman[52602]: 2025-11-28 08:01:52.355940847 +0000 UTC m=+0.160332856 container cleanup 01fcb2b9a34ed7002e2c1ab9793a6c6e2f4640510df7d4a54d984a2cebb71adf (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, container_name=container-puppet-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, name=rhosp17/openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, vendor=Red Hat, Inc.)
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_ro_perms]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_rw_perms]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_group]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain systemd[1]: libpod-conmon-01fcb2b9a34ed7002e2c1ab9793a6c6e2f4640510df7d4a54d984a2cebb71adf.scope: Deactivated successfully.
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_ro]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_rw]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain python3[51054]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-rsyslog --conmon-pidfile /run/container-puppet-rsyslog.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538513 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment --env NAME=rsyslog --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::logging::rsyslog --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-rsyslog --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-rsyslog.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_ro_perms]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_rw_perms]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_group]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_ro]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_rw]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_ro_perms]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_rw_perms]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_group]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_ro]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_rw]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_ro_perms]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_rw_perms]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_group]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_ro]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_rw]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_ro_perms]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_rw_perms]/ensure: created
Nov 28 08:01:52 np0005538513.localdomain systemd[1]: libpod-d50eb1873bbbfb25c378d8d09e1b58b68df7de1660980d5d6de8b3858f88a068.scope: Deactivated successfully.
Nov 28 08:01:52 np0005538513.localdomain systemd[1]: libpod-d50eb1873bbbfb25c378d8d09e1b58b68df7de1660980d5d6de8b3858f88a068.scope: Consumed 2.859s CPU time.
Nov 28 08:01:52 np0005538513.localdomain podman[52110]: 2025-11-28 08:01:52.60973222 +0000 UTC m=+3.209769579 container died d50eb1873bbbfb25c378d8d09e1b58b68df7de1660980d5d6de8b3858f88a068 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=container-puppet-ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, distribution-scope=public, build-date=2025-11-18T23:34:05Z, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=)
Nov 28 08:01:52 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Qemu/Augeas[qemu-conf-limits]/returns: executed successfully
Nov 28 08:01:53 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-4619bcf03a81c0f8085ac49d4104b51527a7e62eb19e6e3702042c0e09f1d20b-merged.mount: Deactivated successfully.
Nov 28 08:01:53 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d50eb1873bbbfb25c378d8d09e1b58b68df7de1660980d5d6de8b3858f88a068-userdata-shm.mount: Deactivated successfully.
Nov 28 08:01:53 np0005538513.localdomain podman[52683]: 2025-11-28 08:01:53.75864049 +0000 UTC m=+1.138717571 container cleanup d50eb1873bbbfb25c378d8d09e1b58b68df7de1660980d5d6de8b3858f88a068 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-ovn-controller, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, container_name=container-puppet-ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']})
Nov 28 08:01:53 np0005538513.localdomain systemd[1]: libpod-conmon-d50eb1873bbbfb25c378d8d09e1b58b68df7de1660980d5d6de8b3858f88a068.scope: Deactivated successfully.
Nov 28 08:01:53 np0005538513.localdomain python3[51054]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ovn_controller --conmon-pidfile /run/container-puppet-ovn_controller.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538513 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,vs_config,exec --env NAME=ovn_controller --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::neutron::agents::ovn
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ovn_controller --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ovn_controller.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /etc/sysconfig/modules:/etc/sysconfig/modules --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Nov 28 08:01:53 np0005538513.localdomain podman[52231]: 2025-11-28 08:01:49.833324774 +0000 UTC m=+0.084557881 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Nov 28 08:01:53 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Migration::Qemu/Augeas[qemu-conf-migration-ports]/returns: executed successfully
Nov 28 08:01:53 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/debug]/ensure: created
Nov 28 08:01:53 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/log_dir]/ensure: created
Nov 28 08:01:53 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/backend]/ensure: created
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/enabled]/ensure: created
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/memcache_servers]/ensure: created
Nov 28 08:01:54 np0005538513.localdomain podman[52745]: 2025-11-28 08:01:54.046622035 +0000 UTC m=+0.090010222 container create 414d294dbe8cc421ef37142db37c20e224431ab3dc4c4553c8883b904f3c4bbd (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, config_id=tripleo_puppet_step1, build-date=2025-11-19T00:23:27Z, io.openshift.expose-services=, container_name=container-puppet-neutron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-server, summary=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-neutron-server-container)
Nov 28 08:01:54 np0005538513.localdomain systemd[1]: Started libpod-conmon-414d294dbe8cc421ef37142db37c20e224431ab3dc4c4553c8883b904f3c4bbd.scope.
Nov 28 08:01:54 np0005538513.localdomain podman[52745]: 2025-11-28 08:01:53.993613043 +0000 UTC m=+0.037001230 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/tls_enabled]/ensure: created
Nov 28 08:01:54 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:01:54 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95fcfabaf1695644f149e65ad3965d41d9d311fa2ace2d9f1efd734cfe54eb68/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 28 08:01:54 np0005538513.localdomain podman[52745]: 2025-11-28 08:01:54.122186333 +0000 UTC m=+0.165574520 container init 414d294dbe8cc421ef37142db37c20e224431ab3dc4c4553c8883b904f3c4bbd (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-server-container, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=container-puppet-neutron, description=Red Hat OpenStack Platform 17.1 neutron-server, build-date=2025-11-19T00:23:27Z)
Nov 28 08:01:54 np0005538513.localdomain podman[52745]: 2025-11-28 08:01:54.1345159 +0000 UTC m=+0.177904087 container start 414d294dbe8cc421ef37142db37c20e224431ab3dc4c4553c8883b904f3c4bbd (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-server, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-server, version=17.1.12, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=container-puppet-neutron, com.redhat.component=openstack-neutron-server-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:23:27Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, vendor=Red Hat, Inc.)
Nov 28 08:01:54 np0005538513.localdomain podman[52745]: 2025-11-28 08:01:54.135099388 +0000 UTC m=+0.178487575 container attach 414d294dbe8cc421ef37142db37c20e224431ab3dc4c4553c8883b904f3c4bbd (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, tcib_managed=true, build-date=2025-11-19T00:23:27Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp17/openstack-neutron-server, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-server-container, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_puppet_step1, container_name=container-puppet-neutron, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-server)
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/ssl]/ensure: created
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova/Oslo::Messaging::Default[nova_config]/Nova_config[DEFAULT/transport_url]/ensure: created
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/driver]/ensure: created
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/transport_url]/ensure: created
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova/Oslo::Concurrency[nova_config]/Nova_config[oslo_concurrency/lock_path]/ensure: created
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_type]/ensure: created
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/region_name]/ensure: created
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_url]/ensure: created
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/username]/ensure: created
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/password]/ensure: created
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/user_domain_name]/ensure: created
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_name]/ensure: created
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_domain_name]/ensure: created
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/send_service_user_token]/ensure: created
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Notice: /Stage[main]/Ssh::Server::Config/Concat[/etc/ssh/sshd_config]/File[/etc/ssh/sshd_config]/ensure: defined content as '{sha256}3ccd56cc76ec60fa08fd698d282c9c89b1e8c485a00f47d57569ed8f6f8a16e4'
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Notice: Applied catalog in 4.88 seconds
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Application:
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:    Initial environment: production
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:    Converged environment: production
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:          Run mode: user
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Changes:
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:             Total: 183
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Events:
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:           Success: 183
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:             Total: 183
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Resources:
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:           Changed: 183
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:       Out of sync: 183
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:           Skipped: 57
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:             Total: 487
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Time:
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:       Concat file: 0.00
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:    Concat fragment: 0.00
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:            Anchor: 0.00
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:         File line: 0.00
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:    Virtlogd config: 0.00
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:    Virtqemud config: 0.01
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:              Exec: 0.01
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:    Virtsecretd config: 0.02
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:    Virtstoraged config: 0.02
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:           Package: 0.02
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:    Virtproxyd config: 0.03
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:              File: 0.03
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:    Virtnodedevd config: 0.06
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:            Augeas: 1.39
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:    Config retrieval: 1.65
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:          Last run: 1764316914
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:       Nova config: 3.08
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:    Transaction evaluation: 4.87
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:    Catalog application: 4.88
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:         Resources: 0.00
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:             Total: 4.88
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]: Version:
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:            Config: 1764316908
Nov 28 08:01:54 np0005538513.localdomain puppet-user[51364]:            Puppet: 7.10.0
Nov 28 08:01:55 np0005538513.localdomain systemd[1]: libpod-700dcb6d86cc0871b7bbf5977f2c9ead9a93d5086f4486ed0144de2a9b1c2b0e.scope: Deactivated successfully.
Nov 28 08:01:55 np0005538513.localdomain systemd[1]: libpod-700dcb6d86cc0871b7bbf5977f2c9ead9a93d5086f4486ed0144de2a9b1c2b0e.scope: Consumed 8.714s CPU time.
Nov 28 08:01:55 np0005538513.localdomain podman[51227]: 2025-11-28 08:01:55.839845727 +0000 UTC m=+10.689864037 container died 700dcb6d86cc0871b7bbf5977f2c9ead9a93d5086f4486ed0144de2a9b1c2b0e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, config_id=tripleo_puppet_step1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=container-puppet-nova_libvirt, name=rhosp17/openstack-nova-libvirt, batch=17.1_20251118.1, tcib_managed=true, release=1761123044, build-date=2025-11-19T00:35:22Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 28 08:01:55 np0005538513.localdomain puppet-user[52776]: Error: Facter: error while resolving custom fact "haproxy_version": undefined method `strip' for nil:NilClass
Nov 28 08:01:55 np0005538513.localdomain systemd[1]: tmp-crun.5f2XKT.mount: Deactivated successfully.
Nov 28 08:01:55 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-700dcb6d86cc0871b7bbf5977f2c9ead9a93d5086f4486ed0144de2a9b1c2b0e-userdata-shm.mount: Deactivated successfully.
Nov 28 08:01:55 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-b8fe5cc4372bb33f59044e24a5715f4f503612636c3ff0d7cebe06eec67fcb4f-merged.mount: Deactivated successfully.
Nov 28 08:01:56 np0005538513.localdomain podman[52822]: 2025-11-28 08:01:56.028505628 +0000 UTC m=+0.178562905 container cleanup 700dcb6d86cc0871b7bbf5977f2c9ead9a93d5086f4486ed0144de2a9b1c2b0e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, url=https://www.redhat.com, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, config_id=tripleo_puppet_step1, name=rhosp17/openstack-nova-libvirt, vcs-type=git, build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., container_name=container-puppet-nova_libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1)
Nov 28 08:01:56 np0005538513.localdomain systemd[1]: libpod-conmon-700dcb6d86cc0871b7bbf5977f2c9ead9a93d5086f4486ed0144de2a9b1c2b0e.scope: Deactivated successfully.
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]:    (file & line not available)
Nov 28 08:01:56 np0005538513.localdomain python3[51054]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-nova_libvirt --conmon-pidfile /run/container-puppet-nova_libvirt.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538513 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password --env NAME=nova_libvirt --env STEP_CONFIG=include ::tripleo::packages
                                                         # TODO(emilien): figure how to deal with libvirt profile.
                                                         # We'll probably treat it like we do with Neutron plugins.
                                                         # Until then, just include it in the default nova-compute role.
                                                         include tripleo::profile::base::nova::compute::libvirt
                                                         
                                                         include tripleo::profile::base::nova::libvirt
                                                         
                                                         include tripleo::profile::base::nova::compute::libvirt_guests
                                                         
                                                         include tripleo::profile::base::sshd
                                                         include tripleo::profile::base::nova::migration::target --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-nova_libvirt --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-nova_libvirt.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]:    (file & line not available)
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Warning: Unknown variable: 'dhcp_agents_per_net'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/neutron.pp, line: 154, column: 37)
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.62 seconds
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/host]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dns_domain]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dhcp_agent_notification]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/global_physnet_mtu]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/vlan_transparent]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron/Neutron_config[agent/root_helper]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron/Neutron_config[agent/report_interval]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/debug]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_host]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_protocol]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_workers]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/state_path]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/hwol_qos_enabled]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[agent/root_helper]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection_timeout]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovsdb_probe_interval]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_nb_connection]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_sb_connection]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/transport_url]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/control_exchange]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron/Oslo::Concurrency[neutron_config]/Neutron_config[oslo_concurrency/lock_path]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/driver]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/transport_url]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Nov 28 08:01:56 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/debug]/ensure: created
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/log_dir]/ensure: created
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]: Notice: Applied catalog in 0.49 seconds
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]: Application:
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]:    Initial environment: production
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]:    Converged environment: production
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]:          Run mode: user
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]: Changes:
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]:             Total: 33
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]: Events:
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]:           Success: 33
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]:             Total: 33
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]: Resources:
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]:           Skipped: 21
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]:           Changed: 33
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]:       Out of sync: 33
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]:             Total: 155
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]: Time:
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]:         Resources: 0.00
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]:    Ovn metadata agent config: 0.02
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]:    Neutron config: 0.41
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]:    Transaction evaluation: 0.48
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]:    Catalog application: 0.49
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]:    Config retrieval: 0.69
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]:          Last run: 1764316917
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]:             Total: 0.49
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]: Version:
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]:            Config: 1764316916
Nov 28 08:01:57 np0005538513.localdomain puppet-user[52776]:            Puppet: 7.10.0
Nov 28 08:01:57 np0005538513.localdomain systemd[1]: libpod-414d294dbe8cc421ef37142db37c20e224431ab3dc4c4553c8883b904f3c4bbd.scope: Deactivated successfully.
Nov 28 08:01:57 np0005538513.localdomain systemd[1]: libpod-414d294dbe8cc421ef37142db37c20e224431ab3dc4c4553c8883b904f3c4bbd.scope: Consumed 3.508s CPU time.
Nov 28 08:01:57 np0005538513.localdomain podman[52961]: 2025-11-28 08:01:57.778382214 +0000 UTC m=+0.054940423 container died 414d294dbe8cc421ef37142db37c20e224431ab3dc4c4553c8883b904f3c4bbd (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-19T00:23:27Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=container-puppet-neutron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.component=openstack-neutron-server-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, description=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-server)
Nov 28 08:01:57 np0005538513.localdomain systemd[1]: tmp-crun.OZZPSB.mount: Deactivated successfully.
Nov 28 08:01:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-414d294dbe8cc421ef37142db37c20e224431ab3dc4c4553c8883b904f3c4bbd-userdata-shm.mount: Deactivated successfully.
Nov 28 08:01:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-95fcfabaf1695644f149e65ad3965d41d9d311fa2ace2d9f1efd734cfe54eb68-merged.mount: Deactivated successfully.
Nov 28 08:01:57 np0005538513.localdomain podman[52961]: 2025-11-28 08:01:57.862562283 +0000 UTC m=+0.139120392 container cleanup 414d294dbe8cc421ef37142db37c20e224431ab3dc4c4553c8883b904f3c4bbd (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_puppet_step1, tcib_managed=true, version=17.1.12, vcs-type=git, build-date=2025-11-19T00:23:27Z, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, release=1761123044, distribution-scope=public, name=rhosp17/openstack-neutron-server, container_name=container-puppet-neutron, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-server-container, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']})
Nov 28 08:01:57 np0005538513.localdomain systemd[1]: libpod-conmon-414d294dbe8cc421ef37142db37c20e224431ab3dc4c4553c8883b904f3c4bbd.scope: Deactivated successfully.
Nov 28 08:01:57 np0005538513.localdomain python3[51054]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-neutron --conmon-pidfile /run/container-puppet-neutron.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538513 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config --env NAME=neutron --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::neutron::ovn_metadata
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-neutron --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-neutron.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Nov 28 08:01:58 np0005538513.localdomain sudo[51052]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:58 np0005538513.localdomain sudo[53011]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uldutstyqcelunojezalfhvxcwizpcdh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:01:58 np0005538513.localdomain sudo[53011]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:58 np0005538513.localdomain python3[53013]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:01:58 np0005538513.localdomain sudo[53011]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:59 np0005538513.localdomain sudo[53027]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrmbcskibegozpdhfrmicdexgnntarwb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:01:59 np0005538513.localdomain sudo[53027]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:59 np0005538513.localdomain sudo[53027]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:59 np0005538513.localdomain sudo[53043]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzivetkzxpxbxblnnuxccighgbrqbfhv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:01:59 np0005538513.localdomain sudo[53043]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:59 np0005538513.localdomain python3[53045]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:01:59 np0005538513.localdomain sudo[53043]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:00 np0005538513.localdomain sudo[53093]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpvbhszovjkbrszisikezawmlbmkrmtm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:00 np0005538513.localdomain sudo[53093]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:00 np0005538513.localdomain python3[53095]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:02:00 np0005538513.localdomain sudo[53093]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:00 np0005538513.localdomain sudo[53136]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdwhpgxrcmjfrxgqropaifpdrakppswd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:00 np0005538513.localdomain sudo[53136]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:00 np0005538513.localdomain python3[53138]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316919.989835-84859-62296000541824/source dest=/usr/libexec/tripleo-container-shutdown mode=0700 owner=root group=root _original_basename=tripleo-container-shutdown follow=False checksum=7d67b1986212f5548057505748cd74cfcf9c0d35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:02:00 np0005538513.localdomain sudo[53136]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:01 np0005538513.localdomain sudo[53198]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-moasotzyjfxkrhyzdetbvnrpcuncvzey ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:01 np0005538513.localdomain sudo[53198]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:01 np0005538513.localdomain python3[53200]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:02:01 np0005538513.localdomain sudo[53198]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:01 np0005538513.localdomain sudo[53241]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uhqmhiahjzqrruaagcunoqcnisodiykr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:01 np0005538513.localdomain sudo[53241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:01 np0005538513.localdomain python3[53243]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316920.861091-84859-277553588498609/source dest=/usr/libexec/tripleo-start-podman-container mode=0700 owner=root group=root _original_basename=tripleo-start-podman-container follow=False checksum=536965633b8d3b1ce794269ffb07be0105a560a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:02:01 np0005538513.localdomain sudo[53241]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:01 np0005538513.localdomain sudo[53303]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oamnnyimsjdovhvxcbxaanjuynhmbrew ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:01 np0005538513.localdomain sudo[53303]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:02 np0005538513.localdomain python3[53305]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:02:02 np0005538513.localdomain sudo[53303]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:02 np0005538513.localdomain sudo[53346]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovgowspcgxetdowfixflzhehrvphjshz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:02 np0005538513.localdomain sudo[53346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:02 np0005538513.localdomain python3[53348]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316921.7847958-85003-138959531371064/source dest=/usr/lib/systemd/system/tripleo-container-shutdown.service mode=0644 owner=root group=root _original_basename=tripleo-container-shutdown-service follow=False checksum=66c1d41406ba8714feb9ed0a35259a7a57ef9707 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:02:02 np0005538513.localdomain sudo[53346]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:02 np0005538513.localdomain sudo[53408]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcdxdaiwssdcjkqxsbneoegfxpqvxixz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:02 np0005538513.localdomain sudo[53408]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:03 np0005538513.localdomain python3[53410]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:02:03 np0005538513.localdomain sudo[53408]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:03 np0005538513.localdomain sudo[53451]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvybqwsnbsgzlxiuwbralyxozhkozyiv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:03 np0005538513.localdomain sudo[53451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:03 np0005538513.localdomain python3[53453]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316922.690198-85033-39475210643575/source dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset mode=0644 owner=root group=root _original_basename=91-tripleo-container-shutdown-preset follow=False checksum=bccb1207dcbcfaa5ca05f83c8f36ce4c2460f081 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:02:03 np0005538513.localdomain sudo[53451]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:03 np0005538513.localdomain sudo[53481]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubkszuarinrwxkqrdjqmrwhbsjpsanua ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:03 np0005538513.localdomain sudo[53481]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:03 np0005538513.localdomain python3[53483]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:02:03 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:02:04 np0005538513.localdomain systemd-rc-local-generator[53507]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:02:04 np0005538513.localdomain systemd-sysv-generator[53511]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:02:04 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:02:04 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:02:04 np0005538513.localdomain systemd-rc-local-generator[53542]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:02:04 np0005538513.localdomain systemd-sysv-generator[53546]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:02:04 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:02:04 np0005538513.localdomain systemd[1]: Starting TripleO Container Shutdown...
Nov 28 08:02:04 np0005538513.localdomain systemd[1]: Finished TripleO Container Shutdown.
Nov 28 08:02:04 np0005538513.localdomain sudo[53481]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:04 np0005538513.localdomain sudo[53604]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ipcunwrqphuapfxncvtynmpbpzcikxdp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:04 np0005538513.localdomain sudo[53604]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:05 np0005538513.localdomain python3[53606]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:02:05 np0005538513.localdomain sudo[53604]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:05 np0005538513.localdomain sudo[53647]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbrwhncpktwecuxpjjchrbxiunhimqpo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:05 np0005538513.localdomain sudo[53647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:05 np0005538513.localdomain python3[53649]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316924.7225726-85081-142511355041184/source dest=/usr/lib/systemd/system/netns-placeholder.service mode=0644 owner=root group=root _original_basename=netns-placeholder-service follow=False checksum=8e9c6d5ce3a6e7f71c18780ec899f32f23de4c71 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:02:05 np0005538513.localdomain sudo[53647]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:05 np0005538513.localdomain sudo[53709]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxnpbbpgzcxvuuijeirisgnqcgucfwfb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:05 np0005538513.localdomain sudo[53709]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:06 np0005538513.localdomain python3[53711]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:02:06 np0005538513.localdomain sudo[53709]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:06 np0005538513.localdomain sudo[53752]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kbouwagrpeaecwcyzghoivyjnvetlyuk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:06 np0005538513.localdomain sudo[53752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:06 np0005538513.localdomain python3[53754]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316925.683529-85215-240822367027995/source dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset mode=0644 owner=root group=root _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:02:06 np0005538513.localdomain sudo[53752]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:06 np0005538513.localdomain sudo[53782]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tgvbktbnvlshjfjnwstjsyutximezowg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:06 np0005538513.localdomain sudo[53782]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:06 np0005538513.localdomain python3[53784]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:02:06 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:02:07 np0005538513.localdomain systemd-rc-local-generator[53807]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:02:07 np0005538513.localdomain systemd-sysv-generator[53814]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:02:07 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:02:07 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:02:07 np0005538513.localdomain systemd-sysv-generator[53851]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:02:07 np0005538513.localdomain systemd-rc-local-generator[53846]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:02:07 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:02:07 np0005538513.localdomain systemd[1]: Starting Create netns directory...
Nov 28 08:02:07 np0005538513.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 08:02:07 np0005538513.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 08:02:07 np0005538513.localdomain systemd[1]: Finished Create netns directory.
Nov 28 08:02:07 np0005538513.localdomain sudo[53782]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:07 np0005538513.localdomain sudo[53874]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dezqzlnmagrduqnwwiqrgwzxdhqirafj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:07 np0005538513.localdomain sudo[53874]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:08 np0005538513.localdomain python3[53876]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Nov 28 08:02:08 np0005538513.localdomain python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for metrics_qdr, new hash: d871e9c8e59a273b3131348d6d370386
Nov 28 08:02:08 np0005538513.localdomain python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for collectd, new hash: da9a0dc7b40588672419e3ce10063e21
Nov 28 08:02:08 np0005538513.localdomain python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for iscsid, new hash: c9c242145d21d40ef98889981c05ca84
Nov 28 08:02:08 np0005538513.localdomain python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtlogd_wrapper, new hash: 0f0904943dda1bf1d123bdf96d71020f
Nov 28 08:02:08 np0005538513.localdomain python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtnodedevd, new hash: 0f0904943dda1bf1d123bdf96d71020f
Nov 28 08:02:08 np0005538513.localdomain python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtproxyd, new hash: 0f0904943dda1bf1d123bdf96d71020f
Nov 28 08:02:08 np0005538513.localdomain python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtqemud, new hash: 0f0904943dda1bf1d123bdf96d71020f
Nov 28 08:02:08 np0005538513.localdomain python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtsecretd, new hash: 0f0904943dda1bf1d123bdf96d71020f
Nov 28 08:02:08 np0005538513.localdomain python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtstoraged, new hash: 0f0904943dda1bf1d123bdf96d71020f
Nov 28 08:02:08 np0005538513.localdomain python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for rsyslog, new hash: 138ccb6252fd89d73a6c37a3f993f3eb
Nov 28 08:02:08 np0005538513.localdomain python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_compute, new hash: 684be86bd5476b8c779d4769a9adf982
Nov 28 08:02:08 np0005538513.localdomain python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_ipmi, new hash: 684be86bd5476b8c779d4769a9adf982
Nov 28 08:02:08 np0005538513.localdomain python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for logrotate_crond, new hash: 53ed83bb0cae779ff95edb2002262c6f
Nov 28 08:02:08 np0005538513.localdomain python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for nova_libvirt_init_secret, new hash: 0f0904943dda1bf1d123bdf96d71020f
Nov 28 08:02:08 np0005538513.localdomain python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for nova_migration_target, new hash: 0f0904943dda1bf1d123bdf96d71020f
Nov 28 08:02:08 np0005538513.localdomain python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for ovn_metadata_agent, new hash: dfc67f7a8d1f67548a53836c6db3b704
Nov 28 08:02:08 np0005538513.localdomain python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for nova_compute, new hash: c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f
Nov 28 08:02:08 np0005538513.localdomain python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for nova_wait_for_compute_service, new hash: 0f0904943dda1bf1d123bdf96d71020f
Nov 28 08:02:08 np0005538513.localdomain sudo[53874]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:08 np0005538513.localdomain sudo[53890]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewnljiteovtqsoeglwhhxtskowrutica ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:08 np0005538513.localdomain sudo[53890]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:08 np0005538513.localdomain sudo[53890]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:09 np0005538513.localdomain sudo[53932]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnmcperkwcyqffbwoyozzdssqgivfvyt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:09 np0005538513.localdomain sudo[53932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:09 np0005538513.localdomain python3[53934]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step1 config_dir=/var/lib/tripleo-config/container-startup-config/step_1 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Nov 28 08:02:09 np0005538513.localdomain podman[53970]: 2025-11-28 08:02:09.780880589 +0000 UTC m=+0.089464568 container create eaab9f0d01f832d0ebbf25d29bac31f0bdd739e36a3ea01c9b6ae9413e729d4b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, container_name=metrics_qdr_init_logs, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc.)
Nov 28 08:02:09 np0005538513.localdomain systemd[1]: Started libpod-conmon-eaab9f0d01f832d0ebbf25d29bac31f0bdd739e36a3ea01c9b6ae9413e729d4b.scope.
Nov 28 08:02:09 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:02:09 np0005538513.localdomain podman[53970]: 2025-11-28 08:02:09.730903187 +0000 UTC m=+0.039487196 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Nov 28 08:02:09 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb58edccbf984a5f7da43e28613aa15e67186fff8a0f55da5237efa2fd62c40a/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff)
Nov 28 08:02:09 np0005538513.localdomain podman[53970]: 2025-11-28 08:02:09.844057553 +0000 UTC m=+0.152641562 container init eaab9f0d01f832d0ebbf25d29bac31f0bdd739e36a3ea01c9b6ae9413e729d4b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr_init_logs, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step1, distribution-scope=public)
Nov 28 08:02:09 np0005538513.localdomain podman[53970]: 2025-11-28 08:02:09.858081922 +0000 UTC m=+0.166665901 container start eaab9f0d01f832d0ebbf25d29bac31f0bdd739e36a3ea01c9b6ae9413e729d4b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr_init_logs, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64)
Nov 28 08:02:09 np0005538513.localdomain podman[53970]: 2025-11-28 08:02:09.858348081 +0000 UTC m=+0.166932130 container attach eaab9f0d01f832d0ebbf25d29bac31f0bdd739e36a3ea01c9b6ae9413e729d4b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, vcs-type=git, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, container_name=metrics_qdr_init_logs, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public)
Nov 28 08:02:09 np0005538513.localdomain systemd[1]: libpod-eaab9f0d01f832d0ebbf25d29bac31f0bdd739e36a3ea01c9b6ae9413e729d4b.scope: Deactivated successfully.
Nov 28 08:02:09 np0005538513.localdomain podman[53970]: 2025-11-28 08:02:09.863551187 +0000 UTC m=+0.172135196 container died eaab9f0d01f832d0ebbf25d29bac31f0bdd739e36a3ea01c9b6ae9413e729d4b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, tcib_managed=true, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr_init_logs, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:02:09 np0005538513.localdomain podman[53990]: 2025-11-28 08:02:09.955110731 +0000 UTC m=+0.078509686 container cleanup eaab9f0d01f832d0ebbf25d29bac31f0bdd739e36a3ea01c9b6ae9413e729d4b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr_init_logs, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:02:09 np0005538513.localdomain systemd[1]: libpod-conmon-eaab9f0d01f832d0ebbf25d29bac31f0bdd739e36a3ea01c9b6ae9413e729d4b.scope: Deactivated successfully.
Nov 28 08:02:09 np0005538513.localdomain python3[53934]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr_init_logs --conmon-pidfile /run/metrics_qdr_init_logs.pid --detach=False --label config_id=tripleo_step1 --label container_name=metrics_qdr_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr_init_logs.log --network none --privileged=False --user root --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 /bin/bash -c chown -R qdrouterd:qdrouterd /var/log/qdrouterd
Nov 28 08:02:10 np0005538513.localdomain podman[54064]: 2025-11-28 08:02:10.441520817 +0000 UTC m=+0.086058389 container create 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 28 08:02:10 np0005538513.localdomain systemd[1]: Started libpod-conmon-63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.scope.
Nov 28 08:02:10 np0005538513.localdomain podman[54064]: 2025-11-28 08:02:10.401209425 +0000 UTC m=+0.045747057 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Nov 28 08:02:10 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:02:10 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/701465e48d77119796b927e784ad3d56a8f99cc392002110647f3a4cfb83b9e9/merged/var/lib/qdrouterd supports timestamps until 2038 (0x7fffffff)
Nov 28 08:02:10 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/701465e48d77119796b927e784ad3d56a8f99cc392002110647f3a4cfb83b9e9/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff)
Nov 28 08:02:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:02:10 np0005538513.localdomain podman[54064]: 2025-11-28 08:02:10.600251262 +0000 UTC m=+0.244788824 container init 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.buildah.version=1.41.4, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, version=17.1.12, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:02:10 np0005538513.localdomain sudo[54085]: qdrouterd : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:02:10 np0005538513.localdomain sudo[54085]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42465)
Nov 28 08:02:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:02:10 np0005538513.localdomain podman[54064]: 2025-11-28 08:02:10.643607852 +0000 UTC m=+0.288145414 container start 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:02:10 np0005538513.localdomain python3[53934]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr --conmon-pidfile /run/metrics_qdr.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=d871e9c8e59a273b3131348d6d370386 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step1 --label container_name=metrics_qdr --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr.log --network host --privileged=False --user qdrouterd --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro --volume /var/lib/metrics_qdr:/var/lib/qdrouterd:z --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Nov 28 08:02:10 np0005538513.localdomain sudo[54085]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:10 np0005538513.localdomain podman[54087]: 2025-11-28 08:02:10.747078067 +0000 UTC m=+0.095941385 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=starting, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team)
Nov 28 08:02:10 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-bb58edccbf984a5f7da43e28613aa15e67186fff8a0f55da5237efa2fd62c40a-merged.mount: Deactivated successfully.
Nov 28 08:02:10 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eaab9f0d01f832d0ebbf25d29bac31f0bdd739e36a3ea01c9b6ae9413e729d4b-userdata-shm.mount: Deactivated successfully.
Nov 28 08:02:10 np0005538513.localdomain sudo[53932]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:10 np0005538513.localdomain podman[54087]: 2025-11-28 08:02:10.964670689 +0000 UTC m=+0.313534007 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, vcs-type=git, release=1761123044, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:02:11 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:02:11 np0005538513.localdomain sudo[54157]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ohjboimkvtkxtgmwgcqhjmkohybdlalq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:11 np0005538513.localdomain sudo[54157]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:11 np0005538513.localdomain python3[54159]: ansible-file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:02:11 np0005538513.localdomain sudo[54157]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:11 np0005538513.localdomain sudo[54173]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpqekmmzziobesoqtcqvtkdphrpeuhbl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:11 np0005538513.localdomain sudo[54173]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:11 np0005538513.localdomain python3[54175]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_metrics_qdr_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:02:11 np0005538513.localdomain sudo[54173]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:12 np0005538513.localdomain sudo[54234]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjbqjogdjnqrirwzsbojyrrsqnagzdjs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:12 np0005538513.localdomain sudo[54234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:12 np0005538513.localdomain python3[54236]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316931.6473696-85346-43577087931402/source dest=/etc/systemd/system/tripleo_metrics_qdr.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:02:12 np0005538513.localdomain sudo[54234]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:12 np0005538513.localdomain sudo[54250]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rofwvsrdrwvcbzzkxsfqmyrvfjfcvbeh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:12 np0005538513.localdomain sudo[54250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:12 np0005538513.localdomain python3[54252]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 08:02:12 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:02:12 np0005538513.localdomain systemd-rc-local-generator[54275]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:02:12 np0005538513.localdomain systemd-sysv-generator[54280]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:02:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:02:12 np0005538513.localdomain sudo[54250]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:13 np0005538513.localdomain sudo[54302]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfdrgdgnprxxkalbvpjfopokmewmyssz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:13 np0005538513.localdomain sudo[54302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:13 np0005538513.localdomain python3[54304]: ansible-systemd Invoked with state=restarted name=tripleo_metrics_qdr.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:02:13 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:02:13 np0005538513.localdomain systemd-sysv-generator[54334]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:02:13 np0005538513.localdomain systemd-rc-local-generator[54329]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:02:13 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:02:13 np0005538513.localdomain systemd[1]: Starting metrics_qdr container...
Nov 28 08:02:13 np0005538513.localdomain systemd[1]: Started metrics_qdr container.
Nov 28 08:02:13 np0005538513.localdomain sudo[54302]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:13 np0005538513.localdomain sudo[54382]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-raygytftyxxixxolfsqndftkgonkcjdu ; /usr/bin/python3
Nov 28 08:02:13 np0005538513.localdomain sudo[54382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:14 np0005538513.localdomain python3[54384]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks1.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:02:14 np0005538513.localdomain sudo[54382]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:14 np0005538513.localdomain sudo[54430]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icfxovxlejmzzzxouopavxpzueccbqzp ; /usr/bin/python3
Nov 28 08:02:14 np0005538513.localdomain sudo[54430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:14 np0005538513.localdomain sudo[54430]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:14 np0005538513.localdomain sudo[54473]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qypwsqrgouymnhuholtjzrozgthewwnc ; /usr/bin/python3
Nov 28 08:02:14 np0005538513.localdomain sudo[54473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:15 np0005538513.localdomain sudo[54473]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:15 np0005538513.localdomain sudo[54503]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zenhidjnjzaqorvsopwsaedbwseyhofh ; /usr/bin/python3
Nov 28 08:02:15 np0005538513.localdomain sudo[54503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:15 np0005538513.localdomain python3[54505]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks1.json short_hostname=np0005538513 step=1 update_config_hash_only=False
Nov 28 08:02:15 np0005538513.localdomain sudo[54503]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:15 np0005538513.localdomain sudo[54519]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzvlyclmwcomsnpqjzmneciqtfnrfuxm ; /usr/bin/python3
Nov 28 08:02:15 np0005538513.localdomain sudo[54519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:16 np0005538513.localdomain python3[54521]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:02:16 np0005538513.localdomain sudo[54519]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:16 np0005538513.localdomain sudo[54535]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efhcpwclfvscsamdqgtzurwoaagesimz ; /usr/bin/python3
Nov 28 08:02:16 np0005538513.localdomain sudo[54535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:16 np0005538513.localdomain python3[54537]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True
Nov 28 08:02:16 np0005538513.localdomain sudo[54535]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:23 np0005538513.localdomain sudo[54538]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:02:23 np0005538513.localdomain sudo[54538]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:02:23 np0005538513.localdomain sudo[54538]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:23 np0005538513.localdomain sudo[54553]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:02:23 np0005538513.localdomain sudo[54553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:02:24 np0005538513.localdomain sudo[54553]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:35 np0005538513.localdomain sudo[54600]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:02:35 np0005538513.localdomain sudo[54600]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:02:35 np0005538513.localdomain sudo[54600]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:41 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:02:41 np0005538513.localdomain podman[54615]: 2025-11-28 08:02:41.848226492 +0000 UTC m=+0.083427375 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_id=tripleo_step1, release=1761123044, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12)
Nov 28 08:02:42 np0005538513.localdomain podman[54615]: 2025-11-28 08:02:42.055465872 +0000 UTC m=+0.290666755 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12)
Nov 28 08:02:42 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:03:01 np0005538513.localdomain anacron[6702]: Job `cron.monthly' started
Nov 28 08:03:01 np0005538513.localdomain anacron[6702]: Job `cron.monthly' terminated
Nov 28 08:03:01 np0005538513.localdomain anacron[6702]: Normal exit (3 jobs run)
Nov 28 08:03:12 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:03:12 np0005538513.localdomain systemd[1]: tmp-crun.nVrcET.mount: Deactivated successfully.
Nov 28 08:03:12 np0005538513.localdomain podman[54646]: 2025-11-28 08:03:12.837826933 +0000 UTC m=+0.072841765 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, tcib_managed=true, architecture=x86_64, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Nov 28 08:03:13 np0005538513.localdomain podman[54646]: 2025-11-28 08:03:13.055379954 +0000 UTC m=+0.290394776 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, vcs-type=git)
Nov 28 08:03:13 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:03:35 np0005538513.localdomain sudo[54675]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:03:35 np0005538513.localdomain sudo[54675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:03:35 np0005538513.localdomain sudo[54675]: pam_unix(sudo:session): session closed for user root
Nov 28 08:03:35 np0005538513.localdomain sudo[54690]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:03:35 np0005538513.localdomain sudo[54690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:03:36 np0005538513.localdomain sudo[54690]: pam_unix(sudo:session): session closed for user root
Nov 28 08:03:36 np0005538513.localdomain sudo[54738]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:03:36 np0005538513.localdomain sudo[54738]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:03:36 np0005538513.localdomain sudo[54738]: pam_unix(sudo:session): session closed for user root
Nov 28 08:03:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:03:43 np0005538513.localdomain podman[54753]: 2025-11-28 08:03:43.841960346 +0000 UTC m=+0.082467243 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12)
Nov 28 08:03:44 np0005538513.localdomain podman[54753]: 2025-11-28 08:03:44.06210597 +0000 UTC m=+0.302612847 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Nov 28 08:03:44 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:04:14 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:04:14 np0005538513.localdomain systemd[1]: tmp-crun.evT8Ov.mount: Deactivated successfully.
Nov 28 08:04:14 np0005538513.localdomain podman[54782]: 2025-11-28 08:04:14.848891904 +0000 UTC m=+0.082142223 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, container_name=metrics_qdr, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, release=1761123044, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:04:15 np0005538513.localdomain podman[54782]: 2025-11-28 08:04:15.04538454 +0000 UTC m=+0.278634829 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:04:15 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:04:37 np0005538513.localdomain sudo[54812]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:04:37 np0005538513.localdomain sudo[54812]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:04:37 np0005538513.localdomain sudo[54812]: pam_unix(sudo:session): session closed for user root
Nov 28 08:04:37 np0005538513.localdomain sudo[54827]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:04:37 np0005538513.localdomain sudo[54827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:04:37 np0005538513.localdomain sudo[54827]: pam_unix(sudo:session): session closed for user root
Nov 28 08:04:38 np0005538513.localdomain sudo[54874]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:04:38 np0005538513.localdomain sudo[54874]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:04:38 np0005538513.localdomain sudo[54874]: pam_unix(sudo:session): session closed for user root
Nov 28 08:04:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:04:45 np0005538513.localdomain podman[54889]: 2025-11-28 08:04:45.851305214 +0000 UTC m=+0.084887870 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-qdrouterd)
Nov 28 08:04:46 np0005538513.localdomain podman[54889]: 2025-11-28 08:04:46.042502871 +0000 UTC m=+0.276085547 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container)
Nov 28 08:04:46 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:05:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:05:16 np0005538513.localdomain systemd[1]: tmp-crun.dn5J1f.mount: Deactivated successfully.
Nov 28 08:05:16 np0005538513.localdomain podman[54919]: 2025-11-28 08:05:16.842830456 +0000 UTC m=+0.076723339 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z)
Nov 28 08:05:17 np0005538513.localdomain podman[54919]: 2025-11-28 08:05:17.048513315 +0000 UTC m=+0.282406218 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 28 08:05:17 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:05:38 np0005538513.localdomain sudo[54949]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:05:38 np0005538513.localdomain sudo[54949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:05:38 np0005538513.localdomain sudo[54949]: pam_unix(sudo:session): session closed for user root
Nov 28 08:05:38 np0005538513.localdomain sudo[54964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:05:38 np0005538513.localdomain sudo[54964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:05:39 np0005538513.localdomain sudo[54964]: pam_unix(sudo:session): session closed for user root
Nov 28 08:05:39 np0005538513.localdomain sudo[55010]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:05:39 np0005538513.localdomain sudo[55010]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:05:39 np0005538513.localdomain sudo[55010]: pam_unix(sudo:session): session closed for user root
Nov 28 08:05:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:05:47 np0005538513.localdomain systemd[1]: tmp-crun.Nmnydd.mount: Deactivated successfully.
Nov 28 08:05:47 np0005538513.localdomain podman[55025]: 2025-11-28 08:05:47.842004643 +0000 UTC m=+0.077467454 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044)
Nov 28 08:05:48 np0005538513.localdomain podman[55025]: 2025-11-28 08:05:48.042471665 +0000 UTC m=+0.277934406 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.)
Nov 28 08:05:48 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:06:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:06:18 np0005538513.localdomain podman[55055]: 2025-11-28 08:06:18.845423396 +0000 UTC m=+0.084257839 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step1, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4)
Nov 28 08:06:19 np0005538513.localdomain podman[55055]: 2025-11-28 08:06:19.036666675 +0000 UTC m=+0.275501128 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-18T22:49:46Z)
Nov 28 08:06:19 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:06:25 np0005538513.localdomain sshd[55084]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:06:25 np0005538513.localdomain sshd[55084]: error: kex_exchange_identification: Connection closed by remote host
Nov 28 08:06:25 np0005538513.localdomain sshd[55084]: Connection closed by 193.32.162.146 port 54292
Nov 28 08:06:39 np0005538513.localdomain sudo[55085]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:06:39 np0005538513.localdomain sudo[55085]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:06:39 np0005538513.localdomain sudo[55085]: pam_unix(sudo:session): session closed for user root
Nov 28 08:06:39 np0005538513.localdomain sudo[55100]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:06:39 np0005538513.localdomain sudo[55100]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:06:40 np0005538513.localdomain sudo[55100]: pam_unix(sudo:session): session closed for user root
Nov 28 08:06:41 np0005538513.localdomain sudo[55146]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:06:41 np0005538513.localdomain sudo[55146]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:06:41 np0005538513.localdomain sudo[55146]: pam_unix(sudo:session): session closed for user root
Nov 28 08:06:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:06:49 np0005538513.localdomain podman[55161]: 2025-11-28 08:06:49.840548034 +0000 UTC m=+0.079480522 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, distribution-scope=public, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T22:49:46Z)
Nov 28 08:06:50 np0005538513.localdomain podman[55161]: 2025-11-28 08:06:50.030393986 +0000 UTC m=+0.269326464 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, version=17.1.12, build-date=2025-11-18T22:49:46Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true)
Nov 28 08:06:50 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:06:55 np0005538513.localdomain sshd[55191]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:06:55 np0005538513.localdomain sshd[55191]: Received disconnect from 80.94.93.233 port 43132:11:  [preauth]
Nov 28 08:06:55 np0005538513.localdomain sshd[55191]: Disconnected from authenticating user root 80.94.93.233 port 43132 [preauth]
Nov 28 08:07:08 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 20 pg[2.0( empty local-lis/les=0/0 n=0 ec=20/20 lis/c=0/0 les/c/f=0/0/0 sis=20) [4,5,3] r=1 lpr=20 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:07:08 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 22 pg[3.0( empty local-lis/les=0/0 n=0 ec=22/22 lis/c=0/0 les/c/f=0/0/0 sis=22) [5,4,0] r=0 lpr=22 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:07:10 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 23 pg[3.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=0/0 les/c/f=0/0/0 sis=22) [5,4,0] r=0 lpr=22 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:07:12 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 24 pg[4.0( empty local-lis/les=0/0 n=0 ec=24/24 lis/c=0/0 les/c/f=0/0/0 sis=24) [3,4,5] r=2 lpr=24 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:07:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 26 pg[5.0( empty local-lis/les=0/0 n=0 ec=26/26 lis/c=0/0 les/c/f=0/0/0 sis=26) [2,3,4] r=0 lpr=26 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:07:15 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 27 pg[5.0( empty local-lis/les=26/27 n=0 ec=26/26 lis/c=0/0 les/c/f=0/0/0 sis=26) [2,3,4] r=0 lpr=26 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:07:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:07:20 np0005538513.localdomain podman[55194]: 2025-11-28 08:07:20.828957372 +0000 UTC m=+0.063953263 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public)
Nov 28 08:07:21 np0005538513.localdomain podman[55194]: 2025-11-28 08:07:21.021335295 +0000 UTC m=+0.256331156 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, batch=17.1_20251118.1, release=1761123044, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git)
Nov 28 08:07:21 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:07:28 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 32 pg[6.0( empty local-lis/les=0/0 n=0 ec=32/32 lis/c=0/0 les/c/f=0/0/0 sis=32) [0,4,2] r=2 lpr=32 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:07:30 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 33 pg[7.0( empty local-lis/les=0/0 n=0 ec=33/33 lis/c=0/0 les/c/f=0/0/0 sis=33) [1,5,3] r=1 lpr=33 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:07:34 np0005538513.localdomain sudo[55225]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:07:34 np0005538513.localdomain sudo[55225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:07:34 np0005538513.localdomain sudo[55225]: pam_unix(sudo:session): session closed for user root
Nov 28 08:07:36 np0005538513.localdomain sudo[55240]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:07:36 np0005538513.localdomain sudo[55240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:07:37 np0005538513.localdomain sudo[55240]: pam_unix(sudo:session): session closed for user root
Nov 28 08:07:37 np0005538513.localdomain sudo[55255]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:07:37 np0005538513.localdomain sudo[55255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:07:37 np0005538513.localdomain sudo[55255]: pam_unix(sudo:session): session closed for user root
Nov 28 08:07:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:07:51 np0005538513.localdomain podman[55270]: 2025-11-28 08:07:51.847428757 +0000 UTC m=+0.087010878 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd)
Nov 28 08:07:52 np0005538513.localdomain podman[55270]: 2025-11-28 08:07:52.029898488 +0000 UTC m=+0.269480659 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, architecture=x86_64, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:07:52 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:07:59 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 38 pg[2.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=38 pruub=13.060708046s) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active pruub 1172.546264648s@ mbc={}] start_peering_interval up [4,5,3] -> [4,5,3], acting [4,5,3] -> [4,5,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:07:59 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 38 pg[2.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=38 pruub=13.057423592s) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1172.546264648s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.16( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.14( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.15( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.18( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.12( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.13( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.11( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.10( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.f( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.e( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.d( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.c( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.b( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.a( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.9( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.1( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.6( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.3( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.5( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.2( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.4( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.7( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.8( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.19( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.1b( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.1a( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.1d( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.1c( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.1e( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.1f( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:00 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.17( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:01 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 40 pg[4.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=40 pruub=15.073258400s) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active pruub 1176.569458008s@ mbc={}] start_peering_interval up [3,4,5] -> [3,4,5], acting [3,4,5] -> [3,4,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:01 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 40 pg[4.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=40 pruub=15.070759773s) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1176.569458008s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:01 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 40 pg[3.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=40 pruub=13.069881439s) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active pruub 1174.570434570s@ mbc={}] start_peering_interval up [5,4,0] -> [5,4,0], acting [5,4,0] -> [5,4,0], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:01 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 40 pg[3.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=40 pruub=13.069881439s) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1174.570434570s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.1a( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.1d( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.19( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.1b( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.18( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.1c( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.1f( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.1( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.e( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.2( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.4( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.3( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.7( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.6( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.5( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.f( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.c( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.a( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.b( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.d( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.8( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.9( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.16( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.17( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.15( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.14( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.13( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.12( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.10( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.1e( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.11( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.19( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.16( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.14( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.17( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.12( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.13( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.15( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.10( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.11( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.e( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.f( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.c( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.b( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.8( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.2( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.7( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.4( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.3( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.a( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.d( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.5( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.6( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.18( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1a( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1d( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1b( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1f( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1e( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.9( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1c( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:02 np0005538513.localdomain sudo[55313]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mszaytumnolgoohgslijhskayhinrxky ; /usr/bin/python3
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.0( empty local-lis/les=40/41 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain sudo[55313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:02 np0005538513.localdomain python3[55315]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:02 np0005538513.localdomain sudo[55313]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1b( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.18( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.17( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.19( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.16( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.15( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.14( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1a( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.11( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.13( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.d( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.e( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.5( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.10( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.c( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.2( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.3( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.4( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.8( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.9( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.a( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.6( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.7( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.b( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.f( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1d( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1f( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1e( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1c( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:02 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.12( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:03 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 42 pg[5.0( empty local-lis/les=26/27 n=0 ec=26/26 lis/c=26/26 les/c/f=27/27/0 sis=42 pruub=15.915790558s) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active pruub 1184.430175781s@ mbc={}] start_peering_interval up [2,3,4] -> [2,3,4], acting [2,3,4] -> [2,3,4], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:03 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 42 pg[6.0( empty local-lis/les=32/33 n=0 ec=32/32 lis/c=32/32 les/c/f=33/33/0 sis=42 pruub=13.975545883s) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 active pruub 1182.489990234s@ mbc={}] start_peering_interval up [0,4,2] -> [0,4,2], acting [0,4,2] -> [0,4,2], acting_primary 0 -> 0, up_primary 0 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:03 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 42 pg[6.0( empty local-lis/les=32/33 n=0 ec=32/32 lis/c=32/32 les/c/f=33/33/0 sis=42 pruub=13.971534729s) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1182.489990234s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:03 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 42 pg[5.0( empty local-lis/les=26/27 n=0 ec=26/26 lis/c=26/26 les/c/f=27/27/0 sis=42 pruub=15.915790558s) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.430175781s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain sudo[55329]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-grdupuchahmctwmxzlpysieyyzycxoxz ; /usr/bin/python3
Nov 28 08:08:04 np0005538513.localdomain sudo[55329]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.1c( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.1f( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.1d( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.12( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.10( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.17( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.16( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.14( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.19( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.1a( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.11( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.17( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.16( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.15( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.14( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.15( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.13( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.1e( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.12( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.13( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.11( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.10( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.f( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.c( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.e( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.d( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.d( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.c( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.e( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.f( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.3( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.3( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.2( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.2( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.1( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.4( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.a( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1b( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.b( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.b( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.7( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.1b( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.18( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.8( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.5( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.6( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.8( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.5( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.9( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1a( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.6( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.7( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.4( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.19( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.a( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.9( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1c( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1f( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1d( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1e( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.18( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.0( empty local-lis/les=42/43 n=0 ec=26/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.e( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.4( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.5( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.6( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.3( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.2( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.7( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.a( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.9( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.16( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.15( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.17( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1e( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.d( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.c( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.f( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.8( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.11( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1d( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.12( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1c( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.14( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1f( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1b( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.19( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.b( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.13( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.18( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1a( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.10( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:04 np0005538513.localdomain python3[55331]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:04 np0005538513.localdomain sudo[55329]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.0 scrub starts
Nov 28 08:08:04 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.0 scrub ok
Nov 28 08:08:05 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 44 pg[7.0( v 36'39 (0'0,36'39] local-lis/les=33/34 n=22 ec=33/33 lis/c=33/33 les/c/f=34/34/0 sis=44 pruub=12.864492416s) [1,5,3] r=1 lpr=44 pi=[33,44)/1 luod=0'0 lua=36'37 crt=36'39 lcod 36'38 mlcod 0'0 active pruub 1178.443603516s@ mbc={}] start_peering_interval up [1,5,3] -> [1,5,3], acting [1,5,3] -> [1,5,3], acting_primary 1 -> 1, up_primary 1 -> 1, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:05 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 44 pg[7.0( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=33/33 lis/c=33/33 les/c/f=34/34/0 sis=44 pruub=12.862508774s) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 lcod 36'38 mlcod 0'0 unknown NOTIFY pruub 1178.443603516s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:05 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.0 scrub starts
Nov 28 08:08:05 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.0 scrub ok
Nov 28 08:08:06 np0005538513.localdomain sudo[55345]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-waxidygnlvxcwkoyqrdsmendftfbocfp ; /usr/bin/python3
Nov 28 08:08:06 np0005538513.localdomain sudo[55345]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:06 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.d( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:06 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.2( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:06 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.7( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:06 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.3( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:06 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.4( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:06 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.5( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:06 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.6( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:06 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.c( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:06 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.f( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:06 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.e( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:06 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.9( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:06 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.8( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:06 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.b( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:06 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.a( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:06 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.1( v 36'39 (0'0,36'39] local-lis/les=33/34 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:06 np0005538513.localdomain python3[55347]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:06 np0005538513.localdomain sudo[55345]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:07 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.6 scrub starts
Nov 28 08:08:07 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.6 scrub ok
Nov 28 08:08:08 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.1b scrub starts
Nov 28 08:08:08 np0005538513.localdomain sudo[55393]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xyfllunumkigitqovglfmmobzlstzuwv ; /usr/bin/python3
Nov 28 08:08:08 np0005538513.localdomain sudo[55393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:08 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.1b scrub ok
Nov 28 08:08:08 np0005538513.localdomain python3[55395]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:08 np0005538513.localdomain sudo[55393]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:08 np0005538513.localdomain sudo[55436]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qdvyawjemtogdoppiouazpsqidpgnwhd ; /usr/bin/python3
Nov 28 08:08:08 np0005538513.localdomain sudo[55436]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:08 np0005538513.localdomain python3[55438]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317288.1489336-92600-76918684453434/source dest=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring mode=600 _original_basename=ceph.client.openstack.keyring follow=False checksum=98ffd20e3b9db1cae39a950d9da1f69e92796658 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:08 np0005538513.localdomain sudo[55436]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:10 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.18 deep-scrub starts
Nov 28 08:08:10 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.18 deep-scrub ok
Nov 28 08:08:11 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.17 scrub starts
Nov 28 08:08:11 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.17 scrub ok
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.3 scrub starts
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.3 scrub ok
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.15( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,1,0] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.839219093s) [0,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.589965820s@ mbc={}] start_peering_interval up [2,3,4] -> [0,1,5], acting [2,3,4] -> [0,1,5], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.836143494s) [3,4,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.587158203s@ mbc={}] start_peering_interval up [0,4,2] -> [3,4,5], acting [0,4,2] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.836041451s) [3,4,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.587158203s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837333679s) [1,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588378906s@ mbc={}] start_peering_interval up [2,3,4] -> [1,3,2], acting [2,3,4] -> [1,3,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.839117050s) [0,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.589965820s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1a( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.845196724s) [4,2,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.596191406s@ mbc={}] start_peering_interval up [0,4,2] -> [4,2,0], acting [0,4,2] -> [4,2,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.843434334s) [4,5,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.594848633s@ mbc={}] start_peering_interval up [0,4,2] -> [4,5,0], acting [0,4,2] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.17( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837383270s) [3,5,4] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588867188s@ mbc={}] start_peering_interval up [2,3,4] -> [3,5,4], acting [2,3,4] -> [3,5,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1a( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.844755173s) [4,2,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.596191406s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.843380928s) [4,5,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.594848633s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.17( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837240219s) [3,5,4] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.588867188s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837261200s) [1,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.588378906s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.836681366s) [4,3,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588378906s@ mbc={}] start_peering_interval up [2,3,4] -> [4,3,5], acting [2,3,4] -> [4,3,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.836624146s) [4,3,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.588378906s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837795258s) [3,2,4] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.589721680s@ mbc={}] start_peering_interval up [2,3,4] -> [3,2,4], acting [2,3,4] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837754250s) [3,2,4] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.589721680s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.17( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.835426331s) [5,0,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.587402344s@ mbc={}] start_peering_interval up [0,4,2] -> [5,0,1], acting [0,4,2] -> [5,0,1], acting_primary 0 -> 5, up_primary 0 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.17( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.835368156s) [5,0,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.587402344s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.838191032s) [5,0,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.590209961s@ mbc={}] start_peering_interval up [2,3,4] -> [5,0,1], acting [2,3,4] -> [5,0,1], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.10( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834941864s) [0,2,4] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.587158203s@ mbc={}] start_peering_interval up [0,4,2] -> [0,2,4], acting [0,4,2] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.10( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834882736s) [0,2,4] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.587158203s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.838109016s) [5,0,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.590209961s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834904671s) [3,5,4] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.587280273s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,4], acting [0,4,2] -> [3,5,4], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837042809s) [5,1,3] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.589477539s@ mbc={}] start_peering_interval up [2,3,4] -> [5,1,3], acting [2,3,4] -> [5,1,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.16( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834480286s) [0,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.587036133s@ mbc={}] start_peering_interval up [0,4,2] -> [0,1,5], acting [0,4,2] -> [0,1,5], acting_primary 0 -> 0, up_primary 0 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.836954117s) [5,1,3] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.589477539s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.836494446s) [1,2,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.589233398s@ mbc={}] start_peering_interval up [2,3,4] -> [1,2,0], acting [2,3,4] -> [1,2,0], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.16( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834243774s) [0,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.587036133s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.12( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.833994865s) [5,4,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.586791992s@ mbc={}] start_peering_interval up [0,4,2] -> [5,4,0], acting [0,4,2] -> [5,4,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.836399078s) [1,2,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.589233398s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.12( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.833915710s) [5,4,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.586791992s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.833595276s) [3,2,1] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.586547852s@ mbc={}] start_peering_interval up [0,4,2] -> [3,2,1], acting [0,4,2] -> [3,2,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834869385s) [3,5,4] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.587280273s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.10( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837560654s) [4,5,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.590576172s@ mbc={}] start_peering_interval up [2,3,4] -> [4,5,0], acting [2,3,4] -> [4,5,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.10( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837445259s) [4,5,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.590576172s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.c( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.842554092s) [3,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595947266s@ mbc={}] start_peering_interval up [0,4,2] -> [3,1,5], acting [0,4,2] -> [3,1,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.833251953s) [3,2,1] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.586547852s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.835626602s) [4,2,3] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588867188s@ mbc={}] start_peering_interval up [2,3,4] -> [4,2,3], acting [2,3,4] -> [4,2,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.c( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.842500687s) [3,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595947266s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.835380554s) [4,2,3] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.588867188s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.d( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.841554642s) [1,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595092773s@ mbc={}] start_peering_interval up [0,4,2] -> [1,3,2], acting [0,4,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.e( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.841343880s) [4,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.594970703s@ mbc={}] start_peering_interval up [0,4,2] -> [4,3,2], acting [0,4,2] -> [4,3,2], acting_primary 0 -> 4, up_primary 0 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.e( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.832862854s) [2,0,4] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.586547852s@ mbc={}] start_peering_interval up [2,3,4] -> [2,0,4], acting [2,3,4] -> [2,0,4], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.e( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.841270447s) [4,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.594970703s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.e( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.832862854s) [2,0,4] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1193.586547852s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.d( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834959984s) [2,4,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588867188s@ mbc={}] start_peering_interval up [2,3,4] -> [2,4,0], acting [2,3,4] -> [2,4,0], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.d( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.841462135s) [1,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595092773s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.841094971s) [3,5,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595092773s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,1], acting [0,4,2] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.841048241s) [3,5,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595092773s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834777832s) [3,4,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588867188s@ mbc={}] start_peering_interval up [2,3,4] -> [3,4,2], acting [2,3,4] -> [3,4,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.d( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834959984s) [2,4,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1193.588867188s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.3( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.841209412s) [4,5,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595336914s@ mbc={}] start_peering_interval up [0,4,2] -> [4,5,0], acting [0,4,2] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834661484s) [3,4,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.588867188s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.3( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.841119766s) [4,5,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595336914s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.833509445s) [0,1,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.587768555s@ mbc={}] start_peering_interval up [2,3,4] -> [0,1,2], acting [2,3,4] -> [0,1,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.833410263s) [0,1,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.587768555s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.2( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.840950966s) [1,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595336914s@ mbc={}] start_peering_interval up [0,4,2] -> [1,3,2], acting [0,4,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831896782s) [4,3,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.586425781s@ mbc={}] start_peering_interval up [2,3,4] -> [4,3,5], acting [2,3,4] -> [4,3,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831829071s) [4,3,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.586425781s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.2( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.840855598s) [1,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595336914s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.833280563s) [4,0,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588012695s@ mbc={}] start_peering_interval up [2,3,4] -> [4,0,2], acting [2,3,4] -> [4,0,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.833249092s) [4,0,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.588012695s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.840673447s) [2,1,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595458984s@ mbc={}] start_peering_interval up [0,4,2] -> [2,1,3], acting [0,4,2] -> [2,1,3], acting_primary 0 -> 2, up_primary 0 -> 2, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.840673447s) [2,1,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1193.595458984s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1b( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834815025s) [1,0,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.589965820s@ mbc={}] start_peering_interval up [2,3,4] -> [1,0,2], acting [2,3,4] -> [1,0,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.18( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.840824127s) [0,1,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595947266s@ mbc={}] start_peering_interval up [0,4,2] -> [0,1,2], acting [0,4,2] -> [0,1,2], acting_primary 0 -> 0, up_primary 0 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.7( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.840670586s) [4,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595825195s@ mbc={}] start_peering_interval up [0,4,2] -> [4,3,2], acting [0,4,2] -> [4,3,2], acting_primary 0 -> 4, up_primary 0 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831583977s) [5,3,4] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.586669922s@ mbc={}] start_peering_interval up [2,3,4] -> [5,3,4], acting [2,3,4] -> [5,3,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.7( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.840625763s) [4,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595825195s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831522942s) [5,3,4] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.586669922s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.18( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.840631485s) [0,1,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595947266s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.b( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834583282s) [5,0,4] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.589965820s@ mbc={}] start_peering_interval up [2,3,4] -> [5,0,4], acting [2,3,4] -> [5,0,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.839269638s) [1,2,3] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.594848633s@ mbc={}] start_peering_interval up [0,4,2] -> [1,2,3], acting [0,4,2] -> [1,2,3], acting_primary 0 -> 1, up_primary 0 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.839148521s) [1,2,3] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.594848633s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.b( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834393501s) [5,0,4] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.589965820s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834494591s) [4,2,3] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.590209961s@ mbc={}] start_peering_interval up [2,3,4] -> [4,2,3], acting [2,3,4] -> [4,2,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1b( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.840360641s) [5,1,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.596191406s@ mbc={}] start_peering_interval up [0,4,2] -> [5,1,0], acting [0,4,2] -> [5,1,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1b( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834750175s) [1,0,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.589965820s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831048012s) [0,4,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.587036133s@ mbc={}] start_peering_interval up [2,3,4] -> [0,4,5], acting [2,3,4] -> [0,4,5], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834384918s) [4,2,3] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.590209961s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1b( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.840190887s) [5,1,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.596191406s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.830997467s) [0,4,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.587036133s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.5( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.838831902s) [4,2,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595092773s@ mbc={}] start_peering_interval up [0,4,2] -> [4,2,0], acting [0,4,2] -> [4,2,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.5( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.838751793s) [4,2,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595092773s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.6( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.839011192s) [3,4,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595336914s@ mbc={}] start_peering_interval up [0,4,2] -> [3,4,5], acting [0,4,2] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.b( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837341309s) [3,1,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.593750000s@ mbc={}] start_peering_interval up [0,4,2] -> [3,1,2], acting [0,4,2] -> [3,1,2], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.6( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.838896751s) [3,4,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595336914s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.833964348s) [2,4,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.590576172s@ mbc={}] start_peering_interval up [2,3,4] -> [2,4,3], acting [2,3,4] -> [2,4,3], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.b( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837283134s) [3,1,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.593750000s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.1b( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,1,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.6( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.830921173s) [3,1,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.587524414s@ mbc={}] start_peering_interval up [2,3,4] -> [3,1,2], acting [2,3,4] -> [3,1,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.833964348s) [2,4,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1193.590576172s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.6( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.830874443s) [3,1,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.587524414s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.19( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.839150429s) [1,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.596069336s@ mbc={}] start_peering_interval up [0,4,2] -> [1,3,2], acting [0,4,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831374168s) [4,3,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588256836s@ mbc={}] start_peering_interval up [2,3,4] -> [4,3,5], acting [2,3,4] -> [4,3,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.19( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.839101791s) [1,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.596069336s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.4( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.838333130s) [3,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595214844s@ mbc={}] start_peering_interval up [0,4,2] -> [3,1,5], acting [0,4,2] -> [3,1,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.4( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.838268280s) [3,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595214844s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831327438s) [4,3,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.588256836s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831075668s) [1,5,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588256836s@ mbc={}] start_peering_interval up [2,3,4] -> [1,5,0], acting [2,3,4] -> [1,5,0], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.a( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.836670876s) [4,0,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.593872070s@ mbc={}] start_peering_interval up [0,4,2] -> [4,0,2], acting [0,4,2] -> [4,0,2], acting_primary 0 -> 4, up_primary 0 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1f( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.981576920s) [0,1,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734985352s@ mbc={}] start_peering_interval up [5,4,0] -> [0,1,5], acting [5,4,0] -> [0,1,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831021309s) [1,5,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.588256836s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.9( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.836584091s) [0,2,4] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.593872070s@ mbc={}] start_peering_interval up [0,4,2] -> [0,2,4], acting [0,4,2] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.a( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.836582184s) [4,0,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.593872070s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1f( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.981451035s) [0,1,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734985352s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.9( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.836521149s) [0,2,4] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.593872070s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.838540077s) [3,5,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.596069336s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,1], acting [0,4,2] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.838509560s) [3,5,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.596069336s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.a( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.830760002s) [0,2,4] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588256836s@ mbc={}] start_peering_interval up [2,3,4] -> [0,2,4], acting [2,3,4] -> [0,2,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1c( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831855774s) [4,2,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.589477539s@ mbc={}] start_peering_interval up [2,3,4] -> [4,2,0], acting [2,3,4] -> [4,2,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1e( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.838161469s) [5,1,3] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595825195s@ mbc={}] start_peering_interval up [0,4,2] -> [5,1,3], acting [0,4,2] -> [5,1,3], acting_primary 0 -> 5, up_primary 0 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831745148s) [3,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.589477539s@ mbc={}] start_peering_interval up [2,3,4] -> [3,1,5], acting [2,3,4] -> [3,1,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1d( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.980996132s) [5,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734741211s@ mbc={}] start_peering_interval up [5,4,0] -> [5,4,3], acting [5,4,0] -> [5,4,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1e( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.838104248s) [5,1,3] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595825195s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831678391s) [3,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.589477539s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.8( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831061363s) [2,0,1] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588867188s@ mbc={}] start_peering_interval up [2,3,4] -> [2,0,1], acting [2,3,4] -> [2,0,1], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1d( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.980996132s) [5,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.734741211s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.8( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831061363s) [2,0,1] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1193.588867188s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1d( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837743759s) [3,5,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595581055s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,1], acting [0,4,2] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.830633163s) [0,1,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588745117s@ mbc={}] start_peering_interval up [2,3,4] -> [0,1,2], acting [2,3,4] -> [0,1,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1d( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837689400s) [3,5,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595581055s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1a( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.978971481s) [5,3,4] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733276367s@ mbc={}] start_peering_interval up [5,4,0] -> [5,3,4], acting [5,4,0] -> [5,3,4], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.830577850s) [0,1,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.588745117s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1a( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.978971481s) [5,3,4] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.733276367s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1b( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.975111961s) [5,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.729736328s@ mbc={}] start_peering_interval up [5,4,0] -> [5,4,3], acting [5,4,0] -> [5,4,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.a( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.830696106s) [0,2,4] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.588256836s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1f( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831474304s) [4,5,3] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.589599609s@ mbc={}] start_peering_interval up [2,3,4] -> [4,5,3], acting [2,3,4] -> [4,5,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1f( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831237793s) [4,5,3] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.589599609s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,4,0] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1c( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.828446388s) [5,3,4] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.586547852s@ mbc={}] start_peering_interval up [0,4,2] -> [5,3,4], acting [0,4,2] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1c( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.828015327s) [5,3,4] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.586547852s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1c( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831803322s) [4,2,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.589477539s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,0,4] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1b( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.975111961s) [5,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.729736328s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.1e( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,1,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.9( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.979107857s) [5,1,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734375000s@ mbc={}] start_peering_interval up [5,4,0] -> [5,1,3], acting [5,4,0] -> [5,1,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.9( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.979107857s) [5,1,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.734375000s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.6( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.979057312s) [0,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734375000s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,5], acting [5,4,0] -> [0,4,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.978583336s) [0,4,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734008789s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,2], acting [5,4,0] -> [0,4,2], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.6( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.978943825s) [0,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734375000s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.978516579s) [0,4,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734008789s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.4( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,3,4] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.b( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,0,4] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.12( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.979392052s) [0,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.736083984s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,5], acting [5,4,0] -> [0,4,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.12( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.979353905s) [0,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.736083984s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,0,1] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.15( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.975153923s) [2,1,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732910156s@ mbc={}] start_peering_interval up [5,4,0] -> [2,1,0], acting [5,4,0] -> [2,1,0], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,0,1] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.15( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.975037575s) [2,1,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732910156s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,1,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.17( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.974172592s) [0,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732666016s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,5], acting [5,4,0] -> [0,4,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.12( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,4,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.17( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.974113464s) [0,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732666016s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.19( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.973925591s) [0,1,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732788086s@ mbc={}] start_peering_interval up [5,4,0] -> [0,1,2], acting [5,4,0] -> [0,1,2], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,3,4] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.973986626s) [0,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733398438s@ mbc={}] start_peering_interval up [3,4,5] -> [0,4,5], acting [3,4,5] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.973930359s) [0,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.733398438s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.11( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.973855972s) [3,4,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733520508s@ mbc={}] start_peering_interval up [3,4,5] -> [3,4,2], acting [3,4,5] -> [3,4,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.18( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.026166916s) [5,1,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.785888672s@ mbc={}] start_peering_interval up [4,5,3] -> [5,1,3], acting [4,5,3] -> [5,1,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.11( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.973798752s) [3,4,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.733520508s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.18( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.026166916s) [5,1,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.785888672s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.17( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.030043602s) [1,5,3] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.789794922s@ mbc={}] start_peering_interval up [4,5,3] -> [1,5,3], acting [4,5,3] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.17( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.030009270s) [1,5,3] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.789794922s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.16( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972895622s) [1,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732788086s@ mbc={}] start_peering_interval up [5,4,0] -> [1,3,5], acting [5,4,0] -> [1,3,5], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.16( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972863197s) [1,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732788086s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.10( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.973284721s) [3,2,4] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733398438s@ mbc={}] start_peering_interval up [3,4,5] -> [3,2,4], acting [3,4,5] -> [3,2,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.10( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.973226547s) [3,2,4] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.733398438s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.13( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.973079681s) [4,2,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733398438s@ mbc={}] start_peering_interval up [3,4,5] -> [4,2,3], acting [3,4,5] -> [4,2,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.16( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.030028343s) [1,2,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.790405273s@ mbc={}] start_peering_interval up [4,5,3] -> [1,2,0], acting [4,5,3] -> [1,2,0], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.15( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.029592514s) [5,0,4] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.790039062s@ mbc={}] start_peering_interval up [4,5,3] -> [5,0,4], acting [4,5,3] -> [5,0,4], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.15( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.029592514s) [5,0,4] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.790039062s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.16( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.029875755s) [1,2,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.790405273s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,4,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.13( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972949028s) [4,2,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.733398438s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.14( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972229958s) [1,2,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732910156s@ mbc={}] start_peering_interval up [5,4,0] -> [1,2,0], acting [5,4,0] -> [1,2,0], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.14( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972191811s) [1,2,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732910156s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.19( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.973726273s) [0,1,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732788086s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.12( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972481728s) [0,5,4] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733398438s@ mbc={}] start_peering_interval up [3,4,5] -> [0,5,4], acting [3,4,5] -> [0,5,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.19( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,3,1] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.12( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972426414s) [0,5,4] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.733398438s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.14( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025187492s) [4,2,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.786132812s@ mbc={}] start_peering_interval up [4,5,3] -> [4,2,0], acting [4,5,3] -> [4,2,0], acting_primary 4 -> 4, up_primary 4 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.14( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025151253s) [4,2,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.786132812s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.13( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.029186249s) [2,4,3] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.790527344s@ mbc={}] start_peering_interval up [4,5,3] -> [2,4,3], acting [4,5,3] -> [2,4,3], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.14( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972023964s) [5,0,1] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733398438s@ mbc={}] start_peering_interval up [3,4,5] -> [5,0,1], acting [3,4,5] -> [5,0,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.10( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,0,4] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.13( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.029132843s) [2,4,3] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.790527344s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.14( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972023964s) [5,0,1] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.733398438s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.12( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.028998375s) [5,3,1] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.790527344s@ mbc={}] start_peering_interval up [4,5,3] -> [5,3,1], acting [4,5,3] -> [5,3,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,4,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,4,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.12( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.028998375s) [5,3,1] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.790527344s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,1,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.17( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972898483s) [3,1,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734497070s@ mbc={}] start_peering_interval up [3,4,5] -> [3,1,5], acting [3,4,5] -> [3,1,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.13( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971303940s) [1,3,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732910156s@ mbc={}] start_peering_interval up [5,4,0] -> [1,3,2], acting [5,4,0] -> [1,3,2], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.17( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972848892s) [3,1,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734497070s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.11( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.029056549s) [4,3,2] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.790893555s@ mbc={}] start_peering_interval up [4,5,3] -> [4,3,2], acting [4,5,3] -> [4,3,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.c( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972228050s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734008789s@ mbc={}] start_peering_interval up [5,4,0] -> [4,3,5], acting [5,4,0] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.11( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.029020309s) [4,3,2] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.790893555s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.1d( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,1,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.c( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972180367s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734008789s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.10( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972114563s) [1,5,3] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734008789s@ mbc={}] start_peering_interval up [5,4,0] -> [1,5,3], acting [5,4,0] -> [1,5,3], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.16( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971117973s) [0,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733154297s@ mbc={}] start_peering_interval up [3,4,5] -> [0,4,5], acting [3,4,5] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.10( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972062111s) [1,5,3] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734008789s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.16( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971092224s) [0,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.733154297s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.13( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971239090s) [1,3,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732910156s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,3,4] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.10( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.029185295s) [2,0,4] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791259766s@ mbc={}] start_peering_interval up [4,5,3] -> [2,0,4], acting [4,5,3] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.10( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.029110909s) [2,0,4] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.791259766s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.9( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971055984s) [5,0,1] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733398438s@ mbc={}] start_peering_interval up [3,4,5] -> [5,0,1], acting [3,4,5] -> [5,0,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.9( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971055984s) [5,0,1] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.733398438s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.1f( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.028552055s) [2,4,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791015625s@ mbc={}] start_peering_interval up [4,5,3] -> [2,4,0], acting [4,5,3] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.028477669s) [2,4,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.791015625s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.e( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971155167s) [2,4,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734008789s@ mbc={}] start_peering_interval up [5,4,0] -> [2,4,0], acting [5,4,0] -> [2,4,0], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.3( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.8( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.970267296s) [5,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733154297s@ mbc={}] start_peering_interval up [3,4,5] -> [5,4,3], acting [3,4,5] -> [5,4,3], acting_primary 3 -> 5, up_primary 3 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.028509140s) [3,2,4] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791381836s@ mbc={}] start_peering_interval up [4,5,3] -> [3,2,4], acting [4,5,3] -> [3,2,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.e( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971103668s) [2,4,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734008789s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.8( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.970267296s) [5,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.733154297s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.028468132s) [3,2,4] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.791381836s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.c( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,0,1] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.862487793s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1182.625488281s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.862393379s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1182.625488281s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,0,1] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.f( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971565247s) [1,5,0] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734619141s@ mbc={}] start_peering_interval up [5,4,0] -> [1,5,0], acting [5,4,0] -> [1,5,0], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,3,1] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.f( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971468925s) [1,5,0] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734619141s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.028242111s) [5,1,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791503906s@ mbc={}] start_peering_interval up [4,5,3] -> [5,1,3], acting [4,5,3] -> [5,1,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971169472s) [0,1,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734497070s@ mbc={}] start_peering_interval up [3,4,5] -> [0,1,5], acting [3,4,5] -> [0,1,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.2( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,1,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.028242111s) [5,1,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.791503906s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,1,0] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971089363s) [0,1,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734497070s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.9( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.861575127s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1182.625000000s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.9( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.861518860s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1182.625000000s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.969545364s) [1,0,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733276367s@ mbc={}] start_peering_interval up [3,4,5] -> [1,0,2], acting [3,4,5] -> [1,0,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.d( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.970258713s) [1,2,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733886719s@ mbc={}] start_peering_interval up [5,4,0] -> [1,2,3], acting [5,4,0] -> [1,2,3], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.027631760s) [2,0,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791381836s@ mbc={}] start_peering_interval up [4,5,3] -> [2,0,1], acting [4,5,3] -> [2,0,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.027595520s) [2,0,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.791381836s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.969427109s) [1,0,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.733276367s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.d( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.970044136s) [1,2,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.733886719s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968944550s) [4,2,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732788086s@ mbc={}] start_peering_interval up [3,4,5] -> [4,2,3], acting [3,4,5] -> [4,2,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.027610779s) [5,1,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791625977s@ mbc={}] start_peering_interval up [4,5,3] -> [5,1,0], acting [4,5,3] -> [5,1,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968889236s) [4,2,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732788086s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.027610779s) [5,1,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.791625977s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.a( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.970264435s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734375000s@ mbc={}] start_peering_interval up [5,4,0] -> [4,3,5], acting [5,4,0] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.a( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.970205307s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734375000s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.860547066s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1182.624877930s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.860388756s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1182.624877930s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.15( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968656540s) [5,3,1] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733276367s@ mbc={}] start_peering_interval up [3,4,5] -> [5,3,1], acting [3,4,5] -> [5,3,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.15( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968656540s) [5,3,1] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.733276367s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.026748657s) [2,3,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791381836s@ mbc={}] start_peering_interval up [4,5,3] -> [2,3,1], acting [4,5,3] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.026714325s) [2,3,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.791381836s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968041420s) [4,3,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732788086s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,2], acting [3,4,5] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967896461s) [4,3,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732788086s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.b( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.969691277s) [3,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734619141s@ mbc={}] start_peering_interval up [5,4,0] -> [3,4,5], acting [5,4,0] -> [3,4,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968936920s) [3,2,4] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733886719s@ mbc={}] start_peering_interval up [3,4,5] -> [3,2,4], acting [3,4,5] -> [3,2,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.9( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.027315140s) [3,4,5] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.792358398s@ mbc={}] start_peering_interval up [4,5,3] -> [3,4,5], acting [4,5,3] -> [3,4,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.b( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.969641685s) [3,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734619141s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.5( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967989922s) [1,5,0] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733154297s@ mbc={}] start_peering_interval up [3,4,5] -> [1,5,0], acting [3,4,5] -> [1,5,0], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.8( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.969229698s) [2,0,4] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734375000s@ mbc={}] start_peering_interval up [5,4,0] -> [2,0,4], acting [5,4,0] -> [2,0,4], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.9( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.027246475s) [3,4,5] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.792358398s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.5( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967956543s) [1,5,0] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.733154297s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.8( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.969168663s) [2,0,4] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734375000s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.3( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.026410103s) [4,3,5] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791625977s@ mbc={}] start_peering_interval up [4,5,3] -> [4,3,5], acting [4,5,3] -> [4,3,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.859489441s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1182.624877930s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.3( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.026229858s) [4,3,5] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.791625977s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.859434128s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1182.624877930s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.6( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966850281s) [5,3,4] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732421875s@ mbc={}] start_peering_interval up [3,4,5] -> [5,3,4], acting [3,4,5] -> [5,3,4], acting_primary 3 -> 5, up_primary 3 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.6( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966850281s) [5,3,4] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.732421875s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968319893s) [3,2,4] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.733886719s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.7( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966467857s) [0,5,4] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732177734s@ mbc={}] start_peering_interval up [3,4,5] -> [0,5,4], acting [3,4,5] -> [0,5,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.7( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966419220s) [0,5,4] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732177734s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025950432s) [3,5,4] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791748047s@ mbc={}] start_peering_interval up [4,5,3] -> [3,5,4], acting [4,5,3] -> [3,5,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025921822s) [3,5,4] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.791748047s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.3( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.855010033s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1182.621093750s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.7( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968351364s) [3,5,4] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734375000s@ mbc={}] start_peering_interval up [5,4,0] -> [3,5,4], acting [5,4,0] -> [3,5,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.3( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.965216637s) [2,4,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.731445312s@ mbc={}] start_peering_interval up [3,4,5] -> [2,4,3], acting [3,4,5] -> [2,4,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.6( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025517464s) [3,2,4] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791625977s@ mbc={}] start_peering_interval up [4,5,3] -> [3,2,4], acting [4,5,3] -> [3,2,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.3( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.965181351s) [2,4,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.731445312s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.6( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025468826s) [3,2,4] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.791625977s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.5( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.026338577s) [2,0,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.792724609s@ mbc={}] start_peering_interval up [4,5,3] -> [2,0,1], acting [4,5,3] -> [2,0,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.2( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967695236s) [3,5,1] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734008789s@ mbc={}] start_peering_interval up [5,4,0] -> [3,5,1], acting [5,4,0] -> [3,5,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.3( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.854744911s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1182.621093750s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.4( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967777252s) [3,2,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734252930s@ mbc={}] start_peering_interval up [5,4,0] -> [3,2,1], acting [5,4,0] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.7( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967906952s) [3,5,4] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734375000s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.2( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967460632s) [3,5,1] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734008789s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.5( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.026021004s) [2,0,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.792724609s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.4( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967555046s) [3,2,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734252930s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.854517937s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1182.621337891s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.4( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.965051651s) [0,5,1] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.731933594s@ mbc={}] start_peering_interval up [3,4,5] -> [0,5,1], acting [3,4,5] -> [0,5,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.854477882s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1182.621337891s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.4( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.965014458s) [0,5,1] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.731933594s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.2( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025288582s) [1,0,2] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.792236328s@ mbc={}] start_peering_interval up [4,5,3] -> [1,0,2], acting [4,5,3] -> [1,0,2], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.3( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967137337s) [4,0,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734130859s@ mbc={}] start_peering_interval up [5,4,0] -> [4,0,5], acting [5,4,0] -> [4,0,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.2( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025241852s) [1,0,2] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.792236328s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.1( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.853694916s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1182.620727539s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.2( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966079712s) [2,1,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733154297s@ mbc={}] start_peering_interval up [3,4,5] -> [2,1,3], acting [3,4,5] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.4( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025568962s) [3,2,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.792602539s@ mbc={}] start_peering_interval up [4,5,3] -> [3,2,1], acting [4,5,3] -> [3,2,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.3( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967079163s) [4,0,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734130859s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.2( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966045380s) [2,1,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.733154297s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.4( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025537491s) [3,2,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.792602539s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.1( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.853642464s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1182.620727539s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.5( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966808319s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734008789s@ mbc={}] start_peering_interval up [5,4,0] -> [4,3,5], acting [5,4,0] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.5( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966762543s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734008789s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.7( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024511337s) [4,2,3] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791870117s@ mbc={}] start_peering_interval up [4,5,3] -> [4,2,3], acting [4,5,3] -> [4,2,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.7( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024435043s) [4,2,3] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.791870117s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.853452682s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1182.620971680s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.964606285s) [4,5,0] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732177734s@ mbc={}] start_peering_interval up [3,4,5] -> [4,5,0], acting [3,4,5] -> [4,5,0], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.853414536s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1182.620971680s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.8( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024182320s) [1,2,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791748047s@ mbc={}] start_peering_interval up [4,5,3] -> [1,2,0], acting [4,5,3] -> [1,2,0], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.964551926s) [4,5,0] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732177734s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.8( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024150848s) [1,2,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.791748047s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.964995384s) [2,1,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732788086s@ mbc={}] start_peering_interval up [3,4,5] -> [2,1,0], acting [3,4,5] -> [2,1,0], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.18( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.964865685s) [3,2,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732666016s@ mbc={}] start_peering_interval up [5,4,0] -> [3,2,1], acting [5,4,0] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.964897156s) [2,3,4] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732666016s@ mbc={}] start_peering_interval up [3,4,5] -> [2,3,4], acting [3,4,5] -> [2,3,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.964949608s) [2,1,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732788086s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.19( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024235725s) [3,4,2] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.792236328s@ mbc={}] start_peering_interval up [4,5,3] -> [3,4,2], acting [4,5,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.18( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.964830399s) [3,2,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732666016s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.19( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024185181s) [3,4,2] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.792236328s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.964869499s) [2,3,4] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732666016s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.963286400s) [2,4,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.731445312s@ mbc={}] start_peering_interval up [3,4,5] -> [2,4,3], acting [3,4,5] -> [2,4,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.963726044s) [2,1,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.731933594s@ mbc={}] start_peering_interval up [3,4,5] -> [2,1,3], acting [3,4,5] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.963196754s) [2,4,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.731445312s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024097443s) [1,5,3] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.792358398s@ mbc={}] start_peering_interval up [4,5,3] -> [1,5,3], acting [4,5,3] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024379730s) [5,4,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.792724609s@ mbc={}] start_peering_interval up [4,5,3] -> [5,4,3], acting [4,5,3] -> [5,4,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.963662148s) [2,1,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.731933594s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.962235451s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.730468750s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,5], acting [3,4,5] -> [4,3,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024379730s) [5,4,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.792724609s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.962183952s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.730468750s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024049759s) [2,1,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.792602539s@ mbc={}] start_peering_interval up [4,5,3] -> [2,1,0], acting [4,5,3] -> [2,1,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.961515427s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.730224609s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,5], acting [3,4,5] -> [4,3,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.023998260s) [2,1,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.792602539s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1c( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966394424s) [1,3,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.735107422s@ mbc={}] start_peering_interval up [5,4,0] -> [1,3,2], acting [5,4,0] -> [1,3,2], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.023966789s) [1,5,3] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.792358398s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.961445808s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.730224609s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.023722649s) [2,4,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.792602539s@ mbc={}] start_peering_interval up [4,5,3] -> [2,4,0], acting [4,5,3] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.18( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.959779739s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.728759766s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,5], acting [3,4,5] -> [4,3,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1c( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966338158s) [1,3,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.735107422s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.18( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.959726334s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.728759766s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.023674965s) [2,4,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.792602539s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.19( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.960644722s) [2,3,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.729858398s@ mbc={}] start_peering_interval up [3,4,5] -> [2,3,1], acting [3,4,5] -> [2,3,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.023437500s) [3,5,4] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.792724609s@ mbc={}] start_peering_interval up [4,5,3] -> [3,5,4], acting [4,5,3] -> [3,5,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.023385048s) [3,5,4] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.792724609s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.19( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.960531235s) [2,3,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.729858398s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1e( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.965315819s) [3,2,4] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734985352s@ mbc={}] start_peering_interval up [5,4,0] -> [3,2,4], acting [5,4,0] -> [3,2,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.022989273s) [0,4,2] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.792724609s@ mbc={}] start_peering_interval up [4,5,3] -> [0,4,2], acting [4,5,3] -> [0,4,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1e( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.965266228s) [3,2,4] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734985352s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.022940636s) [0,4,2] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.792724609s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538513.localdomain sudo[55498]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osjbrfhzsnberiypditzbfnykbvnldbn ; /usr/bin/python3
Nov 28 08:08:13 np0005538513.localdomain sudo[55498]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:13 np0005538513.localdomain python3[55500]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:13 np0005538513.localdomain sudo[55498]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:14 np0005538513.localdomain sudo[55541]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gomazuaojndyclrinkqnaazwpwdmjing ; /usr/bin/python3
Nov 28 08:08:14 np0005538513.localdomain sudo[55541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[7.b( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=1 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.e( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [3,2,4] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.1( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [0,4,2] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [0,4,2] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[7.9( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=1 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [3,2,4] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.f( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [3,2,4] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.14( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [4,2,0] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [4,3,2] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.d( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [4,2,3] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[7.f( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=1 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.4( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [3,2,1] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,5,1] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.c( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [4,3,2] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [3,2,1] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.1d( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,1,5] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [3,4,2] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [4,2,3] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.16( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [0,1,5] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [3,2,1] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,5,1] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=1 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.c( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,1,5] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [3,2,4] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[7.3( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=1 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=1 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.6( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,4,5] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain python3[55543]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317293.6414745-92600-22399435218611/source dest=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring mode=600 _original_basename=ceph.client.manila.keyring follow=False checksum=880d8421ed22fd6e089f5c7c842f51482074b0c0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.5( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [0,4,5] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.19( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [0,1,2] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.3( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,5,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[7.7( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=1 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.1( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,3,5] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[7.d( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=1 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [4,2,3] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.7( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,3,5] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.4( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,1,5] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.1c( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,3,2] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [3,4,2] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [1,0,2] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,5,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [3,2,4] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,0,2] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.15( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,3,5] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.d( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,2,3] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [1,2,0] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.14( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,2,0] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.13( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,3,2] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [1,2,0] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.1f( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,5,3] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.10( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,5,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.19( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [0,1,5] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[5.b( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,0,4] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,5,1] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[6.12( empty local-lis/les=46/47 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,4,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.17( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,5,4] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[2.15( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [5,0,4] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,4,5] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain sudo[55541]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,5,4] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[4.8( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [5,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[5.4( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,3,4] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[4.6( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [5,3,4] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[2.1b( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [5,4,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[3.1a( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [5,3,4] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[3.1d( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [5,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[3.1b( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [5,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[6.1c( empty local-lis/les=46/47 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,3,4] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [1,5,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[6.1e( empty local-lis/les=46/47 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,1,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[6.17( empty local-lis/les=46/47 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,0,1] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[4.15( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [5,3,1] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[2.d( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [5,1,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[3.9( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [5,1,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[2.12( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [5,3,1] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[5.13( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,0,1] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[4.14( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [5,0,1] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[2.b( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [5,1,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[4.9( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [5,0,1] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[6.1b( empty local-lis/les=46/47 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,1,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 47 pg[4.1c( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,3,4] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[5.12( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,1,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 47 pg[4.1f( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 47 pg[3.8( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,0,4] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 47 pg[2.f( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,4,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 47 pg[2.10( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,0,4] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 47 pg[3.e( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,4,0] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 47 pg[5.e( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [2,0,4] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 47 pg[2.13( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,4,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 47 pg[2.18( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [5,1,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 47 pg[5.1a( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [2,4,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 47 pg[5.d( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [2,4,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 47 pg[2.1d( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,4,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 47 pg[4.3( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 47 pg[4.1d( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,1,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 47 pg[2.1c( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,1,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 47 pg[4.19( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,3,1] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 47 pg[2.c( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,0,1] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 47 pg[2.5( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,0,1] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 47 pg[4.2( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,1,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 47 pg[2.a( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,3,1] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 47 pg[3.15( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,1,0] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 47 pg[4.1( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,1,0] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 47 pg[5.8( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [2,0,1] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 47 pg[6.1( empty local-lis/les=46/47 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [2,1,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:15 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 48 pg[7.2( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.795694351s) [3,5,1] r=1 lpr=48 pi=[44,48)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1190.620849609s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:15 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 48 pg[7.6( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.800373077s) [3,5,1] r=1 lpr=48 pi=[44,48)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1190.625488281s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:15 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 48 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.799404144s) [3,5,1] r=1 lpr=48 pi=[44,48)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1190.624633789s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:15 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 48 pg[7.6( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.800303459s) [3,5,1] r=1 lpr=48 pi=[44,48)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1190.625488281s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:15 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 48 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.799332619s) [3,5,1] r=1 lpr=48 pi=[44,48)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1190.624633789s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:15 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 48 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.799490929s) [3,5,1] r=1 lpr=48 pi=[44,48)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1190.624755859s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:15 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 48 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.799461365s) [3,5,1] r=1 lpr=48 pi=[44,48)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1190.624755859s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:15 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 48 pg[7.2( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.795141220s) [3,5,1] r=1 lpr=48 pi=[44,48)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1190.620849609s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:19 np0005538513.localdomain sudo[55604]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-outwxbjgxsuokblqteguxmojrgokmbsu ; /usr/bin/python3
Nov 28 08:08:19 np0005538513.localdomain sudo[55604]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:19 np0005538513.localdomain python3[55606]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:19 np0005538513.localdomain sudo[55604]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:19 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.11 deep-scrub starts
Nov 28 08:08:19 np0005538513.localdomain sudo[55647]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sptrxnbopqktuilaewljaqlllfmrtvih ; /usr/bin/python3
Nov 28 08:08:19 np0005538513.localdomain sudo[55647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:19 np0005538513.localdomain python3[55649]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317299.1397874-92600-218611359240169/source dest=/var/lib/tripleo-config/ceph/ceph.conf mode=644 _original_basename=ceph.conf follow=False checksum=3f1634d98b90f8c800fba4d3a33fb1546a043fff backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:19 np0005538513.localdomain sudo[55647]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:22 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Nov 28 08:08:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:08:22 np0005538513.localdomain systemd[1]: tmp-crun.bzP97W.mount: Deactivated successfully.
Nov 28 08:08:22 np0005538513.localdomain podman[55664]: 2025-11-28 08:08:22.852700776 +0000 UTC m=+0.090409105 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:08:23 np0005538513.localdomain podman[55664]: 2025-11-28 08:08:23.046369528 +0000 UTC m=+0.284077857 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, version=17.1.12)
Nov 28 08:08:23 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:08:23 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 50 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=14.753958702s) [3,4,2] r=2 lpr=50 pi=[46,50)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1203.783691406s@ mbc={}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:23 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 50 pg[7.3( v 36'39 (0'0,36'39] local-lis/les=46/47 n=2 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=50 pruub=14.758782387s) [3,4,2] r=2 lpr=50 pi=[46,50)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1203.788940430s@ mbc={}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:23 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 50 pg[7.3( v 36'39 (0'0,36'39] local-lis/les=46/47 n=2 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=50 pruub=14.758728027s) [3,4,2] r=2 lpr=50 pi=[46,50)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1203.788940430s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:23 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 50 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=50 pruub=14.763359070s) [3,4,2] r=2 lpr=50 pi=[46,50)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1203.793457031s@ mbc={}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:23 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 50 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=50 pruub=14.758196831s) [3,4,2] r=2 lpr=50 pi=[46,50)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1203.788574219s@ mbc={}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:23 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 50 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=50 pruub=14.763068199s) [3,4,2] r=2 lpr=50 pi=[46,50)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1203.793457031s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:23 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 50 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=50 pruub=14.758165359s) [3,4,2] r=2 lpr=50 pi=[46,50)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1203.788574219s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:23 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 50 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=14.753871918s) [3,4,2] r=2 lpr=50 pi=[46,50)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1203.783691406s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:25 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 52 pg[7.c( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=52 pruub=12.537527084s) [0,1,2] r=-1 lpr=52 pi=[44,52)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1198.625610352s@ mbc={}] start_peering_interval up [1,5,3] -> [0,1,2], acting [1,5,3] -> [0,1,2], acting_primary 1 -> 0, up_primary 1 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:25 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 52 pg[7.4( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=52 pruub=12.537282944s) [0,1,2] r=-1 lpr=52 pi=[44,52)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1198.625366211s@ mbc={}] start_peering_interval up [1,5,3] -> [0,1,2], acting [1,5,3] -> [0,1,2], acting_primary 1 -> 0, up_primary 1 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:25 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 52 pg[7.c( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=52 pruub=12.537391663s) [0,1,2] r=-1 lpr=52 pi=[44,52)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1198.625610352s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:25 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 52 pg[7.4( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=52 pruub=12.537203789s) [0,1,2] r=-1 lpr=52 pi=[44,52)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1198.625366211s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:25 np0005538513.localdomain sudo[55740]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlxvjkjjjyalycvjapkxtakxaweigpap ; /usr/bin/python3
Nov 28 08:08:25 np0005538513.localdomain sudo[55740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:26 np0005538513.localdomain python3[55742]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:26 np0005538513.localdomain sudo[55740]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:26 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.1a scrub starts
Nov 28 08:08:26 np0005538513.localdomain sudo[55785]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdpqusjvlkkeziwqjqtdigdwaspswcda ; /usr/bin/python3
Nov 28 08:08:26 np0005538513.localdomain sudo[55785]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:26 np0005538513.localdomain python3[55787]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317305.7807572-93166-33917102108375/source _original_basename=tmp8_keiueq follow=False checksum=f17091ee142621a3c8290c8c96b5b52d67b3a864 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:26 np0005538513.localdomain sudo[55785]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:27 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 52 pg[7.4( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=52) [0,1,2] r=2 lpr=52 pi=[44,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:27 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 52 pg[7.c( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=52) [0,1,2] r=2 lpr=52 pi=[44,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:27 np0005538513.localdomain sudo[55847]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zcepecodinzxoouvtsmqnqrftuqdzqao ; /usr/bin/python3
Nov 28 08:08:27 np0005538513.localdomain sudo[55847]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:27 np0005538513.localdomain python3[55849]: ansible-ansible.legacy.stat Invoked with path=/usr/local/sbin/containers-tmpwatch follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:27 np0005538513.localdomain sudo[55847]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:28 np0005538513.localdomain sudo[55890]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hiwtqphwuprthpncxbowcsmdkhihajrq ; /usr/bin/python3
Nov 28 08:08:28 np0005538513.localdomain sudo[55890]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:28 np0005538513.localdomain python3[55892]: ansible-ansible.legacy.copy Invoked with dest=/usr/local/sbin/containers-tmpwatch group=root mode=493 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317307.4584112-93255-250670734165250/source _original_basename=tmpzgnqv8_t follow=False checksum=84397b037dad9813fed388c4bcdd4871f384cd22 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:28 np0005538513.localdomain sudo[55890]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:28 np0005538513.localdomain sudo[55920]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mamtvlbpfgydltabixaigslckqpwabcq ; /usr/bin/python3
Nov 28 08:08:28 np0005538513.localdomain sudo[55920]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:29 np0005538513.localdomain python3[55922]: ansible-cron Invoked with job=/usr/local/sbin/containers-tmpwatch name=Remove old logs special_time=daily user=root state=present backup=False minute=* hour=* day=* month=* weekday=* disabled=False env=False cron_file=None insertafter=None insertbefore=None
Nov 28 08:08:29 np0005538513.localdomain crontab[55923]: (root) LIST (root)
Nov 28 08:08:29 np0005538513.localdomain crontab[55924]: (root) REPLACE (root)
Nov 28 08:08:29 np0005538513.localdomain sudo[55920]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:29 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.1b scrub starts
Nov 28 08:08:29 np0005538513.localdomain sudo[55938]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxttbyexghdtmjflspqsyxxerupahuri ; /usr/bin/python3
Nov 28 08:08:29 np0005538513.localdomain sudo[55938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:29 np0005538513.localdomain python3[55940]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_2 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:08:29 np0005538513.localdomain sudo[55938]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:29 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.1b scrub ok
Nov 28 08:08:29 np0005538513.localdomain sudo[55988]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oykrovglsakunhjgrvtkennunrskrqmx ; /usr/bin/python3
Nov 28 08:08:29 np0005538513.localdomain sudo[55988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:30 np0005538513.localdomain sudo[55988]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:30 np0005538513.localdomain sudo[56006]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqjdbwbaeflzdnpfwzdrygynepblrdfe ; /usr/bin/python3
Nov 28 08:08:30 np0005538513.localdomain sudo[56006]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:30 np0005538513.localdomain sudo[56006]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:30 np0005538513.localdomain sudo[56110]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ouduigkofbqaglnwjgsfffoimfhwiiku ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317310.48299-93338-213523086718522/async_wrapper.py 478729855719 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317310.48299-93338-213523086718522/AnsiballZ_command.py _
Nov 28 08:08:30 np0005538513.localdomain sudo[56110]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 28 08:08:31 np0005538513.localdomain ansible-async_wrapper.py[56112]: Invoked with 478729855719 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317310.48299-93338-213523086718522/AnsiballZ_command.py _
Nov 28 08:08:31 np0005538513.localdomain ansible-async_wrapper.py[56115]: Starting module and watcher
Nov 28 08:08:31 np0005538513.localdomain ansible-async_wrapper.py[56115]: Start watching 56116 (3600)
Nov 28 08:08:31 np0005538513.localdomain ansible-async_wrapper.py[56116]: Start module (56116)
Nov 28 08:08:31 np0005538513.localdomain ansible-async_wrapper.py[56112]: Return async_wrapper task started.
Nov 28 08:08:31 np0005538513.localdomain sudo[56110]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:31 np0005538513.localdomain sudo[56132]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcvlfriwztfkavrgrduaruvkfavaksdj ; /usr/bin/python3
Nov 28 08:08:31 np0005538513.localdomain sudo[56132]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:31 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.d scrub starts
Nov 28 08:08:31 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.d scrub ok
Nov 28 08:08:31 np0005538513.localdomain python3[56136]: ansible-ansible.legacy.async_status Invoked with jid=478729855719.56112 mode=status _async_dir=/tmp/.ansible_async
Nov 28 08:08:31 np0005538513.localdomain sudo[56132]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:32 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.e scrub starts
Nov 28 08:08:32 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.e scrub ok
Nov 28 08:08:32 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 54 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=46/47 n=2 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=54 pruub=13.465290070s) [2,0,4] r=0 lpr=54 pi=[46,54)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1211.793701172s@ mbc={}] start_peering_interval up [4,2,3] -> [2,0,4], acting [4,2,3] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:32 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 54 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=46/47 n=2 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=54 pruub=13.465290070s) [2,0,4] r=0 lpr=54 pi=[46,54)/1 crt=36'39 mlcod 0'0 unknown pruub 1211.793701172s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:32 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 54 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=46/47 n=2 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=54 pruub=13.464475632s) [2,0,4] r=0 lpr=54 pi=[46,54)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1211.793334961s@ mbc={}] start_peering_interval up [4,2,3] -> [2,0,4], acting [4,2,3] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:32 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 54 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=46/47 n=2 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=54 pruub=13.464475632s) [2,0,4] r=0 lpr=54 pi=[46,54)/1 crt=36'39 mlcod 0'0 unknown pruub 1211.793334961s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:33 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 4.6 scrub starts
Nov 28 08:08:33 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 4.6 scrub ok
Nov 28 08:08:34 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 55 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=54/55 n=2 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=54) [2,0,4] r=0 lpr=54 pi=[46,54)/1 crt=36'39 mlcod 0'0 active+degraded mbc={255={(2+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:34 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 55 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=54/55 n=2 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=54) [2,0,4] r=0 lpr=54 pi=[46,54)/1 crt=36'39 mlcod 0'0 active+degraded mbc={255={(2+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:34 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Nov 28 08:08:34 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.1a deep-scrub starts
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]:    (file & line not available)
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]:    (file & line not available)
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.10 seconds
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]: Notice: Applied catalog in 0.03 seconds
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]: Application:
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]:    Initial environment: production
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]:    Converged environment: production
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]:          Run mode: user
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]: Changes:
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]: Events:
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]: Resources:
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]:             Total: 10
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]: Time:
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]:          Schedule: 0.00
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]:              File: 0.00
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]:              Exec: 0.01
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]:            Augeas: 0.01
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]:    Transaction evaluation: 0.03
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]:    Catalog application: 0.03
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]:    Config retrieval: 0.13
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]:          Last run: 1764317314
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]:        Filebucket: 0.00
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]:             Total: 0.04
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]: Version:
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]:            Config: 1764317314
Nov 28 08:08:34 np0005538513.localdomain puppet-user[56134]:            Puppet: 7.10.0
Nov 28 08:08:34 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.1a deep-scrub ok
Nov 28 08:08:34 np0005538513.localdomain ansible-async_wrapper.py[56116]: Module complete (56116)
Nov 28 08:08:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:08:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Cumulative writes: 4244 writes, 20K keys, 4244 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4244 writes, 311 syncs, 13.65 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 990 writes, 3824 keys, 990 commit groups, 1.0 writes per commit group, ingest: 1.67 MB, 0.00 MB/s
                                                          Interval WAL: 990 writes, 168 syncs, 5.89 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0a2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0a2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0a2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 28 08:08:34 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 56 pg[7.6( v 36'39 (0'0,36'39] local-lis/les=48/49 n=2 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=13.480182648s) [0,4,5] r=2 lpr=56 pi=[48,56)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1208.854370117s@ mbc={}] start_peering_interval up [3,5,1] -> [0,4,5], acting [3,5,1] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:34 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 56 pg[7.6( v 36'39 (0'0,36'39] local-lis/les=48/49 n=2 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=13.480103493s) [0,4,5] r=2 lpr=56 pi=[48,56)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1208.854370117s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:34 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 56 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=48/49 n=1 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=13.480300903s) [0,4,5] r=2 lpr=56 pi=[48,56)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1208.854492188s@ mbc={}] start_peering_interval up [3,5,1] -> [0,4,5], acting [3,5,1] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:34 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 56 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=48/49 n=1 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=13.480031013s) [0,4,5] r=2 lpr=56 pi=[48,56)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1208.854492188s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:35 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 3.e deep-scrub starts
Nov 28 08:08:35 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 3.e deep-scrub ok
Nov 28 08:08:36 np0005538513.localdomain ansible-async_wrapper.py[56115]: Done in kid B.
Nov 28 08:08:36 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 3.8 scrub starts
Nov 28 08:08:37 np0005538513.localdomain sudo[56248]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:08:37 np0005538513.localdomain sudo[56248]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:08:37 np0005538513.localdomain sudo[56248]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:37 np0005538513.localdomain sudo[56263]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 08:08:37 np0005538513.localdomain sudo[56263]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:08:38 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Nov 28 08:08:38 np0005538513.localdomain sudo[56263]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:38 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.b scrub starts
Nov 28 08:08:38 np0005538513.localdomain sudo[56299]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:08:38 np0005538513.localdomain sudo[56299]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:08:38 np0005538513.localdomain sudo[56299]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:38 np0005538513.localdomain sudo[56314]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:08:38 np0005538513.localdomain sudo[56314]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:08:38 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Nov 28 08:08:39 np0005538513.localdomain sudo[56314]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:39 np0005538513.localdomain sudo[56360]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:08:39 np0005538513.localdomain sudo[56360]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:08:39 np0005538513.localdomain sudo[56360]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:08:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1200.3 total, 600.0 interval
                                                          Cumulative writes: 4737 writes, 21K keys, 4737 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4737 writes, 417 syncs, 11.36 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 1354 writes, 5212 keys, 1354 commit groups, 1.0 writes per commit group, ingest: 2.02 MB, 0.00 MB/s
                                                          Interval WAL: 1354 writes, 222 syncs, 6.10 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5584391482d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5584391482d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5584391482d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.032       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.032       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.032       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.3 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 28 08:08:40 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 3.15 scrub starts
Nov 28 08:08:40 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 3.15 scrub ok
Nov 28 08:08:41 np0005538513.localdomain sudo[56388]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oodzluotcvautfwvckbuzycalrtpeydt ; /usr/bin/python3
Nov 28 08:08:41 np0005538513.localdomain sudo[56388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:41 np0005538513.localdomain python3[56390]: ansible-ansible.legacy.async_status Invoked with jid=478729855719.56112 mode=status _async_dir=/tmp/.ansible_async
Nov 28 08:08:41 np0005538513.localdomain sudo[56388]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:42 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 6.1b scrub starts
Nov 28 08:08:42 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 6.1b scrub ok
Nov 28 08:08:42 np0005538513.localdomain sudo[56404]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpcpzxqvjjguqgybwgmsfnamtjjbksbx ; /usr/bin/python3
Nov 28 08:08:42 np0005538513.localdomain sudo[56404]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:42 np0005538513.localdomain python3[56406]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 08:08:42 np0005538513.localdomain sudo[56404]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:42 np0005538513.localdomain sudo[56420]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrgobpyrplnpqirecmjhivcsiezybtgo ; /usr/bin/python3
Nov 28 08:08:42 np0005538513.localdomain sudo[56420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:42 np0005538513.localdomain python3[56422]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:08:42 np0005538513.localdomain sudo[56420]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:43 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 6.1e deep-scrub starts
Nov 28 08:08:43 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 6.1e deep-scrub ok
Nov 28 08:08:43 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 58 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=50/51 n=1 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=58 pruub=13.559302330s) [1,5,3] r=-1 lpr=58 pi=[50,58)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1222.071411133s@ mbc={}] start_peering_interval up [3,4,2] -> [1,5,3], acting [3,4,2] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:43 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 58 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=50/51 n=1 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=58 pruub=13.558609962s) [1,5,3] r=-1 lpr=58 pi=[50,58)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1222.071411133s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:43 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 58 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=50/51 n=1 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=58 pruub=13.554260254s) [1,5,3] r=-1 lpr=58 pi=[50,58)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1222.067749023s@ mbc={}] start_peering_interval up [3,4,2] -> [1,5,3], acting [3,4,2] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:43 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 58 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=50/51 n=1 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=58 pruub=13.554097176s) [1,5,3] r=-1 lpr=58 pi=[50,58)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1222.067749023s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:43 np0005538513.localdomain sudo[56470]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uydesdfwynxdtoylyrzepwzqxttdppru ; /usr/bin/python3
Nov 28 08:08:43 np0005538513.localdomain sudo[56470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:43 np0005538513.localdomain python3[56472]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:43 np0005538513.localdomain sudo[56470]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:43 np0005538513.localdomain sudo[56488]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ruxwjmskdkdjjpjsmsmijfayokduezwf ; /usr/bin/python3
Nov 28 08:08:43 np0005538513.localdomain sudo[56488]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:43 np0005538513.localdomain python3[56490]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpcfhnpj94 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 08:08:43 np0005538513.localdomain sudo[56488]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:43 np0005538513.localdomain sudo[56518]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eyaszddorjwizcsumlxorbxwmdxnizvv ; /usr/bin/python3
Nov 28 08:08:43 np0005538513.localdomain sudo[56518]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:44 np0005538513.localdomain python3[56520]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:44 np0005538513.localdomain sudo[56518]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:44 np0005538513.localdomain sudo[56534]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-naawublfiqzvzgxqgueeabgweuqmosjy ; /usr/bin/python3
Nov 28 08:08:44 np0005538513.localdomain sudo[56534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:44 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 58 pg[7.f( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=58) [1,5,3] r=1 lpr=58 pi=[50,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:44 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 58 pg[7.7( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=58) [1,5,3] r=1 lpr=58 pi=[50,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:44 np0005538513.localdomain sudo[56534]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:44 np0005538513.localdomain sudo[56621]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zvmmklksgimmccneoclaegncvntevoji ; /usr/bin/python3
Nov 28 08:08:44 np0005538513.localdomain sudo[56621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:45 np0005538513.localdomain python3[56623]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Nov 28 08:08:45 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 60 pg[7.8( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=60 pruub=9.058769226s) [3,4,5] r=2 lpr=60 pi=[44,60)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1214.625488281s@ mbc={}] start_peering_interval up [1,5,3] -> [3,4,5], acting [1,5,3] -> [3,4,5], acting_primary 1 -> 3, up_primary 1 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:45 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 60 pg[7.8( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=60 pruub=9.058665276s) [3,4,5] r=2 lpr=60 pi=[44,60)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1214.625488281s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:45 np0005538513.localdomain sudo[56621]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:45 np0005538513.localdomain sudo[56640]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxaumwbhxvejjxlghaabigcnokeiodpp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:45 np0005538513.localdomain sudo[56640]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:45 np0005538513.localdomain python3[56642]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:45 np0005538513.localdomain sudo[56640]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:46 np0005538513.localdomain sudo[56656]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzairfsqymavhrbkelvsdgsvwjsaoicw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:46 np0005538513.localdomain sudo[56656]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:46 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Nov 28 08:08:46 np0005538513.localdomain sudo[56656]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:46 np0005538513.localdomain sudo[56672]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtkffbyeltpcdeunfqutdnztzwohbbjw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:46 np0005538513.localdomain sudo[56672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:46 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Nov 28 08:08:46 np0005538513.localdomain python3[56674]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:08:46 np0005538513.localdomain sudo[56672]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:47 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 6.17 scrub starts
Nov 28 08:08:47 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 6.17 scrub ok
Nov 28 08:08:47 np0005538513.localdomain sudo[56722]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-spyddxsvdchtykrzbicexilktkldvbym ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:47 np0005538513.localdomain sudo[56722]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:47 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 62 pg[7.9( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/47/0 sis=62 pruub=15.181772232s) [0,2,4] r=1 lpr=62 pi=[46,62)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1227.784179688s@ mbc={}] start_peering_interval up [4,2,3] -> [0,2,4], acting [4,2,3] -> [0,2,4], acting_primary 4 -> 0, up_primary 4 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:47 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 62 pg[7.9( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/47/0 sis=62 pruub=15.181417465s) [0,2,4] r=1 lpr=62 pi=[46,62)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1227.784179688s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:47 np0005538513.localdomain python3[56724]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:47 np0005538513.localdomain sudo[56722]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:47 np0005538513.localdomain sudo[56740]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-weowddtgglxxyacjdyszmcgkcqtjgixf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:47 np0005538513.localdomain sudo[56740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:47 np0005538513.localdomain python3[56742]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:47 np0005538513.localdomain sudo[56740]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:47 np0005538513.localdomain sudo[56802]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gasiqjdzgthkucddrkoyonijfizvrvbz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:47 np0005538513.localdomain sudo[56802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:48 np0005538513.localdomain python3[56804]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:48 np0005538513.localdomain sudo[56802]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:48 np0005538513.localdomain sudo[56820]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwgssejjbmohrvtmyqpffaricponkpux ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:48 np0005538513.localdomain sudo[56820]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:48 np0005538513.localdomain python3[56822]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:48 np0005538513.localdomain sudo[56820]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:48 np0005538513.localdomain sudo[56882]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qctqfdyrrpccnyljqhdzytdkefqmulzh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:48 np0005538513.localdomain sudo[56882]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:48 np0005538513.localdomain python3[56884]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:48 np0005538513.localdomain sudo[56882]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:49 np0005538513.localdomain sudo[56900]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tnuvtocrpbcobgmkzxwocfdulfiehopl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:49 np0005538513.localdomain sudo[56900]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:49 np0005538513.localdomain python3[56902]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:49 np0005538513.localdomain sudo[56900]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:49 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 64 pg[7.a( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=64) [2,0,4] r=0 lpr=64 pi=[48,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:49 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 64 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=48/49 n=1 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=64 pruub=15.188872337s) [2,0,4] r=-1 lpr=64 pi=[48,64)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1224.851440430s@ mbc={}] start_peering_interval up [3,5,1] -> [2,0,4], acting [3,5,1] -> [2,0,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:49 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 64 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=48/49 n=1 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=64 pruub=15.188785553s) [2,0,4] r=-1 lpr=64 pi=[48,64)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1224.851440430s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:49 np0005538513.localdomain sudo[56962]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqdnuogsirqbcruxpructtvnxpjggdzf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:49 np0005538513.localdomain sudo[56962]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:49 np0005538513.localdomain python3[56964]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:49 np0005538513.localdomain sudo[56962]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:49 np0005538513.localdomain sudo[56980]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdxovzgdyfgzzzbwfubrjibutcngkniu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:49 np0005538513.localdomain sudo[56980]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:50 np0005538513.localdomain python3[56982]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:50 np0005538513.localdomain sudo[56980]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:50 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 65 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=64/65 n=1 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=64) [2,0,4] r=0 lpr=64 pi=[48,64)/1 crt=36'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:50 np0005538513.localdomain sudo[57010]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jukrdplgyvmufbzauxjkfpstccshoxhs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:50 np0005538513.localdomain sudo[57010]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:50 np0005538513.localdomain python3[57012]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:08:50 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:08:50 np0005538513.localdomain systemd-rc-local-generator[57035]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:08:50 np0005538513.localdomain systemd-sysv-generator[57041]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:08:50 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:08:50 np0005538513.localdomain sudo[57010]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:51 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.d scrub starts
Nov 28 08:08:51 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.d scrub ok
Nov 28 08:08:51 np0005538513.localdomain sudo[57095]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mydfrpugiocwyyidxnmwwxucpfmzyado ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:51 np0005538513.localdomain sudo[57095]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:51 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 66 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=50/51 n=1 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=66 pruub=13.378162384s) [3,1,2] r=2 lpr=66 pi=[50,66)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1230.071655273s@ mbc={}] start_peering_interval up [3,4,2] -> [3,1,2], acting [3,4,2] -> [3,1,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:51 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 66 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=50/51 n=1 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=66 pruub=13.378087044s) [3,1,2] r=2 lpr=66 pi=[50,66)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1230.071655273s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:51 np0005538513.localdomain python3[57097]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:51 np0005538513.localdomain sudo[57095]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:51 np0005538513.localdomain sudo[57113]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qymevwnwdwkdhlzkuohrldrcrpmoxdsh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:51 np0005538513.localdomain sudo[57113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:51 np0005538513.localdomain python3[57115]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:51 np0005538513.localdomain sudo[57113]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:52 np0005538513.localdomain sudo[57175]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fbmvrfossezaezyhyuwwwoyroaxvevfo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:52 np0005538513.localdomain sudo[57175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:52 np0005538513.localdomain python3[57177]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:52 np0005538513.localdomain sudo[57175]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:52 np0005538513.localdomain sudo[57193]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pinzkpuojwfrbfsqnzzppictqbekyogi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:52 np0005538513.localdomain sudo[57193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:52 np0005538513.localdomain python3[57195]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:52 np0005538513.localdomain sudo[57193]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:52 np0005538513.localdomain sudo[57223]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pejnmaagmdrlklnkwprkoqutljhudjfp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:52 np0005538513.localdomain sudo[57223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:53 np0005538513.localdomain python3[57225]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:08:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:08:53 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:08:53 np0005538513.localdomain systemd-sysv-generator[57267]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:08:53 np0005538513.localdomain systemd-rc-local-generator[57264]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:08:53 np0005538513.localdomain podman[57227]: 2025-11-28 08:08:53.191597834 +0000 UTC m=+0.108058007 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-qdrouterd-container)
Nov 28 08:08:53 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:08:53 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.1c scrub starts
Nov 28 08:08:53 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.1c scrub ok
Nov 28 08:08:53 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 68 pg[7.c( v 36'39 (0'0,36'39] local-lis/les=52/53 n=1 ec=44/33 lis/c=52/52 les/c/f=53/53/0 sis=68 pruub=13.965437889s) [1,3,2] r=2 lpr=68 pi=[52,68)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1232.697998047s@ mbc={}] start_peering_interval up [0,1,2] -> [1,3,2], acting [0,1,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:53 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 68 pg[7.c( v 36'39 (0'0,36'39] local-lis/les=52/53 n=1 ec=44/33 lis/c=52/52 les/c/f=53/53/0 sis=68 pruub=13.965130806s) [1,3,2] r=2 lpr=68 pi=[52,68)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1232.697998047s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:53 np0005538513.localdomain systemd[1]: Starting Create netns directory...
Nov 28 08:08:53 np0005538513.localdomain podman[57227]: 2025-11-28 08:08:53.438571377 +0000 UTC m=+0.355031590 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, release=1761123044)
Nov 28 08:08:53 np0005538513.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 08:08:53 np0005538513.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 08:08:53 np0005538513.localdomain systemd[1]: Finished Create netns directory.
Nov 28 08:08:53 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:08:53 np0005538513.localdomain sudo[57223]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:53 np0005538513.localdomain sudo[57312]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abmvybgcegbhlftpridnsziruqxzczmw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:53 np0005538513.localdomain sudo[57312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:53 np0005538513.localdomain python3[57314]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Nov 28 08:08:54 np0005538513.localdomain sudo[57312]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:54 np0005538513.localdomain sudo[57328]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzmzxsrjcbobilqzbnpcuhygntbrqizm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:54 np0005538513.localdomain sudo[57328]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:54 np0005538513.localdomain sudo[57328]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:55 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 4.8 scrub starts
Nov 28 08:08:55 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 4.8 scrub ok
Nov 28 08:08:55 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 70 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=54/55 n=2 ec=44/33 lis/c=54/54 les/c/f=55/55/0 sis=70 pruub=10.763648033s) [1,3,5] r=-1 lpr=70 pi=[54,70)/1 crt=36'39 mlcod 0'0 active pruub 1231.589477539s@ mbc={255={}}] start_peering_interval up [2,0,4] -> [1,3,5], acting [2,0,4] -> [1,3,5], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:55 np0005538513.localdomain ceph-osd[31557]: osd.2 pg_epoch: 70 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=54/55 n=2 ec=44/33 lis/c=54/54 les/c/f=55/55/0 sis=70 pruub=10.763505936s) [1,3,5] r=-1 lpr=70 pi=[54,70)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1231.589477539s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:55 np0005538513.localdomain sudo[57370]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kcouuturkvbcnqkeaeduixdxiscwzfqn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:55 np0005538513.localdomain sudo[57370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:56 np0005538513.localdomain python3[57372]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step2 config_dir=/var/lib/tripleo-config/container-startup-config/step_2 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Nov 28 08:08:56 np0005538513.localdomain podman[57448]: 2025-11-28 08:08:56.417422853 +0000 UTC m=+0.082531379 container create ecffbe7cad180bba146604f75d423ebc7fbe7687684a13ecf44cbde356aeddeb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=nova_virtqemud_init_logs, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, config_id=tripleo_step2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Nov 28 08:08:56 np0005538513.localdomain podman[57449]: 2025-11-28 08:08:56.446815398 +0000 UTC m=+0.104110048 container create ecd8620dfff7bb7b28b391e9a1ddd7c8109888d70f83e22569879e1b5ac9f23a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute_init_log, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, config_id=tripleo_step2, com.redhat.component=openstack-nova-compute-container, release=1761123044, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team)
Nov 28 08:08:56 np0005538513.localdomain systemd[1]: Started libpod-conmon-ecffbe7cad180bba146604f75d423ebc7fbe7687684a13ecf44cbde356aeddeb.scope.
Nov 28 08:08:56 np0005538513.localdomain podman[57448]: 2025-11-28 08:08:56.372694435 +0000 UTC m=+0.037803051 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:08:56 np0005538513.localdomain systemd[1]: Started libpod-conmon-ecd8620dfff7bb7b28b391e9a1ddd7c8109888d70f83e22569879e1b5ac9f23a.scope.
Nov 28 08:08:56 np0005538513.localdomain podman[57449]: 2025-11-28 08:08:56.375874291 +0000 UTC m=+0.033168971 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 28 08:08:56 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:08:56 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:08:56 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1223dc0c4e426b2bfe915961a21ef517e6d9229ac1981a406c76a3ee9d07ac7d/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff)
Nov 28 08:08:56 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc38caeb419d32ecfc51d7c0645586de2c8ada37c45a7d9cd6676aef2fc0ca1f/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:08:56 np0005538513.localdomain podman[57449]: 2025-11-28 08:08:56.497434454 +0000 UTC m=+0.154729104 container init ecd8620dfff7bb7b28b391e9a1ddd7c8109888d70f83e22569879e1b5ac9f23a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, container_name=nova_compute_init_log, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step2, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:08:56 np0005538513.localdomain podman[57449]: 2025-11-28 08:08:56.504684373 +0000 UTC m=+0.161979013 container start ecd8620dfff7bb7b28b391e9a1ddd7c8109888d70f83e22569879e1b5ac9f23a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, container_name=nova_compute_init_log, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step2, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container)
Nov 28 08:08:56 np0005538513.localdomain python3[57372]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute_init_log --conmon-pidfile /run/nova_compute_init_log.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1764316155 --label config_id=tripleo_step2 --label container_name=nova_compute_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute_init_log.log --network none --privileged=False --user root --volume /var/log/containers/nova:/var/log/nova:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /bin/bash -c chown -R nova:nova /var/log/nova
Nov 28 08:08:56 np0005538513.localdomain systemd[1]: libpod-ecd8620dfff7bb7b28b391e9a1ddd7c8109888d70f83e22569879e1b5ac9f23a.scope: Deactivated successfully.
Nov 28 08:08:56 np0005538513.localdomain podman[57448]: 2025-11-28 08:08:56.547218774 +0000 UTC m=+0.212327320 container init ecffbe7cad180bba146604f75d423ebc7fbe7687684a13ecf44cbde356aeddeb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, tcib_managed=true, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, config_id=tripleo_step2, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=nova_virtqemud_init_logs, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git, version=17.1.12, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 28 08:08:56 np0005538513.localdomain podman[57485]: 2025-11-28 08:08:56.563770633 +0000 UTC m=+0.041260114 container died ecd8620dfff7bb7b28b391e9a1ddd7c8109888d70f83e22569879e1b5ac9f23a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step2, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, container_name=nova_compute_init_log)
Nov 28 08:08:56 np0005538513.localdomain systemd[1]: libpod-ecffbe7cad180bba146604f75d423ebc7fbe7687684a13ecf44cbde356aeddeb.scope: Deactivated successfully.
Nov 28 08:08:56 np0005538513.localdomain podman[57485]: 2025-11-28 08:08:56.594125998 +0000 UTC m=+0.071615439 container cleanup ecd8620dfff7bb7b28b391e9a1ddd7c8109888d70f83e22569879e1b5ac9f23a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20251118.1, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, name=rhosp17/openstack-nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, container_name=nova_compute_init_log, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, managed_by=tripleo_ansible, config_id=tripleo_step2, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:08:56 np0005538513.localdomain systemd[1]: libpod-conmon-ecd8620dfff7bb7b28b391e9a1ddd7c8109888d70f83e22569879e1b5ac9f23a.scope: Deactivated successfully.
Nov 28 08:08:56 np0005538513.localdomain podman[57448]: 2025-11-28 08:08:56.606567003 +0000 UTC m=+0.271675519 container start ecffbe7cad180bba146604f75d423ebc7fbe7687684a13ecf44cbde356aeddeb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, config_id=tripleo_step2, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, container_name=nova_virtqemud_init_logs, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 28 08:08:56 np0005538513.localdomain python3[57372]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud_init_logs --conmon-pidfile /run/nova_virtqemud_init_logs.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1764316155 --label config_id=tripleo_step2 --label container_name=nova_virtqemud_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud_init_logs.log --network none --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --user root --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /bin/bash -c chown -R tss:tss /var/log/swtpm
Nov 28 08:08:56 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 70 pg[7.d( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=54/54 les/c/f=55/55/0 sis=70) [1,3,5] r=2 lpr=70 pi=[54,70)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:56 np0005538513.localdomain podman[57507]: 2025-11-28 08:08:56.64264168 +0000 UTC m=+0.064138004 container died ecffbe7cad180bba146604f75d423ebc7fbe7687684a13ecf44cbde356aeddeb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtqemud_init_logs, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step2, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z)
Nov 28 08:08:56 np0005538513.localdomain podman[57507]: 2025-11-28 08:08:56.766155053 +0000 UTC m=+0.187651347 container cleanup ecffbe7cad180bba146604f75d423ebc7fbe7687684a13ecf44cbde356aeddeb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step2, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, container_name=nova_virtqemud_init_logs)
Nov 28 08:08:56 np0005538513.localdomain systemd[1]: libpod-conmon-ecffbe7cad180bba146604f75d423ebc7fbe7687684a13ecf44cbde356aeddeb.scope: Deactivated successfully.
Nov 28 08:08:57 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 4.9 scrub starts
Nov 28 08:08:57 np0005538513.localdomain podman[57630]: 2025-11-28 08:08:57.165708354 +0000 UTC m=+0.087931580 container create ccbff20b3845342a68374757d2d5f6d53ef10d4fa6de90e093768667982a7531 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, architecture=x86_64, config_id=tripleo_step2, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, container_name=create_haproxy_wrapper, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public)
Nov 28 08:08:57 np0005538513.localdomain systemd[1]: Started libpod-conmon-ccbff20b3845342a68374757d2d5f6d53ef10d4fa6de90e093768667982a7531.scope.
Nov 28 08:08:57 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:08:57 np0005538513.localdomain podman[57629]: 2025-11-28 08:08:57.112872902 +0000 UTC m=+0.043504712 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:08:57 np0005538513.localdomain podman[57630]: 2025-11-28 08:08:57.115750259 +0000 UTC m=+0.037973545 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Nov 28 08:08:57 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ffc9017578f246ce8b4d7e2db3f29d809acca5f23dc5c91669a411567041b2c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 08:08:57 np0005538513.localdomain podman[57630]: 2025-11-28 08:08:57.224342372 +0000 UTC m=+0.146565638 container init ccbff20b3845342a68374757d2d5f6d53ef10d4fa6de90e093768667982a7531 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, release=1761123044, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, container_name=create_haproxy_wrapper, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true)
Nov 28 08:08:57 np0005538513.localdomain podman[57630]: 2025-11-28 08:08:57.233809507 +0000 UTC m=+0.156032743 container start ccbff20b3845342a68374757d2d5f6d53ef10d4fa6de90e093768667982a7531 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=create_haproxy_wrapper, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, config_id=tripleo_step2, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:08:57 np0005538513.localdomain podman[57630]: 2025-11-28 08:08:57.234224769 +0000 UTC m=+0.156448025 container attach ccbff20b3845342a68374757d2d5f6d53ef10d4fa6de90e093768667982a7531 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step2, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=create_haproxy_wrapper, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, tcib_managed=true, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']})
Nov 28 08:08:57 np0005538513.localdomain podman[57629]: 2025-11-28 08:08:57.275614657 +0000 UTC m=+0.206246417 container create 685c3d45d3785e0e616411ce008e8b79b4b543eac2375733e152070c7aaf3d71 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=create_virtlogd_wrapper, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, version=17.1.12, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step2)
Nov 28 08:08:57 np0005538513.localdomain systemd[1]: Started libpod-conmon-685c3d45d3785e0e616411ce008e8b79b4b543eac2375733e152070c7aaf3d71.scope.
Nov 28 08:08:57 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:08:57 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e44c8d0954aa6a829c62ac9efb238e93c579afb851dd0e28c7fd1cbc63565274/merged/var/lib/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Nov 28 08:08:57 np0005538513.localdomain podman[57629]: 2025-11-28 08:08:57.335870433 +0000 UTC m=+0.266502223 container init 685c3d45d3785e0e616411ce008e8b79b4b543eac2375733e152070c7aaf3d71 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=create_virtlogd_wrapper, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step2, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z)
Nov 28 08:08:57 np0005538513.localdomain podman[57629]: 2025-11-28 08:08:57.344129722 +0000 UTC m=+0.274761502 container start 685c3d45d3785e0e616411ce008e8b79b4b543eac2375733e152070c7aaf3d71 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step2, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=create_virtlogd_wrapper, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:35:22Z)
Nov 28 08:08:57 np0005538513.localdomain podman[57629]: 2025-11-28 08:08:57.34440121 +0000 UTC m=+0.275033030 container attach 685c3d45d3785e0e616411ce008e8b79b4b543eac2375733e152070c7aaf3d71 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, container_name=create_virtlogd_wrapper, build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_id=tripleo_step2, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com)
Nov 28 08:08:57 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 4.9 scrub ok
Nov 28 08:08:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cc38caeb419d32ecfc51d7c0645586de2c8ada37c45a7d9cd6676aef2fc0ca1f-merged.mount: Deactivated successfully.
Nov 28 08:08:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ecd8620dfff7bb7b28b391e9a1ddd7c8109888d70f83e22569879e1b5ac9f23a-userdata-shm.mount: Deactivated successfully.
Nov 28 08:08:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-1223dc0c4e426b2bfe915961a21ef517e6d9229ac1981a406c76a3ee9d07ac7d-merged.mount: Deactivated successfully.
Nov 28 08:08:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ecffbe7cad180bba146604f75d423ebc7fbe7687684a13ecf44cbde356aeddeb-userdata-shm.mount: Deactivated successfully.
Nov 28 08:08:58 np0005538513.localdomain ovs-vsctl[57756]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory)
Nov 28 08:08:59 np0005538513.localdomain systemd[1]: libpod-685c3d45d3785e0e616411ce008e8b79b4b543eac2375733e152070c7aaf3d71.scope: Deactivated successfully.
Nov 28 08:08:59 np0005538513.localdomain systemd[1]: libpod-685c3d45d3785e0e616411ce008e8b79b4b543eac2375733e152070c7aaf3d71.scope: Consumed 2.016s CPU time.
Nov 28 08:08:59 np0005538513.localdomain podman[57629]: 2025-11-28 08:08:59.353108928 +0000 UTC m=+2.283740708 container died 685c3d45d3785e0e616411ce008e8b79b4b543eac2375733e152070c7aaf3d71 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step2, tcib_managed=true, container_name=create_virtlogd_wrapper, architecture=x86_64, io.openshift.expose-services=, version=17.1.12, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, batch=17.1_20251118.1, distribution-scope=public)
Nov 28 08:08:59 np0005538513.localdomain systemd[1]: tmp-crun.rg4dLc.mount: Deactivated successfully.
Nov 28 08:08:59 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-685c3d45d3785e0e616411ce008e8b79b4b543eac2375733e152070c7aaf3d71-userdata-shm.mount: Deactivated successfully.
Nov 28 08:08:59 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e44c8d0954aa6a829c62ac9efb238e93c579afb851dd0e28c7fd1cbc63565274-merged.mount: Deactivated successfully.
Nov 28 08:08:59 np0005538513.localdomain podman[57883]: 2025-11-28 08:08:59.451138672 +0000 UTC m=+0.088473698 container cleanup 685c3d45d3785e0e616411ce008e8b79b4b543eac2375733e152070c7aaf3d71 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, container_name=create_virtlogd_wrapper, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., config_id=tripleo_step2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:08:59 np0005538513.localdomain systemd[1]: libpod-conmon-685c3d45d3785e0e616411ce008e8b79b4b543eac2375733e152070c7aaf3d71.scope: Deactivated successfully.
Nov 28 08:08:59 np0005538513.localdomain python3[57372]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/create_virtlogd_wrapper.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1764316155 --label config_id=tripleo_step2 --label container_name=create_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_virtlogd_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::nova::virtlogd_wrapper
Nov 28 08:09:00 np0005538513.localdomain systemd[1]: libpod-ccbff20b3845342a68374757d2d5f6d53ef10d4fa6de90e093768667982a7531.scope: Deactivated successfully.
Nov 28 08:09:00 np0005538513.localdomain systemd[1]: libpod-ccbff20b3845342a68374757d2d5f6d53ef10d4fa6de90e093768667982a7531.scope: Consumed 2.132s CPU time.
Nov 28 08:09:00 np0005538513.localdomain podman[57630]: 2025-11-28 08:09:00.274673462 +0000 UTC m=+3.196896698 container died ccbff20b3845342a68374757d2d5f6d53ef10d4fa6de90e093768667982a7531 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step2, io.openshift.expose-services=, container_name=create_haproxy_wrapper, maintainer=OpenStack TripleO Team, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, managed_by=tripleo_ansible)
Nov 28 08:09:00 np0005538513.localdomain podman[57923]: 2025-11-28 08:09:00.347629291 +0000 UTC m=+0.063980260 container cleanup ccbff20b3845342a68374757d2d5f6d53ef10d4fa6de90e093768667982a7531 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, config_id=tripleo_step2, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=create_haproxy_wrapper, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:09:00 np0005538513.localdomain systemd[1]: libpod-conmon-ccbff20b3845342a68374757d2d5f6d53ef10d4fa6de90e093768667982a7531.scope: Deactivated successfully.
Nov 28 08:09:00 np0005538513.localdomain python3[57372]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_haproxy_wrapper --conmon-pidfile /run/create_haproxy_wrapper.pid --detach=False --label config_id=tripleo_step2 --label container_name=create_haproxy_wrapper --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_haproxy_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers
Nov 28 08:09:00 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-0ffc9017578f246ce8b4d7e2db3f29d809acca5f23dc5c91669a411567041b2c-merged.mount: Deactivated successfully.
Nov 28 08:09:00 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ccbff20b3845342a68374757d2d5f6d53ef10d4fa6de90e093768667982a7531-userdata-shm.mount: Deactivated successfully.
Nov 28 08:09:00 np0005538513.localdomain sudo[57370]: pam_unix(sudo:session): session closed for user root
Nov 28 08:09:00 np0005538513.localdomain sudo[57976]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdelhxgxgxrlvdljnxtbnhnyeiiaiged ; /usr/bin/python3
Nov 28 08:09:00 np0005538513.localdomain sudo[57976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:09:01 np0005538513.localdomain python3[57978]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks2.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:09:01 np0005538513.localdomain sudo[57976]: pam_unix(sudo:session): session closed for user root
Nov 28 08:09:01 np0005538513.localdomain sudo[58024]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nagdoskkgadqtoixqcsqbbgqiymukpek ; /usr/bin/python3
Nov 28 08:09:01 np0005538513.localdomain sudo[58024]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:09:01 np0005538513.localdomain sudo[58024]: pam_unix(sudo:session): session closed for user root
Nov 28 08:09:01 np0005538513.localdomain sudo[58067]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfzzduhxdemfrhmdbigmrhtcgvffgusr ; /usr/bin/python3
Nov 28 08:09:01 np0005538513.localdomain sudo[58067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:09:01 np0005538513.localdomain sudo[58067]: pam_unix(sudo:session): session closed for user root
Nov 28 08:09:02 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 4.14 scrub starts
Nov 28 08:09:02 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 4.14 scrub ok
Nov 28 08:09:02 np0005538513.localdomain sudo[58097]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zprriczzsfusywhvuqpxuitbnakpjdow ; /usr/bin/python3
Nov 28 08:09:02 np0005538513.localdomain sudo[58097]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:09:02 np0005538513.localdomain python3[58099]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks2.json short_hostname=np0005538513 step=2 update_config_hash_only=False
Nov 28 08:09:02 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.3 scrub starts
Nov 28 08:09:02 np0005538513.localdomain sudo[58097]: pam_unix(sudo:session): session closed for user root
Nov 28 08:09:02 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.3 scrub ok
Nov 28 08:09:02 np0005538513.localdomain sudo[58113]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynimjcpjscbsdqakfaunrxnsitnkxtfj ; /usr/bin/python3
Nov 28 08:09:02 np0005538513.localdomain sudo[58113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:09:02 np0005538513.localdomain python3[58115]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:09:02 np0005538513.localdomain sudo[58113]: pam_unix(sudo:session): session closed for user root
Nov 28 08:09:03 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 4.15 scrub starts
Nov 28 08:09:03 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 4.15 scrub ok
Nov 28 08:09:03 np0005538513.localdomain sudo[58129]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uwjfdmzvanyveewcitglhqvkvkrmarhp ; /usr/bin/python3
Nov 28 08:09:03 np0005538513.localdomain sudo[58129]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:09:03 np0005538513.localdomain python3[58131]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_2 config_pattern=container-puppet-*.json config_overrides={} debug=True
Nov 28 08:09:03 np0005538513.localdomain sudo[58129]: pam_unix(sudo:session): session closed for user root
Nov 28 08:09:04 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.12 scrub starts
Nov 28 08:09:04 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 72 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=56/57 n=1 ec=44/33 lis/c=56/56 les/c/f=57/57/0 sis=72 pruub=12.112273216s) [3,5,1] r=1 lpr=72 pi=[56,72)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1236.578369141s@ mbc={}] start_peering_interval up [0,4,5] -> [3,5,1], acting [0,4,5] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:09:04 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 72 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=56/57 n=1 ec=44/33 lis/c=56/56 les/c/f=57/57/0 sis=72 pruub=12.112176895s) [3,5,1] r=1 lpr=72 pi=[56,72)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1236.578369141s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:09:04 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.12 scrub ok
Nov 28 08:09:05 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.1f scrub starts
Nov 28 08:09:05 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.1f scrub ok
Nov 28 08:09:05 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 74 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=58/59 n=1 ec=44/33 lis/c=58/58 les/c/f=59/59/0 sis=74 pruub=10.864244461s) [0,5,1] r=1 lpr=74 pi=[58,74)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1236.879638672s@ mbc={}] start_peering_interval up [1,5,3] -> [0,5,1], acting [1,5,3] -> [0,5,1], acting_primary 1 -> 0, up_primary 1 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:09:05 np0005538513.localdomain ceph-osd[32506]: osd.5 pg_epoch: 74 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=58/59 n=1 ec=44/33 lis/c=58/58 les/c/f=59/59/0 sis=74 pruub=10.864125252s) [0,5,1] r=1 lpr=74 pi=[58,74)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1236.879638672s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:09:06 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.1d scrub starts
Nov 28 08:09:06 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.1d scrub ok
Nov 28 08:09:08 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.9 scrub starts
Nov 28 08:09:08 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.9 scrub ok
Nov 28 08:09:10 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Nov 28 08:09:10 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Nov 28 08:09:11 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 5.b scrub starts
Nov 28 08:09:11 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 5.b scrub ok
Nov 28 08:09:12 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.1c scrub starts
Nov 28 08:09:12 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.1c scrub ok
Nov 28 08:09:13 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.5 scrub starts
Nov 28 08:09:13 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.5 scrub ok
Nov 28 08:09:14 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 5.12 scrub starts
Nov 28 08:09:14 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 5.12 scrub ok
Nov 28 08:09:16 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.a scrub starts
Nov 28 08:09:16 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.a scrub ok
Nov 28 08:09:19 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.c scrub starts
Nov 28 08:09:19 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.c scrub ok
Nov 28 08:09:20 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Nov 28 08:09:20 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Nov 28 08:09:22 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Nov 28 08:09:22 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.18 scrub ok
Nov 28 08:09:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:09:23 np0005538513.localdomain podman[58132]: 2025-11-28 08:09:23.851616382 +0000 UTC m=+0.084286301 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, container_name=metrics_qdr, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:09:24 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 6.12 scrub starts
Nov 28 08:09:24 np0005538513.localdomain podman[58132]: 2025-11-28 08:09:24.060560209 +0000 UTC m=+0.293230108 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.12, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 28 08:09:24 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 6.12 scrub ok
Nov 28 08:09:24 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:09:24 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.2 scrub starts
Nov 28 08:09:24 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.2 scrub ok
Nov 28 08:09:25 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.1d scrub starts
Nov 28 08:09:26 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Nov 28 08:09:26 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Nov 28 08:09:27 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.f scrub starts
Nov 28 08:09:27 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.f scrub ok
Nov 28 08:09:28 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.1 scrub starts
Nov 28 08:09:28 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.1 scrub ok
Nov 28 08:09:29 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.10 scrub starts
Nov 28 08:09:29 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.10 scrub ok
Nov 28 08:09:30 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Nov 28 08:09:30 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Nov 28 08:09:30 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Nov 28 08:09:30 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Nov 28 08:09:31 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 7.a deep-scrub starts
Nov 28 08:09:31 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 7.a deep-scrub ok
Nov 28 08:09:32 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 3.8 scrub starts
Nov 28 08:09:32 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 3.8 scrub ok
Nov 28 08:09:33 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Nov 28 08:09:33 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Nov 28 08:09:35 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.1a scrub starts
Nov 28 08:09:35 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.1a scrub ok
Nov 28 08:09:36 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.1d scrub starts
Nov 28 08:09:36 np0005538513.localdomain ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.1d scrub ok
Nov 28 08:09:38 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Nov 28 08:09:38 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 5.4 scrub ok
Nov 28 08:09:39 np0005538513.localdomain sudo[58162]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:09:39 np0005538513.localdomain sudo[58162]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:09:39 np0005538513.localdomain sudo[58162]: pam_unix(sudo:session): session closed for user root
Nov 28 08:09:39 np0005538513.localdomain sudo[58177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:09:39 np0005538513.localdomain sudo[58177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:09:40 np0005538513.localdomain sudo[58177]: pam_unix(sudo:session): session closed for user root
Nov 28 08:09:41 np0005538513.localdomain sudo[58223]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:09:41 np0005538513.localdomain sudo[58223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:09:41 np0005538513.localdomain sudo[58223]: pam_unix(sudo:session): session closed for user root
Nov 28 08:09:41 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.b scrub starts
Nov 28 08:09:41 np0005538513.localdomain ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.b scrub ok
Nov 28 08:09:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:09:54 np0005538513.localdomain podman[58238]: 2025-11-28 08:09:54.8428472 +0000 UTC m=+0.080390804 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1761123044, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:09:55 np0005538513.localdomain podman[58238]: 2025-11-28 08:09:55.075494622 +0000 UTC m=+0.313038276 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1)
Nov 28 08:09:55 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:10:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:10:25 np0005538513.localdomain podman[58267]: 2025-11-28 08:10:25.831164212 +0000 UTC m=+0.071056002 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git)
Nov 28 08:10:26 np0005538513.localdomain podman[58267]: 2025-11-28 08:10:26.025054267 +0000 UTC m=+0.264946017 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, container_name=metrics_qdr, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc.)
Nov 28 08:10:26 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:10:41 np0005538513.localdomain sudo[58297]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:10:41 np0005538513.localdomain sudo[58297]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:10:41 np0005538513.localdomain sudo[58297]: pam_unix(sudo:session): session closed for user root
Nov 28 08:10:41 np0005538513.localdomain sudo[58312]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 08:10:41 np0005538513.localdomain sudo[58312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:10:42 np0005538513.localdomain systemd[1]: tmp-crun.k5fw0k.mount: Deactivated successfully.
Nov 28 08:10:42 np0005538513.localdomain podman[58396]: 2025-11-28 08:10:42.130077347 +0000 UTC m=+0.097372170 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, architecture=x86_64, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, RELEASE=main, ceph=True, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.openshift.expose-services=)
Nov 28 08:10:42 np0005538513.localdomain podman[58396]: 2025-11-28 08:10:42.242461058 +0000 UTC m=+0.209755921 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, architecture=x86_64, ceph=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, RELEASE=main, release=553, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_CLEAN=True, CEPH_POINT_RELEASE=, name=rhceph, vendor=Red Hat, Inc., GIT_BRANCH=main, build-date=2025-09-24T08:57:55, distribution-scope=public)
Nov 28 08:10:42 np0005538513.localdomain sudo[58312]: pam_unix(sudo:session): session closed for user root
Nov 28 08:10:42 np0005538513.localdomain sudo[58465]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:10:42 np0005538513.localdomain sudo[58465]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:10:42 np0005538513.localdomain sudo[58465]: pam_unix(sudo:session): session closed for user root
Nov 28 08:10:42 np0005538513.localdomain sudo[58480]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:10:42 np0005538513.localdomain sudo[58480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:10:43 np0005538513.localdomain sudo[58480]: pam_unix(sudo:session): session closed for user root
Nov 28 08:10:43 np0005538513.localdomain sudo[58527]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:10:43 np0005538513.localdomain sudo[58527]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:10:43 np0005538513.localdomain sudo[58527]: pam_unix(sudo:session): session closed for user root
Nov 28 08:10:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:10:56 np0005538513.localdomain podman[58542]: 2025-11-28 08:10:56.839001578 +0000 UTC m=+0.077449695 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, container_name=metrics_qdr, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Nov 28 08:10:57 np0005538513.localdomain podman[58542]: 2025-11-28 08:10:57.029214463 +0000 UTC m=+0.267662580 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:10:57 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:11:03 np0005538513.localdomain sshd[58571]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:11:03 np0005538513.localdomain sshd[58571]: Invalid user ubuntu from 193.32.162.146 port 47850
Nov 28 08:11:04 np0005538513.localdomain sshd[58571]: Connection closed by invalid user ubuntu 193.32.162.146 port 47850 [preauth]
Nov 28 08:11:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:11:27 np0005538513.localdomain podman[58573]: 2025-11-28 08:11:27.845790017 +0000 UTC m=+0.079321142 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step1, url=https://www.redhat.com, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:11:28 np0005538513.localdomain podman[58573]: 2025-11-28 08:11:28.05943892 +0000 UTC m=+0.292970085 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1)
Nov 28 08:11:28 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:11:44 np0005538513.localdomain sudo[58602]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:11:44 np0005538513.localdomain sudo[58602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:11:44 np0005538513.localdomain sudo[58602]: pam_unix(sudo:session): session closed for user root
Nov 28 08:11:44 np0005538513.localdomain sudo[58617]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:11:44 np0005538513.localdomain sudo[58617]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:11:44 np0005538513.localdomain sudo[58617]: pam_unix(sudo:session): session closed for user root
Nov 28 08:11:45 np0005538513.localdomain sudo[58664]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:11:45 np0005538513.localdomain sudo[58664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:11:45 np0005538513.localdomain sudo[58664]: pam_unix(sudo:session): session closed for user root
Nov 28 08:11:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:11:58 np0005538513.localdomain podman[58680]: 2025-11-28 08:11:58.843821895 +0000 UTC m=+0.086025084 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible)
Nov 28 08:11:59 np0005538513.localdomain podman[58680]: 2025-11-28 08:11:59.033789312 +0000 UTC m=+0.275992551 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git)
Nov 28 08:11:59 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:12:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:12:29 np0005538513.localdomain podman[58707]: 2025-11-28 08:12:29.843325468 +0000 UTC m=+0.079154438 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, container_name=metrics_qdr, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044)
Nov 28 08:12:30 np0005538513.localdomain podman[58707]: 2025-11-28 08:12:30.016936732 +0000 UTC m=+0.252765712 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, container_name=metrics_qdr)
Nov 28 08:12:30 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:12:45 np0005538513.localdomain sudo[58736]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:12:45 np0005538513.localdomain sudo[58736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:12:45 np0005538513.localdomain sudo[58736]: pam_unix(sudo:session): session closed for user root
Nov 28 08:12:45 np0005538513.localdomain sudo[58751]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:12:45 np0005538513.localdomain sudo[58751]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:12:46 np0005538513.localdomain sudo[58751]: pam_unix(sudo:session): session closed for user root
Nov 28 08:12:46 np0005538513.localdomain sudo[58797]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:12:46 np0005538513.localdomain sudo[58797]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:12:46 np0005538513.localdomain sudo[58797]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:13:00 np0005538513.localdomain systemd[1]: tmp-crun.9wsZIe.mount: Deactivated successfully.
Nov 28 08:13:00 np0005538513.localdomain podman[58813]: 2025-11-28 08:13:00.848829917 +0000 UTC m=+0.079963962 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-qdrouterd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z)
Nov 28 08:13:01 np0005538513.localdomain podman[58813]: 2025-11-28 08:13:01.062631506 +0000 UTC m=+0.293765531 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible)
Nov 28 08:13:01 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:13:17 np0005538513.localdomain sshd[58845]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:13:18 np0005538513.localdomain sshd[58845]: Invalid user validator from 193.32.162.146 port 34434
Nov 28 08:13:18 np0005538513.localdomain sshd[58845]: Connection closed by invalid user validator 193.32.162.146 port 34434 [preauth]
Nov 28 08:13:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:13:31 np0005538513.localdomain systemd[1]: tmp-crun.aRlRUu.mount: Deactivated successfully.
Nov 28 08:13:31 np0005538513.localdomain podman[58847]: 2025-11-28 08:13:31.856500614 +0000 UTC m=+0.091045058 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, build-date=2025-11-18T22:49:46Z, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4)
Nov 28 08:13:32 np0005538513.localdomain podman[58847]: 2025-11-28 08:13:32.075971074 +0000 UTC m=+0.310515508 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git)
Nov 28 08:13:32 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:13:33 np0005538513.localdomain sudo[58922]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phtjjfyvercgvdjlwwxhgvcqmshmbkwo ; /usr/bin/python3
Nov 28 08:13:33 np0005538513.localdomain sudo[58922]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:33 np0005538513.localdomain python3[58924]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:13:33 np0005538513.localdomain sudo[58922]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:34 np0005538513.localdomain sudo[58967]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnoudperztmjvkvlqwkvtrqhoaigukaq ; /usr/bin/python3
Nov 28 08:13:34 np0005538513.localdomain sudo[58967]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:34 np0005538513.localdomain python3[58969]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317613.451003-99573-86393723704210/source _original_basename=tmpok_1_q9e follow=False checksum=62439dd24dde40c90e7a39f6a1b31cc6061fe59b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:13:34 np0005538513.localdomain sudo[58967]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:35 np0005538513.localdomain sudo[58997]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uveivlbocrzvmpvyzthegasarvmrilxt ; /usr/bin/python3
Nov 28 08:13:35 np0005538513.localdomain sudo[58997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:35 np0005538513.localdomain python3[58999]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:13:35 np0005538513.localdomain sudo[58997]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:35 np0005538513.localdomain sudo[59047]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmtzqcuoqjecdjldbfudvgpeggpruasr ; /usr/bin/python3
Nov 28 08:13:35 np0005538513.localdomain sudo[59047]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:36 np0005538513.localdomain sudo[59047]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:36 np0005538513.localdomain sudo[59065]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cnlsgxktvgzhfmevdzpyvsumcnbqqlap ; /usr/bin/python3
Nov 28 08:13:36 np0005538513.localdomain sudo[59065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:36 np0005538513.localdomain sudo[59065]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:36 np0005538513.localdomain sudo[59169]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwmsmasaowtlhdvspdtrykphpwiknivf ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317616.509703-99749-159616059690828/async_wrapper.py 906942319155 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317616.509703-99749-159616059690828/AnsiballZ_command.py _
Nov 28 08:13:36 np0005538513.localdomain sudo[59169]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 28 08:13:37 np0005538513.localdomain ansible-async_wrapper.py[59171]: Invoked with 906942319155 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317616.509703-99749-159616059690828/AnsiballZ_command.py _
Nov 28 08:13:37 np0005538513.localdomain ansible-async_wrapper.py[59174]: Starting module and watcher
Nov 28 08:13:37 np0005538513.localdomain ansible-async_wrapper.py[59174]: Start watching 59175 (3600)
Nov 28 08:13:37 np0005538513.localdomain ansible-async_wrapper.py[59175]: Start module (59175)
Nov 28 08:13:37 np0005538513.localdomain ansible-async_wrapper.py[59171]: Return async_wrapper task started.
Nov 28 08:13:37 np0005538513.localdomain sudo[59169]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:37 np0005538513.localdomain sudo[59190]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qgjhtaqgbxuzmkzetvucjfzyovetdjbm ; /usr/bin/python3
Nov 28 08:13:37 np0005538513.localdomain sudo[59190]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:37 np0005538513.localdomain python3[59195]: ansible-ansible.legacy.async_status Invoked with jid=906942319155.59171 mode=status _async_dir=/tmp/.ansible_async
Nov 28 08:13:37 np0005538513.localdomain sudo[59190]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]:    (file & line not available)
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]:    (file & line not available)
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.11 seconds
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]: Notice: Applied catalog in 0.04 seconds
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]: Application:
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]:    Initial environment: production
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]:    Converged environment: production
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]:          Run mode: user
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]: Changes:
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]: Events:
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]: Resources:
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]:             Total: 10
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]: Time:
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]:          Schedule: 0.00
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]:              File: 0.00
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]:              Exec: 0.01
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]:            Augeas: 0.01
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]:    Transaction evaluation: 0.03
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]:    Catalog application: 0.04
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]:    Config retrieval: 0.15
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]:          Last run: 1764317620
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]:        Filebucket: 0.00
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]:             Total: 0.04
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]: Version:
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]:            Config: 1764317620
Nov 28 08:13:40 np0005538513.localdomain puppet-user[59194]:            Puppet: 7.10.0
Nov 28 08:13:40 np0005538513.localdomain ansible-async_wrapper.py[59175]: Module complete (59175)
Nov 28 08:13:42 np0005538513.localdomain ansible-async_wrapper.py[59174]: Done in kid B.
Nov 28 08:13:47 np0005538513.localdomain sudo[59307]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:13:47 np0005538513.localdomain sudo[59307]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:13:47 np0005538513.localdomain sudo[59307]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:47 np0005538513.localdomain sudo[59322]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:13:47 np0005538513.localdomain sudo[59322]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:13:47 np0005538513.localdomain sudo[59367]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jewkyyeuhktfmqounspswcxfpszjtlnt ; /usr/bin/python3
Nov 28 08:13:47 np0005538513.localdomain sudo[59367]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:47 np0005538513.localdomain sudo[59322]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:47 np0005538513.localdomain python3[59371]: ansible-ansible.legacy.async_status Invoked with jid=906942319155.59171 mode=status _async_dir=/tmp/.ansible_async
Nov 28 08:13:47 np0005538513.localdomain sudo[59367]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:48 np0005538513.localdomain sudo[59384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:13:48 np0005538513.localdomain sudo[59384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:13:48 np0005538513.localdomain sudo[59384]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:48 np0005538513.localdomain sudo[59412]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uaatkewrolkrbrsnkhbhxmptqooudhgv ; /usr/bin/python3
Nov 28 08:13:48 np0005538513.localdomain sudo[59412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:48 np0005538513.localdomain python3[59414]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 08:13:48 np0005538513.localdomain sudo[59412]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:48 np0005538513.localdomain sudo[59428]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zuukzbxyuehmvnifjgdrcvmxtkkmmutl ; /usr/bin/python3
Nov 28 08:13:48 np0005538513.localdomain sudo[59428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:48 np0005538513.localdomain python3[59430]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:13:48 np0005538513.localdomain sudo[59428]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:49 np0005538513.localdomain sudo[59478]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wokiogitqvoqumafbeayyjzlxggikzpr ; /usr/bin/python3
Nov 28 08:13:49 np0005538513.localdomain sudo[59478]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:49 np0005538513.localdomain python3[59480]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:13:49 np0005538513.localdomain sudo[59478]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:49 np0005538513.localdomain sudo[59496]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ohomhzgdwdilfwcejtsdijvaqsdhrppq ; /usr/bin/python3
Nov 28 08:13:49 np0005538513.localdomain sudo[59496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:49 np0005538513.localdomain python3[59498]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmp8xnmvue_ recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 08:13:49 np0005538513.localdomain sudo[59496]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:49 np0005538513.localdomain sudo[59526]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzfbthqytvbnqcqfywlxukjztxkningd ; /usr/bin/python3
Nov 28 08:13:49 np0005538513.localdomain sudo[59526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:49 np0005538513.localdomain python3[59528]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:13:49 np0005538513.localdomain sudo[59526]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:50 np0005538513.localdomain sudo[59542]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aporivrbsoibvmjwgovgtkeqwdqtujbe ; /usr/bin/python3
Nov 28 08:13:50 np0005538513.localdomain sudo[59542]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:50 np0005538513.localdomain sudo[59542]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:50 np0005538513.localdomain sudo[59629]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcnhanqcdymzjktmkcmztxxpepmcdqpa ; /usr/bin/python3
Nov 28 08:13:50 np0005538513.localdomain sudo[59629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:51 np0005538513.localdomain python3[59631]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Nov 28 08:13:51 np0005538513.localdomain sudo[59629]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:51 np0005538513.localdomain sudo[59648]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sonbtwquaccosrorejioxwzqcagakjsk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:51 np0005538513.localdomain sudo[59648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:51 np0005538513.localdomain python3[59650]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:13:51 np0005538513.localdomain sudo[59648]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:52 np0005538513.localdomain sudo[59664]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hajtuvqauftgwbiqpznxehestkwzhsow ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:52 np0005538513.localdomain sudo[59664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:52 np0005538513.localdomain sudo[59664]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:52 np0005538513.localdomain sudo[59680]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uskcnmebbdkawfeayflfbpjnoquykxtp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:52 np0005538513.localdomain sudo[59680]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:52 np0005538513.localdomain python3[59682]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:13:52 np0005538513.localdomain sudo[59680]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:53 np0005538513.localdomain sudo[59730]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-suscgotinuzwqreiuzamvxsqdhnjjksn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:53 np0005538513.localdomain sudo[59730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:53 np0005538513.localdomain python3[59732]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:13:53 np0005538513.localdomain sudo[59730]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:54 np0005538513.localdomain sudo[59748]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxqrlghjdaphrumjznucsahjfoikmbcx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:54 np0005538513.localdomain sudo[59748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:54 np0005538513.localdomain python3[59750]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:13:54 np0005538513.localdomain sudo[59748]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:54 np0005538513.localdomain sudo[59810]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-seylrykmfawpndrkrygasclxirxiacnc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:54 np0005538513.localdomain sudo[59810]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:54 np0005538513.localdomain python3[59812]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:13:54 np0005538513.localdomain sudo[59810]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:54 np0005538513.localdomain sudo[59828]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dwqcetpmsivrtmmzgqmxpzuvvaxitrml ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:54 np0005538513.localdomain sudo[59828]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:55 np0005538513.localdomain python3[59830]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:13:55 np0005538513.localdomain sudo[59828]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:55 np0005538513.localdomain sudo[59890]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gfldljqysilhlozrujunwktavqpnegpv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:55 np0005538513.localdomain sudo[59890]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:55 np0005538513.localdomain python3[59892]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:13:55 np0005538513.localdomain sudo[59890]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:55 np0005538513.localdomain sudo[59908]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fteheywvikefylaikbwtnspqoxuwsanw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:55 np0005538513.localdomain sudo[59908]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:55 np0005538513.localdomain python3[59910]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:13:55 np0005538513.localdomain sudo[59908]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:56 np0005538513.localdomain sudo[59970]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrkavylasecobiazepamwgilfxroycwy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:56 np0005538513.localdomain sudo[59970]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:56 np0005538513.localdomain python3[59972]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:13:56 np0005538513.localdomain sudo[59970]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:56 np0005538513.localdomain sudo[59988]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmmnqgqnwigoofttnvxllqmwbxgmqvqy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:56 np0005538513.localdomain sudo[59988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:56 np0005538513.localdomain python3[59990]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:13:56 np0005538513.localdomain sudo[59988]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:56 np0005538513.localdomain sudo[60018]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srsdidnqtsfcuvupglhzxzocaigsufae ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:56 np0005538513.localdomain sudo[60018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:57 np0005538513.localdomain python3[60020]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:13:57 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:13:57 np0005538513.localdomain systemd-rc-local-generator[60038]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:13:57 np0005538513.localdomain systemd-sysv-generator[60046]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:13:57 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:13:57 np0005538513.localdomain sudo[60018]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:57 np0005538513.localdomain sudo[60104]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xkmamnwfzmhxovghlitexuidwavjyale ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:57 np0005538513.localdomain sudo[60104]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:57 np0005538513.localdomain python3[60106]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:13:57 np0005538513.localdomain sudo[60104]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:58 np0005538513.localdomain sudo[60122]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-neeymyvmkczssmmuwbeosbxxppbhnhfw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:58 np0005538513.localdomain sudo[60122]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:58 np0005538513.localdomain python3[60124]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:13:58 np0005538513.localdomain sudo[60122]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:58 np0005538513.localdomain sudo[60184]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvkpojasuzguxykoxflctjapmrscqayw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:58 np0005538513.localdomain sudo[60184]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:58 np0005538513.localdomain python3[60186]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:13:58 np0005538513.localdomain sudo[60184]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:58 np0005538513.localdomain sudo[60202]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uxpjuqumkcvthbnkqqceinqztsiivoqf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:58 np0005538513.localdomain sudo[60202]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:59 np0005538513.localdomain python3[60204]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:13:59 np0005538513.localdomain sudo[60202]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:59 np0005538513.localdomain sudo[60232]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atqvhwghntxrzvmujpirewufaadiqlsa ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:59 np0005538513.localdomain sudo[60232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:59 np0005538513.localdomain python3[60234]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:13:59 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:13:59 np0005538513.localdomain systemd-rc-local-generator[60258]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:13:59 np0005538513.localdomain systemd-sysv-generator[60262]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:13:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:13:59 np0005538513.localdomain systemd[1]: Starting Create netns directory...
Nov 28 08:13:59 np0005538513.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 08:13:59 np0005538513.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 08:13:59 np0005538513.localdomain systemd[1]: Finished Create netns directory.
Nov 28 08:13:59 np0005538513.localdomain sudo[60232]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:00 np0005538513.localdomain sudo[60289]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdkvstfdrrvkxduagshrvjqjqjihhkbx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:00 np0005538513.localdomain sudo[60289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:00 np0005538513.localdomain python3[60291]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Nov 28 08:14:00 np0005538513.localdomain sudo[60289]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:00 np0005538513.localdomain sudo[60305]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tebftkxkbdhgpzhjquyhosoqekskinmv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:00 np0005538513.localdomain sudo[60305]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:01 np0005538513.localdomain sudo[60305]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:02 np0005538513.localdomain sudo[60348]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mwmabyidktguctsbybkzpawktjlugdhd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:02 np0005538513.localdomain sudo[60348]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:02 np0005538513.localdomain python3[60350]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step3 config_dir=/var/lib/tripleo-config/container-startup-config/step_3 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:14:02 np0005538513.localdomain podman[60379]: 2025-11-28 08:14:02.43925302 +0000 UTC m=+0.131910227 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.openshift.expose-services=, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:14:02 np0005538513.localdomain podman[60526]: 2025-11-28 08:14:02.552186627 +0000 UTC m=+0.067797203 container create d527d6e05b826bf85108ac755f1e40f44588049a9c22eb8644f76dede82580c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, config_id=tripleo_step3, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, release=1761123044, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_statedir_owner, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:14:02 np0005538513.localdomain podman[60548]: 2025-11-28 08:14:02.583079977 +0000 UTC m=+0.076139330 container create 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, container_name=collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container)
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: Started libpod-conmon-d527d6e05b826bf85108ac755f1e40f44588049a9c22eb8644f76dede82580c1.scope.
Nov 28 08:14:02 np0005538513.localdomain podman[60552]: 2025-11-28 08:14:02.597506989 +0000 UTC m=+0.086307165 container create 8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step3, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, container_name=nova_virtlogd_wrapper, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:02 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/171c9fab2d92ad957aca0c5cf0ee4736f8cd2abe2dd09f25e9d89f04a4353dd2/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:02 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/171c9fab2d92ad957aca0c5cf0ee4736f8cd2abe2dd09f25e9d89f04a4353dd2/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:02 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/171c9fab2d92ad957aca0c5cf0ee4736f8cd2abe2dd09f25e9d89f04a4353dd2/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: Started libpod-conmon-2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.scope.
Nov 28 08:14:02 np0005538513.localdomain podman[60563]: 2025-11-28 08:14:02.618301555 +0000 UTC m=+0.098036391 container create 33a8f7b4baabd0da2b2d44596de4ba044dd80b4c1cbf36f390043c4e7ddd116e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, container_name=ceilometer_init_log, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step3)
Nov 28 08:14:02 np0005538513.localdomain podman[60526]: 2025-11-28 08:14:02.523364414 +0000 UTC m=+0.038975030 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 28 08:14:02 np0005538513.localdomain podman[60526]: 2025-11-28 08:14:02.625538596 +0000 UTC m=+0.141149182 container init d527d6e05b826bf85108ac755f1e40f44588049a9c22eb8644f76dede82580c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_statedir_owner, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step3, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:02 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/483382bf68693c05a65aded8bd4df683c0e0d8870bcc7441be08a396c5476266/merged/scripts supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:02 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/483382bf68693c05a65aded8bd4df683c0e0d8870bcc7441be08a396c5476266/merged/var/log/collectd supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:02 np0005538513.localdomain podman[60564]: 2025-11-28 08:14:02.63159123 +0000 UTC m=+0.105755588 container create 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, com.redhat.component=openstack-rsyslog-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044)
Nov 28 08:14:02 np0005538513.localdomain podman[60548]: 2025-11-28 08:14:02.544784789 +0000 UTC m=+0.037844142 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Nov 28 08:14:02 np0005538513.localdomain podman[60552]: 2025-11-28 08:14:02.543924602 +0000 UTC m=+0.032724798 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:02 np0005538513.localdomain podman[60563]: 2025-11-28 08:14:02.551242157 +0000 UTC m=+0.030976983 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Nov 28 08:14:02 np0005538513.localdomain podman[60564]: 2025-11-28 08:14:02.574689598 +0000 UTC m=+0.048853976 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: Started libpod-conmon-8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be.scope.
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: Started libpod-conmon-9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f.scope.
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: Started libpod-conmon-33a8f7b4baabd0da2b2d44596de4ba044dd80b4c1cbf36f390043c4e7ddd116e.scope.
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: libpod-d527d6e05b826bf85108ac755f1e40f44588049a9c22eb8644f76dede82580c1.scope: Deactivated successfully.
Nov 28 08:14:02 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/870580852a6f869c68321019e7d77d0da890e64d97007d93989d967a33c02183/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:02 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/870580852a6f869c68321019e7d77d0da890e64d97007d93989d967a33c02183/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:02 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/870580852a6f869c68321019e7d77d0da890e64d97007d93989d967a33c02183/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:02 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/870580852a6f869c68321019e7d77d0da890e64d97007d93989d967a33c02183/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:02 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/870580852a6f869c68321019e7d77d0da890e64d97007d93989d967a33c02183/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:02 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/870580852a6f869c68321019e7d77d0da890e64d97007d93989d967a33c02183/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:02 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/870580852a6f869c68321019e7d77d0da890e64d97007d93989d967a33c02183/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:02 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78b48dbff0a2fe76e018f9048f1970d44652db7588437c79ac71691eb45c0ad0/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:02 np0005538513.localdomain podman[60379]: 2025-11-28 08:14:02.704403493 +0000 UTC m=+0.397060700 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:14:02 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:02 np0005538513.localdomain podman[60548]: 2025-11-28 08:14:02.70682296 +0000 UTC m=+0.199882313 container init 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, name=rhosp17/openstack-collectd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, container_name=collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc.)
Nov 28 08:14:02 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:02 np0005538513.localdomain podman[60564]: 2025-11-28 08:14:02.714174716 +0000 UTC m=+0.188339084 container init 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, container_name=rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, build-date=2025-11-18T22:49:49Z, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3)
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:14:02 np0005538513.localdomain podman[60564]: 2025-11-28 08:14:02.72117744 +0000 UTC m=+0.195341808 container start 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-type=git, name=rhosp17/openstack-rsyslog, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step3, com.redhat.component=openstack-rsyslog-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:49Z, container_name=rsyslog, managed_by=tripleo_ansible, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 rsyslog)
Nov 28 08:14:02 np0005538513.localdomain sudo[60639]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:02 np0005538513.localdomain sudo[60645]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:02 np0005538513.localdomain python3[60350]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name rsyslog --conmon-pidfile /run/rsyslog.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=138ccb6252fd89d73a6c37a3f993f3eb --label config_id=tripleo_step3 --label container_name=rsyslog --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/rsyslog.log --network host --privileged=True --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:ro --volume /var/log/containers/rsyslog:/var/log/rsyslog:rw,z --volume /var/log:/var/log/host:ro --volume /var/lib/rsyslog.container:/var/lib/rsyslog:rw,z registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Nov 28 08:14:02 np0005538513.localdomain sudo[60645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:14:02 np0005538513.localdomain systemd-logind[764]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 28 08:14:02 np0005538513.localdomain podman[60548]: 2025-11-28 08:14:02.739532448 +0000 UTC m=+0.232591781 container start 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, tcib_managed=true, io.openshift.expose-services=, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: Created slice User Slice of UID 0.
Nov 28 08:14:02 np0005538513.localdomain python3[60350]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name collectd --cap-add IPC_LOCK --conmon-pidfile /run/collectd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=da9a0dc7b40588672419e3ce10063e21 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=collectd --label managed_by=tripleo_ansible --label config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/collectd.log --memory 512m --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro --volume /var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/collectd:/var/log/collectd:rw,z --volume /var/lib/container-config-scripts:/config-scripts:ro --volume /var/lib/container-user-scripts:/scripts:z --volume /run:/run:rw --volume /sys/fs/cgroup:/sys/fs/cgroup:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 28 08:14:02 np0005538513.localdomain podman[60563]: 2025-11-28 08:14:02.763954911 +0000 UTC m=+0.243689777 container init 33a8f7b4baabd0da2b2d44596de4ba044dd80b4c1cbf36f390043c4e7ddd116e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, batch=17.1_20251118.1, config_id=tripleo_step3, container_name=ceilometer_init_log, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, url=https://www.redhat.com, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 28 08:14:02 np0005538513.localdomain podman[60563]: 2025-11-28 08:14:02.771952227 +0000 UTC m=+0.251687063 container start 33a8f7b4baabd0da2b2d44596de4ba044dd80b4c1cbf36f390043c4e7ddd116e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, container_name=ceilometer_init_log, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: Starting User Manager for UID 0...
Nov 28 08:14:02 np0005538513.localdomain python3[60350]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_init_log --conmon-pidfile /run/ceilometer_init_log.pid --detach=True --label config_id=tripleo_step3 --label container_name=ceilometer_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_init_log.log --network none --user root --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 /bin/bash -c chown -R ceilometer:ceilometer /var/log/ceilometer
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: libpod-33a8f7b4baabd0da2b2d44596de4ba044dd80b4c1cbf36f390043c4e7ddd116e.scope: Deactivated successfully.
Nov 28 08:14:02 np0005538513.localdomain sudo[60645]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:02 np0005538513.localdomain podman[60526]: 2025-11-28 08:14:02.787887697 +0000 UTC m=+0.303498283 container start d527d6e05b826bf85108ac755f1e40f44588049a9c22eb8644f76dede82580c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step3, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_statedir_owner, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:14:02 np0005538513.localdomain podman[60526]: 2025-11-28 08:14:02.792298319 +0000 UTC m=+0.307908895 container attach d527d6e05b826bf85108ac755f1e40f44588049a9c22eb8644f76dede82580c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, container_name=nova_statedir_owner, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: libpod-9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f.scope: Deactivated successfully.
Nov 28 08:14:02 np0005538513.localdomain systemd[60679]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:02 np0005538513.localdomain podman[60552]: 2025-11-28 08:14:02.812971491 +0000 UTC m=+0.301771667 container init 8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, build-date=2025-11-19T00:35:22Z, container_name=nova_virtlogd_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step3, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12)
Nov 28 08:14:02 np0005538513.localdomain podman[60689]: 2025-11-28 08:14:02.832450805 +0000 UTC m=+0.038538766 container died 33a8f7b4baabd0da2b2d44596de4ba044dd80b4c1cbf36f390043c4e7ddd116e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, container_name=ceilometer_init_log, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044)
Nov 28 08:14:02 np0005538513.localdomain sudo[60731]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:02 np0005538513.localdomain podman[60526]: 2025-11-28 08:14:02.845211994 +0000 UTC m=+0.360822580 container died d527d6e05b826bf85108ac755f1e40f44588049a9c22eb8644f76dede82580c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, distribution-scope=public, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_statedir_owner, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Nov 28 08:14:02 np0005538513.localdomain systemd-logind[764]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 28 08:14:02 np0005538513.localdomain podman[60707]: 2025-11-28 08:14:02.860281706 +0000 UTC m=+0.048436992 container died 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., container_name=rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp17/openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:49Z, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step3, io.openshift.expose-services=, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog)
Nov 28 08:14:02 np0005538513.localdomain podman[60552]: 2025-11-28 08:14:02.871543567 +0000 UTC m=+0.360343743 container start 8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, container_name=nova_virtlogd_wrapper, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:14:02 np0005538513.localdomain python3[60350]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/nova_virtlogd_wrapper.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=0f0904943dda1bf1d123bdf96d71020f --label config_id=tripleo_step3 --label container_name=nova_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtlogd_wrapper.log --network host --pid host --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:02 np0005538513.localdomain podman[60707]: 2025-11-28 08:14:02.89003885 +0000 UTC m=+0.078194106 container cleanup 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.component=openstack-rsyslog-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp17/openstack-rsyslog, architecture=x86_64, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, vendor=Red Hat, Inc., distribution-scope=public)
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: libpod-conmon-9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f.scope: Deactivated successfully.
Nov 28 08:14:02 np0005538513.localdomain systemd[60679]: Queued start job for default target Main User Target.
Nov 28 08:14:02 np0005538513.localdomain systemd[60679]: Created slice User Application Slice.
Nov 28 08:14:02 np0005538513.localdomain systemd[60679]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 28 08:14:02 np0005538513.localdomain systemd[60679]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 08:14:02 np0005538513.localdomain systemd[60679]: Reached target Paths.
Nov 28 08:14:02 np0005538513.localdomain systemd[60679]: Reached target Timers.
Nov 28 08:14:02 np0005538513.localdomain systemd[60679]: Starting D-Bus User Message Bus Socket...
Nov 28 08:14:02 np0005538513.localdomain systemd[60679]: Starting Create User's Volatile Files and Directories...
Nov 28 08:14:02 np0005538513.localdomain systemd[60679]: Finished Create User's Volatile Files and Directories.
Nov 28 08:14:02 np0005538513.localdomain systemd[60679]: Listening on D-Bus User Message Bus Socket.
Nov 28 08:14:02 np0005538513.localdomain systemd[60679]: Reached target Sockets.
Nov 28 08:14:02 np0005538513.localdomain systemd[60679]: Reached target Basic System.
Nov 28 08:14:02 np0005538513.localdomain systemd[60679]: Reached target Main User Target.
Nov 28 08:14:02 np0005538513.localdomain systemd[60679]: Startup finished in 142ms.
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: Started User Manager for UID 0.
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: Started Session c1 of User root.
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: Started Session c2 of User root.
Nov 28 08:14:02 np0005538513.localdomain podman[60630]: 2025-11-28 08:14:02.967991017 +0000 UTC m=+0.261692304 container cleanup d527d6e05b826bf85108ac755f1e40f44588049a9c22eb8644f76dede82580c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, managed_by=tripleo_ansible, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=nova_statedir_owner, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z)
Nov 28 08:14:02 np0005538513.localdomain sudo[60639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:02 np0005538513.localdomain systemd[1]: libpod-conmon-d527d6e05b826bf85108ac755f1e40f44588049a9c22eb8644f76dede82580c1.scope: Deactivated successfully.
Nov 28 08:14:02 np0005538513.localdomain sudo[60731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:02 np0005538513.localdomain python3[60350]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_statedir_owner --conmon-pidfile /run/nova_statedir_owner.pid --detach=False --env NOVA_STATEDIR_OWNERSHIP_SKIP=triliovault-mounts --env TRIPLEO_DEPLOY_IDENTIFIER=1764316155 --env __OS_DEBUG=true --label config_id=tripleo_step3 --label container_name=nova_statedir_owner --label managed_by=tripleo_ansible --label config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_statedir_owner.log --network none --privileged=False --security-opt label=disable --user root --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/container-config-scripts:/container-config-scripts:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py
Nov 28 08:14:03 np0005538513.localdomain podman[60689]: 2025-11-28 08:14:03.013385221 +0000 UTC m=+0.219473182 container cleanup 33a8f7b4baabd0da2b2d44596de4ba044dd80b4c1cbf36f390043c4e7ddd116e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, tcib_managed=true, container_name=ceilometer_init_log, build-date=2025-11-19T00:12:45Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_id=tripleo_step3, distribution-scope=public)
Nov 28 08:14:03 np0005538513.localdomain systemd[1]: libpod-conmon-33a8f7b4baabd0da2b2d44596de4ba044dd80b4c1cbf36f390043c4e7ddd116e.scope: Deactivated successfully.
Nov 28 08:14:03 np0005538513.localdomain podman[60649]: 2025-11-28 08:14:02.981051745 +0000 UTC m=+0.237521630 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:14:03 np0005538513.localdomain sudo[60731]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:03 np0005538513.localdomain systemd[1]: session-c2.scope: Deactivated successfully.
Nov 28 08:14:03 np0005538513.localdomain sudo[60639]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:03 np0005538513.localdomain systemd[1]: session-c1.scope: Deactivated successfully.
Nov 28 08:14:03 np0005538513.localdomain podman[60649]: 2025-11-28 08:14:03.067422342 +0000 UTC m=+0.323892237 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64)
Nov 28 08:14:03 np0005538513.localdomain podman[60649]: unhealthy
Nov 28 08:14:03 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:14:03 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Failed with result 'exit-code'.
Nov 28 08:14:03 np0005538513.localdomain podman[60903]: 2025-11-28 08:14:03.298948788 +0000 UTC m=+0.079831378 container create 4ec37aacc81c30d8f8c1872da98bbae1184ac4d79565b18c976c336f5619bd9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, tcib_managed=true)
Nov 28 08:14:03 np0005538513.localdomain systemd[1]: Started libpod-conmon-4ec37aacc81c30d8f8c1872da98bbae1184ac4d79565b18c976c336f5619bd9a.scope.
Nov 28 08:14:03 np0005538513.localdomain podman[60903]: 2025-11-28 08:14:03.255815326 +0000 UTC m=+0.036697926 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:03 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:03 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2373ee3a3140a9b51206ab4b5b4d03e16415bec87f47ab14239ebc6234afc02f/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:03 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2373ee3a3140a9b51206ab4b5b4d03e16415bec87f47ab14239ebc6234afc02f/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:03 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2373ee3a3140a9b51206ab4b5b4d03e16415bec87f47ab14239ebc6234afc02f/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:03 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2373ee3a3140a9b51206ab4b5b4d03e16415bec87f47ab14239ebc6234afc02f/merged/var/log/swtpm/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:03 np0005538513.localdomain podman[60903]: 2025-11-28 08:14:03.369771467 +0000 UTC m=+0.150654017 container init 4ec37aacc81c30d8f8c1872da98bbae1184ac4d79565b18c976c336f5619bd9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:14:03 np0005538513.localdomain podman[60903]: 2025-11-28 08:14:03.379424365 +0000 UTC m=+0.160306905 container start 4ec37aacc81c30d8f8c1872da98bbae1184ac4d79565b18c976c336f5619bd9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, distribution-scope=public)
Nov 28 08:14:03 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-171c9fab2d92ad957aca0c5cf0ee4736f8cd2abe2dd09f25e9d89f04a4353dd2-merged.mount: Deactivated successfully.
Nov 28 08:14:03 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d527d6e05b826bf85108ac755f1e40f44588049a9c22eb8644f76dede82580c1-userdata-shm.mount: Deactivated successfully.
Nov 28 08:14:03 np0005538513.localdomain podman[60955]: 2025-11-28 08:14:03.452210667 +0000 UTC m=+0.070528700 container create 2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-nova-libvirt, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtsecretd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container)
Nov 28 08:14:03 np0005538513.localdomain systemd[1]: Started libpod-conmon-2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76.scope.
Nov 28 08:14:03 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:03 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e56fe7373565106f757a890fd2f61ff50243f466d810e377560c4fab1b4ea8c4/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:03 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e56fe7373565106f757a890fd2f61ff50243f466d810e377560c4fab1b4ea8c4/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:03 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e56fe7373565106f757a890fd2f61ff50243f466d810e377560c4fab1b4ea8c4/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:03 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e56fe7373565106f757a890fd2f61ff50243f466d810e377560c4fab1b4ea8c4/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:03 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e56fe7373565106f757a890fd2f61ff50243f466d810e377560c4fab1b4ea8c4/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:03 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e56fe7373565106f757a890fd2f61ff50243f466d810e377560c4fab1b4ea8c4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:03 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e56fe7373565106f757a890fd2f61ff50243f466d810e377560c4fab1b4ea8c4/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:03 np0005538513.localdomain podman[60955]: 2025-11-28 08:14:03.419251032 +0000 UTC m=+0.037569095 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:03 np0005538513.localdomain podman[60955]: 2025-11-28 08:14:03.520239687 +0000 UTC m=+0.138557720 container init 2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, config_id=tripleo_step3, version=17.1.12, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_virtsecretd, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:14:03 np0005538513.localdomain podman[60955]: 2025-11-28 08:14:03.52969682 +0000 UTC m=+0.148014863 container start 2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, architecture=x86_64, version=17.1.12, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtsecretd, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Nov 28 08:14:03 np0005538513.localdomain python3[60350]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtsecretd --cgroupns=host --conmon-pidfile /run/nova_virtsecretd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=0f0904943dda1bf1d123bdf96d71020f --label config_id=tripleo_step3 --label container_name=nova_virtsecretd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtsecretd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:03 np0005538513.localdomain sudo[60977]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:03 np0005538513.localdomain systemd-logind[764]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 28 08:14:03 np0005538513.localdomain systemd[1]: Started Session c3 of User root.
Nov 28 08:14:03 np0005538513.localdomain sudo[60977]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:03 np0005538513.localdomain sudo[60977]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:03 np0005538513.localdomain systemd[1]: session-c3.scope: Deactivated successfully.
Nov 28 08:14:03 np0005538513.localdomain podman[61099]: 2025-11-28 08:14:03.974451927 +0000 UTC m=+0.077147693 container create 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-iscsid-container, container_name=iscsid, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 28 08:14:04 np0005538513.localdomain podman[61112]: 2025-11-28 08:14:04.011961968 +0000 UTC m=+0.084827658 container create 6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, container_name=nova_virtnodedevd, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, vcs-type=git, architecture=x86_64, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible)
Nov 28 08:14:04 np0005538513.localdomain systemd[1]: Started libpod-conmon-08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.scope.
Nov 28 08:14:04 np0005538513.localdomain podman[61099]: 2025-11-28 08:14:03.928740283 +0000 UTC m=+0.031436079 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Nov 28 08:14:04 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:04 np0005538513.localdomain systemd[1]: Started libpod-conmon-6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265.scope.
Nov 28 08:14:04 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b21fd5920b03309569c61b370f896baf7b2149eb0e6261b2fb5b94e6a082fed/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b21fd5920b03309569c61b370f896baf7b2149eb0e6261b2fb5b94e6a082fed/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:04 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/025413710f75ace93305dda8659754012c7e6b0ed71f82e0bba1b040134d7ebe/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/025413710f75ace93305dda8659754012c7e6b0ed71f82e0bba1b040134d7ebe/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/025413710f75ace93305dda8659754012c7e6b0ed71f82e0bba1b040134d7ebe/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/025413710f75ace93305dda8659754012c7e6b0ed71f82e0bba1b040134d7ebe/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/025413710f75ace93305dda8659754012c7e6b0ed71f82e0bba1b040134d7ebe/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/025413710f75ace93305dda8659754012c7e6b0ed71f82e0bba1b040134d7ebe/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/025413710f75ace93305dda8659754012c7e6b0ed71f82e0bba1b040134d7ebe/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538513.localdomain podman[61112]: 2025-11-28 08:14:03.967827264 +0000 UTC m=+0.040693024 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:04 np0005538513.localdomain podman[61112]: 2025-11-28 08:14:04.073108797 +0000 UTC m=+0.145974487 container init 6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, vcs-type=git, config_id=tripleo_step3, version=17.1.12, container_name=nova_virtnodedevd, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044)
Nov 28 08:14:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:14:04 np0005538513.localdomain podman[61099]: 2025-11-28 08:14:04.078266322 +0000 UTC m=+0.180962088 container init 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:14:04 np0005538513.localdomain podman[61112]: 2025-11-28 08:14:04.082440565 +0000 UTC m=+0.155306275 container start 6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, architecture=x86_64, container_name=nova_virtnodedevd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git)
Nov 28 08:14:04 np0005538513.localdomain python3[60350]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtnodedevd --cgroupns=host --conmon-pidfile /run/nova_virtnodedevd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=0f0904943dda1bf1d123bdf96d71020f --label config_id=tripleo_step3 --label container_name=nova_virtnodedevd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtnodedevd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:04 np0005538513.localdomain sudo[61142]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:14:04 np0005538513.localdomain systemd-logind[764]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 28 08:14:04 np0005538513.localdomain podman[61099]: 2025-11-28 08:14:04.11598144 +0000 UTC m=+0.218677206 container start 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1)
Nov 28 08:14:04 np0005538513.localdomain sudo[61139]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:04 np0005538513.localdomain python3[60350]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name iscsid --conmon-pidfile /run/iscsid.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=c9c242145d21d40ef98889981c05ca84 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=iscsid --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/iscsid.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Nov 28 08:14:04 np0005538513.localdomain systemd-logind[764]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 28 08:14:04 np0005538513.localdomain systemd[1]: Started Session c4 of User root.
Nov 28 08:14:04 np0005538513.localdomain systemd[1]: Started Session c5 of User root.
Nov 28 08:14:04 np0005538513.localdomain sudo[61142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:04 np0005538513.localdomain sudo[61139]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:04 np0005538513.localdomain sudo[61142]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:04 np0005538513.localdomain sudo[61139]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:04 np0005538513.localdomain podman[61149]: 2025-11-28 08:14:04.234111585 +0000 UTC m=+0.095837922 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, distribution-scope=public)
Nov 28 08:14:04 np0005538513.localdomain systemd[1]: session-c5.scope: Deactivated successfully.
Nov 28 08:14:04 np0005538513.localdomain systemd[1]: session-c4.scope: Deactivated successfully.
Nov 28 08:14:04 np0005538513.localdomain kernel: Loading iSCSI transport class v2.0-870.
Nov 28 08:14:04 np0005538513.localdomain podman[61149]: 2025-11-28 08:14:04.286471291 +0000 UTC m=+0.148197658 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:14:04 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:14:04 np0005538513.localdomain podman[61274]: 2025-11-28 08:14:04.736149756 +0000 UTC m=+0.082520285 container create 635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, container_name=nova_virtstoraged, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true)
Nov 28 08:14:04 np0005538513.localdomain systemd[1]: Started libpod-conmon-635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951.scope.
Nov 28 08:14:04 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:04 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc6f1986e221e102f7feefd3b96110a2bdd081d61203281438672df579c3b8f4/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc6f1986e221e102f7feefd3b96110a2bdd081d61203281438672df579c3b8f4/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc6f1986e221e102f7feefd3b96110a2bdd081d61203281438672df579c3b8f4/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc6f1986e221e102f7feefd3b96110a2bdd081d61203281438672df579c3b8f4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc6f1986e221e102f7feefd3b96110a2bdd081d61203281438672df579c3b8f4/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc6f1986e221e102f7feefd3b96110a2bdd081d61203281438672df579c3b8f4/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc6f1986e221e102f7feefd3b96110a2bdd081d61203281438672df579c3b8f4/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538513.localdomain podman[61274]: 2025-11-28 08:14:04.696865448 +0000 UTC m=+0.043235957 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:04 np0005538513.localdomain podman[61274]: 2025-11-28 08:14:04.800336272 +0000 UTC m=+0.146706771 container init 635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-type=git, container_name=nova_virtstoraged, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, architecture=x86_64, release=1761123044, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:14:04 np0005538513.localdomain podman[61274]: 2025-11-28 08:14:04.814030141 +0000 UTC m=+0.160400640 container start 635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtstoraged, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12)
Nov 28 08:14:04 np0005538513.localdomain python3[60350]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtstoraged --cgroupns=host --conmon-pidfile /run/nova_virtstoraged.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=0f0904943dda1bf1d123bdf96d71020f --label config_id=tripleo_step3 --label container_name=nova_virtstoraged --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtstoraged.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:04 np0005538513.localdomain sudo[61293]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:04 np0005538513.localdomain systemd-logind[764]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 28 08:14:04 np0005538513.localdomain systemd[1]: Started Session c6 of User root.
Nov 28 08:14:04 np0005538513.localdomain sudo[61293]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:04 np0005538513.localdomain sudo[61293]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:04 np0005538513.localdomain systemd[1]: session-c6.scope: Deactivated successfully.
Nov 28 08:14:05 np0005538513.localdomain podman[61378]: 2025-11-28 08:14:05.288981405 +0000 UTC m=+0.085195340 container create 60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step3, container_name=nova_virtqemud, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12)
Nov 28 08:14:05 np0005538513.localdomain systemd[1]: Started libpod-conmon-60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057.scope.
Nov 28 08:14:05 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:05 np0005538513.localdomain podman[61378]: 2025-11-28 08:14:05.247877088 +0000 UTC m=+0.044091063 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:05 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/850a5494078c53945d7c69a5c914ed859c31bc07523103c44dd52329f5c128d7/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:05 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/850a5494078c53945d7c69a5c914ed859c31bc07523103c44dd52329f5c128d7/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:05 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/850a5494078c53945d7c69a5c914ed859c31bc07523103c44dd52329f5c128d7/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:05 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/850a5494078c53945d7c69a5c914ed859c31bc07523103c44dd52329f5c128d7/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:05 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/850a5494078c53945d7c69a5c914ed859c31bc07523103c44dd52329f5c128d7/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:05 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/850a5494078c53945d7c69a5c914ed859c31bc07523103c44dd52329f5c128d7/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:05 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/850a5494078c53945d7c69a5c914ed859c31bc07523103c44dd52329f5c128d7/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:05 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/850a5494078c53945d7c69a5c914ed859c31bc07523103c44dd52329f5c128d7/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:05 np0005538513.localdomain podman[61378]: 2025-11-28 08:14:05.360531347 +0000 UTC m=+0.156745322 container init 60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, vendor=Red Hat, Inc., release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, url=https://www.redhat.com, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, config_id=tripleo_step3, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtqemud, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:14:05 np0005538513.localdomain podman[61378]: 2025-11-28 08:14:05.370534657 +0000 UTC m=+0.166748612 container start 60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, batch=17.1_20251118.1, config_id=tripleo_step3, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtqemud, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, url=https://www.redhat.com)
Nov 28 08:14:05 np0005538513.localdomain python3[60350]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud --cgroupns=host --conmon-pidfile /run/nova_virtqemud.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=0f0904943dda1bf1d123bdf96d71020f --label config_id=tripleo_step3 --label container_name=nova_virtqemud --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:05 np0005538513.localdomain sudo[61398]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:05 np0005538513.localdomain systemd-logind[764]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 28 08:14:05 np0005538513.localdomain systemd[1]: Started Session c7 of User root.
Nov 28 08:14:05 np0005538513.localdomain sudo[61398]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:05 np0005538513.localdomain sudo[61398]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:05 np0005538513.localdomain systemd[1]: session-c7.scope: Deactivated successfully.
Nov 28 08:14:05 np0005538513.localdomain podman[61486]: 2025-11-28 08:14:05.771739078 +0000 UTC m=+0.069586819 container create 76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_virtproxyd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, release=1761123044, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:14:05 np0005538513.localdomain systemd[1]: Started libpod-conmon-76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50.scope.
Nov 28 08:14:05 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:05 np0005538513.localdomain podman[61486]: 2025-11-28 08:14:05.73431287 +0000 UTC m=+0.032160661 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:05 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ed6a910a473599a3c0eb40fb417f8a3b4e8b49460b9c3a2f878bde6830d5976/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:05 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ed6a910a473599a3c0eb40fb417f8a3b4e8b49460b9c3a2f878bde6830d5976/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:05 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ed6a910a473599a3c0eb40fb417f8a3b4e8b49460b9c3a2f878bde6830d5976/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:05 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ed6a910a473599a3c0eb40fb417f8a3b4e8b49460b9c3a2f878bde6830d5976/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:05 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ed6a910a473599a3c0eb40fb417f8a3b4e8b49460b9c3a2f878bde6830d5976/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:05 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ed6a910a473599a3c0eb40fb417f8a3b4e8b49460b9c3a2f878bde6830d5976/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:05 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ed6a910a473599a3c0eb40fb417f8a3b4e8b49460b9c3a2f878bde6830d5976/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:05 np0005538513.localdomain podman[61486]: 2025-11-28 08:14:05.844333414 +0000 UTC m=+0.142181185 container init 76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., container_name=nova_virtproxyd, config_id=tripleo_step3, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1761123044, distribution-scope=public, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 28 08:14:05 np0005538513.localdomain podman[61486]: 2025-11-28 08:14:05.856191385 +0000 UTC m=+0.154039156 container start 76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, config_id=tripleo_step3, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, container_name=nova_virtproxyd, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 28 08:14:05 np0005538513.localdomain python3[60350]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtproxyd --cgroupns=host --conmon-pidfile /run/nova_virtproxyd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=0f0904943dda1bf1d123bdf96d71020f --label config_id=tripleo_step3 --label container_name=nova_virtproxyd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtproxyd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:05 np0005538513.localdomain sudo[61506]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:05 np0005538513.localdomain systemd-logind[764]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 28 08:14:05 np0005538513.localdomain systemd[1]: Started Session c8 of User root.
Nov 28 08:14:05 np0005538513.localdomain sudo[61506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:06 np0005538513.localdomain sudo[61506]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:06 np0005538513.localdomain systemd[1]: session-c8.scope: Deactivated successfully.
Nov 28 08:14:06 np0005538513.localdomain sudo[60348]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:06 np0005538513.localdomain sudo[61565]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dwspjivtixfrbnvugrzskbonmehtugvc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:06 np0005538513.localdomain sudo[61565]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:06 np0005538513.localdomain python3[61567]: ansible-file Invoked with path=/etc/systemd/system/tripleo_collectd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:06 np0005538513.localdomain sudo[61565]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:06 np0005538513.localdomain sudo[61581]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kcjsuloisphykvpwbhzlgmxfbwsfqmph ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:06 np0005538513.localdomain sudo[61581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:06 np0005538513.localdomain python3[61583]: ansible-file Invoked with path=/etc/systemd/system/tripleo_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:06 np0005538513.localdomain sudo[61581]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:06 np0005538513.localdomain sudo[61597]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qvbktjbluxhogjcmbpmurcrgvxeehadz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:06 np0005538513.localdomain sudo[61597]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:06 np0005538513.localdomain python3[61599]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:06 np0005538513.localdomain sudo[61597]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:07 np0005538513.localdomain sudo[61613]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-crtjlldxhdtkrusjqloejqpkrrlixhzb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:07 np0005538513.localdomain sudo[61613]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:07 np0005538513.localdomain python3[61615]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:07 np0005538513.localdomain sudo[61613]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:07 np0005538513.localdomain sudo[61629]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gddlsvkochlbngswhgpfjadyxwbeafxd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:07 np0005538513.localdomain sudo[61629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:07 np0005538513.localdomain python3[61631]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:07 np0005538513.localdomain sudo[61629]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:07 np0005538513.localdomain sudo[61645]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oeodakdimzfgbfktfewrfnxznjrzubhv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:07 np0005538513.localdomain sudo[61645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:07 np0005538513.localdomain python3[61647]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:07 np0005538513.localdomain sudo[61645]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:07 np0005538513.localdomain sudo[61661]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tydwyusovreglgipipvoedkfdxlxkqnj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:07 np0005538513.localdomain sudo[61661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:08 np0005538513.localdomain python3[61663]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:08 np0005538513.localdomain sudo[61661]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:08 np0005538513.localdomain sudo[61677]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xjzcnnlluxcwkynwofqnyrrvsqrteons ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:08 np0005538513.localdomain sudo[61677]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:08 np0005538513.localdomain python3[61679]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:08 np0005538513.localdomain sudo[61677]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:08 np0005538513.localdomain sudo[61693]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbaotkgijyyqlwhceucfghigonwacqng ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:08 np0005538513.localdomain sudo[61693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:08 np0005538513.localdomain python3[61695]: ansible-file Invoked with path=/etc/systemd/system/tripleo_rsyslog.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:08 np0005538513.localdomain sudo[61693]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:08 np0005538513.localdomain sudo[61709]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eadzjyhtanuarbxywcnpaixceamwuthn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:08 np0005538513.localdomain sudo[61709]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:08 np0005538513.localdomain python3[61711]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_collectd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:14:08 np0005538513.localdomain sudo[61709]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:08 np0005538513.localdomain sudo[61725]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfopnoslujhdxdqaiymulskwgstizovu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:08 np0005538513.localdomain sudo[61725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:09 np0005538513.localdomain python3[61727]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_iscsid_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:14:09 np0005538513.localdomain sudo[61725]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:09 np0005538513.localdomain sudo[61741]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pzzdeonoqtvzzkttrxfofwgabzsklkfu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:09 np0005538513.localdomain sudo[61741]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:09 np0005538513.localdomain python3[61743]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:14:09 np0005538513.localdomain sudo[61741]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:09 np0005538513.localdomain sudo[61757]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldcrljetufitnjakhbqlqeoruopeccfi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:09 np0005538513.localdomain sudo[61757]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:09 np0005538513.localdomain python3[61759]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:14:09 np0005538513.localdomain sudo[61757]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:09 np0005538513.localdomain sudo[61773]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmkambtnhyajguucmiigbdkggyxgpenk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:09 np0005538513.localdomain sudo[61773]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:09 np0005538513.localdomain python3[61775]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:14:09 np0005538513.localdomain sudo[61773]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:09 np0005538513.localdomain sudo[61789]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmwtlospfccyxrezrslqxznrvybudvoo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:09 np0005538513.localdomain sudo[61789]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:09 np0005538513.localdomain python3[61791]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:14:09 np0005538513.localdomain sudo[61789]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:10 np0005538513.localdomain sudo[61805]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvhmgtdvkludhfogufimzrmthobhurtb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:10 np0005538513.localdomain sudo[61805]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:10 np0005538513.localdomain python3[61807]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:14:10 np0005538513.localdomain sudo[61805]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:10 np0005538513.localdomain sudo[61821]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-amjtkfxmnfhsblespcqxjwolnhkrszpb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:10 np0005538513.localdomain sudo[61821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:10 np0005538513.localdomain python3[61823]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:14:10 np0005538513.localdomain sudo[61821]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:10 np0005538513.localdomain sudo[61837]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqtxchjdncbvuapkgjnwgdecgqhxpcwj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:10 np0005538513.localdomain sudo[61837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:10 np0005538513.localdomain python3[61839]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_rsyslog_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:14:10 np0005538513.localdomain sudo[61837]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:11 np0005538513.localdomain sudo[61898]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tafwjvkqvndbiawurrzsxuqymjzrxewt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:11 np0005538513.localdomain sudo[61898]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:11 np0005538513.localdomain python3[61900]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317650.7635152-101024-248536602527523/source dest=/etc/systemd/system/tripleo_collectd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:11 np0005538513.localdomain sudo[61898]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:11 np0005538513.localdomain sudo[61928]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldbvypepjtoowxeohnzzegvodxycstbp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:11 np0005538513.localdomain sudo[61928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:11 np0005538513.localdomain python3[61930]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317650.7635152-101024-248536602527523/source dest=/etc/systemd/system/tripleo_iscsid.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:11 np0005538513.localdomain sudo[61928]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:12 np0005538513.localdomain sudo[61957]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mofbkkjenxpgwmssmzkaluplfhsdgsvy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:12 np0005538513.localdomain sudo[61957]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:12 np0005538513.localdomain python3[61959]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317650.7635152-101024-248536602527523/source dest=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:12 np0005538513.localdomain sudo[61957]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:12 np0005538513.localdomain sudo[61986]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjhfowpjnbuhokrruxfvbuttnmlfzuku ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:12 np0005538513.localdomain sudo[61986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:12 np0005538513.localdomain python3[61988]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317650.7635152-101024-248536602527523/source dest=/etc/systemd/system/tripleo_nova_virtnodedevd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:12 np0005538513.localdomain sudo[61986]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:13 np0005538513.localdomain sudo[62015]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwsldrfxslvjkijucgocmxyhjwvqlnrm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:13 np0005538513.localdomain sudo[62015]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:13 np0005538513.localdomain python3[62017]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317650.7635152-101024-248536602527523/source dest=/etc/systemd/system/tripleo_nova_virtproxyd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:13 np0005538513.localdomain sudo[62015]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:13 np0005538513.localdomain sudo[62044]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwvjajzckqfdpgkfarzkqkahnzvywptc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:13 np0005538513.localdomain sudo[62044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:13 np0005538513.localdomain python3[62046]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317650.7635152-101024-248536602527523/source dest=/etc/systemd/system/tripleo_nova_virtqemud.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:13 np0005538513.localdomain sudo[62044]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:14 np0005538513.localdomain sudo[62073]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vjboaymyqstoyuxqgeqgoyoujiouypqh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:14 np0005538513.localdomain sudo[62073]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:14 np0005538513.localdomain python3[62075]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317650.7635152-101024-248536602527523/source dest=/etc/systemd/system/tripleo_nova_virtsecretd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:14 np0005538513.localdomain sudo[62073]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:14 np0005538513.localdomain sudo[62102]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkbrbyxrekhpocejvualwyolqcfcwiao ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:14 np0005538513.localdomain sudo[62102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:14 np0005538513.localdomain python3[62104]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317650.7635152-101024-248536602527523/source dest=/etc/systemd/system/tripleo_nova_virtstoraged.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:14 np0005538513.localdomain sudo[62102]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:15 np0005538513.localdomain sudo[62131]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-agnnakpgyhayyuknsqoonmoykwqalobb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:15 np0005538513.localdomain sudo[62131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:15 np0005538513.localdomain python3[62133]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317650.7635152-101024-248536602527523/source dest=/etc/systemd/system/tripleo_rsyslog.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:15 np0005538513.localdomain sudo[62131]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:15 np0005538513.localdomain sudo[62147]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qecpoaeqwfygckkifaewotreozomdrji ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:15 np0005538513.localdomain sudo[62147]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:15 np0005538513.localdomain python3[62149]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 08:14:15 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:14:15 np0005538513.localdomain systemd-sysv-generator[62179]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:15 np0005538513.localdomain systemd-rc-local-generator[62176]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:15 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:16 np0005538513.localdomain systemd[1]: Stopping User Manager for UID 0...
Nov 28 08:14:16 np0005538513.localdomain systemd[60679]: Activating special unit Exit the Session...
Nov 28 08:14:16 np0005538513.localdomain systemd[60679]: Stopped target Main User Target.
Nov 28 08:14:16 np0005538513.localdomain systemd[60679]: Stopped target Basic System.
Nov 28 08:14:16 np0005538513.localdomain systemd[60679]: Stopped target Paths.
Nov 28 08:14:16 np0005538513.localdomain systemd[60679]: Stopped target Sockets.
Nov 28 08:14:16 np0005538513.localdomain systemd[60679]: Stopped target Timers.
Nov 28 08:14:16 np0005538513.localdomain systemd[60679]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 08:14:16 np0005538513.localdomain systemd[60679]: Closed D-Bus User Message Bus Socket.
Nov 28 08:14:16 np0005538513.localdomain systemd[60679]: Stopped Create User's Volatile Files and Directories.
Nov 28 08:14:16 np0005538513.localdomain systemd[60679]: Removed slice User Application Slice.
Nov 28 08:14:16 np0005538513.localdomain systemd[60679]: Reached target Shutdown.
Nov 28 08:14:16 np0005538513.localdomain systemd[60679]: Finished Exit the Session.
Nov 28 08:14:16 np0005538513.localdomain systemd[60679]: Reached target Exit the Session.
Nov 28 08:14:16 np0005538513.localdomain systemd[1]: user@0.service: Deactivated successfully.
Nov 28 08:14:16 np0005538513.localdomain systemd[1]: Stopped User Manager for UID 0.
Nov 28 08:14:16 np0005538513.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 28 08:14:16 np0005538513.localdomain sudo[62147]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:16 np0005538513.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 28 08:14:16 np0005538513.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 28 08:14:16 np0005538513.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 28 08:14:16 np0005538513.localdomain systemd[1]: Removed slice User Slice of UID 0.
Nov 28 08:14:16 np0005538513.localdomain sudo[62200]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wlfzbljlawcggtdjgaiomlfxnzlsnabs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:16 np0005538513.localdomain sudo[62200]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:16 np0005538513.localdomain python3[62202]: ansible-systemd Invoked with state=restarted name=tripleo_collectd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:14:16 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:14:16 np0005538513.localdomain systemd-sysv-generator[62231]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:16 np0005538513.localdomain systemd-rc-local-generator[62227]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:17 np0005538513.localdomain systemd[1]: Starting collectd container...
Nov 28 08:14:17 np0005538513.localdomain systemd[1]: Started collectd container.
Nov 28 08:14:17 np0005538513.localdomain sudo[62200]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:17 np0005538513.localdomain sudo[62267]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqweenppfjrfgrggmhbnyjqloxfoicsb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:17 np0005538513.localdomain sudo[62267]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:17 np0005538513.localdomain python3[62269]: ansible-systemd Invoked with state=restarted name=tripleo_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:14:17 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:14:17 np0005538513.localdomain systemd-sysv-generator[62298]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:17 np0005538513.localdomain systemd-rc-local-generator[62294]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:17 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:18 np0005538513.localdomain systemd[1]: Starting iscsid container...
Nov 28 08:14:18 np0005538513.localdomain systemd[1]: Started iscsid container.
Nov 28 08:14:18 np0005538513.localdomain sudo[62267]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:18 np0005538513.localdomain sudo[62334]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ddhqnyawonwtuklgpbekwdeoigpstrsf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:18 np0005538513.localdomain sudo[62334]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:18 np0005538513.localdomain python3[62336]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtlogd_wrapper.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:14:18 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:14:18 np0005538513.localdomain systemd-sysv-generator[62364]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:18 np0005538513.localdomain systemd-rc-local-generator[62360]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:18 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:19 np0005538513.localdomain systemd[1]: Starting nova_virtlogd_wrapper container...
Nov 28 08:14:19 np0005538513.localdomain systemd[1]: Started nova_virtlogd_wrapper container.
Nov 28 08:14:19 np0005538513.localdomain sudo[62334]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:19 np0005538513.localdomain sudo[62401]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oyywihtedhubvjjvlazjglmoupcnnkeh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:19 np0005538513.localdomain sudo[62401]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:19 np0005538513.localdomain python3[62403]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtnodedevd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:14:20 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:14:20 np0005538513.localdomain systemd-sysv-generator[62436]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:20 np0005538513.localdomain systemd-rc-local-generator[62432]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:20 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:21 np0005538513.localdomain systemd[1]: Starting nova_virtnodedevd container...
Nov 28 08:14:21 np0005538513.localdomain tripleo-start-podman-container[62443]: Creating additional drop-in dependency for "nova_virtnodedevd" (6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265)
Nov 28 08:14:21 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:14:21 np0005538513.localdomain systemd-rc-local-generator[62501]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:21 np0005538513.localdomain systemd-sysv-generator[62504]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:21 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:21 np0005538513.localdomain systemd[1]: Started nova_virtnodedevd container.
Nov 28 08:14:21 np0005538513.localdomain sudo[62401]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:21 np0005538513.localdomain sudo[62526]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dwsblnxqogrmwjaqcbtuexewrjruujpu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:21 np0005538513.localdomain sudo[62526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:22 np0005538513.localdomain python3[62528]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtproxyd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:14:22 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:14:22 np0005538513.localdomain systemd-rc-local-generator[62553]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:22 np0005538513.localdomain systemd-sysv-generator[62556]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:22 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:22 np0005538513.localdomain systemd[1]: Starting nova_virtproxyd container...
Nov 28 08:14:22 np0005538513.localdomain tripleo-start-podman-container[62567]: Creating additional drop-in dependency for "nova_virtproxyd" (76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50)
Nov 28 08:14:22 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:14:22 np0005538513.localdomain systemd-sysv-generator[62628]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:22 np0005538513.localdomain systemd-rc-local-generator[62625]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:22 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:22 np0005538513.localdomain systemd[1]: Started nova_virtproxyd container.
Nov 28 08:14:22 np0005538513.localdomain sudo[62526]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:23 np0005538513.localdomain sudo[62650]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hpzoiyzncmnqsdsszmghhipnkwwedgic ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:23 np0005538513.localdomain sudo[62650]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:23 np0005538513.localdomain python3[62652]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtqemud.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:14:23 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:14:23 np0005538513.localdomain systemd-sysv-generator[62682]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:23 np0005538513.localdomain systemd-rc-local-generator[62677]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:23 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:23 np0005538513.localdomain systemd[1]: Starting nova_virtqemud container...
Nov 28 08:14:23 np0005538513.localdomain tripleo-start-podman-container[62692]: Creating additional drop-in dependency for "nova_virtqemud" (60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057)
Nov 28 08:14:23 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:14:24 np0005538513.localdomain systemd-sysv-generator[62753]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:24 np0005538513.localdomain systemd-rc-local-generator[62749]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:24 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:24 np0005538513.localdomain systemd[1]: Started nova_virtqemud container.
Nov 28 08:14:24 np0005538513.localdomain sudo[62650]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:24 np0005538513.localdomain sudo[62773]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bcqugazessqujhgesrreskmojqsyexpf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:24 np0005538513.localdomain sudo[62773]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:25 np0005538513.localdomain python3[62775]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtsecretd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:14:25 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:14:25 np0005538513.localdomain systemd-sysv-generator[62806]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:25 np0005538513.localdomain systemd-rc-local-generator[62800]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:25 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:25 np0005538513.localdomain systemd[1]: Starting nova_virtsecretd container...
Nov 28 08:14:25 np0005538513.localdomain tripleo-start-podman-container[62814]: Creating additional drop-in dependency for "nova_virtsecretd" (2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76)
Nov 28 08:14:25 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:14:25 np0005538513.localdomain systemd-rc-local-generator[62868]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:25 np0005538513.localdomain systemd-sysv-generator[62874]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:25 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:25 np0005538513.localdomain systemd[1]: Started nova_virtsecretd container.
Nov 28 08:14:25 np0005538513.localdomain sudo[62773]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:26 np0005538513.localdomain sudo[62895]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adibupytfycxpjdjkoltlgympapkwcjv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:26 np0005538513.localdomain sudo[62895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:26 np0005538513.localdomain python3[62897]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtstoraged.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:14:26 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:14:26 np0005538513.localdomain systemd-sysv-generator[62925]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:26 np0005538513.localdomain systemd-rc-local-generator[62921]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:26 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:26 np0005538513.localdomain systemd[1]: Starting nova_virtstoraged container...
Nov 28 08:14:27 np0005538513.localdomain tripleo-start-podman-container[62937]: Creating additional drop-in dependency for "nova_virtstoraged" (635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951)
Nov 28 08:14:27 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:14:27 np0005538513.localdomain systemd-rc-local-generator[62995]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:27 np0005538513.localdomain systemd-sysv-generator[62998]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:27 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:27 np0005538513.localdomain systemd[1]: Started nova_virtstoraged container.
Nov 28 08:14:27 np0005538513.localdomain sudo[62895]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:27 np0005538513.localdomain sudo[63020]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfecmbjgwuchfrejopflhxwwnikiqjwd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:27 np0005538513.localdomain sudo[63020]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:27 np0005538513.localdomain python3[63022]: ansible-systemd Invoked with state=restarted name=tripleo_rsyslog.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:14:27 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:14:28 np0005538513.localdomain systemd-sysv-generator[63053]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:28 np0005538513.localdomain systemd-rc-local-generator[63049]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:28 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:28 np0005538513.localdomain systemd[1]: Starting rsyslog container...
Nov 28 08:14:28 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:28 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:28 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:28 np0005538513.localdomain podman[63061]: 2025-11-28 08:14:28.454550936 +0000 UTC m=+0.118281470 container init 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, container_name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, build-date=2025-11-18T22:49:49Z, name=rhosp17/openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, release=1761123044, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, config_id=tripleo_step3)
Nov 28 08:14:28 np0005538513.localdomain podman[63061]: 2025-11-28 08:14:28.465828027 +0000 UTC m=+0.129558551 container start 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2025-11-18T22:49:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, container_name=rsyslog, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git)
Nov 28 08:14:28 np0005538513.localdomain podman[63061]: rsyslog
Nov 28 08:14:28 np0005538513.localdomain systemd[1]: Started rsyslog container.
Nov 28 08:14:28 np0005538513.localdomain sudo[63081]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:28 np0005538513.localdomain sudo[63081]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:28 np0005538513.localdomain sudo[63020]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:28 np0005538513.localdomain sudo[63081]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:28 np0005538513.localdomain systemd[1]: libpod-9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f.scope: Deactivated successfully.
Nov 28 08:14:28 np0005538513.localdomain podman[63098]: 2025-11-28 08:14:28.634159219 +0000 UTC m=+0.050897231 container died 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, version=17.1.12, build-date=2025-11-18T22:49:49Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, vcs-type=git)
Nov 28 08:14:28 np0005538513.localdomain podman[63098]: 2025-11-28 08:14:28.660991279 +0000 UTC m=+0.077729261 container cleanup 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, tcib_managed=true, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-rsyslog-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:49Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, name=rhosp17/openstack-rsyslog, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']})
Nov 28 08:14:28 np0005538513.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:14:28 np0005538513.localdomain podman[63110]: 2025-11-28 08:14:28.747010214 +0000 UTC m=+0.061713607 container cleanup 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, name=rhosp17/openstack-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-18T22:49:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container)
Nov 28 08:14:28 np0005538513.localdomain podman[63110]: rsyslog
Nov 28 08:14:28 np0005538513.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Nov 28 08:14:28 np0005538513.localdomain sudo[63134]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cklxqcbprhkxsroezbsjupoglzwvysbj ; /usr/bin/python3
Nov 28 08:14:28 np0005538513.localdomain sudo[63134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:28 np0005538513.localdomain python3[63137]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks3.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:28 np0005538513.localdomain sudo[63134]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:28 np0005538513.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 1.
Nov 28 08:14:28 np0005538513.localdomain systemd[1]: Stopped rsyslog container.
Nov 28 08:14:29 np0005538513.localdomain systemd[1]: Starting rsyslog container...
Nov 28 08:14:29 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:29 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:29 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:29 np0005538513.localdomain podman[63138]: 2025-11-28 08:14:29.094445474 +0000 UTC m=+0.083399033 container init 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.buildah.version=1.41.4, vcs-type=git, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, build-date=2025-11-18T22:49:49Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64)
Nov 28 08:14:29 np0005538513.localdomain podman[63138]: 2025-11-28 08:14:29.10306956 +0000 UTC m=+0.092023159 container start 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, name=rhosp17/openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=rsyslog, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, build-date=2025-11-18T22:49:49Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog)
Nov 28 08:14:29 np0005538513.localdomain podman[63138]: rsyslog
Nov 28 08:14:29 np0005538513.localdomain systemd[1]: Started rsyslog container.
Nov 28 08:14:29 np0005538513.localdomain sudo[63158]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:29 np0005538513.localdomain sudo[63158]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:29 np0005538513.localdomain sudo[63158]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:29 np0005538513.localdomain systemd[1]: libpod-9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f.scope: Deactivated successfully.
Nov 28 08:14:29 np0005538513.localdomain podman[63161]: 2025-11-28 08:14:29.2619651 +0000 UTC m=+0.050376124 container died 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp17/openstack-rsyslog, build-date=2025-11-18T22:49:49Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:14:29 np0005538513.localdomain podman[63161]: 2025-11-28 08:14:29.289973197 +0000 UTC m=+0.078384191 container cleanup 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=rsyslog, build-date=2025-11-18T22:49:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp17/openstack-rsyslog, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:14:29 np0005538513.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:14:29 np0005538513.localdomain podman[63187]: 2025-11-28 08:14:29.374640579 +0000 UTC m=+0.058084941 container cleanup 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-rsyslog-container, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, release=1761123044, name=rhosp17/openstack-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Nov 28 08:14:29 np0005538513.localdomain podman[63187]: rsyslog
Nov 28 08:14:29 np0005538513.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Nov 28 08:14:29 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17-merged.mount: Deactivated successfully.
Nov 28 08:14:29 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f-userdata-shm.mount: Deactivated successfully.
Nov 28 08:14:29 np0005538513.localdomain sudo[63231]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugkrurtesltdxoqnyjjyeeuldjyycxcf ; /usr/bin/python3
Nov 28 08:14:29 np0005538513.localdomain sudo[63231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:29 np0005538513.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 2.
Nov 28 08:14:29 np0005538513.localdomain systemd[1]: Stopped rsyslog container.
Nov 28 08:14:29 np0005538513.localdomain systemd[1]: Starting rsyslog container...
Nov 28 08:14:29 np0005538513.localdomain sudo[63231]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:29 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:29 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:29 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:29 np0005538513.localdomain podman[63234]: 2025-11-28 08:14:29.624627857 +0000 UTC m=+0.108969382 container init 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-rsyslog-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:49:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog)
Nov 28 08:14:29 np0005538513.localdomain podman[63234]: 2025-11-28 08:14:29.633172891 +0000 UTC m=+0.117514456 container start 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, version=17.1.12, release=1761123044, name=rhosp17/openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-18T22:49:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=rsyslog, distribution-scope=public, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git)
Nov 28 08:14:29 np0005538513.localdomain podman[63234]: rsyslog
Nov 28 08:14:29 np0005538513.localdomain systemd[1]: Started rsyslog container.
Nov 28 08:14:29 np0005538513.localdomain sudo[63253]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:29 np0005538513.localdomain sudo[63253]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:29 np0005538513.localdomain sudo[63253]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:29 np0005538513.localdomain systemd[1]: libpod-9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f.scope: Deactivated successfully.
Nov 28 08:14:29 np0005538513.localdomain podman[63280]: 2025-11-28 08:14:29.776507522 +0000 UTC m=+0.050921682 container died 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:49Z, name=rhosp17/openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:14:29 np0005538513.localdomain podman[63280]: 2025-11-28 08:14:29.799346673 +0000 UTC m=+0.073760793 container cleanup 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:49Z, io.buildah.version=1.41.4, container_name=rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:14:29 np0005538513.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:14:29 np0005538513.localdomain sudo[63309]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsfkuvpxyprtgkhxcxbovzhzfiqcuzci ; /usr/bin/python3
Nov 28 08:14:29 np0005538513.localdomain sudo[63309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:29 np0005538513.localdomain podman[63312]: 2025-11-28 08:14:29.894724659 +0000 UTC m=+0.063242037 container cleanup 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, architecture=x86_64, container_name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:49Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-rsyslog, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=)
Nov 28 08:14:29 np0005538513.localdomain podman[63312]: rsyslog
Nov 28 08:14:29 np0005538513.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Nov 28 08:14:29 np0005538513.localdomain sudo[63309]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:30 np0005538513.localdomain sudo[63353]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egcqyayxqmlvybxgpdjjkhbcurlelduv ; /usr/bin/python3
Nov 28 08:14:30 np0005538513.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 3.
Nov 28 08:14:30 np0005538513.localdomain systemd[1]: Stopped rsyslog container.
Nov 28 08:14:30 np0005538513.localdomain sudo[63353]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:30 np0005538513.localdomain systemd[1]: Starting rsyslog container...
Nov 28 08:14:30 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:30 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:30 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:30 np0005538513.localdomain podman[63356]: 2025-11-28 08:14:30.366316996 +0000 UTC m=+0.111553305 container init 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_id=tripleo_step3, name=rhosp17/openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=rsyslog, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4)
Nov 28 08:14:30 np0005538513.localdomain podman[63356]: 2025-11-28 08:14:30.375127128 +0000 UTC m=+0.120363427 container start 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2025-11-18T22:49:49Z, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, name=rhosp17/openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, com.redhat.component=openstack-rsyslog-container)
Nov 28 08:14:30 np0005538513.localdomain podman[63356]: rsyslog
Nov 28 08:14:30 np0005538513.localdomain systemd[1]: Started rsyslog container.
Nov 28 08:14:30 np0005538513.localdomain sudo[63376]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:30 np0005538513.localdomain sudo[63376]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:30 np0005538513.localdomain python3[63355]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks3.json short_hostname=np0005538513 step=3 update_config_hash_only=False
Nov 28 08:14:30 np0005538513.localdomain sudo[63353]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:30 np0005538513.localdomain sudo[63376]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:30 np0005538513.localdomain systemd[1]: libpod-9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f.scope: Deactivated successfully.
Nov 28 08:14:30 np0005538513.localdomain podman[63379]: 2025-11-28 08:14:30.516700842 +0000 UTC m=+0.040275870 container died 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_id=tripleo_step3, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-rsyslog-container, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T22:49:49Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']})
Nov 28 08:14:30 np0005538513.localdomain systemd[1]: tmp-crun.t8EM5O.mount: Deactivated successfully.
Nov 28 08:14:30 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f-userdata-shm.mount: Deactivated successfully.
Nov 28 08:14:30 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17-merged.mount: Deactivated successfully.
Nov 28 08:14:30 np0005538513.localdomain podman[63379]: 2025-11-28 08:14:30.549441851 +0000 UTC m=+0.073016829 container cleanup 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, container_name=rsyslog, name=rhosp17/openstack-rsyslog, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-rsyslog-container, build-date=2025-11-18T22:49:49Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1)
Nov 28 08:14:30 np0005538513.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:14:30 np0005538513.localdomain podman[63393]: 2025-11-28 08:14:30.627552454 +0000 UTC m=+0.054713043 container cleanup 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, container_name=rsyslog, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, version=17.1.12, build-date=2025-11-18T22:49:49Z)
Nov 28 08:14:30 np0005538513.localdomain podman[63393]: rsyslog
Nov 28 08:14:30 np0005538513.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Nov 28 08:14:30 np0005538513.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 4.
Nov 28 08:14:30 np0005538513.localdomain systemd[1]: Stopped rsyslog container.
Nov 28 08:14:30 np0005538513.localdomain systemd[1]: Starting rsyslog container...
Nov 28 08:14:30 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:30 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:30 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:30 np0005538513.localdomain podman[63405]: 2025-11-28 08:14:30.848321086 +0000 UTC m=+0.093311720 container init 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:49Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3, distribution-scope=public, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, release=1761123044)
Nov 28 08:14:30 np0005538513.localdomain podman[63405]: 2025-11-28 08:14:30.857290703 +0000 UTC m=+0.102281327 container start 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, name=rhosp17/openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, container_name=rsyslog)
Nov 28 08:14:30 np0005538513.localdomain podman[63405]: rsyslog
Nov 28 08:14:30 np0005538513.localdomain systemd[1]: Started rsyslog container.
Nov 28 08:14:30 np0005538513.localdomain sudo[63424]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:30 np0005538513.localdomain sudo[63424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:30 np0005538513.localdomain sudo[63424]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:30 np0005538513.localdomain systemd[1]: libpod-9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f.scope: Deactivated successfully.
Nov 28 08:14:30 np0005538513.localdomain sudo[63441]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uchotrektsljibnmnbywdpatversxzuk ; /usr/bin/python3
Nov 28 08:14:30 np0005538513.localdomain sudo[63441]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:31 np0005538513.localdomain podman[63440]: 2025-11-28 08:14:31.009942223 +0000 UTC m=+0.054200398 container died 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, url=https://www.redhat.com, container_name=rsyslog, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:49Z, release=1761123044)
Nov 28 08:14:31 np0005538513.localdomain podman[63440]: 2025-11-28 08:14:31.035429039 +0000 UTC m=+0.079687184 container cleanup 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:49Z, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-rsyslog, batch=17.1_20251118.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, version=17.1.12, release=1761123044, url=https://www.redhat.com, container_name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true)
Nov 28 08:14:31 np0005538513.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:14:31 np0005538513.localdomain podman[63456]: 2025-11-28 08:14:31.145859016 +0000 UTC m=+0.078742563 container cleanup 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-rsyslog, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:49Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-type=git, distribution-scope=public)
Nov 28 08:14:31 np0005538513.localdomain podman[63456]: rsyslog
Nov 28 08:14:31 np0005538513.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Nov 28 08:14:31 np0005538513.localdomain python3[63454]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:31 np0005538513.localdomain sudo[63441]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:31 np0005538513.localdomain sudo[63481]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdipstocwnkmkyuikbderjreufsozzaf ; /usr/bin/python3
Nov 28 08:14:31 np0005538513.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 5.
Nov 28 08:14:31 np0005538513.localdomain systemd[1]: Stopped rsyslog container.
Nov 28 08:14:31 np0005538513.localdomain systemd[1]: tripleo_rsyslog.service: Start request repeated too quickly.
Nov 28 08:14:31 np0005538513.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Nov 28 08:14:31 np0005538513.localdomain systemd[1]: Failed to start rsyslog container.
Nov 28 08:14:31 np0005538513.localdomain sudo[63481]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:31 np0005538513.localdomain python3[63483]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_3 config_pattern=container-puppet-*.json config_overrides={} debug=True
Nov 28 08:14:31 np0005538513.localdomain sudo[63481]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:14:32 np0005538513.localdomain podman[63484]: 2025-11-28 08:14:32.83641877 +0000 UTC m=+0.070350255 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:14:33 np0005538513.localdomain podman[63484]: 2025-11-28 08:14:33.031154368 +0000 UTC m=+0.265085823 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, tcib_managed=true, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044)
Nov 28 08:14:33 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:14:33 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:14:33 np0005538513.localdomain podman[63513]: 2025-11-28 08:14:33.843738807 +0000 UTC m=+0.082619797 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, architecture=x86_64)
Nov 28 08:14:33 np0005538513.localdomain podman[63513]: 2025-11-28 08:14:33.85941667 +0000 UTC m=+0.098297660 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true)
Nov 28 08:14:33 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:14:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:14:34 np0005538513.localdomain systemd[1]: tmp-crun.yo1hfP.mount: Deactivated successfully.
Nov 28 08:14:34 np0005538513.localdomain podman[63533]: 2025-11-28 08:14:34.879357891 +0000 UTC m=+0.086201652 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:14:34 np0005538513.localdomain podman[63533]: 2025-11-28 08:14:34.920409536 +0000 UTC m=+0.127253257 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 28 08:14:34 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:14:48 np0005538513.localdomain sudo[63553]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:14:48 np0005538513.localdomain sudo[63553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:14:48 np0005538513.localdomain sudo[63553]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:48 np0005538513.localdomain sudo[63568]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:14:48 np0005538513.localdomain sudo[63568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:14:49 np0005538513.localdomain sudo[63568]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:49 np0005538513.localdomain sudo[63616]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:14:49 np0005538513.localdomain sudo[63616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:14:49 np0005538513.localdomain sudo[63616]: pam_unix(sudo:session): session closed for user root
Nov 28 08:15:03 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:15:03 np0005538513.localdomain systemd[1]: tmp-crun.PhwKBI.mount: Deactivated successfully.
Nov 28 08:15:03 np0005538513.localdomain podman[63631]: 2025-11-28 08:15:03.863100305 +0000 UTC m=+0.096959789 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Nov 28 08:15:04 np0005538513.localdomain podman[63631]: 2025-11-28 08:15:04.073683399 +0000 UTC m=+0.307542903 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, io.buildah.version=1.41.4, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=)
Nov 28 08:15:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:15:04 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:15:04 np0005538513.localdomain systemd[1]: tmp-crun.1kaJ0p.mount: Deactivated successfully.
Nov 28 08:15:04 np0005538513.localdomain podman[63660]: 2025-11-28 08:15:04.193255481 +0000 UTC m=+0.087343560 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, build-date=2025-11-18T22:51:28Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container)
Nov 28 08:15:04 np0005538513.localdomain podman[63660]: 2025-11-28 08:15:04.205405578 +0000 UTC m=+0.099493647 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, url=https://www.redhat.com, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:15:04 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:15:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:15:05 np0005538513.localdomain podman[63681]: 2025-11-28 08:15:05.841343462 +0000 UTC m=+0.079284788 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:15:05 np0005538513.localdomain podman[63681]: 2025-11-28 08:15:05.878471437 +0000 UTC m=+0.116412733 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_id=tripleo_step3, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.12)
Nov 28 08:15:05 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:15:24 np0005538513.localdomain sshd[63699]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:15:25 np0005538513.localdomain sshd[63699]: Invalid user node from 193.32.162.146 port 49234
Nov 28 08:15:25 np0005538513.localdomain sshd[63699]: Connection closed by invalid user node 193.32.162.146 port 49234 [preauth]
Nov 28 08:15:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:15:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:15:35 np0005538513.localdomain podman[63701]: 2025-11-28 08:15:35.378177356 +0000 UTC m=+0.066781229 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:15:35 np0005538513.localdomain podman[63701]: 2025-11-28 08:15:35.387983041 +0000 UTC m=+0.076586884 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, version=17.1.12, release=1761123044)
Nov 28 08:15:35 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:15:35 np0005538513.localdomain podman[63702]: 2025-11-28 08:15:35.392815371 +0000 UTC m=+0.073895800 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=metrics_qdr, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044)
Nov 28 08:15:35 np0005538513.localdomain podman[63702]: 2025-11-28 08:15:35.585682744 +0000 UTC m=+0.266763233 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, release=1761123044)
Nov 28 08:15:35 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:15:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:15:36 np0005538513.localdomain systemd[1]: tmp-crun.j22Ket.mount: Deactivated successfully.
Nov 28 08:15:36 np0005538513.localdomain podman[63752]: 2025-11-28 08:15:36.845215272 +0000 UTC m=+0.080153705 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 28 08:15:36 np0005538513.localdomain podman[63752]: 2025-11-28 08:15:36.852916872 +0000 UTC m=+0.087855315 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible)
Nov 28 08:15:36 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:15:50 np0005538513.localdomain sudo[63771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:15:50 np0005538513.localdomain sudo[63771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:15:50 np0005538513.localdomain sudo[63771]: pam_unix(sudo:session): session closed for user root
Nov 28 08:15:50 np0005538513.localdomain sudo[63786]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:15:50 np0005538513.localdomain sudo[63786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:15:50 np0005538513.localdomain sudo[63786]: pam_unix(sudo:session): session closed for user root
Nov 28 08:15:51 np0005538513.localdomain sudo[63834]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:15:51 np0005538513.localdomain sudo[63834]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:15:51 np0005538513.localdomain sudo[63834]: pam_unix(sudo:session): session closed for user root
Nov 28 08:16:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:16:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:16:05 np0005538513.localdomain podman[63849]: 2025-11-28 08:16:05.829801189 +0000 UTC m=+0.073801366 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc.)
Nov 28 08:16:05 np0005538513.localdomain systemd[1]: tmp-crun.VJV5i6.mount: Deactivated successfully.
Nov 28 08:16:05 np0005538513.localdomain podman[63850]: 2025-11-28 08:16:05.864390036 +0000 UTC m=+0.099409994 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, release=1761123044, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Nov 28 08:16:05 np0005538513.localdomain podman[63849]: 2025-11-28 08:16:05.869364051 +0000 UTC m=+0.113364188 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:16:05 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:16:06 np0005538513.localdomain podman[63850]: 2025-11-28 08:16:06.057618269 +0000 UTC m=+0.292638197 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc.)
Nov 28 08:16:06 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:16:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:16:07 np0005538513.localdomain podman[63897]: 2025-11-28 08:16:07.845664226 +0000 UTC m=+0.084280823 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.12, architecture=x86_64, config_id=tripleo_step3, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:16:07 np0005538513.localdomain podman[63897]: 2025-11-28 08:16:07.857485545 +0000 UTC m=+0.096102142 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, container_name=iscsid, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public)
Nov 28 08:16:07 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:16:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:16:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:16:36 np0005538513.localdomain podman[63918]: 2025-11-28 08:16:36.8410722 +0000 UTC m=+0.075998535 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, version=17.1.12, vcs-type=git, url=https://www.redhat.com)
Nov 28 08:16:36 np0005538513.localdomain podman[63917]: 2025-11-28 08:16:36.896985521 +0000 UTC m=+0.132928449 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:16:36 np0005538513.localdomain podman[63917]: 2025-11-28 08:16:36.90565693 +0000 UTC m=+0.141599888 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git, container_name=collectd, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container)
Nov 28 08:16:36 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:16:37 np0005538513.localdomain podman[63918]: 2025-11-28 08:16:37.032369794 +0000 UTC m=+0.267296129 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:16:37 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:16:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:16:38 np0005538513.localdomain podman[63966]: 2025-11-28 08:16:38.824987433 +0000 UTC m=+0.066447699 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_id=tripleo_step3)
Nov 28 08:16:38 np0005538513.localdomain podman[63966]: 2025-11-28 08:16:38.862377687 +0000 UTC m=+0.103837943 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:16:38 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:16:51 np0005538513.localdomain sudo[63985]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:16:51 np0005538513.localdomain sudo[63985]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:16:51 np0005538513.localdomain sudo[63985]: pam_unix(sudo:session): session closed for user root
Nov 28 08:16:51 np0005538513.localdomain sudo[64000]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:16:51 np0005538513.localdomain sudo[64000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:16:52 np0005538513.localdomain sudo[64000]: pam_unix(sudo:session): session closed for user root
Nov 28 08:16:55 np0005538513.localdomain sudo[64047]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:16:55 np0005538513.localdomain sudo[64047]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:16:55 np0005538513.localdomain sudo[64047]: pam_unix(sudo:session): session closed for user root
Nov 28 08:17:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:17:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:17:07 np0005538513.localdomain systemd[1]: tmp-crun.AjsOuO.mount: Deactivated successfully.
Nov 28 08:17:07 np0005538513.localdomain podman[64062]: 2025-11-28 08:17:07.857637266 +0000 UTC m=+0.095167523 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=collectd, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 28 08:17:07 np0005538513.localdomain podman[64062]: 2025-11-28 08:17:07.866853253 +0000 UTC m=+0.104383460 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public)
Nov 28 08:17:07 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:17:07 np0005538513.localdomain podman[64063]: 2025-11-28 08:17:07.958122674 +0000 UTC m=+0.194538616 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=)
Nov 28 08:17:08 np0005538513.localdomain podman[64063]: 2025-11-28 08:17:08.184443456 +0000 UTC m=+0.420859388 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.4, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Nov 28 08:17:08 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:17:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:17:09 np0005538513.localdomain podman[64111]: 2025-11-28 08:17:09.851404256 +0000 UTC m=+0.081061574 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:17:09 np0005538513.localdomain podman[64111]: 2025-11-28 08:17:09.8643839 +0000 UTC m=+0.094041198 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, distribution-scope=public, tcib_managed=true, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4)
Nov 28 08:17:09 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:17:28 np0005538513.localdomain sshd[64130]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:17:28 np0005538513.localdomain sshd[64130]: Invalid user solana from 193.32.162.146 port 35816
Nov 28 08:17:28 np0005538513.localdomain sshd[64130]: Connection closed by invalid user solana 193.32.162.146 port 35816 [preauth]
Nov 28 08:17:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:17:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:17:38 np0005538513.localdomain podman[64132]: 2025-11-28 08:17:38.8297433 +0000 UTC m=+0.065817659 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, container_name=collectd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, release=1761123044, version=17.1.12)
Nov 28 08:17:38 np0005538513.localdomain podman[64132]: 2025-11-28 08:17:38.842339821 +0000 UTC m=+0.078414170 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, release=1761123044, io.openshift.expose-services=, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.)
Nov 28 08:17:38 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:17:38 np0005538513.localdomain podman[64133]: 2025-11-28 08:17:38.896148345 +0000 UTC m=+0.128919632 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, vcs-type=git, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:17:39 np0005538513.localdomain podman[64133]: 2025-11-28 08:17:39.114310345 +0000 UTC m=+0.347081602 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step1, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Nov 28 08:17:39 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:17:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:17:40 np0005538513.localdomain podman[64179]: 2025-11-28 08:17:40.849631582 +0000 UTC m=+0.087791934 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3)
Nov 28 08:17:40 np0005538513.localdomain podman[64179]: 2025-11-28 08:17:40.859364665 +0000 UTC m=+0.097525007 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, tcib_managed=true, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:17:40 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:17:56 np0005538513.localdomain sudo[64198]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:17:56 np0005538513.localdomain sudo[64198]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:17:56 np0005538513.localdomain sudo[64198]: pam_unix(sudo:session): session closed for user root
Nov 28 08:17:56 np0005538513.localdomain sudo[64213]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:17:56 np0005538513.localdomain sudo[64213]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:17:56 np0005538513.localdomain sudo[64213]: pam_unix(sudo:session): session closed for user root
Nov 28 08:17:56 np0005538513.localdomain sudo[64260]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:17:56 np0005538513.localdomain sudo[64260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:17:56 np0005538513.localdomain sudo[64260]: pam_unix(sudo:session): session closed for user root
Nov 28 08:17:57 np0005538513.localdomain sudo[64275]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 28 08:17:57 np0005538513.localdomain sudo[64275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:17:57 np0005538513.localdomain sudo[64275]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:02 np0005538513.localdomain sudo[64308]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:18:02 np0005538513.localdomain sudo[64308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:18:02 np0005538513.localdomain sudo[64308]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:18:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:18:09 np0005538513.localdomain podman[64325]: 2025-11-28 08:18:09.839867725 +0000 UTC m=+0.077106810 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, architecture=x86_64, config_id=tripleo_step1, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4)
Nov 28 08:18:09 np0005538513.localdomain systemd[1]: tmp-crun.USzITd.mount: Deactivated successfully.
Nov 28 08:18:09 np0005538513.localdomain podman[64324]: 2025-11-28 08:18:09.873722758 +0000 UTC m=+0.109864590 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public)
Nov 28 08:18:09 np0005538513.localdomain podman[64324]: 2025-11-28 08:18:09.914647022 +0000 UTC m=+0.150788904 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, container_name=collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vcs-type=git)
Nov 28 08:18:09 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:18:10 np0005538513.localdomain podman[64325]: 2025-11-28 08:18:10.039446606 +0000 UTC m=+0.276685671 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:18:10 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:18:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:18:11 np0005538513.localdomain systemd[1]: tmp-crun.1VIiWK.mount: Deactivated successfully.
Nov 28 08:18:11 np0005538513.localdomain podman[64374]: 2025-11-28 08:18:11.823724416 +0000 UTC m=+0.062878118 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, architecture=x86_64, managed_by=tripleo_ansible, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044)
Nov 28 08:18:11 np0005538513.localdomain podman[64374]: 2025-11-28 08:18:11.858129067 +0000 UTC m=+0.097282839 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 28 08:18:11 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:18:28 np0005538513.localdomain sudo[64439]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kveuqvyzakvrzuupzrpdxiripwcyduym ; /usr/bin/python3
Nov 28 08:18:28 np0005538513.localdomain sudo[64439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:28 np0005538513.localdomain python3[64441]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:18:28 np0005538513.localdomain sudo[64439]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:28 np0005538513.localdomain sudo[64484]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxtruaibnpkxfitnrrscuecpaocpyjsl ; /usr/bin/python3
Nov 28 08:18:28 np0005538513.localdomain sudo[64484]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:28 np0005538513.localdomain python3[64486]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317908.1401827-108167-57177670405838/source _original_basename=tmpq7dykwvm follow=False checksum=ee48fb03297eb703b1954c8852d0f67fab51dac1 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:18:28 np0005538513.localdomain sudo[64484]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:29 np0005538513.localdomain sudo[64546]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhhtwnzgmuvvlwetficirwwcgffbxyhr ; /usr/bin/python3
Nov 28 08:18:29 np0005538513.localdomain sudo[64546]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:30 np0005538513.localdomain python3[64548]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/recover_tripleo_nova_virtqemud.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:18:30 np0005538513.localdomain sudo[64546]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:30 np0005538513.localdomain sudo[64589]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-firyyxhhstfylcmsjxvjhdtauysrhgvb ; /usr/bin/python3
Nov 28 08:18:30 np0005538513.localdomain sudo[64589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:30 np0005538513.localdomain python3[64591]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/recover_tripleo_nova_virtqemud.sh mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317909.744724-108363-151236680935108/source _original_basename=tmpymj_pfvr follow=False checksum=922b8aa8342176110bffc2e39abdccc2b39e53a9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:18:30 np0005538513.localdomain sudo[64589]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:30 np0005538513.localdomain sudo[64651]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ixjrzqquvgjicvxggbrltlhltaxxxbat ; /usr/bin/python3
Nov 28 08:18:30 np0005538513.localdomain sudo[64651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:31 np0005538513.localdomain python3[64653]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:18:31 np0005538513.localdomain sudo[64651]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:31 np0005538513.localdomain sudo[64694]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-boaxbwlqpgkfdkwmaihxutgczlywndic ; /usr/bin/python3
Nov 28 08:18:31 np0005538513.localdomain sudo[64694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:31 np0005538513.localdomain python3[64696]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.service mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317910.7036264-108421-207814106630004/source _original_basename=tmp3_ou5n1o follow=False checksum=92f73544b703afc85885fa63ab07bdf8f8671554 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:18:31 np0005538513.localdomain sudo[64694]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:31 np0005538513.localdomain sudo[64756]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-walnlfwoxnuthlichotbplazdqidfowy ; /usr/bin/python3
Nov 28 08:18:31 np0005538513.localdomain sudo[64756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:32 np0005538513.localdomain python3[64758]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:18:32 np0005538513.localdomain sudo[64756]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:32 np0005538513.localdomain sudo[64799]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-njdbdqjicsjutelatjefmqcmnmnddson ; /usr/bin/python3
Nov 28 08:18:32 np0005538513.localdomain sudo[64799]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:32 np0005538513.localdomain python3[64801]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317911.7271552-108483-211511023349089/source _original_basename=tmpa1i1_8_v follow=False checksum=c6e5f76a53c0d6ccaf46c4b48d813dc2891ad8e9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:18:32 np0005538513.localdomain sudo[64799]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:33 np0005538513.localdomain sudo[64829]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nsmrddlmqtkcdldirchnsujlhwxdjfwq ; /usr/bin/python3
Nov 28 08:18:33 np0005538513.localdomain sudo[64829]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:33 np0005538513.localdomain python3[64831]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.service daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 28 08:18:33 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:18:33 np0005538513.localdomain systemd-sysv-generator[64861]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:18:33 np0005538513.localdomain systemd-rc-local-generator[64858]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:18:33 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:18:33 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:18:33 np0005538513.localdomain systemd-sysv-generator[64898]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:18:33 np0005538513.localdomain systemd-rc-local-generator[64894]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:18:33 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:18:34 np0005538513.localdomain sudo[64829]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:34 np0005538513.localdomain sudo[64918]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emhbeaeibpshaaabtrhmgbiteqskpfcw ; /usr/bin/python3
Nov 28 08:18:34 np0005538513.localdomain sudo[64918]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:18:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1800.1 total, 600.0 interval
                                                          Cumulative writes: 4541 writes, 20K keys, 4541 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4541 writes, 459 syncs, 9.89 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 297 writes, 643 keys, 297 commit groups, 1.0 writes per commit group, ingest: 0.53 MB, 0.00 MB/s
                                                          Interval WAL: 297 writes, 148 syncs, 2.01 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 08:18:34 np0005538513.localdomain python3[64920]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.timer state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:18:34 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:18:34 np0005538513.localdomain systemd-rc-local-generator[64943]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:18:34 np0005538513.localdomain systemd-sysv-generator[64946]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:18:34 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:18:35 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:18:35 np0005538513.localdomain systemd-sysv-generator[64987]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:18:35 np0005538513.localdomain systemd-rc-local-generator[64984]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:18:35 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:18:35 np0005538513.localdomain systemd[1]: Started Check and recover tripleo_nova_virtqemud every 10m.
Nov 28 08:18:35 np0005538513.localdomain sudo[64918]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:35 np0005538513.localdomain sudo[65008]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnudqxwvvqzxkildpxplctwkbyoqygzy ; /usr/bin/python3
Nov 28 08:18:35 np0005538513.localdomain sudo[65008]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:35 np0005538513.localdomain python3[65010]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl enable --now tripleo_nova_virtqemud_recover.timer _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 08:18:35 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:18:35 np0005538513.localdomain systemd-rc-local-generator[65035]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:18:35 np0005538513.localdomain systemd-sysv-generator[65038]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:18:35 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:18:35 np0005538513.localdomain sudo[65008]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:36 np0005538513.localdomain sudo[65092]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txjpqjmxbtmnphlglpslpnleoqkyinwl ; /usr/bin/python3
Nov 28 08:18:36 np0005538513.localdomain sudo[65092]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:36 np0005538513.localdomain python3[65094]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:18:36 np0005538513.localdomain sudo[65092]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:36 np0005538513.localdomain sudo[65135]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbhirpsdhfilqrtloausjlqftkewkkrr ; /usr/bin/python3
Nov 28 08:18:36 np0005538513.localdomain sudo[65135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:36 np0005538513.localdomain python3[65137]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_libvirt.target group=root mode=0644 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317916.1270037-108624-93954046964457/source _original_basename=tmp4murbgv2 follow=False checksum=c064b4a8e7d3d1d7c62d1f80a09e350659996afd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:18:36 np0005538513.localdomain sudo[65135]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:37 np0005538513.localdomain sudo[65165]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzqcahbswkfgadjhzpslezlianjbvgba ; /usr/bin/python3
Nov 28 08:18:37 np0005538513.localdomain sudo[65165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:37 np0005538513.localdomain python3[65167]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:18:37 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:18:37 np0005538513.localdomain systemd-rc-local-generator[65192]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:18:37 np0005538513.localdomain systemd-sysv-generator[65197]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:18:37 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:18:37 np0005538513.localdomain systemd[1]: Reached target tripleo_nova_libvirt.target.
Nov 28 08:18:37 np0005538513.localdomain sudo[65165]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:37 np0005538513.localdomain sudo[65220]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rocwkjwyvlxbiyyxyaxeytlyngdihqbt ; /usr/bin/python3
Nov 28 08:18:37 np0005538513.localdomain sudo[65220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:38 np0005538513.localdomain python3[65222]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:18:38 np0005538513.localdomain sudo[65220]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:38 np0005538513.localdomain sudo[65270]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nlbpakcxygadfrreyncekesiaswrbbdo ; /usr/bin/python3
Nov 28 08:18:38 np0005538513.localdomain sudo[65270]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:38 np0005538513.localdomain sudo[65270]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:38 np0005538513.localdomain sudo[65288]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jgdhbxorqtrtiplxcoedmiomlndwtbxy ; /usr/bin/python3
Nov 28 08:18:38 np0005538513.localdomain sudo[65288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:39 np0005538513.localdomain sudo[65288]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:39 np0005538513.localdomain sudo[65392]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzmjkmrnjsygeuxflqyjnevkofeszhbf ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317919.2468598-108755-100979962503665/async_wrapper.py 771929046149 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317919.2468598-108755-100979962503665/AnsiballZ_command.py _
Nov 28 08:18:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:18:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1800.3 total, 600.0 interval
                                                          Cumulative writes: 5030 writes, 22K keys, 5030 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5030 writes, 563 syncs, 8.93 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 293 writes, 666 keys, 293 commit groups, 1.0 writes per commit group, ingest: 0.52 MB, 0.00 MB/s
                                                          Interval WAL: 293 writes, 146 syncs, 2.01 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 08:18:39 np0005538513.localdomain sudo[65392]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 28 08:18:39 np0005538513.localdomain ansible-async_wrapper.py[65394]: Invoked with 771929046149 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317919.2468598-108755-100979962503665/AnsiballZ_command.py _
Nov 28 08:18:39 np0005538513.localdomain ansible-async_wrapper.py[65397]: Starting module and watcher
Nov 28 08:18:39 np0005538513.localdomain ansible-async_wrapper.py[65397]: Start watching 65398 (3600)
Nov 28 08:18:39 np0005538513.localdomain ansible-async_wrapper.py[65398]: Start module (65398)
Nov 28 08:18:39 np0005538513.localdomain ansible-async_wrapper.py[65394]: Return async_wrapper task started.
Nov 28 08:18:39 np0005538513.localdomain sudo[65392]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:39 np0005538513.localdomain sudo[65416]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrujyglevabpqkaegrkslsgtcviuvlxs ; /usr/bin/python3
Nov 28 08:18:39 np0005538513.localdomain sudo[65416]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:18:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:18:40 np0005538513.localdomain podman[65418]: 2025-11-28 08:18:40.10168311 +0000 UTC m=+0.111343666 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, config_id=tripleo_step3, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, container_name=collectd, io.openshift.expose-services=, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4)
Nov 28 08:18:40 np0005538513.localdomain podman[65418]: 2025-11-28 08:18:40.113416345 +0000 UTC m=+0.123076951 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step3, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:18:40 np0005538513.localdomain python3[65419]: ansible-ansible.legacy.async_status Invoked with jid=771929046149.65394 mode=status _async_dir=/tmp/.ansible_async
Nov 28 08:18:40 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:18:40 np0005538513.localdomain sudo[65416]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:40 np0005538513.localdomain podman[65438]: 2025-11-28 08:18:40.194611722 +0000 UTC m=+0.083380046 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:18:40 np0005538513.localdomain podman[65438]: 2025-11-28 08:18:40.386336059 +0000 UTC m=+0.275104383 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, vcs-type=git, build-date=2025-11-18T22:49:46Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com)
Nov 28 08:18:40 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:18:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:18:42 np0005538513.localdomain podman[65521]: 2025-11-28 08:18:42.462359489 +0000 UTC m=+0.078136163 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:18:42 np0005538513.localdomain podman[65521]: 2025-11-28 08:18:42.47654839 +0000 UTC m=+0.092325134 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, config_id=tripleo_step3, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vendor=Red Hat, Inc., container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 28 08:18:42 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]:    (file & line not available)
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]:    (file & line not available)
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]:                     with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]:                     with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]:                     with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]:                     with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]:                     with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Nov 28 08:18:43 np0005538513.localdomain puppet-user[65404]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.20 seconds
Nov 28 08:18:44 np0005538513.localdomain ansible-async_wrapper.py[65397]: 65398 still running (3600)
Nov 28 08:18:49 np0005538513.localdomain ansible-async_wrapper.py[65397]: 65398 still running (3595)
Nov 28 08:18:50 np0005538513.localdomain sudo[65687]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dfebrxzlzrncipxwxzadtqjgqwqawekv ; /usr/bin/python3
Nov 28 08:18:50 np0005538513.localdomain sudo[65687]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:50 np0005538513.localdomain python3[65689]: ansible-ansible.legacy.async_status Invoked with jid=771929046149.65394 mode=status _async_dir=/tmp/.ansible_async
Nov 28 08:18:50 np0005538513.localdomain sudo[65687]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:51 np0005538513.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 08:18:51 np0005538513.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 28 08:18:51 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:18:51 np0005538513.localdomain systemd-sysv-generator[65837]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:18:51 np0005538513.localdomain systemd-rc-local-generator[65833]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:18:51 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:18:51 np0005538513.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 08:18:52 np0005538513.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 08:18:52 np0005538513.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 28 08:18:52 np0005538513.localdomain systemd[1]: man-db-cache-update.service: Consumed 1.155s CPU time.
Nov 28 08:18:52 np0005538513.localdomain systemd[1]: run-rc7fa4f6dd09f43ee9e8d0ddd10c187d0.service: Deactivated successfully.
Nov 28 08:18:52 np0005538513.localdomain puppet-user[65404]: Notice: /Stage[main]/Snmp/Package[snmpd]/ensure: created
Nov 28 08:18:52 np0005538513.localdomain puppet-user[65404]: Notice: /Stage[main]/Snmp/File[snmpd.conf]/content: content changed '{sha256}2b743f970e80e2150759bfc66f2d8d0fbd8b31624f79e2991248d1a5ac57494e' to '{sha256}2ccb0e8433cdc7a879c38994ca54c4dea57b8ee81d75485f6ddcf513209d34ed'
Nov 28 08:18:52 np0005538513.localdomain puppet-user[65404]: Notice: /Stage[main]/Snmp/File[snmpd.sysconfig]/content: content changed '{sha256}b63afb2dee7419b6834471f88581d981c8ae5c8b27b9d329ba67a02f3ddd8221' to '{sha256}3917ee8bbc680ad50d77186ad4a1d2705c2025c32fc32f823abbda7f2328dfbd'
Nov 28 08:18:52 np0005538513.localdomain puppet-user[65404]: Notice: /Stage[main]/Snmp/File[snmptrapd.conf]/content: content changed '{sha256}2e1ca894d609ef337b6243909bf5623c87fd5df98ecbd00c7d4c12cf12f03c4e' to '{sha256}3ecf18da1ba84ea3932607f2b903ee6a038b6f9ac4e1e371e48f3ef61c5052ea'
Nov 28 08:18:52 np0005538513.localdomain puppet-user[65404]: Notice: /Stage[main]/Snmp/File[snmptrapd.sysconfig]/content: content changed '{sha256}86ee5797ad10cb1ea0f631e9dfa6ae278ecf4f4d16f4c80f831cdde45601b23c' to '{sha256}2244553364afcca151958f8e2003e4c182f5e2ecfbe55405cec73fd818581e97'
Nov 28 08:18:52 np0005538513.localdomain puppet-user[65404]: Notice: /Stage[main]/Snmp/Service[snmptrapd]: Triggered 'refresh' from 2 events
Nov 28 08:18:54 np0005538513.localdomain ansible-async_wrapper.py[65397]: 65398 still running (3590)
Nov 28 08:18:57 np0005538513.localdomain puppet-user[65404]: Notice: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/returns: executed successfully
Nov 28 08:18:58 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:18:58 np0005538513.localdomain systemd-sysv-generator[66824]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:18:58 np0005538513.localdomain systemd-rc-local-generator[66821]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:18:58 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:18:58 np0005538513.localdomain systemd[1]: Starting Simple Network Management Protocol (SNMP) Daemon....
Nov 28 08:18:58 np0005538513.localdomain snmpd[66832]: Can't find directory of RPM packages
Nov 28 08:18:58 np0005538513.localdomain snmpd[66832]: Duplicate IPv4 address detected, some interfaces may not be visible in IP-MIB
Nov 28 08:18:58 np0005538513.localdomain systemd[1]: Started Simple Network Management Protocol (SNMP) Daemon..
Nov 28 08:18:58 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:18:58 np0005538513.localdomain systemd-rc-local-generator[66859]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:18:58 np0005538513.localdomain systemd-sysv-generator[66863]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:18:58 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:18:58 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:18:58 np0005538513.localdomain systemd-rc-local-generator[66891]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:18:58 np0005538513.localdomain systemd-sysv-generator[66896]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:18:58 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]: Notice: /Stage[main]/Snmp/Service[snmpd]/ensure: ensure changed 'stopped' to 'running'
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]: Notice: Applied catalog in 15.38 seconds
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]: Application:
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:    Initial environment: production
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:    Converged environment: production
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:          Run mode: user
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]: Changes:
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:             Total: 8
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]: Events:
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:           Success: 8
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:             Total: 8
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]: Resources:
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:         Restarted: 1
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:           Changed: 8
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:       Out of sync: 8
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:             Total: 19
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]: Time:
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:        Filebucket: 0.00
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:          Schedule: 0.00
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:            Augeas: 0.01
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:              File: 0.08
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:    Config retrieval: 0.26
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:           Service: 1.22
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:    Transaction evaluation: 15.37
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:    Catalog application: 15.38
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:          Last run: 1764317939
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:              Exec: 5.07
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:           Package: 8.82
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:             Total: 15.38
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]: Version:
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:            Config: 1764317923
Nov 28 08:18:59 np0005538513.localdomain puppet-user[65404]:            Puppet: 7.10.0
Nov 28 08:18:59 np0005538513.localdomain ansible-async_wrapper.py[65398]: Module complete (65398)
Nov 28 08:18:59 np0005538513.localdomain ansible-async_wrapper.py[65397]: Done in kid B.
Nov 28 08:19:00 np0005538513.localdomain sudo[66918]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-raisghnjpaeomrafmxxuyeriuwrgszpr ; /usr/bin/python3
Nov 28 08:19:00 np0005538513.localdomain sudo[66918]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:00 np0005538513.localdomain python3[66920]: ansible-ansible.legacy.async_status Invoked with jid=771929046149.65394 mode=status _async_dir=/tmp/.ansible_async
Nov 28 08:19:00 np0005538513.localdomain sudo[66918]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:01 np0005538513.localdomain sudo[66934]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-luupwsnwqjfwsuhbvrjnesxxyapudgjf ; /usr/bin/python3
Nov 28 08:19:01 np0005538513.localdomain sudo[66934]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:01 np0005538513.localdomain python3[66936]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 08:19:01 np0005538513.localdomain sudo[66934]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:01 np0005538513.localdomain sudo[66950]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gylmazaaaqtygokxhhrzwsckipikkoxg ; /usr/bin/python3
Nov 28 08:19:01 np0005538513.localdomain sudo[66950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:01 np0005538513.localdomain python3[66952]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:19:01 np0005538513.localdomain sudo[66950]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:02 np0005538513.localdomain sudo[67000]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oegyjyzhqiomfpdbuwavybkvrfayfswc ; /usr/bin/python3
Nov 28 08:19:02 np0005538513.localdomain sudo[67000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:02 np0005538513.localdomain python3[67002]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:19:02 np0005538513.localdomain sudo[67000]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:02 np0005538513.localdomain sudo[67018]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fxkyejztcdrlijwtqkugxhlcfmrnkotz ; /usr/bin/python3
Nov 28 08:19:02 np0005538513.localdomain sudo[67018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:02 np0005538513.localdomain python3[67020]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpa0718uv6 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 08:19:02 np0005538513.localdomain sudo[67018]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:02 np0005538513.localdomain sudo[67035]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:19:02 np0005538513.localdomain sudo[67035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:19:02 np0005538513.localdomain sudo[67035]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:02 np0005538513.localdomain sudo[67062]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mphqpqsvadjpuvvxdjqpvqggdsizlsgs ; /usr/bin/python3
Nov 28 08:19:02 np0005538513.localdomain sudo[67062]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:02 np0005538513.localdomain sudo[67065]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 08:19:02 np0005538513.localdomain sudo[67065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:19:02 np0005538513.localdomain python3[67067]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:02 np0005538513.localdomain sudo[67062]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:03 np0005538513.localdomain sudo[67101]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wniynwmfxxdfzwezwyukkqvyjicldctb ; /usr/bin/python3
Nov 28 08:19:03 np0005538513.localdomain sudo[67101]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:03 np0005538513.localdomain sudo[67065]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:03 np0005538513.localdomain sudo[67119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:19:03 np0005538513.localdomain sudo[67119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:19:03 np0005538513.localdomain sudo[67119]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:03 np0005538513.localdomain sudo[67135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:19:03 np0005538513.localdomain sudo[67135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:19:03 np0005538513.localdomain sudo[67101]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:03 np0005538513.localdomain sudo[67249]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wboemshsqgbnsgrpvhbyofbsoanumeaq ; /usr/bin/python3
Nov 28 08:19:03 np0005538513.localdomain sudo[67249]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:03 np0005538513.localdomain sudo[67135]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:04 np0005538513.localdomain python3[67253]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Nov 28 08:19:04 np0005538513.localdomain sudo[67268]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:19:04 np0005538513.localdomain sudo[67268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:19:04 np0005538513.localdomain sudo[67268]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:04 np0005538513.localdomain sudo[67249]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:04 np0005538513.localdomain sudo[67284]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 -- inventory --format=json-pretty --filter-for-batch
Nov 28 08:19:04 np0005538513.localdomain sudo[67284]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:19:04 np0005538513.localdomain sudo[67335]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gcikrwssnfbgqnhrwsykwsypkygdnwqr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:04 np0005538513.localdomain sudo[67335]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:04 np0005538513.localdomain python3[67341]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:04 np0005538513.localdomain sudo[67335]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:04 np0005538513.localdomain podman[67356]: 
Nov 28 08:19:04 np0005538513.localdomain podman[67356]: 2025-11-28 08:19:04.889490985 +0000 UTC m=+0.080224158 container create b29cd3eb307babfe96d1ed2b77c0a52a232ce75b4a7b93eba93d24e201112e9f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_blackwell, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_BRANCH=main, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, ceph=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, build-date=2025-09-24T08:57:55)
Nov 28 08:19:04 np0005538513.localdomain systemd[1]: Started libpod-conmon-b29cd3eb307babfe96d1ed2b77c0a52a232ce75b4a7b93eba93d24e201112e9f.scope.
Nov 28 08:19:04 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:19:04 np0005538513.localdomain podman[67356]: 2025-11-28 08:19:04.855490787 +0000 UTC m=+0.046224010 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 08:19:04 np0005538513.localdomain podman[67356]: 2025-11-28 08:19:04.964378306 +0000 UTC m=+0.155111489 container init b29cd3eb307babfe96d1ed2b77c0a52a232ce75b4a7b93eba93d24e201112e9f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_blackwell, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-type=git, vendor=Red Hat, Inc., release=553, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, ceph=True, build-date=2025-09-24T08:57:55, distribution-scope=public, io.buildah.version=1.33.12, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=)
Nov 28 08:19:04 np0005538513.localdomain systemd[1]: tmp-crun.fWxGSB.mount: Deactivated successfully.
Nov 28 08:19:04 np0005538513.localdomain podman[67356]: 2025-11-28 08:19:04.977787802 +0000 UTC m=+0.168520975 container start b29cd3eb307babfe96d1ed2b77c0a52a232ce75b4a7b93eba93d24e201112e9f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_blackwell, io.openshift.expose-services=, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, name=rhceph, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, GIT_CLEAN=True, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, CEPH_POINT_RELEASE=, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, distribution-scope=public)
Nov 28 08:19:04 np0005538513.localdomain podman[67356]: 2025-11-28 08:19:04.978206476 +0000 UTC m=+0.168939659 container attach b29cd3eb307babfe96d1ed2b77c0a52a232ce75b4a7b93eba93d24e201112e9f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_blackwell, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, name=rhceph, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, release=553, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, CEPH_POINT_RELEASE=)
Nov 28 08:19:04 np0005538513.localdomain sleepy_blackwell[67372]: 167 167
Nov 28 08:19:04 np0005538513.localdomain systemd[1]: libpod-b29cd3eb307babfe96d1ed2b77c0a52a232ce75b4a7b93eba93d24e201112e9f.scope: Deactivated successfully.
Nov 28 08:19:04 np0005538513.localdomain podman[67356]: 2025-11-28 08:19:04.982837979 +0000 UTC m=+0.173571162 container died b29cd3eb307babfe96d1ed2b77c0a52a232ce75b4a7b93eba93d24e201112e9f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_blackwell, distribution-scope=public, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, RELEASE=main, GIT_BRANCH=main, name=rhceph, version=7, vcs-type=git, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc.)
Nov 28 08:19:05 np0005538513.localdomain podman[67377]: 2025-11-28 08:19:05.083212124 +0000 UTC m=+0.088480196 container remove b29cd3eb307babfe96d1ed2b77c0a52a232ce75b4a7b93eba93d24e201112e9f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_blackwell, name=rhceph, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, architecture=x86_64, io.buildah.version=1.33.12, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph)
Nov 28 08:19:05 np0005538513.localdomain sudo[67402]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fcbmqqzhzhfsakybfxqjlyzinpbbzkdr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:05 np0005538513.localdomain systemd[1]: libpod-conmon-b29cd3eb307babfe96d1ed2b77c0a52a232ce75b4a7b93eba93d24e201112e9f.scope: Deactivated successfully.
Nov 28 08:19:05 np0005538513.localdomain sudo[67402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:05 np0005538513.localdomain sudo[67402]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:05 np0005538513.localdomain podman[67416]: 
Nov 28 08:19:05 np0005538513.localdomain podman[67416]: 2025-11-28 08:19:05.300768345 +0000 UTC m=+0.083646815 container create a0cab37cd83a6c205cdd1bb3e282524e6395b285e88bbff75b00001bf72f6313 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_mclaren, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, RELEASE=main, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, vcs-type=git, io.buildah.version=1.33.12, ceph=True, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 08:19:05 np0005538513.localdomain systemd[1]: Started libpod-conmon-a0cab37cd83a6c205cdd1bb3e282524e6395b285e88bbff75b00001bf72f6313.scope.
Nov 28 08:19:05 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:19:05 np0005538513.localdomain podman[67416]: 2025-11-28 08:19:05.264283389 +0000 UTC m=+0.047161929 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 08:19:05 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11cf5150f1f463467335df2b9dbb597480c9cf3d5415c339e42bc32ca62bee13/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:05 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11cf5150f1f463467335df2b9dbb597480c9cf3d5415c339e42bc32ca62bee13/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:05 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11cf5150f1f463467335df2b9dbb597480c9cf3d5415c339e42bc32ca62bee13/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:05 np0005538513.localdomain podman[67416]: 2025-11-28 08:19:05.371668481 +0000 UTC m=+0.154546951 container init a0cab37cd83a6c205cdd1bb3e282524e6395b285e88bbff75b00001bf72f6313 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_mclaren, io.openshift.expose-services=, ceph=True, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, architecture=x86_64, io.buildah.version=1.33.12, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 08:19:05 np0005538513.localdomain podman[67416]: 2025-11-28 08:19:05.383446397 +0000 UTC m=+0.166324877 container start a0cab37cd83a6c205cdd1bb3e282524e6395b285e88bbff75b00001bf72f6313 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_mclaren, RELEASE=main, build-date=2025-09-24T08:57:55, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_CLEAN=True, architecture=x86_64, ceph=True, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=)
Nov 28 08:19:05 np0005538513.localdomain podman[67416]: 2025-11-28 08:19:05.383767327 +0000 UTC m=+0.166645817 container attach a0cab37cd83a6c205cdd1bb3e282524e6395b285e88bbff75b00001bf72f6313 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_mclaren, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, version=7, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, distribution-scope=public, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 08:19:05 np0005538513.localdomain sudo[67450]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ocaxbsqttyapwfbwwiatpctwopzkvonm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:05 np0005538513.localdomain sudo[67450]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:05 np0005538513.localdomain python3[67453]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:19:05 np0005538513.localdomain sudo[67450]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:05 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f85fc9d759a4c1c69629c73a27514be5577ab0598fee5b9a7fba520b058de24e-merged.mount: Deactivated successfully.
Nov 28 08:19:06 np0005538513.localdomain sudo[68643]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obaqpboydfnbcacwtgfjkahrrogeesjd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:06 np0005538513.localdomain sudo[68643]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]: [
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:     {
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:         "available": false,
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:         "ceph_device": false,
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:         "lsm_data": {},
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:         "lvs": [],
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:         "path": "/dev/sr0",
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:         "rejected_reasons": [
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:             "Has a FileSystem",
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:             "Insufficient space (<5GB)"
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:         ],
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:         "sys_api": {
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:             "actuators": null,
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:             "device_nodes": "sr0",
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:             "human_readable_size": "482.00 KB",
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:             "id_bus": "ata",
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:             "model": "QEMU DVD-ROM",
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:             "nr_requests": "2",
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:             "partitions": {},
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:             "path": "/dev/sr0",
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:             "removable": "1",
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:             "rev": "2.5+",
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:             "ro": "0",
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:             "rotational": "1",
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:             "sas_address": "",
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:             "sas_device_handle": "",
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:             "scheduler_mode": "mq-deadline",
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:             "sectors": 0,
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:             "sectorsize": "2048",
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:             "size": 493568.0,
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:             "support_discard": "0",
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:             "type": "disk",
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:             "vendor": "QEMU"
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:         }
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]:     }
Nov 28 08:19:06 np0005538513.localdomain exciting_mclaren[67431]: ]
Nov 28 08:19:06 np0005538513.localdomain python3[68766]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:19:06 np0005538513.localdomain systemd[1]: libpod-a0cab37cd83a6c205cdd1bb3e282524e6395b285e88bbff75b00001bf72f6313.scope: Deactivated successfully.
Nov 28 08:19:06 np0005538513.localdomain systemd[1]: libpod-a0cab37cd83a6c205cdd1bb3e282524e6395b285e88bbff75b00001bf72f6313.scope: Consumed 1.017s CPU time.
Nov 28 08:19:06 np0005538513.localdomain podman[67416]: 2025-11-28 08:19:06.383748389 +0000 UTC m=+1.166626899 container died a0cab37cd83a6c205cdd1bb3e282524e6395b285e88bbff75b00001bf72f6313 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_mclaren, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, release=553, version=7, io.openshift.expose-services=, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_CLEAN=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, ceph=True, vcs-type=git, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 08:19:06 np0005538513.localdomain sudo[68643]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:06 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-11cf5150f1f463467335df2b9dbb597480c9cf3d5415c339e42bc32ca62bee13-merged.mount: Deactivated successfully.
Nov 28 08:19:06 np0005538513.localdomain podman[69067]: 2025-11-28 08:19:06.48723778 +0000 UTC m=+0.089599580 container remove a0cab37cd83a6c205cdd1bb3e282524e6395b285e88bbff75b00001bf72f6313 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_mclaren, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553)
Nov 28 08:19:06 np0005538513.localdomain sudo[69095]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-behwhtgzihigtzmclqhbskdxsjowdqyf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:06 np0005538513.localdomain systemd[1]: libpod-conmon-a0cab37cd83a6c205cdd1bb3e282524e6395b285e88bbff75b00001bf72f6313.scope: Deactivated successfully.
Nov 28 08:19:06 np0005538513.localdomain sudo[69095]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:06 np0005538513.localdomain sudo[67284]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:06 np0005538513.localdomain python3[69099]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:06 np0005538513.localdomain sudo[69095]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:07 np0005538513.localdomain sudo[69159]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsbhbqpljamlhcuejkwqvxbpvjwzmvrb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:07 np0005538513.localdomain sudo[69159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:07 np0005538513.localdomain sudo[69162]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:19:07 np0005538513.localdomain sudo[69162]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:19:07 np0005538513.localdomain sudo[69162]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:07 np0005538513.localdomain python3[69161]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:19:07 np0005538513.localdomain sudo[69159]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:07 np0005538513.localdomain sudo[69192]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhgddsiolxhlnthizttvdlfbciawfnss ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:07 np0005538513.localdomain sudo[69192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:07 np0005538513.localdomain python3[69194]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:07 np0005538513.localdomain sudo[69192]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:07 np0005538513.localdomain sudo[69254]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vkffcaktozrperaimzrkxlngrfkaowxz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:07 np0005538513.localdomain sudo[69254]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:08 np0005538513.localdomain python3[69256]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:19:08 np0005538513.localdomain sudo[69254]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:08 np0005538513.localdomain sudo[69272]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-enjclzivtrzskpuxwrfmanldllghdwou ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:08 np0005538513.localdomain sudo[69272]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:08 np0005538513.localdomain python3[69274]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:08 np0005538513.localdomain sudo[69272]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:08 np0005538513.localdomain sudo[69334]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlmnumnhdqlropofmooabkmtsylfukwz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:08 np0005538513.localdomain sudo[69334]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:08 np0005538513.localdomain python3[69336]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:19:08 np0005538513.localdomain sudo[69334]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:09 np0005538513.localdomain sudo[69352]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zinrjltftxerbnvkqijpqalipoabyixe ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:09 np0005538513.localdomain sudo[69352]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:09 np0005538513.localdomain python3[69354]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:09 np0005538513.localdomain sudo[69352]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:09 np0005538513.localdomain sudo[69382]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxioarxhrnrczsiyopmghbvluokouxsi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:09 np0005538513.localdomain sudo[69382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:09 np0005538513.localdomain python3[69384]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:19:09 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:19:09 np0005538513.localdomain systemd-sysv-generator[69409]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:19:09 np0005538513.localdomain systemd-rc-local-generator[69405]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:19:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:19:10 np0005538513.localdomain sudo[69382]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:10 np0005538513.localdomain sudo[69467]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ceajtenocbssbwdbqnasaxfirlhcixre ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:10 np0005538513.localdomain sudo[69467]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:19:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:19:10 np0005538513.localdomain systemd[1]: tmp-crun.z4ehy0.mount: Deactivated successfully.
Nov 28 08:19:10 np0005538513.localdomain podman[69469]: 2025-11-28 08:19:10.60714836 +0000 UTC m=+0.114402122 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, container_name=collectd, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:19:10 np0005538513.localdomain python3[69470]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:19:10 np0005538513.localdomain podman[69469]: 2025-11-28 08:19:10.642147558 +0000 UTC m=+0.149401320 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, release=1761123044, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, name=rhosp17/openstack-collectd, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team)
Nov 28 08:19:10 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:19:10 np0005538513.localdomain sudo[69467]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:10 np0005538513.localdomain podman[69471]: 2025-11-28 08:19:10.689219344 +0000 UTC m=+0.196229819 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com)
Nov 28 08:19:10 np0005538513.localdomain sudo[69535]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncmruupjemrkiqaglbriadpdyilqlmrj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:10 np0005538513.localdomain sudo[69535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:10 np0005538513.localdomain podman[69471]: 2025-11-28 08:19:10.885489472 +0000 UTC m=+0.392499947 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, config_id=tripleo_step1, version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Nov 28 08:19:10 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:19:10 np0005538513.localdomain python3[69537]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:10 np0005538513.localdomain sudo[69535]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:11 np0005538513.localdomain sudo[69597]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rutxgldbgtcalisgpzehaipprpacgtiu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:11 np0005538513.localdomain sudo[69597]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:11 np0005538513.localdomain python3[69599]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:19:11 np0005538513.localdomain sudo[69597]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:11 np0005538513.localdomain sudo[69615]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dsboixphlpfiyfmmcrcjlxppezxpcxlo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:11 np0005538513.localdomain sudo[69615]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:11 np0005538513.localdomain systemd[1]: tmp-crun.Ep8ky5.mount: Deactivated successfully.
Nov 28 08:19:11 np0005538513.localdomain python3[69617]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:11 np0005538513.localdomain sudo[69615]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:11 np0005538513.localdomain sudo[69645]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwredgcikryohhjmwhnivqavjjcbalsx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:11 np0005538513.localdomain sudo[69645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:12 np0005538513.localdomain python3[69647]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:19:12 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:19:12 np0005538513.localdomain systemd-rc-local-generator[69673]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:19:12 np0005538513.localdomain systemd-sysv-generator[69677]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:19:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:19:12 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:19:12 np0005538513.localdomain systemd[1]: Starting Create netns directory...
Nov 28 08:19:12 np0005538513.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 08:19:12 np0005538513.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 08:19:12 np0005538513.localdomain systemd[1]: Finished Create netns directory.
Nov 28 08:19:12 np0005538513.localdomain sudo[69645]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:12 np0005538513.localdomain systemd[1]: tmp-crun.WwZGJa.mount: Deactivated successfully.
Nov 28 08:19:12 np0005538513.localdomain podman[69685]: 2025-11-28 08:19:12.61572202 +0000 UTC m=+0.091086156 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-iscsid, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 28 08:19:12 np0005538513.localdomain podman[69685]: 2025-11-28 08:19:12.627208078 +0000 UTC m=+0.102572214 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, name=rhosp17/openstack-iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1)
Nov 28 08:19:12 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:19:12 np0005538513.localdomain sudo[69720]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpkwbmlovowaeutnjrlafqxeojhybpxt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:12 np0005538513.localdomain sudo[69720]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:13 np0005538513.localdomain python3[69722]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Nov 28 08:19:13 np0005538513.localdomain sudo[69720]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:13 np0005538513.localdomain sudo[69736]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwwclzacjngalsdfdftbnltklgspjumi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:13 np0005538513.localdomain sudo[69736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:13 np0005538513.localdomain sudo[69736]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:15 np0005538513.localdomain sudo[69779]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uewgcrgnywxanyokybnscsonrgyuwlob ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:15 np0005538513.localdomain sudo[69779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:15 np0005538513.localdomain python3[69781]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step4 config_dir=/var/lib/tripleo-config/container-startup-config/step_4 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Nov 28 08:19:15 np0005538513.localdomain podman[69952]: 2025-11-28 08:19:15.472570981 +0000 UTC m=+0.061953839 container create b3acc362da558e764f3cb1c197ade9eb852868cb570ef88f532cf93f61b44add (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, build-date=2025-11-18T23:34:05Z, release=1761123044, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=configure_cms_options, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:19:15 np0005538513.localdomain podman[69953]: 2025-11-28 08:19:15.499826569 +0000 UTC m=+0.083719416 container create f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=)
Nov 28 08:19:15 np0005538513.localdomain podman[69972]: 2025-11-28 08:19:15.510363667 +0000 UTC m=+0.079212086 container create 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:19:15 np0005538513.localdomain systemd[1]: Started libpod-conmon-b3acc362da558e764f3cb1c197ade9eb852868cb570ef88f532cf93f61b44add.scope.
Nov 28 08:19:15 np0005538513.localdomain podman[69973]: 2025-11-28 08:19:15.534309402 +0000 UTC m=+0.099989553 container create bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:49:32Z, version=17.1.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com)
Nov 28 08:19:15 np0005538513.localdomain podman[69952]: 2025-11-28 08:19:15.436910051 +0000 UTC m=+0.026292919 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Nov 28 08:19:15 np0005538513.localdomain systemd[1]: Started libpod-conmon-4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.scope.
Nov 28 08:19:15 np0005538513.localdomain systemd[1]: Started libpod-conmon-f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.scope.
Nov 28 08:19:15 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:19:15 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:19:15 np0005538513.localdomain podman[69953]: 2025-11-28 08:19:15.448990137 +0000 UTC m=+0.032882994 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Nov 28 08:19:15 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:19:15 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9124600c137d24812fa12ae9a3723386ce2301b24a72f8d26d11f6206571e1d/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:15 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f3450f8dbd8f52854977f74bd961373e3aeac1471ae57db291ae89b64fa40dd/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:15 np0005538513.localdomain podman[69972]: 2025-11-28 08:19:15.463636823 +0000 UTC m=+0.032485242 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Nov 28 08:19:15 np0005538513.localdomain podman[69989]: 2025-11-28 08:19:15.564182942 +0000 UTC m=+0.115461945 container create 4ef9a876eb51c0fb7805cce2278a0020c7131c4e4156ae0a51343d63dc9eb2b1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, build-date=2025-11-19T00:35:22Z, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, container_name=nova_libvirt_init_secret, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:19:15 np0005538513.localdomain podman[69989]: 2025-11-28 08:19:15.475182302 +0000 UTC m=+0.026461325 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:19:15 np0005538513.localdomain systemd[1]: Started libpod-conmon-bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.scope.
Nov 28 08:19:15 np0005538513.localdomain systemd[1]: Started libpod-conmon-4ef9a876eb51c0fb7805cce2278a0020c7131c4e4156ae0a51343d63dc9eb2b1.scope.
Nov 28 08:19:15 np0005538513.localdomain podman[69973]: 2025-11-28 08:19:15.486639189 +0000 UTC m=+0.052319350 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Nov 28 08:19:15 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:19:15 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/926487bff9cf92c9a8f31e53ba63d52e32adb5e89e0ae8702a4e60968ca6f3f1/merged/var/log/containers supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:15 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:19:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:19:15 np0005538513.localdomain podman[69953]: 2025-11-28 08:19:15.594248307 +0000 UTC m=+0.178141154 container init f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, version=17.1.12, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team)
Nov 28 08:19:15 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a02f32b7b5e21aed25175ffba2f815ea97f4b11687dc87cafc732563545474b2/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:15 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a02f32b7b5e21aed25175ffba2f815ea97f4b11687dc87cafc732563545474b2/merged/etc/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:15 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a02f32b7b5e21aed25175ffba2f815ea97f4b11687dc87cafc732563545474b2/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:15 np0005538513.localdomain podman[69989]: 2025-11-28 08:19:15.605861469 +0000 UTC m=+0.157140482 container init 4ef9a876eb51c0fb7805cce2278a0020c7131c4e4156ae0a51343d63dc9eb2b1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, vcs-type=git, release=1761123044, version=17.1.12, config_id=tripleo_step4, container_name=nova_libvirt_init_secret, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, build-date=2025-11-19T00:35:22Z)
Nov 28 08:19:15 np0005538513.localdomain podman[69952]: 2025-11-28 08:19:15.609566055 +0000 UTC m=+0.198948933 container init b3acc362da558e764f3cb1c197ade9eb852868cb570ef88f532cf93f61b44add (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=configure_cms_options, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, distribution-scope=public, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1)
Nov 28 08:19:15 np0005538513.localdomain sudo[70042]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:19:15 np0005538513.localdomain sudo[70042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 28 08:19:15 np0005538513.localdomain podman[69989]: 2025-11-28 08:19:15.613312901 +0000 UTC m=+0.164591904 container start 4ef9a876eb51c0fb7805cce2278a0020c7131c4e4156ae0a51343d63dc9eb2b1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, container_name=nova_libvirt_init_secret, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, version=17.1.12, build-date=2025-11-19T00:35:22Z, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Nov 28 08:19:15 np0005538513.localdomain podman[69989]: 2025-11-28 08:19:15.613584819 +0000 UTC m=+0.164863832 container attach 4ef9a876eb51c0fb7805cce2278a0020c7131c4e4156ae0a51343d63dc9eb2b1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, config_id=tripleo_step4, container_name=nova_libvirt_init_secret, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1)
Nov 28 08:19:15 np0005538513.localdomain podman[69952]: 2025-11-28 08:19:15.619745351 +0000 UTC m=+0.209128219 container start b3acc362da558e764f3cb1c197ade9eb852868cb570ef88f532cf93f61b44add (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, container_name=configure_cms_options, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 28 08:19:15 np0005538513.localdomain podman[69952]: 2025-11-28 08:19:15.621048562 +0000 UTC m=+0.210431490 container attach b3acc362da558e764f3cb1c197ade9eb852868cb570ef88f532cf93f61b44add (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=configure_cms_options, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-ovn-controller-container)
Nov 28 08:19:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:19:15 np0005538513.localdomain podman[69953]: 2025-11-28 08:19:15.633097997 +0000 UTC m=+0.216990844 container start f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:19:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:19:15 np0005538513.localdomain podman[69973]: 2025-11-28 08:19:15.636445091 +0000 UTC m=+0.202125282 container init bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, build-date=2025-11-18T22:49:32Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-cron-container)
Nov 28 08:19:15 np0005538513.localdomain python3[69781]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_ipmi --conmon-pidfile /run/ceilometer_agent_ipmi.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=684be86bd5476b8c779d4769a9adf982 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_ipmi --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_ipmi.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Nov 28 08:19:15 np0005538513.localdomain sudo[70065]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:19:15 np0005538513.localdomain sudo[70065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:19:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:19:15 np0005538513.localdomain sudo[70042]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:19:15 np0005538513.localdomain podman[69972]: 2025-11-28 08:19:15.696411768 +0000 UTC m=+0.265260177 container init 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, tcib_managed=true, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4)
Nov 28 08:19:15 np0005538513.localdomain ovs-vsctl[70101]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . external_ids ovn-cms-options
Nov 28 08:19:15 np0005538513.localdomain sudo[70103]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:19:15 np0005538513.localdomain sudo[70103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:19:15 np0005538513.localdomain sudo[70065]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:15 np0005538513.localdomain podman[69973]: 2025-11-28 08:19:15.724315956 +0000 UTC m=+0.289996107 container start bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, container_name=logrotate_crond, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, release=1761123044, build-date=2025-11-18T22:49:32Z, vcs-type=git, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:19:15 np0005538513.localdomain python3[69781]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name logrotate_crond --conmon-pidfile /run/logrotate_crond.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=53ed83bb0cae779ff95edb2002262c6f --healthcheck-command /usr/share/openstack-tripleo-common/healthcheck/cron --label config_id=tripleo_step4 --label container_name=logrotate_crond --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/logrotate_crond.log --network none --pid host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:z registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Nov 28 08:19:15 np0005538513.localdomain crond[70064]: (CRON) STARTUP (1.5.7)
Nov 28 08:19:15 np0005538513.localdomain crond[70064]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 20% if used.)
Nov 28 08:19:15 np0005538513.localdomain crond[70064]: (CRON) INFO (running with inotify support)
Nov 28 08:19:15 np0005538513.localdomain systemd[1]: libpod-b3acc362da558e764f3cb1c197ade9eb852868cb570ef88f532cf93f61b44add.scope: Deactivated successfully.
Nov 28 08:19:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:19:15 np0005538513.localdomain podman[69972]: 2025-11-28 08:19:15.76269704 +0000 UTC m=+0.331545429 container start 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, version=17.1.12, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1)
Nov 28 08:19:15 np0005538513.localdomain python3[69781]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=684be86bd5476b8c779d4769a9adf982 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_compute.log --network host --privileged=False --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Nov 28 08:19:15 np0005538513.localdomain podman[70055]: 2025-11-28 08:19:15.772079342 +0000 UTC m=+0.138357857 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, tcib_managed=true, release=1761123044, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 28 08:19:15 np0005538513.localdomain sudo[70103]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:15 np0005538513.localdomain systemd[1]: libpod-4ef9a876eb51c0fb7805cce2278a0020c7131c4e4156ae0a51343d63dc9eb2b1.scope: Deactivated successfully.
Nov 28 08:19:15 np0005538513.localdomain podman[69952]: 2025-11-28 08:19:15.800255019 +0000 UTC m=+0.389637897 container died b3acc362da558e764f3cb1c197ade9eb852868cb570ef88f532cf93f61b44add (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, version=17.1.12, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=configure_cms_options, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Nov 28 08:19:15 np0005538513.localdomain podman[70129]: 2025-11-28 08:19:15.832790361 +0000 UTC m=+0.061651639 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, distribution-scope=public, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, version=17.1.12, architecture=x86_64, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1)
Nov 28 08:19:15 np0005538513.localdomain podman[69989]: 2025-11-28 08:19:15.853203227 +0000 UTC m=+0.404482240 container died 4ef9a876eb51c0fb7805cce2278a0020c7131c4e4156ae0a51343d63dc9eb2b1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_libvirt_init_secret, distribution-scope=public, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Nov 28 08:19:15 np0005538513.localdomain podman[70129]: 2025-11-28 08:19:15.868249295 +0000 UTC m=+0.097110553 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container)
Nov 28 08:19:15 np0005538513.localdomain podman[70129]: unhealthy
Nov 28 08:19:15 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:19:15 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Failed with result 'exit-code'.
Nov 28 08:19:15 np0005538513.localdomain podman[70077]: 2025-11-28 08:19:15.93909018 +0000 UTC m=+0.264465752 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=starting, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, managed_by=tripleo_ansible, container_name=logrotate_crond, release=1761123044, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12)
Nov 28 08:19:15 np0005538513.localdomain podman[70077]: 2025-11-28 08:19:15.948300007 +0000 UTC m=+0.273675799 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, release=1761123044)
Nov 28 08:19:15 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:19:15 np0005538513.localdomain podman[70055]: 2025-11-28 08:19:15.966380709 +0000 UTC m=+0.332659224 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, vcs-type=git, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12)
Nov 28 08:19:15 np0005538513.localdomain podman[70055]: unhealthy
Nov 28 08:19:15 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:19:15 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Failed with result 'exit-code'.
Nov 28 08:19:16 np0005538513.localdomain podman[70126]: 2025-11-28 08:19:16.03712193 +0000 UTC m=+0.274229915 container cleanup b3acc362da558e764f3cb1c197ade9eb852868cb570ef88f532cf93f61b44add (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, container_name=configure_cms_options, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-ovn-controller, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4)
Nov 28 08:19:16 np0005538513.localdomain systemd[1]: libpod-conmon-b3acc362da558e764f3cb1c197ade9eb852868cb570ef88f532cf93f61b44add.scope: Deactivated successfully.
Nov 28 08:19:16 np0005538513.localdomain python3[69781]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name configure_cms_options --conmon-pidfile /run/configure_cms_options.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1764316155 --label config_id=tripleo_step4 --label container_name=configure_cms_options --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/configure_cms_options.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 /bin/bash -c CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi
Nov 28 08:19:16 np0005538513.localdomain podman[70168]: 2025-11-28 08:19:16.133410688 +0000 UTC m=+0.319308829 container cleanup 4ef9a876eb51c0fb7805cce2278a0020c7131c4e4156ae0a51343d63dc9eb2b1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_libvirt_init_secret, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:19:16 np0005538513.localdomain systemd[1]: libpod-conmon-4ef9a876eb51c0fb7805cce2278a0020c7131c4e4156ae0a51343d63dc9eb2b1.scope: Deactivated successfully.
Nov 28 08:19:16 np0005538513.localdomain python3[69781]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_libvirt_init_secret --cgroupns=host --conmon-pidfile /run/nova_libvirt_init_secret.pid --detach=False --env LIBVIRT_DEFAULT_URI=qemu:///system --env TRIPLEO_CONFIG_HASH=0f0904943dda1bf1d123bdf96d71020f --label config_id=tripleo_step4 --label container_name=nova_libvirt_init_secret --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_libvirt_init_secret.log --network host --privileged=False --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova --volume /etc/libvirt:/etc/libvirt --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro --volume /var/lib/tripleo-config/ceph:/etc/ceph:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /nova_libvirt_init_secret.sh ceph:openstack
Nov 28 08:19:16 np0005538513.localdomain podman[70261]: 2025-11-28 08:19:16.211184758 +0000 UTC m=+0.144300292 container create 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:19:16 np0005538513.localdomain systemd[1]: Started libpod-conmon-9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.scope.
Nov 28 08:19:16 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:19:16 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f1ef42e70cf0e1a0a8a92baa1944c6e077a6987018321471d4c334fae1280ec/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:16 np0005538513.localdomain podman[70261]: 2025-11-28 08:19:16.175272651 +0000 UTC m=+0.108388245 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 28 08:19:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:19:16 np0005538513.localdomain podman[70261]: 2025-11-28 08:19:16.282932581 +0000 UTC m=+0.216048155 container init 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.buildah.version=1.41.4, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1)
Nov 28 08:19:16 np0005538513.localdomain sudo[70359]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:19:16 np0005538513.localdomain sudo[70359]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:19:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:19:16 np0005538513.localdomain podman[70261]: 2025-11-28 08:19:16.311409697 +0000 UTC m=+0.244525241 container start 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4)
Nov 28 08:19:16 np0005538513.localdomain podman[70325]: 2025-11-28 08:19:16.319447397 +0000 UTC m=+0.122478593 container create b558d4971537e9d22bbec20d85b84342b92fe1e88ef9195cca460c7543cefb42 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, container_name=setup_ovs_manager, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.expose-services=)
Nov 28 08:19:16 np0005538513.localdomain python3[69781]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_migration_target --conmon-pidfile /run/nova_migration_target.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=0f0904943dda1bf1d123bdf96d71020f --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=nova_migration_target --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_migration_target.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /etc/ssh:/host-ssh:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 28 08:19:16 np0005538513.localdomain podman[70325]: 2025-11-28 08:19:16.237217468 +0000 UTC m=+0.040248664 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Nov 28 08:19:16 np0005538513.localdomain sudo[70359]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:16 np0005538513.localdomain sshd[70379]: Server listening on 0.0.0.0 port 2022.
Nov 28 08:19:16 np0005538513.localdomain sshd[70379]: Server listening on :: port 2022.
Nov 28 08:19:16 np0005538513.localdomain systemd[1]: Started libpod-conmon-b558d4971537e9d22bbec20d85b84342b92fe1e88ef9195cca460c7543cefb42.scope.
Nov 28 08:19:16 np0005538513.localdomain podman[70361]: 2025-11-28 08:19:16.406224548 +0000 UTC m=+0.084792650 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=starting, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1)
Nov 28 08:19:16 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:19:16 np0005538513.localdomain podman[70325]: 2025-11-28 08:19:16.419970836 +0000 UTC m=+0.223002072 container init b558d4971537e9d22bbec20d85b84342b92fe1e88ef9195cca460c7543cefb42 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=setup_ovs_manager, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com)
Nov 28 08:19:16 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-6bc7f6cf2f2d6fa477b99c9c15d2f85320ca24368fa72b515260201c2b251c67-merged.mount: Deactivated successfully.
Nov 28 08:19:16 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b3acc362da558e764f3cb1c197ade9eb852868cb570ef88f532cf93f61b44add-userdata-shm.mount: Deactivated successfully.
Nov 28 08:19:16 np0005538513.localdomain podman[70325]: 2025-11-28 08:19:16.483960088 +0000 UTC m=+0.286991304 container start b558d4971537e9d22bbec20d85b84342b92fe1e88ef9195cca460c7543cefb42 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=setup_ovs_manager, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Nov 28 08:19:16 np0005538513.localdomain podman[70325]: 2025-11-28 08:19:16.484459393 +0000 UTC m=+0.287490629 container attach b558d4971537e9d22bbec20d85b84342b92fe1e88ef9195cca460c7543cefb42 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=setup_ovs_manager, architecture=x86_64, version=17.1.12, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:19:16 np0005538513.localdomain sudo[70420]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpugdlzly0/privsep.sock
Nov 28 08:19:16 np0005538513.localdomain sudo[70420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 28 08:19:16 np0005538513.localdomain podman[70361]: 2025-11-28 08:19:16.756142658 +0000 UTC m=+0.434710770 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 08:19:16 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:19:17 np0005538513.localdomain kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 28 08:19:17 np0005538513.localdomain sudo[70420]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:19 np0005538513.localdomain ovs-vsctl[70546]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 28 08:19:19 np0005538513.localdomain systemd[1]: libpod-b558d4971537e9d22bbec20d85b84342b92fe1e88ef9195cca460c7543cefb42.scope: Deactivated successfully.
Nov 28 08:19:19 np0005538513.localdomain systemd[1]: libpod-b558d4971537e9d22bbec20d85b84342b92fe1e88ef9195cca460c7543cefb42.scope: Consumed 2.917s CPU time.
Nov 28 08:19:19 np0005538513.localdomain podman[70325]: 2025-11-28 08:19:19.364358711 +0000 UTC m=+3.167389937 container died b558d4971537e9d22bbec20d85b84342b92fe1e88ef9195cca460c7543cefb42 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, config_id=tripleo_step4, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, architecture=x86_64, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=setup_ovs_manager, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Nov 28 08:19:19 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b558d4971537e9d22bbec20d85b84342b92fe1e88ef9195cca460c7543cefb42-userdata-shm.mount: Deactivated successfully.
Nov 28 08:19:19 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-be016c2b2baa2373c75faceeaeaba806753f973ba55e627044f9b5780da52a4c-merged.mount: Deactivated successfully.
Nov 28 08:19:19 np0005538513.localdomain podman[70547]: 2025-11-28 08:19:19.47099875 +0000 UTC m=+0.092477459 container cleanup b558d4971537e9d22bbec20d85b84342b92fe1e88ef9195cca460c7543cefb42 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=setup_ovs_manager, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:19:19 np0005538513.localdomain systemd[1]: libpod-conmon-b558d4971537e9d22bbec20d85b84342b92fe1e88ef9195cca460c7543cefb42.scope: Deactivated successfully.
Nov 28 08:19:19 np0005538513.localdomain python3[69781]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name setup_ovs_manager --conmon-pidfile /run/setup_ovs_manager.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1764316155 --label config_id=tripleo_step4 --label container_name=setup_ovs_manager --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/setup_ovs_manager.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 exec include tripleo::profile::base::neutron::ovn_metadata
Nov 28 08:19:19 np0005538513.localdomain podman[70654]: 2025-11-28 08:19:19.905175512 +0000 UTC m=+0.072774546 container create 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1761123044)
Nov 28 08:19:19 np0005538513.localdomain podman[70667]: 2025-11-28 08:19:19.947161689 +0000 UTC m=+0.069726111 container create 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:19:19 np0005538513.localdomain podman[70654]: 2025-11-28 08:19:19.865859869 +0000 UTC m=+0.033458983 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Nov 28 08:19:19 np0005538513.localdomain systemd[1]: Started libpod-conmon-1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.scope.
Nov 28 08:19:19 np0005538513.localdomain systemd[1]: Started libpod-conmon-3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.scope.
Nov 28 08:19:19 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:19:20 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/278e8886c08486a895bd57117a67612048d79e44c13305c2caa42c992931b27d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:20 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/278e8886c08486a895bd57117a67612048d79e44c13305c2caa42c992931b27d/merged/var/log/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:20 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/278e8886c08486a895bd57117a67612048d79e44c13305c2caa42c992931b27d/merged/etc/neutron/kill_scripts supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:20 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:19:20 np0005538513.localdomain podman[70667]: 2025-11-28 08:19:19.911983334 +0000 UTC m=+0.034547826 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Nov 28 08:19:20 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/069878c0f25bf999d8bcfd5288562c93f75c544f63f9cb3c25f6c968781689e7/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:20 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/069878c0f25bf999d8bcfd5288562c93f75c544f63f9cb3c25f6c968781689e7/merged/var/log/openvswitch supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:20 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/069878c0f25bf999d8bcfd5288562c93f75c544f63f9cb3c25f6c968781689e7/merged/var/log/ovn supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:19:20 np0005538513.localdomain podman[70667]: 2025-11-28 08:19:20.039154672 +0000 UTC m=+0.161719194 container init 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent)
Nov 28 08:19:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:19:20 np0005538513.localdomain podman[70654]: 2025-11-28 08:19:20.056535583 +0000 UTC m=+0.224134687 container init 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, tcib_managed=true, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team)
Nov 28 08:19:20 np0005538513.localdomain sudo[70694]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:19:20 np0005538513.localdomain sudo[70694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Nov 28 08:19:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:19:20 np0005538513.localdomain podman[70667]: 2025-11-28 08:19:20.090719486 +0000 UTC m=+0.213283958 container start 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Nov 28 08:19:20 np0005538513.localdomain systemd-logind[764]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 28 08:19:20 np0005538513.localdomain python3[69781]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=dfc67f7a8d1f67548a53836c6db3b704 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ovn_metadata_agent --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_metadata_agent.log --network host --pid host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/neutron:/var/log/neutron:z --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /run/netns:/run/netns:shared --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Nov 28 08:19:20 np0005538513.localdomain systemd[1]: Created slice User Slice of UID 0.
Nov 28 08:19:20 np0005538513.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 28 08:19:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:19:20 np0005538513.localdomain podman[70654]: 2025-11-28 08:19:20.142838829 +0000 UTC m=+0.310437893 container start 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4)
Nov 28 08:19:20 np0005538513.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 28 08:19:20 np0005538513.localdomain python3[69781]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck 6642 --label config_id=tripleo_step4 --label container_name=ovn_controller --label managed_by=tripleo_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_controller.log --network host --privileged=True --user root --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/log/containers/openvswitch:/var/log/openvswitch:z --volume /var/log/containers/openvswitch:/var/log/ovn:z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Nov 28 08:19:20 np0005538513.localdomain systemd[1]: Starting User Manager for UID 0...
Nov 28 08:19:20 np0005538513.localdomain sudo[70694]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:20 np0005538513.localdomain systemd[70718]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:19:20 np0005538513.localdomain podman[70699]: 2025-11-28 08:19:20.214686234 +0000 UTC m=+0.114588126 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible)
Nov 28 08:19:20 np0005538513.localdomain podman[70699]: 2025-11-28 08:19:20.255593468 +0000 UTC m=+0.155495380 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, version=17.1.12, build-date=2025-11-19T00:14:25Z, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:19:20 np0005538513.localdomain podman[70699]: unhealthy
Nov 28 08:19:20 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:19:20 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:19:20 np0005538513.localdomain systemd[70718]: Queued start job for default target Main User Target.
Nov 28 08:19:20 np0005538513.localdomain systemd[70718]: Created slice User Application Slice.
Nov 28 08:19:20 np0005538513.localdomain systemd[70718]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 28 08:19:20 np0005538513.localdomain systemd[70718]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 08:19:20 np0005538513.localdomain systemd[70718]: Reached target Paths.
Nov 28 08:19:20 np0005538513.localdomain systemd[70718]: Reached target Timers.
Nov 28 08:19:20 np0005538513.localdomain systemd[70718]: Starting D-Bus User Message Bus Socket...
Nov 28 08:19:20 np0005538513.localdomain systemd[70718]: Starting Create User's Volatile Files and Directories...
Nov 28 08:19:20 np0005538513.localdomain systemd[70718]: Listening on D-Bus User Message Bus Socket.
Nov 28 08:19:20 np0005538513.localdomain systemd[70718]: Reached target Sockets.
Nov 28 08:19:20 np0005538513.localdomain systemd[70718]: Finished Create User's Volatile Files and Directories.
Nov 28 08:19:20 np0005538513.localdomain systemd[70718]: Reached target Basic System.
Nov 28 08:19:20 np0005538513.localdomain systemd[70718]: Reached target Main User Target.
Nov 28 08:19:20 np0005538513.localdomain systemd[70718]: Startup finished in 121ms.
Nov 28 08:19:20 np0005538513.localdomain systemd[1]: Started User Manager for UID 0.
Nov 28 08:19:20 np0005538513.localdomain systemd[1]: Started Session c9 of User root.
Nov 28 08:19:20 np0005538513.localdomain podman[70714]: 2025-11-28 08:19:20.356104546 +0000 UTC m=+0.196161806 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, tcib_managed=true)
Nov 28 08:19:20 np0005538513.localdomain sudo[69779]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:20 np0005538513.localdomain podman[70714]: 2025-11-28 08:19:20.417186167 +0000 UTC m=+0.257243427 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:19:20 np0005538513.localdomain podman[70714]: unhealthy
Nov 28 08:19:20 np0005538513.localdomain systemd[1]: session-c9.scope: Deactivated successfully.
Nov 28 08:19:20 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:19:20 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:19:20 np0005538513.localdomain kernel: device br-int entered promiscuous mode
Nov 28 08:19:20 np0005538513.localdomain NetworkManager[5967]: <info>  [1764317960.4865] manager: (br-int): new Generic device (/org/freedesktop/NetworkManager/Devices/11)
Nov 28 08:19:20 np0005538513.localdomain systemd-udevd[70804]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 08:19:20 np0005538513.localdomain sudo[70822]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntwmmtkcudwbboumrwpjmdymdxbjhxdj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:20 np0005538513.localdomain sudo[70822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:20 np0005538513.localdomain python3[70824]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:20 np0005538513.localdomain sudo[70822]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:20 np0005538513.localdomain sudo[70838]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lhmioknxyrjygpiuzfeyxunfkndrqxeg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:20 np0005538513.localdomain sudo[70838]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:21 np0005538513.localdomain python3[70840]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:21 np0005538513.localdomain sudo[70838]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:21 np0005538513.localdomain sudo[70854]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swrwzwwdptirxjwnnlfsqphumjjidrvr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:21 np0005538513.localdomain sudo[70854]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:21 np0005538513.localdomain python3[70856]: ansible-file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:21 np0005538513.localdomain sudo[70854]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:21 np0005538513.localdomain sudo[70870]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uxafxqfkidgxhwaxcclxuzpkrfwgqfzc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:21 np0005538513.localdomain sudo[70870]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:21 np0005538513.localdomain python3[70872]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:21 np0005538513.localdomain sudo[70870]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:21 np0005538513.localdomain kernel: device genev_sys_6081 entered promiscuous mode
Nov 28 08:19:21 np0005538513.localdomain systemd-udevd[70806]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 08:19:21 np0005538513.localdomain NetworkManager[5967]: <info>  [1764317961.5516] device (genev_sys_6081): carrier: link connected
Nov 28 08:19:21 np0005538513.localdomain NetworkManager[5967]: <info>  [1764317961.5525] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/12)
Nov 28 08:19:21 np0005538513.localdomain sudo[70889]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofqdjzlwhvbpwfudbvchjufyaslgxaux ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:21 np0005538513.localdomain sudo[70889]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:21 np0005538513.localdomain sudo[70893]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/bin/neutron-rootwrap /etc/neutron/rootwrap.conf privsep-helper --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --privsep_context neutron.privileged.default --privsep_sock_path /tmp/tmp87tvfyva/privsep.sock
Nov 28 08:19:21 np0005538513.localdomain sudo[70893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Nov 28 08:19:21 np0005538513.localdomain python3[70891]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:21 np0005538513.localdomain sudo[70889]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:21 np0005538513.localdomain sudo[70908]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nslyggakrzfycevwemnybvgcxucubtej ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:21 np0005538513.localdomain sudo[70908]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:22 np0005538513.localdomain python3[70910]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:22 np0005538513.localdomain sudo[70908]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:22 np0005538513.localdomain sudo[70925]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ekcxugrgtibnqvzbqlxatukkrnlvijiu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:22 np0005538513.localdomain sudo[70925]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:22 np0005538513.localdomain python3[70927]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:19:22 np0005538513.localdomain sudo[70925]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:22 np0005538513.localdomain sudo[70942]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-npzzhtffigxlpgnnniclppboiikbtbqz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:22 np0005538513.localdomain sudo[70942]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:22 np0005538513.localdomain sudo[70893]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:22 np0005538513.localdomain python3[70944]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:19:22 np0005538513.localdomain sudo[70942]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:22 np0005538513.localdomain sudo[70959]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xoxaqbvabqxuywkvpozrsuuoikdxhxse ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:22 np0005538513.localdomain sudo[70959]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:22 np0005538513.localdomain python3[70961]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_logrotate_crond_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:19:22 np0005538513.localdomain sudo[70959]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:22 np0005538513.localdomain sudo[70977]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pispxwbbryhbmhuuyoufmmczycubkxgq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:22 np0005538513.localdomain sudo[70977]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:23 np0005538513.localdomain python3[70979]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_migration_target_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:19:23 np0005538513.localdomain sudo[70977]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:23 np0005538513.localdomain sudo[70993]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tbmxvklwbgcbhdoebrymjpkjdaubtmaz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:23 np0005538513.localdomain sudo[70993]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:23 np0005538513.localdomain python3[70995]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_controller_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:19:23 np0005538513.localdomain sudo[70993]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:23 np0005538513.localdomain sudo[71009]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwyoxdcxigysfgqwpxrbchlomllrsfpr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:23 np0005538513.localdomain sudo[71009]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:23 np0005538513.localdomain python3[71011]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:19:23 np0005538513.localdomain sudo[71009]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:23 np0005538513.localdomain sudo[71070]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxiyqkgdlwbihhvpbupbyqixertiputj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:23 np0005538513.localdomain sudo[71070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:24 np0005538513.localdomain python3[71072]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317963.5978873-110014-4299752726396/source dest=/etc/systemd/system/tripleo_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:24 np0005538513.localdomain sudo[71070]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:24 np0005538513.localdomain sudo[71099]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qeyyktcdocgdxdziioayvnibzobwxxgh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:24 np0005538513.localdomain sudo[71099]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:24 np0005538513.localdomain python3[71101]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317963.5978873-110014-4299752726396/source dest=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:24 np0005538513.localdomain sudo[71099]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:24 np0005538513.localdomain sudo[71128]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvqhkzptxiieeayqjwjeorlnwblglkfv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:24 np0005538513.localdomain sudo[71128]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:25 np0005538513.localdomain python3[71130]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317963.5978873-110014-4299752726396/source dest=/etc/systemd/system/tripleo_logrotate_crond.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:25 np0005538513.localdomain sudo[71128]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:25 np0005538513.localdomain sudo[71157]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfxdlxjvghgpzqrjzjeckicuxhltqfal ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:25 np0005538513.localdomain sudo[71157]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:25 np0005538513.localdomain python3[71159]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317963.5978873-110014-4299752726396/source dest=/etc/systemd/system/tripleo_nova_migration_target.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:25 np0005538513.localdomain sudo[71157]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:25 np0005538513.localdomain sudo[71186]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ixxmmwoqoqzmkunsbgwswxffqnpnypuw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:25 np0005538513.localdomain sudo[71186]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:26 np0005538513.localdomain python3[71188]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317963.5978873-110014-4299752726396/source dest=/etc/systemd/system/tripleo_ovn_controller.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:26 np0005538513.localdomain sudo[71186]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:26 np0005538513.localdomain sudo[71215]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xeiezkkvbpqaipxhdmnqpjesuvsprxmz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:26 np0005538513.localdomain sudo[71215]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:26 np0005538513.localdomain python3[71217]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317963.5978873-110014-4299752726396/source dest=/etc/systemd/system/tripleo_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:26 np0005538513.localdomain sudo[71215]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:26 np0005538513.localdomain sudo[71231]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bslaukgbtznpiimjhgvilmfdtndsyoqh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:26 np0005538513.localdomain sudo[71231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:26 np0005538513.localdomain python3[71233]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 08:19:27 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:19:27 np0005538513.localdomain systemd-rc-local-generator[71258]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:19:27 np0005538513.localdomain systemd-sysv-generator[71263]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:19:27 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:19:27 np0005538513.localdomain sudo[71231]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:27 np0005538513.localdomain sudo[71282]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llinqvohkiyepbbizsibijnucyaxscsq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:27 np0005538513.localdomain sudo[71282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:27 np0005538513.localdomain python3[71284]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:19:27 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:19:28 np0005538513.localdomain systemd-sysv-generator[71311]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:19:28 np0005538513.localdomain systemd-rc-local-generator[71308]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:19:28 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:19:28 np0005538513.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Nov 28 08:19:28 np0005538513.localdomain tripleo-start-podman-container[71323]: Creating additional drop-in dependency for "ceilometer_agent_compute" (4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5)
Nov 28 08:19:28 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:19:28 np0005538513.localdomain systemd-sysv-generator[71387]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:19:28 np0005538513.localdomain systemd-rc-local-generator[71384]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:19:28 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:19:28 np0005538513.localdomain systemd[1]: Started ceilometer_agent_compute container.
Nov 28 08:19:28 np0005538513.localdomain sudo[71282]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:29 np0005538513.localdomain sudo[71407]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prbvhzaaieqmqeuujzegbqwuiokbrbpl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:29 np0005538513.localdomain sudo[71407]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:29 np0005538513.localdomain python3[71409]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_ipmi.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:19:29 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:19:29 np0005538513.localdomain systemd-rc-local-generator[71433]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:19:29 np0005538513.localdomain systemd-sysv-generator[71437]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:19:29 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:19:29 np0005538513.localdomain systemd[1]: Starting ceilometer_agent_ipmi container...
Nov 28 08:19:29 np0005538513.localdomain systemd[1]: Started ceilometer_agent_ipmi container.
Nov 28 08:19:29 np0005538513.localdomain sudo[71407]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:30 np0005538513.localdomain sudo[71475]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zbytkksyigrvowujjlsevjibkrhjnxwl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:30 np0005538513.localdomain sudo[71475]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:30 np0005538513.localdomain python3[71477]: ansible-systemd Invoked with state=restarted name=tripleo_logrotate_crond.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:19:30 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:19:30 np0005538513.localdomain systemd-rc-local-generator[71501]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:19:30 np0005538513.localdomain systemd-sysv-generator[71505]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:19:30 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:19:30 np0005538513.localdomain systemd[1]: Stopping User Manager for UID 0...
Nov 28 08:19:30 np0005538513.localdomain systemd[70718]: Activating special unit Exit the Session...
Nov 28 08:19:30 np0005538513.localdomain systemd[70718]: Stopped target Main User Target.
Nov 28 08:19:30 np0005538513.localdomain systemd[70718]: Stopped target Basic System.
Nov 28 08:19:30 np0005538513.localdomain systemd[70718]: Stopped target Paths.
Nov 28 08:19:30 np0005538513.localdomain systemd[70718]: Stopped target Sockets.
Nov 28 08:19:30 np0005538513.localdomain systemd[70718]: Stopped target Timers.
Nov 28 08:19:30 np0005538513.localdomain systemd[70718]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 08:19:30 np0005538513.localdomain systemd[70718]: Closed D-Bus User Message Bus Socket.
Nov 28 08:19:30 np0005538513.localdomain systemd[70718]: Stopped Create User's Volatile Files and Directories.
Nov 28 08:19:30 np0005538513.localdomain systemd[70718]: Removed slice User Application Slice.
Nov 28 08:19:30 np0005538513.localdomain systemd[70718]: Reached target Shutdown.
Nov 28 08:19:30 np0005538513.localdomain systemd[70718]: Finished Exit the Session.
Nov 28 08:19:30 np0005538513.localdomain systemd[70718]: Reached target Exit the Session.
Nov 28 08:19:30 np0005538513.localdomain systemd[1]: Starting logrotate_crond container...
Nov 28 08:19:30 np0005538513.localdomain systemd[1]: user@0.service: Deactivated successfully.
Nov 28 08:19:30 np0005538513.localdomain systemd[1]: Stopped User Manager for UID 0.
Nov 28 08:19:30 np0005538513.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 28 08:19:30 np0005538513.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 28 08:19:30 np0005538513.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 28 08:19:30 np0005538513.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 28 08:19:30 np0005538513.localdomain systemd[1]: Removed slice User Slice of UID 0.
Nov 28 08:19:30 np0005538513.localdomain systemd[1]: Started logrotate_crond container.
Nov 28 08:19:30 np0005538513.localdomain sudo[71475]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:31 np0005538513.localdomain sudo[71543]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fbwfqmclxcvwfkiqguxuieeltkziobye ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:31 np0005538513.localdomain sudo[71543]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:31 np0005538513.localdomain python3[71545]: ansible-systemd Invoked with state=restarted name=tripleo_nova_migration_target.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:19:31 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:19:31 np0005538513.localdomain systemd-rc-local-generator[71573]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:19:31 np0005538513.localdomain systemd-sysv-generator[71578]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:19:31 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:19:32 np0005538513.localdomain systemd[1]: Starting nova_migration_target container...
Nov 28 08:19:32 np0005538513.localdomain systemd[1]: Started nova_migration_target container.
Nov 28 08:19:32 np0005538513.localdomain sudo[71543]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:32 np0005538513.localdomain sudo[71611]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qpjphetjruppzovdgxviyfgwojhhwoet ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:32 np0005538513.localdomain sudo[71611]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:32 np0005538513.localdomain python3[71613]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:19:32 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:19:32 np0005538513.localdomain systemd-rc-local-generator[71637]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:19:32 np0005538513.localdomain systemd-sysv-generator[71643]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:19:33 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:19:33 np0005538513.localdomain systemd[1]: Starting ovn_controller container...
Nov 28 08:19:33 np0005538513.localdomain tripleo-start-podman-container[71653]: Creating additional drop-in dependency for "ovn_controller" (3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e)
Nov 28 08:19:33 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:19:33 np0005538513.localdomain systemd-rc-local-generator[71711]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:19:33 np0005538513.localdomain systemd-sysv-generator[71714]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:19:33 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:19:33 np0005538513.localdomain systemd[1]: Started ovn_controller container.
Nov 28 08:19:33 np0005538513.localdomain sudo[71611]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:34 np0005538513.localdomain sudo[71737]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqhuabmfoczkokopjpnnjovccwlititd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:34 np0005538513.localdomain sudo[71737]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:34 np0005538513.localdomain sshd[71740]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:19:34 np0005538513.localdomain python3[71739]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:19:34 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:19:34 np0005538513.localdomain systemd-sysv-generator[71772]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:19:34 np0005538513.localdomain systemd-rc-local-generator[71769]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:19:34 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:19:34 np0005538513.localdomain sshd[71740]: Invalid user sol from 193.32.162.146 port 50642
Nov 28 08:19:34 np0005538513.localdomain systemd[1]: Starting ovn_metadata_agent container...
Nov 28 08:19:34 np0005538513.localdomain systemd[1]: Started ovn_metadata_agent container.
Nov 28 08:19:34 np0005538513.localdomain sshd[71740]: Connection closed by invalid user sol 193.32.162.146 port 50642 [preauth]
Nov 28 08:19:34 np0005538513.localdomain sudo[71737]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:35 np0005538513.localdomain sudo[71821]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eeouajywqwcybrwylzzjuxykvcknrfty ; /usr/bin/python3
Nov 28 08:19:35 np0005538513.localdomain sudo[71821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:35 np0005538513.localdomain python3[71823]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks4.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:35 np0005538513.localdomain sudo[71821]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:35 np0005538513.localdomain sudo[71869]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-clnfosmrtbulzvzbhptblbfushrthvcn ; /usr/bin/python3
Nov 28 08:19:35 np0005538513.localdomain sudo[71869]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:35 np0005538513.localdomain sudo[71869]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:36 np0005538513.localdomain sudo[71912]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdtaoancbzyapjbnbidixxenoxsnlzva ; /usr/bin/python3
Nov 28 08:19:36 np0005538513.localdomain sudo[71912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:36 np0005538513.localdomain sudo[71912]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:36 np0005538513.localdomain sudo[71942]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvxyjyyqiapmhzmumzvsmvcamzmfxadp ; /usr/bin/python3
Nov 28 08:19:36 np0005538513.localdomain sudo[71942]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:36 np0005538513.localdomain python3[71944]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks4.json short_hostname=np0005538513 step=4 update_config_hash_only=False
Nov 28 08:19:36 np0005538513.localdomain sudo[71942]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:37 np0005538513.localdomain sudo[71958]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubcdsvblbeyhnryommgsbcgkhschigbl ; /usr/bin/python3
Nov 28 08:19:37 np0005538513.localdomain sudo[71958]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:37 np0005538513.localdomain python3[71960]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:37 np0005538513.localdomain sudo[71958]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:37 np0005538513.localdomain sudo[71974]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bndjftitjvfgdenaqmonnkxgsnfxhkyv ; /usr/bin/python3
Nov 28 08:19:37 np0005538513.localdomain sudo[71974]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:37 np0005538513.localdomain python3[71976]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_4 config_pattern=container-puppet-*.json config_overrides={} debug=True
Nov 28 08:19:37 np0005538513.localdomain sudo[71974]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:19:40 np0005538513.localdomain podman[71978]: 2025-11-28 08:19:40.830528867 +0000 UTC m=+0.068982630 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, release=1761123044, vcs-type=git, name=rhosp17/openstack-collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, io.openshift.expose-services=)
Nov 28 08:19:40 np0005538513.localdomain podman[71978]: 2025-11-28 08:19:40.841969017 +0000 UTC m=+0.080422780 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:19:40 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:19:41 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:19:41 np0005538513.localdomain podman[71999]: 2025-11-28 08:19:41.871169908 +0000 UTC m=+0.075870166 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 28 08:19:42 np0005538513.localdomain podman[71999]: 2025-11-28 08:19:42.042625999 +0000 UTC m=+0.247326297 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true)
Nov 28 08:19:42 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:19:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:19:42 np0005538513.localdomain podman[72029]: 2025-11-28 08:19:42.847726696 +0000 UTC m=+0.083125236 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:19:42 np0005538513.localdomain podman[72029]: 2025-11-28 08:19:42.855940974 +0000 UTC m=+0.091339474 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12)
Nov 28 08:19:42 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:19:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:19:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:19:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:19:46 np0005538513.localdomain systemd[1]: tmp-crun.MHizjA.mount: Deactivated successfully.
Nov 28 08:19:46 np0005538513.localdomain podman[72052]: 2025-11-28 08:19:46.87439841 +0000 UTC m=+0.110823246 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.12, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi)
Nov 28 08:19:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:19:46 np0005538513.localdomain podman[72050]: 2025-11-28 08:19:46.925771715 +0000 UTC m=+0.164822523 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, release=1761123044, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Nov 28 08:19:46 np0005538513.localdomain podman[72052]: 2025-11-28 08:19:46.931628489 +0000 UTC m=+0.168053335 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step4)
Nov 28 08:19:46 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:19:46 np0005538513.localdomain podman[72050]: 2025-11-28 08:19:46.980488646 +0000 UTC m=+0.219539504 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z)
Nov 28 08:19:46 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:19:47 np0005538513.localdomain podman[72088]: 2025-11-28 08:19:47.065165419 +0000 UTC m=+0.171363800 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, config_id=tripleo_step4)
Nov 28 08:19:47 np0005538513.localdomain podman[72051]: 2025-11-28 08:19:46.988395185 +0000 UTC m=+0.223144208 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public)
Nov 28 08:19:47 np0005538513.localdomain podman[72051]: 2025-11-28 08:19:47.121581322 +0000 UTC m=+0.356330365 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.expose-services=, release=1761123044, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:19:47 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:19:47 np0005538513.localdomain podman[72088]: 2025-11-28 08:19:47.460555141 +0000 UTC m=+0.566753512 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute)
Nov 28 08:19:47 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:19:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:19:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:19:50 np0005538513.localdomain podman[72145]: 2025-11-28 08:19:50.847697397 +0000 UTC m=+0.084520939 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:19:50 np0005538513.localdomain podman[72145]: 2025-11-28 08:19:50.88722753 +0000 UTC m=+0.124051062 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1)
Nov 28 08:19:50 np0005538513.localdomain podman[72146]: 2025-11-28 08:19:50.902113268 +0000 UTC m=+0.133765297 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:19:50 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:19:50 np0005538513.localdomain podman[72146]: 2025-11-28 08:19:50.94732737 +0000 UTC m=+0.178979389 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=ovn_controller, version=17.1.12, url=https://www.redhat.com, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:19:50 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:19:58 np0005538513.localdomain snmpd[66832]: empty variable list in _query
Nov 28 08:19:58 np0005538513.localdomain snmpd[66832]: empty variable list in _query
Nov 28 08:20:07 np0005538513.localdomain sudo[72194]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:20:07 np0005538513.localdomain sudo[72194]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:20:07 np0005538513.localdomain sudo[72194]: pam_unix(sudo:session): session closed for user root
Nov 28 08:20:07 np0005538513.localdomain sudo[72209]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:20:07 np0005538513.localdomain sudo[72209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:20:07 np0005538513.localdomain sudo[72209]: pam_unix(sudo:session): session closed for user root
Nov 28 08:20:08 np0005538513.localdomain sudo[72257]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:20:08 np0005538513.localdomain sudo[72257]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:20:08 np0005538513.localdomain sudo[72257]: pam_unix(sudo:session): session closed for user root
Nov 28 08:20:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:20:11 np0005538513.localdomain podman[72272]: 2025-11-28 08:20:11.852964427 +0000 UTC m=+0.087294547 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step3, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container)
Nov 28 08:20:11 np0005538513.localdomain podman[72272]: 2025-11-28 08:20:11.890360022 +0000 UTC m=+0.124690092 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:20:11 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:20:12 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:20:12 np0005538513.localdomain systemd[1]: tmp-crun.DJAMnU.mount: Deactivated successfully.
Nov 28 08:20:12 np0005538513.localdomain podman[72290]: 2025-11-28 08:20:12.844952688 +0000 UTC m=+0.079589004 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, version=17.1.12, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:20:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:20:13 np0005538513.localdomain podman[72290]: 2025-11-28 08:20:13.055510719 +0000 UTC m=+0.290146985 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:20:13 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:20:13 np0005538513.localdomain systemd[1]: tmp-crun.ITc6ed.mount: Deactivated successfully.
Nov 28 08:20:13 np0005538513.localdomain podman[72319]: 2025-11-28 08:20:13.134598516 +0000 UTC m=+0.077232300 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, container_name=iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z)
Nov 28 08:20:13 np0005538513.localdomain podman[72319]: 2025-11-28 08:20:13.146345195 +0000 UTC m=+0.088978959 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, container_name=iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com)
Nov 28 08:20:13 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:20:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:20:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:20:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:20:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:20:17 np0005538513.localdomain podman[72339]: 2025-11-28 08:20:17.856094217 +0000 UTC m=+0.092631843 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:20:17 np0005538513.localdomain podman[72339]: 2025-11-28 08:20:17.911389636 +0000 UTC m=+0.147927302 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z)
Nov 28 08:20:17 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:20:17 np0005538513.localdomain podman[72341]: 2025-11-28 08:20:17.913512383 +0000 UTC m=+0.142978547 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, vcs-type=git)
Nov 28 08:20:17 np0005538513.localdomain podman[72341]: 2025-11-28 08:20:17.996306446 +0000 UTC m=+0.225772610 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:20:18 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:20:18 np0005538513.localdomain podman[72347]: 2025-11-28 08:20:18.009286615 +0000 UTC m=+0.232887614 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:20:18 np0005538513.localdomain podman[72340]: 2025-11-28 08:20:17.966760598 +0000 UTC m=+0.198214304 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:20:18 np0005538513.localdomain podman[72347]: 2025-11-28 08:20:18.052276696 +0000 UTC m=+0.275877715 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:20:18 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:20:18 np0005538513.localdomain podman[72340]: 2025-11-28 08:20:18.369363967 +0000 UTC m=+0.600817643 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git)
Nov 28 08:20:18 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:20:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:20:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:20:21 np0005538513.localdomain systemd[1]: tmp-crun.zBS8H1.mount: Deactivated successfully.
Nov 28 08:20:21 np0005538513.localdomain podman[72432]: 2025-11-28 08:20:21.907131777 +0000 UTC m=+0.148664755 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent)
Nov 28 08:20:21 np0005538513.localdomain podman[72433]: 2025-11-28 08:20:21.872765117 +0000 UTC m=+0.111179586 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, release=1761123044, config_id=tripleo_step4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:20:21 np0005538513.localdomain podman[72432]: 2025-11-28 08:20:21.953400713 +0000 UTC m=+0.194933731 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:20:21 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:20:21 np0005538513.localdomain podman[72433]: 2025-11-28 08:20:21.963238333 +0000 UTC m=+0.201652862 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, container_name=ovn_controller)
Nov 28 08:20:22 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:20:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:20:42 np0005538513.localdomain podman[72477]: 2025-11-28 08:20:42.847215863 +0000 UTC m=+0.087079059 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, container_name=collectd, config_id=tripleo_step3)
Nov 28 08:20:42 np0005538513.localdomain podman[72477]: 2025-11-28 08:20:42.882228185 +0000 UTC m=+0.122091391 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044)
Nov 28 08:20:42 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:20:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:20:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:20:43 np0005538513.localdomain podman[72497]: 2025-11-28 08:20:43.82506024 +0000 UTC m=+0.058927064 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, tcib_managed=true, container_name=iscsid, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 28 08:20:43 np0005538513.localdomain podman[72497]: 2025-11-28 08:20:43.863404646 +0000 UTC m=+0.097271510 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, container_name=iscsid, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:20:43 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:20:43 np0005538513.localdomain podman[72498]: 2025-11-28 08:20:43.920482461 +0000 UTC m=+0.147466518 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git)
Nov 28 08:20:44 np0005538513.localdomain podman[72498]: 2025-11-28 08:20:44.109728822 +0000 UTC m=+0.336712859 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:20:44 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:20:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:20:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:20:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:20:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:20:48 np0005538513.localdomain podman[72550]: 2025-11-28 08:20:48.841070674 +0000 UTC m=+0.074132363 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, architecture=x86_64, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team)
Nov 28 08:20:48 np0005538513.localdomain podman[72550]: 2025-11-28 08:20:48.865541503 +0000 UTC m=+0.098603192 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:20:48 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:20:48 np0005538513.localdomain podman[72547]: 2025-11-28 08:20:48.95639543 +0000 UTC m=+0.194048463 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, release=1761123044)
Nov 28 08:20:48 np0005538513.localdomain podman[72547]: 2025-11-28 08:20:48.986842167 +0000 UTC m=+0.224495230 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container)
Nov 28 08:20:48 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:20:49 np0005538513.localdomain podman[72549]: 2025-11-28 08:20:49.055536818 +0000 UTC m=+0.288793643 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-cron-container, release=1761123044, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, vcs-type=git, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.)
Nov 28 08:20:49 np0005538513.localdomain podman[72548]: 2025-11-28 08:20:49.060621847 +0000 UTC m=+0.295138421 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, architecture=x86_64)
Nov 28 08:20:49 np0005538513.localdomain podman[72549]: 2025-11-28 08:20:49.069443984 +0000 UTC m=+0.302700809 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, name=rhosp17/openstack-cron, vcs-type=git, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_id=tripleo_step4, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64)
Nov 28 08:20:49 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:20:49 np0005538513.localdomain podman[72548]: 2025-11-28 08:20:49.420481213 +0000 UTC m=+0.654997797 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4)
Nov 28 08:20:49 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:20:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:20:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:20:52 np0005538513.localdomain podman[72643]: 2025-11-28 08:20:52.836535547 +0000 UTC m=+0.070332643 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Nov 28 08:20:52 np0005538513.localdomain podman[72643]: 2025-11-28 08:20:52.857454325 +0000 UTC m=+0.091251471 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:20:52 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:20:52 np0005538513.localdomain podman[72642]: 2025-11-28 08:20:52.949714346 +0000 UTC m=+0.186222507 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 28 08:20:53 np0005538513.localdomain podman[72642]: 2025-11-28 08:20:53.01856391 +0000 UTC m=+0.255072081 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4)
Nov 28 08:20:53 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:21:08 np0005538513.localdomain sudo[72690]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:21:08 np0005538513.localdomain sudo[72690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:21:08 np0005538513.localdomain sudo[72690]: pam_unix(sudo:session): session closed for user root
Nov 28 08:21:08 np0005538513.localdomain sudo[72705]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 08:21:08 np0005538513.localdomain sudo[72705]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:21:09 np0005538513.localdomain podman[72795]: 2025-11-28 08:21:09.782650558 +0000 UTC m=+0.085330824 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, release=553, architecture=x86_64, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_BRANCH=main, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git)
Nov 28 08:21:09 np0005538513.localdomain podman[72795]: 2025-11-28 08:21:09.912540513 +0000 UTC m=+0.215220769 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, name=rhceph, RELEASE=main, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., release=553, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph)
Nov 28 08:21:10 np0005538513.localdomain sudo[72705]: pam_unix(sudo:session): session closed for user root
Nov 28 08:21:10 np0005538513.localdomain sudo[72863]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:21:10 np0005538513.localdomain sudo[72863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:21:10 np0005538513.localdomain sudo[72863]: pam_unix(sudo:session): session closed for user root
Nov 28 08:21:10 np0005538513.localdomain sudo[72878]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:21:10 np0005538513.localdomain sudo[72878]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:21:10 np0005538513.localdomain sudo[72878]: pam_unix(sudo:session): session closed for user root
Nov 28 08:21:11 np0005538513.localdomain sudo[72926]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:21:11 np0005538513.localdomain sudo[72926]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:21:11 np0005538513.localdomain sudo[72926]: pam_unix(sudo:session): session closed for user root
Nov 28 08:21:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:21:13 np0005538513.localdomain podman[72941]: 2025-11-28 08:21:13.848763343 +0000 UTC m=+0.085637224 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:21:13 np0005538513.localdomain podman[72941]: 2025-11-28 08:21:13.858622953 +0000 UTC m=+0.095496824 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, distribution-scope=public, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, maintainer=OpenStack TripleO Team)
Nov 28 08:21:13 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:21:14 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:21:14 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:21:14 np0005538513.localdomain podman[72964]: 2025-11-28 08:21:14.831906607 +0000 UTC m=+0.069309620 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Nov 28 08:21:14 np0005538513.localdomain podman[72963]: 2025-11-28 08:21:14.89211277 +0000 UTC m=+0.130126202 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, architecture=x86_64, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z)
Nov 28 08:21:14 np0005538513.localdomain podman[72963]: 2025-11-28 08:21:14.92134945 +0000 UTC m=+0.159362912 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=iscsid, io.openshift.expose-services=, config_id=tripleo_step3, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:21:14 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:21:15 np0005538513.localdomain podman[72964]: 2025-11-28 08:21:15.008358236 +0000 UTC m=+0.245761269 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, batch=17.1_20251118.1)
Nov 28 08:21:15 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:21:19 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:21:19 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:21:19 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:21:19 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:21:19 np0005538513.localdomain podman[73012]: 2025-11-28 08:21:19.820083904 +0000 UTC m=+0.059325506 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Nov 28 08:21:19 np0005538513.localdomain systemd[1]: tmp-crun.jRgCjR.mount: Deactivated successfully.
Nov 28 08:21:19 np0005538513.localdomain podman[73013]: 2025-11-28 08:21:19.87020253 +0000 UTC m=+0.100567333 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2025-11-18T22:49:32Z)
Nov 28 08:21:19 np0005538513.localdomain podman[73013]: 2025-11-28 08:21:19.905317915 +0000 UTC m=+0.135682688 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1)
Nov 28 08:21:19 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:21:19 np0005538513.localdomain podman[73014]: 2025-11-28 08:21:19.91881356 +0000 UTC m=+0.146028664 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1761123044, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:21:19 np0005538513.localdomain podman[73011]: 2025-11-28 08:21:19.965138326 +0000 UTC m=+0.203003564 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, container_name=ceilometer_agent_compute, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, build-date=2025-11-19T00:11:48Z, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git)
Nov 28 08:21:19 np0005538513.localdomain podman[73014]: 2025-11-28 08:21:19.970479554 +0000 UTC m=+0.197694678 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:21:19 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:21:19 np0005538513.localdomain podman[73011]: 2025-11-28 08:21:19.995405358 +0000 UTC m=+0.233270616 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:21:20 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:21:20 np0005538513.localdomain podman[73012]: 2025-11-28 08:21:20.176350928 +0000 UTC m=+0.415592550 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:21:20 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:21:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:21:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:21:23 np0005538513.localdomain podman[73108]: 2025-11-28 08:21:23.84013082 +0000 UTC m=+0.080166361 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, container_name=ovn_metadata_agent)
Nov 28 08:21:23 np0005538513.localdomain podman[73108]: 2025-11-28 08:21:23.88401165 +0000 UTC m=+0.124047151 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:21:23 np0005538513.localdomain systemd[1]: tmp-crun.ip9a0P.mount: Deactivated successfully.
Nov 28 08:21:23 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:21:23 np0005538513.localdomain podman[73109]: 2025-11-28 08:21:23.910140582 +0000 UTC m=+0.146372234 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T23:34:05Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:21:23 np0005538513.localdomain podman[73109]: 2025-11-28 08:21:23.931873195 +0000 UTC m=+0.168104857 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team)
Nov 28 08:21:23 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:21:36 np0005538513.localdomain sshd[73153]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:21:37 np0005538513.localdomain sshd[73153]: Invalid user sol from 193.32.162.146 port 37016
Nov 28 08:21:37 np0005538513.localdomain sshd[73153]: Connection closed by invalid user sol 193.32.162.146 port 37016 [preauth]
Nov 28 08:21:44 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:21:44 np0005538513.localdomain podman[73155]: 2025-11-28 08:21:44.854192353 +0000 UTC m=+0.089520366 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, config_id=tripleo_step3, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, container_name=collectd, version=17.1.12, vcs-type=git, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:21:44 np0005538513.localdomain podman[73155]: 2025-11-28 08:21:44.890538906 +0000 UTC m=+0.125866979 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-type=git, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:21:44 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:21:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:21:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:21:45 np0005538513.localdomain systemd[1]: tmp-crun.zFaGIv.mount: Deactivated successfully.
Nov 28 08:21:45 np0005538513.localdomain podman[73176]: 2025-11-28 08:21:45.843849522 +0000 UTC m=+0.084112106 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-18T23:44:13Z, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=iscsid)
Nov 28 08:21:45 np0005538513.localdomain podman[73177]: 2025-11-28 08:21:45.868819527 +0000 UTC m=+0.099743458 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public)
Nov 28 08:21:45 np0005538513.localdomain podman[73176]: 2025-11-28 08:21:45.877707006 +0000 UTC m=+0.117969560 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=iscsid, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:21:45 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:21:46 np0005538513.localdomain podman[73177]: 2025-11-28 08:21:46.048479586 +0000 UTC m=+0.279403477 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, build-date=2025-11-18T22:49:46Z, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Nov 28 08:21:46 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:21:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:21:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:21:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:21:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:21:50 np0005538513.localdomain podman[73223]: 2025-11-28 08:21:50.849101367 +0000 UTC m=+0.085671045 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:21:50 np0005538513.localdomain podman[73222]: 2025-11-28 08:21:50.894439743 +0000 UTC m=+0.133861831 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:21:50 np0005538513.localdomain podman[73224]: 2025-11-28 08:21:50.956799273 +0000 UTC m=+0.188626502 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.expose-services=)
Nov 28 08:21:50 np0005538513.localdomain podman[73228]: 2025-11-28 08:21:50.926710617 +0000 UTC m=+0.155354226 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 28 08:21:50 np0005538513.localdomain podman[73224]: 2025-11-28 08:21:50.988653904 +0000 UTC m=+0.220481163 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, tcib_managed=true, version=17.1.12)
Nov 28 08:21:50 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:21:51 np0005538513.localdomain podman[73228]: 2025-11-28 08:21:51.009512481 +0000 UTC m=+0.238156120 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, version=17.1.12, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=)
Nov 28 08:21:51 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:21:51 np0005538513.localdomain podman[73222]: 2025-11-28 08:21:51.030662395 +0000 UTC m=+0.270084513 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:21:51 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:21:51 np0005538513.localdomain podman[73223]: 2025-11-28 08:21:51.203826541 +0000 UTC m=+0.440396169 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git)
Nov 28 08:21:51 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:21:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:21:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:21:54 np0005538513.localdomain podman[73317]: 2025-11-28 08:21:54.849090733 +0000 UTC m=+0.086313275 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:21:54 np0005538513.localdomain podman[73317]: 2025-11-28 08:21:54.898351052 +0000 UTC m=+0.135573574 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step4)
Nov 28 08:21:54 np0005538513.localdomain systemd[1]: tmp-crun.XhO9qt.mount: Deactivated successfully.
Nov 28 08:21:54 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:21:54 np0005538513.localdomain podman[73316]: 2025-11-28 08:21:54.914448078 +0000 UTC m=+0.153002512 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 08:21:54 np0005538513.localdomain podman[73316]: 2025-11-28 08:21:54.946338441 +0000 UTC m=+0.184892865 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044)
Nov 28 08:21:54 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:22:05 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:22:05 np0005538513.localdomain recover_tripleo_nova_virtqemud[73363]: 61397
Nov 28 08:22:05 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:22:05 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:22:11 np0005538513.localdomain sudo[73364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:22:11 np0005538513.localdomain sudo[73364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:22:11 np0005538513.localdomain sudo[73364]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:11 np0005538513.localdomain sudo[73379]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:22:11 np0005538513.localdomain sudo[73379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:22:12 np0005538513.localdomain sudo[73379]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:13 np0005538513.localdomain sudo[73427]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:22:13 np0005538513.localdomain sudo[73427]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:22:13 np0005538513.localdomain sudo[73427]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:22:15 np0005538513.localdomain podman[73442]: 2025-11-28 08:22:15.836090597 +0000 UTC m=+0.068046280 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, tcib_managed=true, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:22:15 np0005538513.localdomain podman[73442]: 2025-11-28 08:22:15.84599985 +0000 UTC m=+0.077955533 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, release=1761123044, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.)
Nov 28 08:22:15 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:22:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:22:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:22:16 np0005538513.localdomain systemd[1]: tmp-crun.bZPjZB.mount: Deactivated successfully.
Nov 28 08:22:16 np0005538513.localdomain podman[73462]: 2025-11-28 08:22:16.856677789 +0000 UTC m=+0.092243112 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible)
Nov 28 08:22:16 np0005538513.localdomain podman[73462]: 2025-11-28 08:22:16.867688225 +0000 UTC m=+0.103253538 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step3, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-type=git)
Nov 28 08:22:16 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:22:16 np0005538513.localdomain podman[73463]: 2025-11-28 08:22:16.831480347 +0000 UTC m=+0.067893556 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, release=1761123044, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:22:17 np0005538513.localdomain podman[73463]: 2025-11-28 08:22:17.04624372 +0000 UTC m=+0.282656859 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:22:17 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:22:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:22:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:22:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:22:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:22:21 np0005538513.localdomain systemd[1]: tmp-crun.MvLlFc.mount: Deactivated successfully.
Nov 28 08:22:21 np0005538513.localdomain podman[73510]: 2025-11-28 08:22:21.851850578 +0000 UTC m=+0.083756065 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:22:21 np0005538513.localdomain podman[73509]: 2025-11-28 08:22:21.903419649 +0000 UTC m=+0.134273963 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:22:21 np0005538513.localdomain podman[73508]: 2025-11-28 08:22:21.963473187 +0000 UTC m=+0.196970454 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1)
Nov 28 08:22:21 np0005538513.localdomain podman[73511]: 2025-11-28 08:22:21.877244806 +0000 UTC m=+0.100507891 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:22:21 np0005538513.localdomain podman[73510]: 2025-11-28 08:22:21.983523358 +0000 UTC m=+0.215428815 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:49:32Z, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1)
Nov 28 08:22:21 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:22:22 np0005538513.localdomain podman[73511]: 2025-11-28 08:22:22.00998449 +0000 UTC m=+0.233247585 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=)
Nov 28 08:22:22 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:22:22 np0005538513.localdomain podman[73508]: 2025-11-28 08:22:22.024845317 +0000 UTC m=+0.258342514 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:22:22 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:22:22 np0005538513.localdomain podman[73509]: 2025-11-28 08:22:22.270708208 +0000 UTC m=+0.501562602 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4)
Nov 28 08:22:22 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:22:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:22:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:22:25 np0005538513.localdomain podman[73604]: 2025-11-28 08:22:25.857810241 +0000 UTC m=+0.086489021 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=ovn_controller, io.openshift.expose-services=, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:22:25 np0005538513.localdomain systemd[1]: tmp-crun.lFYal7.mount: Deactivated successfully.
Nov 28 08:22:25 np0005538513.localdomain podman[73603]: 2025-11-28 08:22:25.915985279 +0000 UTC m=+0.145785995 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4)
Nov 28 08:22:25 np0005538513.localdomain podman[73604]: 2025-11-28 08:22:25.936543726 +0000 UTC m=+0.165222546 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Nov 28 08:22:25 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:22:25 np0005538513.localdomain podman[73603]: 2025-11-28 08:22:25.991463212 +0000 UTC m=+0.221263938 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, architecture=x86_64, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git)
Nov 28 08:22:26 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:22:26 np0005538513.localdomain sudo[73696]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzlopszrvnwycxxcdysiagpcnrvionym ; /usr/bin/python3
Nov 28 08:22:26 np0005538513.localdomain sudo[73696]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:26 np0005538513.localdomain python3[73698]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:22:26 np0005538513.localdomain sudo[73696]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:27 np0005538513.localdomain sudo[73741]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpknsdkaihkhwhdijrvlxybuxraafwvw ; /usr/bin/python3
Nov 28 08:22:27 np0005538513.localdomain sudo[73741]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:27 np0005538513.localdomain python3[73743]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764318146.518492-114295-47244957008075/source _original_basename=tmpr3j16rg5 follow=False checksum=039e0b234f00fbd1242930f0d5dc67e8b4c067fe backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:22:27 np0005538513.localdomain sudo[73741]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:28 np0005538513.localdomain sudo[73771]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmkwziptjvvgfvkolhnmbnutfxznelxi ; /usr/bin/python3
Nov 28 08:22:28 np0005538513.localdomain sudo[73771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:28 np0005538513.localdomain python3[73773]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_5 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:22:28 np0005538513.localdomain sudo[73771]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:28 np0005538513.localdomain sudo[73821]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bnmowhbrrgwppxjmdrsnvmgqbnthwymg ; /usr/bin/python3
Nov 28 08:22:28 np0005538513.localdomain sudo[73821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:28 np0005538513.localdomain sudo[73821]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:28 np0005538513.localdomain sudo[73839]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jopvfjlfkcljfnixlekbqnyvijjschxa ; /usr/bin/python3
Nov 28 08:22:28 np0005538513.localdomain sudo[73839]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:29 np0005538513.localdomain sudo[73839]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:29 np0005538513.localdomain sudo[73943]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svnawvotqebijjwxwtvhugibarpypytp ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764318149.3570664-114686-51827400948075/async_wrapper.py 298073833886 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764318149.3570664-114686-51827400948075/AnsiballZ_command.py _
Nov 28 08:22:29 np0005538513.localdomain sudo[73943]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 28 08:22:29 np0005538513.localdomain ansible-async_wrapper.py[73945]: Invoked with 298073833886 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764318149.3570664-114686-51827400948075/AnsiballZ_command.py _
Nov 28 08:22:29 np0005538513.localdomain ansible-async_wrapper.py[73948]: Starting module and watcher
Nov 28 08:22:29 np0005538513.localdomain ansible-async_wrapper.py[73948]: Start watching 73949 (3600)
Nov 28 08:22:29 np0005538513.localdomain ansible-async_wrapper.py[73949]: Start module (73949)
Nov 28 08:22:29 np0005538513.localdomain ansible-async_wrapper.py[73945]: Return async_wrapper task started.
Nov 28 08:22:29 np0005538513.localdomain sudo[73943]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:30 np0005538513.localdomain sudo[73967]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzrrwpddqafkelazzgycchlwhjrkvxov ; /usr/bin/python3
Nov 28 08:22:30 np0005538513.localdomain sudo[73967]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:30 np0005538513.localdomain python3[73969]: ansible-ansible.legacy.async_status Invoked with jid=298073833886.73945 mode=status _async_dir=/tmp/.ansible_async
Nov 28 08:22:30 np0005538513.localdomain sudo[73967]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:33 np0005538513.localdomain puppet-user[73964]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:22:33 np0005538513.localdomain puppet-user[73964]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:22:33 np0005538513.localdomain puppet-user[73964]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:22:33 np0005538513.localdomain puppet-user[73964]:    (file & line not available)
Nov 28 08:22:33 np0005538513.localdomain puppet-user[73964]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:22:33 np0005538513.localdomain puppet-user[73964]:    (file & line not available)
Nov 28 08:22:33 np0005538513.localdomain puppet-user[73964]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:                     with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:                     with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:                     with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:                     with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:                     with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.22 seconds
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]: Notice: Applied catalog in 0.43 seconds
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]: Application:
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:    Initial environment: production
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:    Converged environment: production
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:          Run mode: user
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]: Changes:
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]: Events:
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]: Resources:
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:             Total: 19
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]: Time:
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:          Schedule: 0.00
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:           Package: 0.00
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:              Exec: 0.01
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:            Augeas: 0.01
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:              File: 0.02
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:           Service: 0.06
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:    Config retrieval: 0.28
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:    Transaction evaluation: 0.28
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:    Catalog application: 0.43
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:          Last run: 1764318154
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:        Filebucket: 0.00
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:             Total: 0.43
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]: Version:
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:            Config: 1764318153
Nov 28 08:22:34 np0005538513.localdomain puppet-user[73964]:            Puppet: 7.10.0
Nov 28 08:22:34 np0005538513.localdomain ansible-async_wrapper.py[73949]: Module complete (73949)
Nov 28 08:22:34 np0005538513.localdomain ansible-async_wrapper.py[73948]: Done in kid B.
Nov 28 08:22:40 np0005538513.localdomain sudo[74105]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-esjmxtgvnzggrsuippeiieocdfidsfad ; /usr/bin/python3
Nov 28 08:22:40 np0005538513.localdomain sudo[74105]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:40 np0005538513.localdomain python3[74107]: ansible-ansible.legacy.async_status Invoked with jid=298073833886.73945 mode=status _async_dir=/tmp/.ansible_async
Nov 28 08:22:40 np0005538513.localdomain sudo[74105]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:41 np0005538513.localdomain sudo[74121]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqxlxghlwxkjruhnxhavqnhicviqkbfq ; /usr/bin/python3
Nov 28 08:22:41 np0005538513.localdomain sudo[74121]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:41 np0005538513.localdomain python3[74123]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 08:22:41 np0005538513.localdomain sudo[74121]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:41 np0005538513.localdomain sudo[74137]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujdncwnmtlcdxnhpjqibwsszxuhggdqy ; /usr/bin/python3
Nov 28 08:22:41 np0005538513.localdomain sudo[74137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:41 np0005538513.localdomain python3[74139]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:22:41 np0005538513.localdomain sudo[74137]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:42 np0005538513.localdomain sudo[74187]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izbgkivhdrrfjqgaolxbbvcfdjteiylt ; /usr/bin/python3
Nov 28 08:22:42 np0005538513.localdomain sudo[74187]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:42 np0005538513.localdomain python3[74189]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:22:42 np0005538513.localdomain sudo[74187]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:42 np0005538513.localdomain sudo[74205]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mwcunnlqewuwnboivrgczazemlcubopb ; /usr/bin/python3
Nov 28 08:22:42 np0005538513.localdomain sudo[74205]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:42 np0005538513.localdomain python3[74207]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpy6ueq7k2 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 08:22:42 np0005538513.localdomain sudo[74205]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:42 np0005538513.localdomain sudo[74235]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlzbnvukzcyoprtjaabpbmetrjwpkqqn ; /usr/bin/python3
Nov 28 08:22:42 np0005538513.localdomain sudo[74235]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:42 np0005538513.localdomain python3[74237]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:22:42 np0005538513.localdomain sudo[74235]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:43 np0005538513.localdomain sudo[74251]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-byhhcsrkgvpjlbnwrtjextgaaxdjzxne ; /usr/bin/python3
Nov 28 08:22:43 np0005538513.localdomain sudo[74251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:43 np0005538513.localdomain sudo[74251]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:43 np0005538513.localdomain sudo[74340]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emnjwmcxefrhzjqneggocmqjjwfnnndb ; /usr/bin/python3
Nov 28 08:22:43 np0005538513.localdomain sudo[74340]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:43 np0005538513.localdomain python3[74342]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Nov 28 08:22:44 np0005538513.localdomain sudo[74340]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:44 np0005538513.localdomain sudo[74359]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emdhfzezcchdejlgdfhpobfgbnmxikfw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:44 np0005538513.localdomain sudo[74359]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:44 np0005538513.localdomain python3[74361]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:22:44 np0005538513.localdomain sudo[74359]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:44 np0005538513.localdomain sudo[74375]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfwwedxaquvhocjzbojyarqfhghzplog ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:44 np0005538513.localdomain sudo[74375]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:44 np0005538513.localdomain sudo[74375]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:45 np0005538513.localdomain sudo[74391]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okctpugeoxicoqkggmphsdsshvdnbpws ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:45 np0005538513.localdomain sudo[74391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:45 np0005538513.localdomain python3[74393]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:22:45 np0005538513.localdomain sudo[74391]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:45 np0005538513.localdomain sudo[74441]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxbhhiiuzlyqtqhmnefalpbixiwvoglm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:45 np0005538513.localdomain sudo[74441]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:22:45 np0005538513.localdomain podman[74444]: 2025-11-28 08:22:45.988062484 +0000 UTC m=+0.086119268 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd)
Nov 28 08:22:45 np0005538513.localdomain podman[74444]: 2025-11-28 08:22:45.999361939 +0000 UTC m=+0.097418733 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, architecture=x86_64, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:22:46 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:22:46 np0005538513.localdomain python3[74443]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:22:46 np0005538513.localdomain sudo[74441]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:46 np0005538513.localdomain sudo[74479]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynuaxxjqoigtjzurogoaxhqtvmzxpemj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:46 np0005538513.localdomain sudo[74479]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:46 np0005538513.localdomain python3[74481]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:22:46 np0005538513.localdomain sudo[74479]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:46 np0005538513.localdomain sudo[74541]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umkejeqcogmhmcxnlujntqapozbzhyxm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:46 np0005538513.localdomain sudo[74541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:46 np0005538513.localdomain python3[74543]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:22:46 np0005538513.localdomain sudo[74541]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:46 np0005538513.localdomain sudo[74559]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmfddghxkakqolznnkyghkdzeztumwhj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:46 np0005538513.localdomain sudo[74559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:46 np0005538513.localdomain python3[74561]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:22:47 np0005538513.localdomain sudo[74559]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:47 np0005538513.localdomain sudo[74621]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fejfhxvziiecaflsjahuskjklpuptskg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:47 np0005538513.localdomain sudo[74621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:22:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:22:47 np0005538513.localdomain podman[74624]: 2025-11-28 08:22:47.47912343 +0000 UTC m=+0.072594114 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=)
Nov 28 08:22:47 np0005538513.localdomain podman[74624]: 2025-11-28 08:22:47.487265415 +0000 UTC m=+0.080736129 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, name=rhosp17/openstack-iscsid, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, tcib_managed=true, config_id=tripleo_step3, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:22:47 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:22:47 np0005538513.localdomain podman[74625]: 2025-11-28 08:22:47.548776379 +0000 UTC m=+0.138499255 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:22:47 np0005538513.localdomain python3[74623]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:22:47 np0005538513.localdomain sudo[74621]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:47 np0005538513.localdomain sudo[74685]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwgzzxxzssyuffkbtwqnfffgzgqvjxbc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:47 np0005538513.localdomain sudo[74685]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:47 np0005538513.localdomain podman[74625]: 2025-11-28 08:22:47.784328386 +0000 UTC m=+0.374051202 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, version=17.1.12, vcs-type=git, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com)
Nov 28 08:22:47 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:22:47 np0005538513.localdomain python3[74687]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:22:47 np0005538513.localdomain sudo[74685]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:48 np0005538513.localdomain sudo[74747]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rvifkvrfkbckljuvgwnfffpebboclmkn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:48 np0005538513.localdomain sudo[74747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:48 np0005538513.localdomain python3[74749]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:22:48 np0005538513.localdomain sudo[74747]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:48 np0005538513.localdomain sudo[74765]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmeygirmeaxnkskyxyfgxydnsjmykcvs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:48 np0005538513.localdomain sudo[74765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:48 np0005538513.localdomain python3[74767]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:22:48 np0005538513.localdomain sudo[74765]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:48 np0005538513.localdomain sudo[74795]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykcjabqyoxwmhjxducpgyniunbvcpjsi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:48 np0005538513.localdomain sudo[74795]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:49 np0005538513.localdomain python3[74797]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:22:49 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:22:49 np0005538513.localdomain systemd-rc-local-generator[74820]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:22:49 np0005538513.localdomain systemd-sysv-generator[74825]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:22:49 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:22:49 np0005538513.localdomain sudo[74795]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:49 np0005538513.localdomain sudo[74881]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-darewjbuplksqrvchbxnoyhrjccrzxtk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:49 np0005538513.localdomain sudo[74881]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:49 np0005538513.localdomain python3[74883]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:22:49 np0005538513.localdomain sudo[74881]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:50 np0005538513.localdomain sudo[74899]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvrrrjqwiqyqfvmildqxvmcbesafnpab ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:50 np0005538513.localdomain sudo[74899]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:50 np0005538513.localdomain python3[74901]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:22:50 np0005538513.localdomain sudo[74899]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:50 np0005538513.localdomain sudo[74961]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypngjyyqozqnhiyztjjmsonbmzqyurhs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:50 np0005538513.localdomain sudo[74961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:50 np0005538513.localdomain python3[74963]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:22:50 np0005538513.localdomain sudo[74961]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:50 np0005538513.localdomain sudo[74979]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxzshujzdsrzciqqaufvdhefdjrtotip ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:50 np0005538513.localdomain sudo[74979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:50 np0005538513.localdomain python3[74981]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:22:51 np0005538513.localdomain sudo[74979]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:51 np0005538513.localdomain sudo[75009]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zumijumbnmwubnukwspahijgfsrdjmac ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:51 np0005538513.localdomain sudo[75009]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:51 np0005538513.localdomain python3[75011]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:22:51 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:22:51 np0005538513.localdomain systemd-rc-local-generator[75032]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:22:51 np0005538513.localdomain systemd-sysv-generator[75036]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:22:51 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:22:51 np0005538513.localdomain systemd[1]: Starting Create netns directory...
Nov 28 08:22:51 np0005538513.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 08:22:51 np0005538513.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 08:22:51 np0005538513.localdomain systemd[1]: Finished Create netns directory.
Nov 28 08:22:51 np0005538513.localdomain sudo[75009]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:52 np0005538513.localdomain sudo[75065]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gtzpwttmzbskenwllebfvqpfptlbnwzo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:52 np0005538513.localdomain sudo[75065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:22:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:22:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:22:52 np0005538513.localdomain podman[75067]: 2025-11-28 08:22:52.250207202 +0000 UTC m=+0.084157357 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:22:52 np0005538513.localdomain podman[75067]: 2025-11-28 08:22:52.303446346 +0000 UTC m=+0.137396501 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.buildah.version=1.41.4, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute)
Nov 28 08:22:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:22:52 np0005538513.localdomain podman[75070]: 2025-11-28 08:22:52.315004509 +0000 UTC m=+0.148811330 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, batch=17.1_20251118.1)
Nov 28 08:22:52 np0005538513.localdomain python3[75068]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Nov 28 08:22:52 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:22:52 np0005538513.localdomain sudo[75065]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:52 np0005538513.localdomain podman[75069]: 2025-11-28 08:22:52.3532198 +0000 UTC m=+0.187225547 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4)
Nov 28 08:22:52 np0005538513.localdomain podman[75118]: 2025-11-28 08:22:52.404938957 +0000 UTC m=+0.079441839 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, release=1761123044, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12)
Nov 28 08:22:52 np0005538513.localdomain podman[75069]: 2025-11-28 08:22:52.418325708 +0000 UTC m=+0.252331455 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, tcib_managed=true, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-cron, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:22:52 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:22:52 np0005538513.localdomain podman[75070]: 2025-11-28 08:22:52.475130424 +0000 UTC m=+0.308937285 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vcs-type=git, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi)
Nov 28 08:22:52 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:22:52 np0005538513.localdomain sudo[75170]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aifkmkqamzazijjomhmxdryjmjyhbjej ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:52 np0005538513.localdomain sudo[75170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:52 np0005538513.localdomain podman[75118]: 2025-11-28 08:22:52.743275165 +0000 UTC m=+0.417778047 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, container_name=nova_migration_target, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:22:52 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:22:53 np0005538513.localdomain sudo[75170]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:53 np0005538513.localdomain sudo[75213]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ywpqklpwyzhgoiqrswyhmlmrtazyoiud ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:53 np0005538513.localdomain sudo[75213]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:53 np0005538513.localdomain python3[75215]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step5 config_dir=/var/lib/tripleo-config/container-startup-config/step_5 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Nov 28 08:22:54 np0005538513.localdomain podman[75253]: 2025-11-28 08:22:54.095537955 +0000 UTC m=+0.072706877 container create c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step5)
Nov 28 08:22:54 np0005538513.localdomain systemd[1]: Started libpod-conmon-c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.scope.
Nov 28 08:22:54 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:22:54 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d078e66fd31ac4112ed049a2e764bba9cb8e12df988d1bfceb2a6cc73592ff91/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:22:54 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d078e66fd31ac4112ed049a2e764bba9cb8e12df988d1bfceb2a6cc73592ff91/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:22:54 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d078e66fd31ac4112ed049a2e764bba9cb8e12df988d1bfceb2a6cc73592ff91/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 08:22:54 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d078e66fd31ac4112ed049a2e764bba9cb8e12df988d1bfceb2a6cc73592ff91/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:22:54 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d078e66fd31ac4112ed049a2e764bba9cb8e12df988d1bfceb2a6cc73592ff91/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 08:22:54 np0005538513.localdomain podman[75253]: 2025-11-28 08:22:54.055784266 +0000 UTC m=+0.032953248 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 28 08:22:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:22:54 np0005538513.localdomain podman[75253]: 2025-11-28 08:22:54.180937961 +0000 UTC m=+0.158106993 container init c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=nova_compute, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:22:54 np0005538513.localdomain sudo[75274]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:22:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:22:54 np0005538513.localdomain podman[75253]: 2025-11-28 08:22:54.214190927 +0000 UTC m=+0.191359939 container start c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5, version=17.1.12, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=nova_compute)
Nov 28 08:22:54 np0005538513.localdomain systemd-logind[764]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 28 08:22:54 np0005538513.localdomain python3[75215]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute --conmon-pidfile /run/nova_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env LIBGUESTFS_BACKEND=direct --env TRIPLEO_CONFIG_HASH=c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f --healthcheck-command /openstack/healthcheck 5672 --ipc host --label config_id=tripleo_step5 --label container_name=nova_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute.log --network host --privileged=True --ulimit nofile=131072 --ulimit memlock=67108864 --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/nova:/var/log/nova --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /dev:/dev --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /run/nova:/run/nova:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /sys/class/net:/sys/class/net --volume /sys/bus/pci:/sys/bus/pci --volume /boot:/boot:ro --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 28 08:22:54 np0005538513.localdomain systemd[1]: Created slice User Slice of UID 0.
Nov 28 08:22:54 np0005538513.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 28 08:22:54 np0005538513.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 28 08:22:54 np0005538513.localdomain systemd[1]: Starting User Manager for UID 0...
Nov 28 08:22:54 np0005538513.localdomain systemd[75288]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:22:54 np0005538513.localdomain systemd[75288]: Queued start job for default target Main User Target.
Nov 28 08:22:54 np0005538513.localdomain systemd[75288]: Created slice User Application Slice.
Nov 28 08:22:54 np0005538513.localdomain systemd[75288]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 28 08:22:54 np0005538513.localdomain systemd[75288]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 08:22:54 np0005538513.localdomain systemd[75288]: Reached target Paths.
Nov 28 08:22:54 np0005538513.localdomain systemd[75288]: Reached target Timers.
Nov 28 08:22:54 np0005538513.localdomain systemd[75288]: Starting D-Bus User Message Bus Socket...
Nov 28 08:22:54 np0005538513.localdomain podman[75275]: 2025-11-28 08:22:54.448573836 +0000 UTC m=+0.223567860 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, architecture=x86_64, release=1761123044, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git, container_name=nova_compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:22:54 np0005538513.localdomain systemd[75288]: Starting Create User's Volatile Files and Directories...
Nov 28 08:22:54 np0005538513.localdomain systemd[75288]: Finished Create User's Volatile Files and Directories.
Nov 28 08:22:54 np0005538513.localdomain systemd[75288]: Listening on D-Bus User Message Bus Socket.
Nov 28 08:22:54 np0005538513.localdomain systemd[75288]: Reached target Sockets.
Nov 28 08:22:54 np0005538513.localdomain systemd[75288]: Reached target Basic System.
Nov 28 08:22:54 np0005538513.localdomain systemd[75288]: Reached target Main User Target.
Nov 28 08:22:54 np0005538513.localdomain systemd[75288]: Startup finished in 131ms.
Nov 28 08:22:54 np0005538513.localdomain systemd[1]: Started User Manager for UID 0.
Nov 28 08:22:54 np0005538513.localdomain systemd[1]: Started Session c10 of User root.
Nov 28 08:22:54 np0005538513.localdomain sudo[75274]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Nov 28 08:22:54 np0005538513.localdomain podman[75275]: 2025-11-28 08:22:54.521508459 +0000 UTC m=+0.296502513 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, tcib_managed=true, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vendor=Red Hat, Inc.)
Nov 28 08:22:54 np0005538513.localdomain podman[75275]: unhealthy
Nov 28 08:22:54 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:22:54 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'.
Nov 28 08:22:54 np0005538513.localdomain sudo[75274]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:54 np0005538513.localdomain systemd[1]: session-c10.scope: Deactivated successfully.
Nov 28 08:22:54 np0005538513.localdomain podman[75375]: 2025-11-28 08:22:54.676853925 +0000 UTC m=+0.099086627 container create b23471aa557660c5dac2cb5b1a3e4f53c2898babc5e8dcd2f3287d589ad38589 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, container_name=nova_wait_for_compute_service, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5)
Nov 28 08:22:54 np0005538513.localdomain podman[75375]: 2025-11-28 08:22:54.620633787 +0000 UTC m=+0.042866509 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 28 08:22:54 np0005538513.localdomain systemd[1]: Started libpod-conmon-b23471aa557660c5dac2cb5b1a3e4f53c2898babc5e8dcd2f3287d589ad38589.scope.
Nov 28 08:22:54 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:22:54 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/722caa9ca8bac97f5e9f74c37c16703e7dbaf1a59ddfd48c9720d9c7ac6caca3/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Nov 28 08:22:54 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/722caa9ca8bac97f5e9f74c37c16703e7dbaf1a59ddfd48c9720d9c7ac6caca3/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:22:54 np0005538513.localdomain podman[75375]: 2025-11-28 08:22:54.7511433 +0000 UTC m=+0.173376002 container init b23471aa557660c5dac2cb5b1a3e4f53c2898babc5e8dcd2f3287d589ad38589 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, container_name=nova_wait_for_compute_service, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git)
Nov 28 08:22:54 np0005538513.localdomain podman[75375]: 2025-11-28 08:22:54.760909608 +0000 UTC m=+0.183142310 container start b23471aa557660c5dac2cb5b1a3e4f53c2898babc5e8dcd2f3287d589ad38589 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_wait_for_compute_service, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, config_id=tripleo_step5)
Nov 28 08:22:54 np0005538513.localdomain podman[75375]: 2025-11-28 08:22:54.761152265 +0000 UTC m=+0.183384977 container attach b23471aa557660c5dac2cb5b1a3e4f53c2898babc5e8dcd2f3287d589ad38589 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, container_name=nova_wait_for_compute_service, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-type=git, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:22:54 np0005538513.localdomain sudo[75394]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:22:54 np0005538513.localdomain sudo[75394]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Nov 28 08:22:54 np0005538513.localdomain sudo[75394]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:22:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:22:56 np0005538513.localdomain systemd[1]: tmp-crun.0ulbkM.mount: Deactivated successfully.
Nov 28 08:22:56 np0005538513.localdomain podman[75399]: 2025-11-28 08:22:56.869455379 +0000 UTC m=+0.099636964 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64)
Nov 28 08:22:56 np0005538513.localdomain podman[75398]: 2025-11-28 08:22:56.907271798 +0000 UTC m=+0.136374970 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:22:56 np0005538513.localdomain podman[75399]: 2025-11-28 08:22:56.924375996 +0000 UTC m=+0.154557551 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:22:56 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:22:56 np0005538513.localdomain podman[75398]: 2025-11-28 08:22:56.979583362 +0000 UTC m=+0.208686474 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.12, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:22:56 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:23:04 np0005538513.localdomain systemd[1]: Stopping User Manager for UID 0...
Nov 28 08:23:04 np0005538513.localdomain systemd[75288]: Activating special unit Exit the Session...
Nov 28 08:23:04 np0005538513.localdomain systemd[75288]: Stopped target Main User Target.
Nov 28 08:23:04 np0005538513.localdomain systemd[75288]: Stopped target Basic System.
Nov 28 08:23:04 np0005538513.localdomain systemd[75288]: Stopped target Paths.
Nov 28 08:23:04 np0005538513.localdomain systemd[75288]: Stopped target Sockets.
Nov 28 08:23:04 np0005538513.localdomain systemd[75288]: Stopped target Timers.
Nov 28 08:23:04 np0005538513.localdomain systemd[75288]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 08:23:04 np0005538513.localdomain systemd[75288]: Closed D-Bus User Message Bus Socket.
Nov 28 08:23:04 np0005538513.localdomain systemd[75288]: Stopped Create User's Volatile Files and Directories.
Nov 28 08:23:04 np0005538513.localdomain systemd[75288]: Removed slice User Application Slice.
Nov 28 08:23:04 np0005538513.localdomain systemd[75288]: Reached target Shutdown.
Nov 28 08:23:04 np0005538513.localdomain systemd[75288]: Finished Exit the Session.
Nov 28 08:23:04 np0005538513.localdomain systemd[75288]: Reached target Exit the Session.
Nov 28 08:23:04 np0005538513.localdomain systemd[1]: user@0.service: Deactivated successfully.
Nov 28 08:23:04 np0005538513.localdomain systemd[1]: Stopped User Manager for UID 0.
Nov 28 08:23:04 np0005538513.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 28 08:23:04 np0005538513.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 28 08:23:04 np0005538513.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 28 08:23:04 np0005538513.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 28 08:23:04 np0005538513.localdomain systemd[1]: Removed slice User Slice of UID 0.
Nov 28 08:23:13 np0005538513.localdomain sudo[75447]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:23:13 np0005538513.localdomain sudo[75447]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:23:13 np0005538513.localdomain sudo[75447]: pam_unix(sudo:session): session closed for user root
Nov 28 08:23:13 np0005538513.localdomain sudo[75462]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:23:13 np0005538513.localdomain sudo[75462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:23:13 np0005538513.localdomain sudo[75462]: pam_unix(sudo:session): session closed for user root
Nov 28 08:23:14 np0005538513.localdomain sudo[75509]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:23:14 np0005538513.localdomain sudo[75509]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:23:14 np0005538513.localdomain sudo[75509]: pam_unix(sudo:session): session closed for user root
Nov 28 08:23:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:23:16 np0005538513.localdomain podman[75524]: 2025-11-28 08:23:16.847336741 +0000 UTC m=+0.082003429 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, distribution-scope=public, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=)
Nov 28 08:23:16 np0005538513.localdomain podman[75524]: 2025-11-28 08:23:16.886510543 +0000 UTC m=+0.121177301 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-collectd-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:23:16 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:23:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:23:17 np0005538513.localdomain systemd[1]: tmp-crun.0yiD7S.mount: Deactivated successfully.
Nov 28 08:23:17 np0005538513.localdomain podman[75545]: 2025-11-28 08:23:17.844010631 +0000 UTC m=+0.078701976 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, container_name=iscsid, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3)
Nov 28 08:23:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:23:17 np0005538513.localdomain podman[75545]: 2025-11-28 08:23:17.859445316 +0000 UTC m=+0.094136651 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, architecture=x86_64, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true)
Nov 28 08:23:17 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:23:17 np0005538513.localdomain systemd[1]: tmp-crun.JLrCmr.mount: Deactivated successfully.
Nov 28 08:23:17 np0005538513.localdomain podman[75564]: 2025-11-28 08:23:17.948247998 +0000 UTC m=+0.086949905 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, container_name=metrics_qdr, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_id=tripleo_step1, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=)
Nov 28 08:23:18 np0005538513.localdomain podman[75564]: 2025-11-28 08:23:18.133692439 +0000 UTC m=+0.272394346 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1)
Nov 28 08:23:18 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:23:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:23:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:23:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:23:22 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:23:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:23:22 np0005538513.localdomain recover_tripleo_nova_virtqemud[75613]: 61397
Nov 28 08:23:22 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:23:22 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:23:22 np0005538513.localdomain podman[75593]: 2025-11-28 08:23:22.841169703 +0000 UTC m=+0.076842448 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 08:23:22 np0005538513.localdomain podman[75593]: 2025-11-28 08:23:22.86432691 +0000 UTC m=+0.099999675 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:11:48Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:23:22 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:23:22 np0005538513.localdomain podman[75595]: 2025-11-28 08:23:22.906409344 +0000 UTC m=+0.135726320 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible)
Nov 28 08:23:22 np0005538513.localdomain podman[75594]: 2025-11-28 08:23:22.946570366 +0000 UTC m=+0.177463411 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-cron-container, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team)
Nov 28 08:23:22 np0005538513.localdomain podman[75595]: 2025-11-28 08:23:22.959374869 +0000 UTC m=+0.188691815 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, tcib_managed=true, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 28 08:23:22 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:23:22 np0005538513.localdomain podman[75594]: 2025-11-28 08:23:22.979368578 +0000 UTC m=+0.210261713 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:23:22 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:23:23 np0005538513.localdomain podman[75612]: 2025-11-28 08:23:23.0615092 +0000 UTC m=+0.278042873 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, container_name=nova_migration_target, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:23:23 np0005538513.localdomain podman[75612]: 2025-11-28 08:23:23.460695713 +0000 UTC m=+0.677229376 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64)
Nov 28 08:23:23 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:23:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:23:24 np0005538513.localdomain podman[75687]: 2025-11-28 08:23:24.844453584 +0000 UTC m=+0.077808868 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, io.openshift.expose-services=, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:23:24 np0005538513.localdomain podman[75687]: 2025-11-28 08:23:24.928406294 +0000 UTC m=+0.161761548 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=nova_compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:23:24 np0005538513.localdomain podman[75687]: unhealthy
Nov 28 08:23:24 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:23:24 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'.
Nov 28 08:23:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:23:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:23:27 np0005538513.localdomain podman[75710]: 2025-11-28 08:23:27.837099834 +0000 UTC m=+0.072673426 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, release=1761123044)
Nov 28 08:23:27 np0005538513.localdomain podman[75710]: 2025-11-28 08:23:27.884825115 +0000 UTC m=+0.120398677 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, vcs-type=git, container_name=ovn_controller, batch=17.1_20251118.1)
Nov 28 08:23:27 np0005538513.localdomain systemd[1]: tmp-crun.A0f2h0.mount: Deactivated successfully.
Nov 28 08:23:27 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:23:27 np0005538513.localdomain podman[75709]: 2025-11-28 08:23:27.909821411 +0000 UTC m=+0.147415787 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=)
Nov 28 08:23:27 np0005538513.localdomain podman[75709]: 2025-11-28 08:23:27.955007951 +0000 UTC m=+0.192602327 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:23:27 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:23:37 np0005538513.localdomain sshd[75759]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:23:37 np0005538513.localdomain sshd[75759]: Invalid user sol from 193.32.162.146 port 52002
Nov 28 08:23:37 np0005538513.localdomain sshd[75759]: Connection closed by invalid user sol 193.32.162.146 port 52002 [preauth]
Nov 28 08:23:46 np0005538513.localdomain sshd[35388]: Received disconnect from 192.168.122.100 port 53432:11: disconnected by user
Nov 28 08:23:46 np0005538513.localdomain sshd[35388]: Disconnected from user zuul 192.168.122.100 port 53432
Nov 28 08:23:46 np0005538513.localdomain sshd[35385]: pam_unix(sshd:session): session closed for user zuul
Nov 28 08:23:46 np0005538513.localdomain systemd[1]: session-27.scope: Deactivated successfully.
Nov 28 08:23:46 np0005538513.localdomain systemd[1]: session-27.scope: Consumed 2.979s CPU time.
Nov 28 08:23:46 np0005538513.localdomain systemd-logind[764]: Session 27 logged out. Waiting for processes to exit.
Nov 28 08:23:46 np0005538513.localdomain systemd-logind[764]: Removed session 27.
Nov 28 08:23:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:23:47 np0005538513.localdomain systemd[1]: tmp-crun.wdczUu.mount: Deactivated successfully.
Nov 28 08:23:47 np0005538513.localdomain podman[75761]: 2025-11-28 08:23:47.860050277 +0000 UTC m=+0.093271209 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step3, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, release=1761123044, name=rhosp17/openstack-collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:23:47 np0005538513.localdomain podman[75761]: 2025-11-28 08:23:47.877074658 +0000 UTC m=+0.110295580 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step3, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container)
Nov 28 08:23:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:23:47 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:23:47 np0005538513.localdomain podman[75781]: 2025-11-28 08:23:47.960731689 +0000 UTC m=+0.058683137 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team)
Nov 28 08:23:47 np0005538513.localdomain podman[75781]: 2025-11-28 08:23:47.999053449 +0000 UTC m=+0.097004867 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, container_name=iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, vcs-type=git, version=17.1.12)
Nov 28 08:23:48 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:23:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:23:48 np0005538513.localdomain podman[75800]: 2025-11-28 08:23:48.8419099 +0000 UTC m=+0.078250460 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:23:49 np0005538513.localdomain podman[75800]: 2025-11-28 08:23:49.058249312 +0000 UTC m=+0.294589792 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, container_name=metrics_qdr, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:23:49 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:23:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:23:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:23:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:23:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:23:53 np0005538513.localdomain systemd[1]: tmp-crun.CxWm2h.mount: Deactivated successfully.
Nov 28 08:23:53 np0005538513.localdomain podman[75831]: 2025-11-28 08:23:53.872713587 +0000 UTC m=+0.109751042 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git)
Nov 28 08:23:53 np0005538513.localdomain podman[75830]: 2025-11-28 08:23:53.830486864 +0000 UTC m=+0.073153978 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:23:53 np0005538513.localdomain podman[75837]: 2025-11-28 08:23:53.858599438 +0000 UTC m=+0.089964192 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-cron-container)
Nov 28 08:23:53 np0005538513.localdomain podman[75830]: 2025-11-28 08:23:53.913559696 +0000 UTC m=+0.156226810 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container)
Nov 28 08:23:53 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:23:53 np0005538513.localdomain podman[75837]: 2025-11-28 08:23:53.938103477 +0000 UTC m=+0.169468251 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=logrotate_crond, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron)
Nov 28 08:23:53 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:23:53 np0005538513.localdomain podman[75838]: 2025-11-28 08:23:53.946666999 +0000 UTC m=+0.173736497 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, config_id=tripleo_step4)
Nov 28 08:23:53 np0005538513.localdomain podman[75838]: 2025-11-28 08:23:53.9642978 +0000 UTC m=+0.191367288 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, container_name=ceilometer_agent_ipmi, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc.)
Nov 28 08:23:53 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:23:54 np0005538513.localdomain podman[75831]: 2025-11-28 08:23:54.216646527 +0000 UTC m=+0.453683992 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_migration_target, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:23:54 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:23:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:23:55 np0005538513.localdomain podman[75923]: 2025-11-28 08:23:55.836709951 +0000 UTC m=+0.072588530 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vcs-type=git, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:23:55 np0005538513.localdomain podman[75923]: 2025-11-28 08:23:55.895698818 +0000 UTC m=+0.131577397 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, container_name=nova_compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:23:55 np0005538513.localdomain podman[75923]: unhealthy
Nov 28 08:23:55 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:23:55 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'.
Nov 28 08:23:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:23:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:23:58 np0005538513.localdomain systemd[1]: tmp-crun.eIJfbb.mount: Deactivated successfully.
Nov 28 08:23:58 np0005538513.localdomain podman[75945]: 2025-11-28 08:23:58.839386135 +0000 UTC m=+0.076079171 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:23:58 np0005538513.localdomain podman[75946]: 2025-11-28 08:23:58.893640941 +0000 UTC m=+0.126717162 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller)
Nov 28 08:23:58 np0005538513.localdomain podman[75945]: 2025-11-28 08:23:58.911563041 +0000 UTC m=+0.148256067 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Nov 28 08:23:58 np0005538513.localdomain podman[75946]: 2025-11-28 08:23:58.92032278 +0000 UTC m=+0.153398971 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team)
Nov 28 08:23:58 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:23:58 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:24:14 np0005538513.localdomain sudo[75992]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:24:14 np0005538513.localdomain sudo[75992]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:24:14 np0005538513.localdomain sudo[75992]: pam_unix(sudo:session): session closed for user root
Nov 28 08:24:14 np0005538513.localdomain sudo[76007]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:24:14 np0005538513.localdomain sudo[76007]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:24:15 np0005538513.localdomain sudo[76007]: pam_unix(sudo:session): session closed for user root
Nov 28 08:24:16 np0005538513.localdomain sudo[76053]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:24:16 np0005538513.localdomain sudo[76053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:24:16 np0005538513.localdomain sudo[76053]: pam_unix(sudo:session): session closed for user root
Nov 28 08:24:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:24:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:24:18 np0005538513.localdomain systemd[1]: tmp-crun.mkd0p4.mount: Deactivated successfully.
Nov 28 08:24:18 np0005538513.localdomain podman[76068]: 2025-11-28 08:24:18.864765736 +0000 UTC m=+0.097397970 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.12, architecture=x86_64, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container)
Nov 28 08:24:18 np0005538513.localdomain podman[76068]: 2025-11-28 08:24:18.880321311 +0000 UTC m=+0.112953555 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid)
Nov 28 08:24:18 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:24:18 np0005538513.localdomain podman[76069]: 2025-11-28 08:24:18.959868781 +0000 UTC m=+0.192929408 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, vcs-type=git)
Nov 28 08:24:18 np0005538513.localdomain podman[76069]: 2025-11-28 08:24:18.973368411 +0000 UTC m=+0.206429018 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z)
Nov 28 08:24:18 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:24:19 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:24:19 np0005538513.localdomain podman[76106]: 2025-11-28 08:24:19.843078585 +0000 UTC m=+0.086331587 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, architecture=x86_64, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd)
Nov 28 08:24:20 np0005538513.localdomain podman[76106]: 2025-11-28 08:24:20.020246681 +0000 UTC m=+0.263499673 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64)
Nov 28 08:24:20 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:24:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:24:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:24:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:24:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:24:24 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:24:24 np0005538513.localdomain recover_tripleo_nova_virtqemud[76156]: 61397
Nov 28 08:24:24 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:24:24 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:24:24 np0005538513.localdomain podman[76135]: 2025-11-28 08:24:24.842696581 +0000 UTC m=+0.075258955 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:24:24 np0005538513.localdomain podman[76135]: 2025-11-28 08:24:24.874317087 +0000 UTC m=+0.106879501 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4)
Nov 28 08:24:24 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:24:25 np0005538513.localdomain podman[76136]: 2025-11-28 08:24:24.999538 +0000 UTC m=+0.229902794 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true)
Nov 28 08:24:25 np0005538513.localdomain podman[76138]: 2025-11-28 08:24:25.030599818 +0000 UTC m=+0.253257796 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:24:25 np0005538513.localdomain podman[76137]: 2025-11-28 08:24:25.069595589 +0000 UTC m=+0.296649678 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=logrotate_crond, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-cron, vcs-type=git)
Nov 28 08:24:25 np0005538513.localdomain podman[76137]: 2025-11-28 08:24:25.099972745 +0000 UTC m=+0.327026864 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, container_name=logrotate_crond, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, tcib_managed=true, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:24:25 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:24:25 np0005538513.localdomain podman[76138]: 2025-11-28 08:24:25.108421274 +0000 UTC m=+0.331079252 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, version=17.1.12, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:24:25 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:24:25 np0005538513.localdomain podman[76136]: 2025-11-28 08:24:25.362603189 +0000 UTC m=+0.592967983 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team)
Nov 28 08:24:25 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:24:25 np0005538513.localdomain systemd[1]: tmp-crun.IRGTvo.mount: Deactivated successfully.
Nov 28 08:24:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:24:26 np0005538513.localdomain podman[76232]: 2025-11-28 08:24:26.840973785 +0000 UTC m=+0.079004314 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, url=https://www.redhat.com, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:24:26 np0005538513.localdomain podman[76232]: 2025-11-28 08:24:26.900429736 +0000 UTC m=+0.138460255 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:24:26 np0005538513.localdomain podman[76232]: unhealthy
Nov 28 08:24:26 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:24:26 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'.
Nov 28 08:24:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:24:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:24:29 np0005538513.localdomain podman[76254]: 2025-11-28 08:24:29.83140032 +0000 UTC m=+0.072948731 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:24:29 np0005538513.localdomain podman[76254]: 2025-11-28 08:24:29.875476521 +0000 UTC m=+0.117024912 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12)
Nov 28 08:24:29 np0005538513.localdomain systemd[1]: tmp-crun.0ycmAw.mount: Deactivated successfully.
Nov 28 08:24:29 np0005538513.localdomain podman[76255]: 2025-11-28 08:24:29.893587808 +0000 UTC m=+0.130020097 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=ovn_controller, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-18T23:34:05Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20251118.1)
Nov 28 08:24:29 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:24:29 np0005538513.localdomain podman[76255]: 2025-11-28 08:24:29.914589766 +0000 UTC m=+0.151022035 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller)
Nov 28 08:24:29 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:24:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:24:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:24:49 np0005538513.localdomain podman[76302]: 2025-11-28 08:24:49.839043793 +0000 UTC m=+0.079407327 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:24:49 np0005538513.localdomain podman[76303]: 2025-11-28 08:24:49.888892999 +0000 UTC m=+0.127039872 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_id=tripleo_step3, vcs-type=git, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:24:49 np0005538513.localdomain podman[76302]: 2025-11-28 08:24:49.924233283 +0000 UTC m=+0.164596847 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, vcs-type=git, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public)
Nov 28 08:24:49 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:24:49 np0005538513.localdomain podman[76303]: 2025-11-28 08:24:49.976269479 +0000 UTC m=+0.214416312 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:24:49 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:24:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:24:50 np0005538513.localdomain podman[76339]: 2025-11-28 08:24:50.848723311 +0000 UTC m=+0.087418842 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, container_name=metrics_qdr, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Nov 28 08:24:51 np0005538513.localdomain podman[76339]: 2025-11-28 08:24:51.028107967 +0000 UTC m=+0.266803568 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_id=tripleo_step1, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:24:51 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:24:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:24:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:24:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:24:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:24:55 np0005538513.localdomain podman[76368]: 2025-11-28 08:24:55.815190832 +0000 UTC m=+0.061556080 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git)
Nov 28 08:24:55 np0005538513.localdomain podman[76368]: 2025-11-28 08:24:55.836301804 +0000 UTC m=+0.082667012 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute)
Nov 28 08:24:55 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:24:55 np0005538513.localdomain podman[76375]: 2025-11-28 08:24:55.875123618 +0000 UTC m=+0.111891241 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, release=1761123044, architecture=x86_64, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.12, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, io.openshift.expose-services=, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.buildah.version=1.41.4)
Nov 28 08:24:55 np0005538513.localdomain podman[76375]: 2025-11-28 08:24:55.88651221 +0000 UTC m=+0.123279813 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, batch=17.1_20251118.1)
Nov 28 08:24:55 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:24:55 np0005538513.localdomain podman[76369]: 2025-11-28 08:24:55.926164322 +0000 UTC m=+0.167402106 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1)
Nov 28 08:24:55 np0005538513.localdomain podman[76379]: 2025-11-28 08:24:55.989148845 +0000 UTC m=+0.223640585 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:24:56 np0005538513.localdomain podman[76379]: 2025-11-28 08:24:56.043445452 +0000 UTC m=+0.277937122 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, version=17.1.12, io.buildah.version=1.41.4, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 08:24:56 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:24:56 np0005538513.localdomain podman[76369]: 2025-11-28 08:24:56.26123694 +0000 UTC m=+0.502474694 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container)
Nov 28 08:24:56 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:24:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:24:57 np0005538513.localdomain systemd[1]: tmp-crun.kQbXOK.mount: Deactivated successfully.
Nov 28 08:24:57 np0005538513.localdomain podman[76462]: 2025-11-28 08:24:57.849672527 +0000 UTC m=+0.087832025 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute)
Nov 28 08:24:57 np0005538513.localdomain podman[76462]: 2025-11-28 08:24:57.910351757 +0000 UTC m=+0.148511245 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, release=1761123044, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:24:57 np0005538513.localdomain podman[76462]: unhealthy
Nov 28 08:24:57 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:24:57 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'.
Nov 28 08:25:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:25:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:25:00 np0005538513.localdomain podman[76485]: 2025-11-28 08:25:00.843098936 +0000 UTC m=+0.077124224 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Nov 28 08:25:00 np0005538513.localdomain podman[76484]: 2025-11-28 08:25:00.894442369 +0000 UTC m=+0.131592927 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, config_id=tripleo_step4)
Nov 28 08:25:00 np0005538513.localdomain podman[76485]: 2025-11-28 08:25:00.918375781 +0000 UTC m=+0.152401079 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044)
Nov 28 08:25:00 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:25:00 np0005538513.localdomain podman[76484]: 2025-11-28 08:25:00.97400098 +0000 UTC m=+0.211151568 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, vcs-type=git, batch=17.1_20251118.1, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=)
Nov 28 08:25:00 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:25:16 np0005538513.localdomain sudo[76533]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:25:16 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:25:16 np0005538513.localdomain sudo[76533]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:25:16 np0005538513.localdomain sudo[76533]: pam_unix(sudo:session): session closed for user root
Nov 28 08:25:16 np0005538513.localdomain recover_tripleo_nova_virtqemud[76549]: 61397
Nov 28 08:25:16 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:25:16 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:25:16 np0005538513.localdomain sudo[76550]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:25:16 np0005538513.localdomain sudo[76550]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:25:17 np0005538513.localdomain sudo[76550]: pam_unix(sudo:session): session closed for user root
Nov 28 08:25:17 np0005538513.localdomain sudo[76598]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:25:17 np0005538513.localdomain sudo[76598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:25:17 np0005538513.localdomain sudo[76598]: pam_unix(sudo:session): session closed for user root
Nov 28 08:25:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:25:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:25:20 np0005538513.localdomain systemd[1]: tmp-crun.1s5MRS.mount: Deactivated successfully.
Nov 28 08:25:20 np0005538513.localdomain podman[76614]: 2025-11-28 08:25:20.917556778 +0000 UTC m=+0.142609967 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible)
Nov 28 08:25:20 np0005538513.localdomain podman[76613]: 2025-11-28 08:25:20.887733519 +0000 UTC m=+0.112680545 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step3, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, architecture=x86_64, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:25:20 np0005538513.localdomain podman[76614]: 2025-11-28 08:25:20.953300255 +0000 UTC m=+0.178353434 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public)
Nov 28 08:25:20 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:25:20 np0005538513.localdomain podman[76613]: 2025-11-28 08:25:20.972305279 +0000 UTC m=+0.197252225 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:25:20 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:25:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:25:21 np0005538513.localdomain podman[76650]: 2025-11-28 08:25:21.849451431 +0000 UTC m=+0.086579985 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, managed_by=tripleo_ansible)
Nov 28 08:25:21 np0005538513.localdomain systemd[1]: tmp-crun.DqiLNp.mount: Deactivated successfully.
Nov 28 08:25:22 np0005538513.localdomain podman[76650]: 2025-11-28 08:25:22.046930763 +0000 UTC m=+0.284059267 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64)
Nov 28 08:25:22 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:25:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:25:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:25:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:25:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:25:26 np0005538513.localdomain systemd[1]: tmp-crun.MU2j7h.mount: Deactivated successfully.
Nov 28 08:25:26 np0005538513.localdomain podman[76678]: 2025-11-28 08:25:26.864067424 +0000 UTC m=+0.099887478 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, container_name=ceilometer_agent_compute, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public)
Nov 28 08:25:26 np0005538513.localdomain podman[76679]: 2025-11-28 08:25:26.895172383 +0000 UTC m=+0.126252307 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:25:26 np0005538513.localdomain podman[76678]: 2025-11-28 08:25:26.900384649 +0000 UTC m=+0.136204683 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Nov 28 08:25:26 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:25:26 np0005538513.localdomain podman[76680]: 2025-11-28 08:25:26.914427006 +0000 UTC m=+0.143511116 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=logrotate_crond, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, maintainer=OpenStack TripleO Team)
Nov 28 08:25:26 np0005538513.localdomain podman[76686]: 2025-11-28 08:25:26.962578289 +0000 UTC m=+0.186316239 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:25:26 np0005538513.localdomain podman[76680]: 2025-11-28 08:25:26.977895225 +0000 UTC m=+0.206979365 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, release=1761123044, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, name=rhosp17/openstack-cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:25:26 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:25:26 np0005538513.localdomain podman[76686]: 2025-11-28 08:25:26.996823908 +0000 UTC m=+0.220561908 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, architecture=x86_64)
Nov 28 08:25:27 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:25:27 np0005538513.localdomain podman[76679]: 2025-11-28 08:25:27.264589485 +0000 UTC m=+0.495669459 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, container_name=nova_migration_target, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git)
Nov 28 08:25:27 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:25:27 np0005538513.localdomain systemd[1]: tmp-crun.CVdkAQ.mount: Deactivated successfully.
Nov 28 08:25:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:25:28 np0005538513.localdomain podman[76768]: 2025-11-28 08:25:28.855416758 +0000 UTC m=+0.089081645 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=nova_compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, release=1761123044, version=17.1.12, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:25:28 np0005538513.localdomain podman[76768]: 2025-11-28 08:25:28.926450158 +0000 UTC m=+0.160115035 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step5, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, managed_by=tripleo_ansible)
Nov 28 08:25:28 np0005538513.localdomain podman[76768]: unhealthy
Nov 28 08:25:28 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:25:28 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'.
Nov 28 08:25:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:25:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:25:31 np0005538513.localdomain podman[76791]: 2025-11-28 08:25:31.830062481 +0000 UTC m=+0.064558884 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, container_name=ovn_controller, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:25:31 np0005538513.localdomain podman[76791]: 2025-11-28 08:25:31.877247652 +0000 UTC m=+0.111744105 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=ovn_controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public)
Nov 28 08:25:31 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:25:31 np0005538513.localdomain podman[76790]: 2025-11-28 08:25:31.897591749 +0000 UTC m=+0.133413324 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent)
Nov 28 08:25:31 np0005538513.localdomain podman[76790]: 2025-11-28 08:25:31.937100026 +0000 UTC m=+0.172921651 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-19T00:14:25Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vendor=Red Hat, Inc.)
Nov 28 08:25:31 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:25:45 np0005538513.localdomain sshd[76838]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:25:45 np0005538513.localdomain sshd[76838]: Invalid user sol from 193.32.162.146 port 38556
Nov 28 08:25:45 np0005538513.localdomain sshd[76838]: Connection closed by invalid user sol 193.32.162.146 port 38556 [preauth]
Nov 28 08:25:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:25:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:25:51 np0005538513.localdomain podman[76840]: 2025-11-28 08:25:51.851190594 +0000 UTC m=+0.087141022 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.buildah.version=1.41.4, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container)
Nov 28 08:25:51 np0005538513.localdomain podman[76840]: 2025-11-28 08:25:51.88531191 +0000 UTC m=+0.121262308 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-18T23:44:13Z)
Nov 28 08:25:51 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:25:51 np0005538513.localdomain podman[76841]: 2025-11-28 08:25:51.90512924 +0000 UTC m=+0.138994213 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, vcs-type=git, version=17.1.12, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container)
Nov 28 08:25:51 np0005538513.localdomain podman[76841]: 2025-11-28 08:25:51.918426383 +0000 UTC m=+0.152291356 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, release=1761123044)
Nov 28 08:25:51 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:25:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:25:52 np0005538513.localdomain podman[76881]: 2025-11-28 08:25:52.828920535 +0000 UTC m=+0.073215449 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:25:53 np0005538513.localdomain podman[76881]: 2025-11-28 08:25:53.020257152 +0000 UTC m=+0.264551996 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1)
Nov 28 08:25:53 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:25:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:25:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:25:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:25:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:25:57 np0005538513.localdomain systemd[1]: tmp-crun.Pz9Wsq.mount: Deactivated successfully.
Nov 28 08:25:57 np0005538513.localdomain podman[76937]: 2025-11-28 08:25:57.362462736 +0000 UTC m=+0.087872586 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1)
Nov 28 08:25:57 np0005538513.localdomain podman[76936]: 2025-11-28 08:25:57.336218011 +0000 UTC m=+0.069644346 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, distribution-scope=public, name=rhosp17/openstack-cron, vcs-type=git, container_name=logrotate_crond, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1761123044)
Nov 28 08:25:57 np0005538513.localdomain podman[76937]: 2025-11-28 08:25:57.443462152 +0000 UTC m=+0.168872012 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc.)
Nov 28 08:25:57 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:25:57 np0005538513.localdomain podman[76972]: 2025-11-28 08:25:57.446100606 +0000 UTC m=+0.138791636 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4)
Nov 28 08:25:57 np0005538513.localdomain podman[76935]: 2025-11-28 08:25:57.5015481 +0000 UTC m=+0.233230129 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible)
Nov 28 08:25:57 np0005538513.localdomain podman[76936]: 2025-11-28 08:25:57.523309043 +0000 UTC m=+0.256735328 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, tcib_managed=true, container_name=logrotate_crond, release=1761123044, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:25:57 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:25:57 np0005538513.localdomain podman[76935]: 2025-11-28 08:25:57.556325572 +0000 UTC m=+0.288007551 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Nov 28 08:25:57 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:25:57 np0005538513.localdomain podman[76972]: 2025-11-28 08:25:57.840399989 +0000 UTC m=+0.533091009 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, container_name=nova_migration_target, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute)
Nov 28 08:25:57 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:25:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:25:59 np0005538513.localdomain podman[77095]: 2025-11-28 08:25:59.841330077 +0000 UTC m=+0.074922124 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, architecture=x86_64, container_name=nova_compute, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-nova-compute, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:25:59 np0005538513.localdomain podman[77095]: 2025-11-28 08:25:59.873532792 +0000 UTC m=+0.107124799 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, release=1761123044, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:25:59 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:26:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:26:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:26:02 np0005538513.localdomain podman[77124]: 2025-11-28 08:26:02.836324877 +0000 UTC m=+0.072816937 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 28 08:26:02 np0005538513.localdomain podman[77124]: 2025-11-28 08:26:02.891893565 +0000 UTC m=+0.128385625 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team)
Nov 28 08:26:02 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:26:02 np0005538513.localdomain podman[77125]: 2025-11-28 08:26:02.893262868 +0000 UTC m=+0.127688882 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:26:02 np0005538513.localdomain podman[77125]: 2025-11-28 08:26:02.976505476 +0000 UTC m=+0.210931530 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller)
Nov 28 08:26:02 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:26:07 np0005538513.localdomain systemd[1]: libpod-b23471aa557660c5dac2cb5b1a3e4f53c2898babc5e8dcd2f3287d589ad38589.scope: Deactivated successfully.
Nov 28 08:26:07 np0005538513.localdomain podman[75375]: 2025-11-28 08:26:07.133983394 +0000 UTC m=+192.556216126 container died b23471aa557660c5dac2cb5b1a3e4f53c2898babc5e8dcd2f3287d589ad38589 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, version=17.1.12, container_name=nova_wait_for_compute_service, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=)
Nov 28 08:26:07 np0005538513.localdomain systemd[1]: tmp-crun.PhvEsT.mount: Deactivated successfully.
Nov 28 08:26:07 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b23471aa557660c5dac2cb5b1a3e4f53c2898babc5e8dcd2f3287d589ad38589-userdata-shm.mount: Deactivated successfully.
Nov 28 08:26:07 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-722caa9ca8bac97f5e9f74c37c16703e7dbaf1a59ddfd48c9720d9c7ac6caca3-merged.mount: Deactivated successfully.
Nov 28 08:26:07 np0005538513.localdomain podman[77172]: 2025-11-28 08:26:07.227750637 +0000 UTC m=+0.082610899 container cleanup b23471aa557660c5dac2cb5b1a3e4f53c2898babc5e8dcd2f3287d589ad38589 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, version=17.1.12, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_wait_for_compute_service, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4)
Nov 28 08:26:07 np0005538513.localdomain systemd[1]: libpod-conmon-b23471aa557660c5dac2cb5b1a3e4f53c2898babc5e8dcd2f3287d589ad38589.scope: Deactivated successfully.
Nov 28 08:26:07 np0005538513.localdomain python3[75215]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_wait_for_compute_service --conmon-pidfile /run/nova_wait_for_compute_service.pid --detach=False --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env __OS_DEBUG=true --env TRIPLEO_CONFIG_HASH=0f0904943dda1bf1d123bdf96d71020f --label config_id=tripleo_step5 --label container_name=nova_wait_for_compute_service --label managed_by=tripleo_ansible --label config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_wait_for_compute_service.log --network host --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/nova:/var/log/nova --volume /var/lib/container-config-scripts:/container-config-scripts registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 28 08:26:07 np0005538513.localdomain sudo[75213]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:07 np0005538513.localdomain sudo[77226]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lcgeqypctnfzohrzdxdocutfyieegmbo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:26:07 np0005538513.localdomain sudo[77226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:26:07 np0005538513.localdomain python3[77228]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:26:07 np0005538513.localdomain sudo[77226]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:07 np0005538513.localdomain sudo[77242]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmpzshguwkzareswrbxzqmwwyzpqqpad ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:26:07 np0005538513.localdomain sudo[77242]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:26:08 np0005538513.localdomain python3[77244]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:26:08 np0005538513.localdomain sudo[77242]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:08 np0005538513.localdomain sudo[77303]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btubwenmwbfsqmxvgsadhlomwrswmloq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:26:08 np0005538513.localdomain sudo[77303]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:26:08 np0005538513.localdomain python3[77305]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764318368.165464-119283-166483390457805/source dest=/etc/systemd/system/tripleo_nova_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:26:08 np0005538513.localdomain sudo[77303]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:08 np0005538513.localdomain sudo[77319]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xdxfddmsidckxakyqshkvwdffybjoovk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:26:08 np0005538513.localdomain sudo[77319]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:26:09 np0005538513.localdomain python3[77321]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 08:26:09 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:26:09 np0005538513.localdomain systemd-rc-local-generator[77342]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:26:09 np0005538513.localdomain systemd-sysv-generator[77346]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:26:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:26:09 np0005538513.localdomain sudo[77319]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:09 np0005538513.localdomain sudo[77371]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pcklwoccianlsgeertcllqnyebynsold ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:26:09 np0005538513.localdomain sudo[77371]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:26:10 np0005538513.localdomain python3[77373]: ansible-systemd Invoked with state=restarted name=tripleo_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:26:10 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:26:10 np0005538513.localdomain systemd-rc-local-generator[77400]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:26:10 np0005538513.localdomain systemd-sysv-generator[77403]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:26:10 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:26:10 np0005538513.localdomain systemd[1]: Starting nova_compute container...
Nov 28 08:26:10 np0005538513.localdomain tripleo-start-podman-container[77413]: Creating additional drop-in dependency for "nova_compute" (c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6)
Nov 28 08:26:10 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 08:26:10 np0005538513.localdomain systemd-rc-local-generator[77473]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:26:10 np0005538513.localdomain systemd-sysv-generator[77478]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:26:10 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:26:11 np0005538513.localdomain systemd[1]: Started nova_compute container.
Nov 28 08:26:11 np0005538513.localdomain sudo[77371]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:11 np0005538513.localdomain sudo[77510]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nytsmmaoaeswhqmrtzddditgviavxcmx ; /usr/bin/python3
Nov 28 08:26:11 np0005538513.localdomain sudo[77510]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:26:11 np0005538513.localdomain python3[77512]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks5.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:26:11 np0005538513.localdomain sudo[77510]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:11 np0005538513.localdomain sudo[77558]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjfyetlvyemktbjclmjmpbhoalshtvyr ; /usr/bin/python3
Nov 28 08:26:11 np0005538513.localdomain sudo[77558]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:26:12 np0005538513.localdomain sudo[77558]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:12 np0005538513.localdomain sudo[77601]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xosypwnlyremfhycttubwvihcmodjmew ; /usr/bin/python3
Nov 28 08:26:12 np0005538513.localdomain sudo[77601]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:26:12 np0005538513.localdomain sudo[77601]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:12 np0005538513.localdomain sudo[77631]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdlhkktnihqcamivlfzjpcvxbsmeawdv ; /usr/bin/python3
Nov 28 08:26:12 np0005538513.localdomain sudo[77631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:26:12 np0005538513.localdomain python3[77633]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks5.json short_hostname=np0005538513 step=5 update_config_hash_only=False
Nov 28 08:26:12 np0005538513.localdomain sudo[77631]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:13 np0005538513.localdomain sudo[77647]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zbnmkrzmuwhdqsplcegtgfxgpiqubqpi ; /usr/bin/python3
Nov 28 08:26:13 np0005538513.localdomain sudo[77647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:26:13 np0005538513.localdomain python3[77649]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:26:13 np0005538513.localdomain sudo[77647]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:13 np0005538513.localdomain sudo[77663]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upejdncpvhbjaaaznmgaesgyykgiatdd ; /usr/bin/python3
Nov 28 08:26:13 np0005538513.localdomain sudo[77663]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:26:13 np0005538513.localdomain python3[77665]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_5 config_pattern=container-puppet-*.json config_overrides={} debug=True
Nov 28 08:26:13 np0005538513.localdomain sudo[77663]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:17 np0005538513.localdomain sudo[77666]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:26:17 np0005538513.localdomain sudo[77666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:26:17 np0005538513.localdomain sudo[77666]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:18 np0005538513.localdomain sudo[77681]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:26:18 np0005538513.localdomain sudo[77681]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:26:18 np0005538513.localdomain sudo[77681]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:19 np0005538513.localdomain sudo[77729]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:26:19 np0005538513.localdomain sudo[77729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:26:19 np0005538513.localdomain sudo[77729]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:26:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:26:22 np0005538513.localdomain podman[77745]: 2025-11-28 08:26:22.859149265 +0000 UTC m=+0.090881233 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:26:22 np0005538513.localdomain podman[77745]: 2025-11-28 08:26:22.873267065 +0000 UTC m=+0.104999022 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:26:22 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:26:22 np0005538513.localdomain systemd[1]: tmp-crun.o4X1h3.mount: Deactivated successfully.
Nov 28 08:26:22 np0005538513.localdomain podman[77744]: 2025-11-28 08:26:22.965821239 +0000 UTC m=+0.198180006 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step3, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:26:22 np0005538513.localdomain podman[77744]: 2025-11-28 08:26:22.978601175 +0000 UTC m=+0.210959902 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-iscsid-container, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, version=17.1.12, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team)
Nov 28 08:26:22 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:26:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:26:23 np0005538513.localdomain podman[77782]: 2025-11-28 08:26:23.843043543 +0000 UTC m=+0.077766835 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, release=1761123044, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd)
Nov 28 08:26:24 np0005538513.localdomain podman[77782]: 2025-11-28 08:26:24.068446022 +0000 UTC m=+0.303169244 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, config_id=tripleo_step1, version=17.1.12, io.openshift.expose-services=, container_name=metrics_qdr, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:26:24 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:26:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:26:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:26:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:26:27 np0005538513.localdomain systemd[1]: tmp-crun.Xej9LQ.mount: Deactivated successfully.
Nov 28 08:26:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:26:27 np0005538513.localdomain systemd[1]: tmp-crun.K0Sh2l.mount: Deactivated successfully.
Nov 28 08:26:27 np0005538513.localdomain podman[77815]: 2025-11-28 08:26:27.905950542 +0000 UTC m=+0.139718575 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_id=tripleo_step4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Nov 28 08:26:27 np0005538513.localdomain podman[77815]: 2025-11-28 08:26:27.94047328 +0000 UTC m=+0.174241273 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:26:27 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:26:27 np0005538513.localdomain podman[77813]: 2025-11-28 08:26:27.958325899 +0000 UTC m=+0.193497156 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:26:27 np0005538513.localdomain podman[77814]: 2025-11-28 08:26:27.877342172 +0000 UTC m=+0.111918581 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team)
Nov 28 08:26:28 np0005538513.localdomain podman[77814]: 2025-11-28 08:26:28.011645925 +0000 UTC m=+0.246222334 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron)
Nov 28 08:26:28 np0005538513.localdomain podman[77813]: 2025-11-28 08:26:28.018443881 +0000 UTC m=+0.253615148 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git)
Nov 28 08:26:28 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:26:28 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:26:28 np0005538513.localdomain podman[77854]: 2025-11-28 08:26:28.023347977 +0000 UTC m=+0.135786080 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, container_name=nova_migration_target, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:26:28 np0005538513.localdomain podman[77854]: 2025-11-28 08:26:28.397435897 +0000 UTC m=+0.509874020 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, container_name=nova_migration_target, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:26:28 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:26:30 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:26:30 np0005538513.localdomain systemd[1]: tmp-crun.Ez8SSj.mount: Deactivated successfully.
Nov 28 08:26:30 np0005538513.localdomain podman[77904]: 2025-11-28 08:26:30.853630157 +0000 UTC m=+0.091458231 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:26:30 np0005538513.localdomain podman[77904]: 2025-11-28 08:26:30.880219103 +0000 UTC m=+0.118047197 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, release=1761123044, batch=17.1_20251118.1)
Nov 28 08:26:30 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:26:33 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:26:33 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:26:33 np0005538513.localdomain podman[77929]: 2025-11-28 08:26:33.844931509 +0000 UTC m=+0.082953439 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, container_name=ovn_controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 08:26:33 np0005538513.localdomain podman[77929]: 2025-11-28 08:26:33.893807074 +0000 UTC m=+0.131829044 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=ovn_controller)
Nov 28 08:26:33 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:26:33 np0005538513.localdomain podman[77928]: 2025-11-28 08:26:33.894642771 +0000 UTC m=+0.135414979 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, vcs-type=git, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Nov 28 08:26:33 np0005538513.localdomain podman[77928]: 2025-11-28 08:26:33.979536131 +0000 UTC m=+0.220308369 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:26:33 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:26:41 np0005538513.localdomain sshd[77976]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:26:41 np0005538513.localdomain sshd[77976]: Accepted publickey for zuul from 192.168.122.100 port 55934 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 08:26:41 np0005538513.localdomain systemd-logind[764]: New session 33 of user zuul.
Nov 28 08:26:41 np0005538513.localdomain systemd[1]: Started Session 33 of User zuul.
Nov 28 08:26:41 np0005538513.localdomain sshd[77976]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 08:26:41 np0005538513.localdomain sudo[78083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mflnuiafjnzppoekolznkborhtuneifz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764318401.4056191-41201-66752644032899/AnsiballZ_setup.py
Nov 28 08:26:41 np0005538513.localdomain sudo[78083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 08:26:42 np0005538513.localdomain python3[78085]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 08:26:44 np0005538513.localdomain sudo[78083]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:49 np0005538513.localdomain sudo[78346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-psywissynyconolznncjmexmsftcduij ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764318409.08217-41291-263343051363899/AnsiballZ_dnf.py
Nov 28 08:26:49 np0005538513.localdomain sudo[78346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 08:26:49 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:26:49 np0005538513.localdomain recover_tripleo_nova_virtqemud[78350]: 61397
Nov 28 08:26:49 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:26:49 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:26:49 np0005538513.localdomain python3[78348]: ansible-ansible.legacy.dnf Invoked with name=['iptables'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None
Nov 28 08:26:52 np0005538513.localdomain sudo[78346]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:26:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:26:53 np0005538513.localdomain podman[78368]: 2025-11-28 08:26:53.858312386 +0000 UTC m=+0.094098783 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z)
Nov 28 08:26:53 np0005538513.localdomain podman[78368]: 2025-11-28 08:26:53.895476648 +0000 UTC m=+0.131263105 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=collectd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step3)
Nov 28 08:26:53 np0005538513.localdomain systemd[1]: tmp-crun.5gbyYP.mount: Deactivated successfully.
Nov 28 08:26:53 np0005538513.localdomain podman[78367]: 2025-11-28 08:26:53.919304637 +0000 UTC m=+0.157166801 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-type=git, release=1761123044, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team)
Nov 28 08:26:53 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:26:53 np0005538513.localdomain podman[78367]: 2025-11-28 08:26:53.958360529 +0000 UTC m=+0.196222713 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1)
Nov 28 08:26:53 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:26:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:26:54 np0005538513.localdomain podman[78407]: 2025-11-28 08:26:54.840002974 +0000 UTC m=+0.079126179 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc.)
Nov 28 08:26:55 np0005538513.localdomain podman[78407]: 2025-11-28 08:26:55.05648047 +0000 UTC m=+0.295603645 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, container_name=metrics_qdr, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z)
Nov 28 08:26:55 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:26:56 np0005538513.localdomain sudo[78510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujqgqjcokmceqegtrgcugarimoqgfcqs ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764318416.126637-41344-60698342007452/AnsiballZ_iptables.py
Nov 28 08:26:56 np0005538513.localdomain sudo[78510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 08:26:56 np0005538513.localdomain python3[78512]: ansible-ansible.builtin.iptables Invoked with action=insert chain=INPUT comment=allow ssh access for zuul executor in_interface=eth0 jump=ACCEPT protocol=tcp source=38.102.83.114 table=filter state=present ip_version=ipv4 match=[] destination_ports=[] ctstate=[] syn=ignore flush=False chain_management=False numeric=False rule_num=None wait=None to_source=None destination=None to_destination=None tcp_flags=None gateway=None log_prefix=None log_level=None goto=None out_interface=None fragment=None set_counters=None source_port=None destination_port=None to_ports=None set_dscp_mark=None set_dscp_mark_class=None src_range=None dst_range=None match_set=None match_set_flags=None limit=None limit_burst=None uid_owner=None gid_owner=None reject_with=None icmp_type=None policy=None
Nov 28 08:26:56 np0005538513.localdomain kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 28 08:26:56 np0005538513.localdomain systemd-journald[47227]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 81.1 (270 of 333 items), suggesting rotation.
Nov 28 08:26:56 np0005538513.localdomain systemd-journald[47227]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 08:26:56 np0005538513.localdomain rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 08:26:56 np0005538513.localdomain sudo[78510]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:56 np0005538513.localdomain rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 08:26:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:26:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:26:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:26:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:26:58 np0005538513.localdomain podman[78583]: 2025-11-28 08:26:58.858661595 +0000 UTC m=+0.085492960 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:26:58 np0005538513.localdomain podman[78583]: 2025-11-28 08:26:58.898515333 +0000 UTC m=+0.125346698 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, version=17.1.12)
Nov 28 08:26:58 np0005538513.localdomain systemd[1]: tmp-crun.CjOJH2.mount: Deactivated successfully.
Nov 28 08:26:58 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:26:58 np0005538513.localdomain podman[78581]: 2025-11-28 08:26:58.927478675 +0000 UTC m=+0.156615823 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, url=https://www.redhat.com)
Nov 28 08:26:58 np0005538513.localdomain podman[78581]: 2025-11-28 08:26:58.961394013 +0000 UTC m=+0.190531171 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step4, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team)
Nov 28 08:26:58 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:26:58 np0005538513.localdomain podman[78584]: 2025-11-28 08:26:58.977654761 +0000 UTC m=+0.199071964 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, tcib_managed=true, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 08:26:59 np0005538513.localdomain podman[78582]: 2025-11-28 08:26:59.062900422 +0000 UTC m=+0.291699400 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, version=17.1.12, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, vcs-type=git)
Nov 28 08:26:59 np0005538513.localdomain podman[78584]: 2025-11-28 08:26:59.087385011 +0000 UTC m=+0.308802214 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_id=tripleo_step4, version=17.1.12, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git)
Nov 28 08:26:59 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:26:59 np0005538513.localdomain podman[78582]: 2025-11-28 08:26:59.467540483 +0000 UTC m=+0.696339521 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, container_name=nova_migration_target, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:26:59 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:27:01 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:27:01 np0005538513.localdomain systemd[1]: tmp-crun.OB6UTm.mount: Deactivated successfully.
Nov 28 08:27:01 np0005538513.localdomain podman[78678]: 2025-11-28 08:27:01.862176705 +0000 UTC m=+0.093908079 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, tcib_managed=true, config_id=tripleo_step5, io.openshift.expose-services=, container_name=nova_compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:27:01 np0005538513.localdomain podman[78678]: 2025-11-28 08:27:01.889476044 +0000 UTC m=+0.121207478 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, architecture=x86_64, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:27:01 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:27:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:27:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:27:04 np0005538513.localdomain systemd[1]: tmp-crun.9LEEbZ.mount: Deactivated successfully.
Nov 28 08:27:04 np0005538513.localdomain podman[78704]: 2025-11-28 08:27:04.845594736 +0000 UTC m=+0.086501624 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Nov 28 08:27:04 np0005538513.localdomain podman[78704]: 2025-11-28 08:27:04.891975632 +0000 UTC m=+0.132882590 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, config_id=tripleo_step4, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:27:04 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:27:04 np0005538513.localdomain podman[78705]: 2025-11-28 08:27:04.89975031 +0000 UTC m=+0.137550898 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container)
Nov 28 08:27:04 np0005538513.localdomain podman[78705]: 2025-11-28 08:27:04.982363637 +0000 UTC m=+0.220164215 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step4, container_name=ovn_controller, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:27:04 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:27:05 np0005538513.localdomain systemd[1]: tmp-crun.geDpMV.mount: Deactivated successfully.
Nov 28 08:27:19 np0005538513.localdomain sudo[78752]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:27:19 np0005538513.localdomain sudo[78752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:27:19 np0005538513.localdomain sudo[78752]: pam_unix(sudo:session): session closed for user root
Nov 28 08:27:19 np0005538513.localdomain sudo[78767]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:27:19 np0005538513.localdomain sudo[78767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:27:20 np0005538513.localdomain sudo[78767]: pam_unix(sudo:session): session closed for user root
Nov 28 08:27:20 np0005538513.localdomain sudo[78814]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:27:20 np0005538513.localdomain sudo[78814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:27:20 np0005538513.localdomain sudo[78814]: pam_unix(sudo:session): session closed for user root
Nov 28 08:27:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:27:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:27:24 np0005538513.localdomain systemd[1]: tmp-crun.ALBu8f.mount: Deactivated successfully.
Nov 28 08:27:24 np0005538513.localdomain podman[78829]: 2025-11-28 08:27:24.8937444 +0000 UTC m=+0.128890001 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=)
Nov 28 08:27:24 np0005538513.localdomain podman[78829]: 2025-11-28 08:27:24.906883027 +0000 UTC m=+0.142028598 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, release=1761123044, architecture=x86_64, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:27:24 np0005538513.localdomain podman[78830]: 2025-11-28 08:27:24.863165207 +0000 UTC m=+0.096902994 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, distribution-scope=public, config_id=tripleo_step3, container_name=collectd, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible)
Nov 28 08:27:24 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:27:24 np0005538513.localdomain podman[78830]: 2025-11-28 08:27:24.948405638 +0000 UTC m=+0.182143435 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, config_id=tripleo_step3, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, tcib_managed=true, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:27:24 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:27:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:27:25 np0005538513.localdomain podman[78870]: 2025-11-28 08:27:25.825158737 +0000 UTC m=+0.067579901 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, version=17.1.12, release=1761123044, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:27:26 np0005538513.localdomain podman[78870]: 2025-11-28 08:27:26.034750564 +0000 UTC m=+0.277171698 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-type=git, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:27:26 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:27:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:27:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:27:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:27:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:27:29 np0005538513.localdomain systemd[1]: tmp-crun.G7ee6b.mount: Deactivated successfully.
Nov 28 08:27:29 np0005538513.localdomain podman[78899]: 2025-11-28 08:27:29.863408423 +0000 UTC m=+0.096972376 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z)
Nov 28 08:27:29 np0005538513.localdomain podman[78900]: 2025-11-28 08:27:29.91240115 +0000 UTC m=+0.140146208 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Nov 28 08:27:29 np0005538513.localdomain podman[78899]: 2025-11-28 08:27:29.921605844 +0000 UTC m=+0.155169777 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:27:29 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:27:30 np0005538513.localdomain podman[78901]: 2025-11-28 08:27:30.014396856 +0000 UTC m=+0.237954541 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron)
Nov 28 08:27:30 np0005538513.localdomain podman[78901]: 2025-11-28 08:27:30.025345784 +0000 UTC m=+0.248903469 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, release=1761123044, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:27:30 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:27:30 np0005538513.localdomain podman[78907]: 2025-11-28 08:27:30.109141489 +0000 UTC m=+0.328948605 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible)
Nov 28 08:27:30 np0005538513.localdomain podman[78907]: 2025-11-28 08:27:30.161328659 +0000 UTC m=+0.381135755 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:27:30 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:27:30 np0005538513.localdomain podman[78900]: 2025-11-28 08:27:30.31224257 +0000 UTC m=+0.539987608 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:27:30 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:27:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:27:32 np0005538513.localdomain podman[78995]: 2025-11-28 08:27:32.846312197 +0000 UTC m=+0.079982756 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container)
Nov 28 08:27:32 np0005538513.localdomain podman[78995]: 2025-11-28 08:27:32.87281448 +0000 UTC m=+0.106484979 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=)
Nov 28 08:27:32 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:27:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:27:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:27:35 np0005538513.localdomain systemd[1]: tmp-crun.8uCkYX.mount: Deactivated successfully.
Nov 28 08:27:35 np0005538513.localdomain systemd[1]: tmp-crun.Onfx5b.mount: Deactivated successfully.
Nov 28 08:27:35 np0005538513.localdomain podman[79023]: 2025-11-28 08:27:35.863868425 +0000 UTC m=+0.091233224 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, architecture=x86_64)
Nov 28 08:27:35 np0005538513.localdomain podman[79022]: 2025-11-28 08:27:35.830315538 +0000 UTC m=+0.067096726 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=)
Nov 28 08:27:35 np0005538513.localdomain podman[79023]: 2025-11-28 08:27:35.910478197 +0000 UTC m=+0.137843006 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, version=17.1.12, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://www.redhat.com)
Nov 28 08:27:35 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:27:35 np0005538513.localdomain podman[79022]: 2025-11-28 08:27:35.966132357 +0000 UTC m=+0.202913535 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.12, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, release=1761123044)
Nov 28 08:27:35 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:27:50 np0005538513.localdomain sshd[79069]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:27:50 np0005538513.localdomain sshd[79069]: Invalid user solana from 193.32.162.146 port 53332
Nov 28 08:27:50 np0005538513.localdomain sshd[79069]: Connection closed by invalid user solana 193.32.162.146 port 53332 [preauth]
Nov 28 08:27:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:27:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:27:55 np0005538513.localdomain systemd[1]: tmp-crun.l3Zpt7.mount: Deactivated successfully.
Nov 28 08:27:55 np0005538513.localdomain podman[79071]: 2025-11-28 08:27:55.865652023 +0000 UTC m=+0.101571313 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3)
Nov 28 08:27:55 np0005538513.localdomain podman[79071]: 2025-11-28 08:27:55.902622679 +0000 UTC m=+0.138542019 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, io.buildah.version=1.41.4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, architecture=x86_64)
Nov 28 08:27:55 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:27:55 np0005538513.localdomain podman[79072]: 2025-11-28 08:27:55.956332307 +0000 UTC m=+0.190363226 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, architecture=x86_64, container_name=collectd, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, distribution-scope=public)
Nov 28 08:27:55 np0005538513.localdomain podman[79072]: 2025-11-28 08:27:55.994386098 +0000 UTC m=+0.228417027 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:27:56 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:27:56 np0005538513.localdomain sshd[77979]: Received disconnect from 192.168.122.100 port 55934:11: disconnected by user
Nov 28 08:27:56 np0005538513.localdomain sshd[77979]: Disconnected from user zuul 192.168.122.100 port 55934
Nov 28 08:27:56 np0005538513.localdomain sshd[77976]: pam_unix(sshd:session): session closed for user zuul
Nov 28 08:27:56 np0005538513.localdomain systemd[1]: session-33.scope: Deactivated successfully.
Nov 28 08:27:56 np0005538513.localdomain systemd[1]: session-33.scope: Consumed 5.529s CPU time.
Nov 28 08:27:56 np0005538513.localdomain systemd-logind[764]: Session 33 logged out. Waiting for processes to exit.
Nov 28 08:27:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:27:56 np0005538513.localdomain systemd-logind[764]: Removed session 33.
Nov 28 08:27:56 np0005538513.localdomain podman[79111]: 2025-11-28 08:27:56.435461478 +0000 UTC m=+0.077990483 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1761123044, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, managed_by=tripleo_ansible)
Nov 28 08:27:56 np0005538513.localdomain podman[79111]: 2025-11-28 08:27:56.622409395 +0000 UTC m=+0.264938410 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:27:56 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:28:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:28:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:28:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:28:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:28:00 np0005538513.localdomain podman[79187]: 2025-11-28 08:28:00.837783764 +0000 UTC m=+0.066049513 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, url=https://www.redhat.com, release=1761123044, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, build-date=2025-11-18T22:49:32Z, distribution-scope=public, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team)
Nov 28 08:28:00 np0005538513.localdomain podman[79193]: 2025-11-28 08:28:00.857162849 +0000 UTC m=+0.076572726 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 28 08:28:00 np0005538513.localdomain podman[79186]: 2025-11-28 08:28:00.90241553 +0000 UTC m=+0.129722609 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute)
Nov 28 08:28:00 np0005538513.localdomain podman[79193]: 2025-11-28 08:28:00.912443708 +0000 UTC m=+0.131853585 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi)
Nov 28 08:28:00 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:28:00 np0005538513.localdomain podman[79185]: 2025-11-28 08:28:00.964856705 +0000 UTC m=+0.196472660 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:28:00 np0005538513.localdomain podman[79187]: 2025-11-28 08:28:00.978866431 +0000 UTC m=+0.207132210 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git)
Nov 28 08:28:00 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:28:00 np0005538513.localdomain podman[79185]: 2025-11-28 08:28:00.996374888 +0000 UTC m=+0.227990843 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:11:48Z, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 08:28:01 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:28:01 np0005538513.localdomain podman[79186]: 2025-11-28 08:28:01.317517394 +0000 UTC m=+0.544824503 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git)
Nov 28 08:28:01 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:28:03 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:28:03 np0005538513.localdomain podman[79277]: 2025-11-28 08:28:03.84112123 +0000 UTC m=+0.078278508 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute)
Nov 28 08:28:03 np0005538513.localdomain podman[79277]: 2025-11-28 08:28:03.871247408 +0000 UTC m=+0.108404616 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, tcib_managed=true, version=17.1.12, container_name=nova_compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:28:03 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:28:05 np0005538513.localdomain sshd[79303]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:28:05 np0005538513.localdomain sshd[79303]: Accepted publickey for zuul from 38.102.83.114 port 46086 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 08:28:05 np0005538513.localdomain systemd-logind[764]: New session 34 of user zuul.
Nov 28 08:28:05 np0005538513.localdomain systemd[1]: Started Session 34 of User zuul.
Nov 28 08:28:05 np0005538513.localdomain sshd[79303]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 08:28:05 np0005538513.localdomain sudo[79320]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dsemedaposgmqpreupcmyepiffmynndp ; /usr/bin/python3
Nov 28 08:28:05 np0005538513.localdomain sudo[79320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 08:28:05 np0005538513.localdomain python3[79322]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 08:28:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:28:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:28:06 np0005538513.localdomain podman[79325]: 2025-11-28 08:28:06.841735906 +0000 UTC m=+0.082819697 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, release=1761123044, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=ovn_controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:28:06 np0005538513.localdomain podman[79325]: 2025-11-28 08:28:06.867392178 +0000 UTC m=+0.108475969 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z)
Nov 28 08:28:06 np0005538513.localdomain podman[79324]: 2025-11-28 08:28:06.890388149 +0000 UTC m=+0.132105619 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, vcs-type=git, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:28:06 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:28:06 np0005538513.localdomain podman[79324]: 2025-11-28 08:28:06.941377884 +0000 UTC m=+0.183095354 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git)
Nov 28 08:28:06 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:28:08 np0005538513.localdomain sudo[79320]: pam_unix(sudo:session): session closed for user root
Nov 28 08:28:21 np0005538513.localdomain sudo[79369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:28:21 np0005538513.localdomain sudo[79369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:28:21 np0005538513.localdomain sudo[79369]: pam_unix(sudo:session): session closed for user root
Nov 28 08:28:21 np0005538513.localdomain sudo[79384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:28:21 np0005538513.localdomain sudo[79384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:28:21 np0005538513.localdomain sudo[79384]: pam_unix(sudo:session): session closed for user root
Nov 28 08:28:24 np0005538513.localdomain sudo[79432]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:28:24 np0005538513.localdomain sudo[79432]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:28:24 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:28:24 np0005538513.localdomain sudo[79432]: pam_unix(sudo:session): session closed for user root
Nov 28 08:28:24 np0005538513.localdomain recover_tripleo_nova_virtqemud[79448]: 61397
Nov 28 08:28:24 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:28:24 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:28:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:28:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:28:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:28:26 np0005538513.localdomain podman[79450]: 2025-11-28 08:28:26.879077362 +0000 UTC m=+0.075097630 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, container_name=collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:28:26 np0005538513.localdomain systemd[1]: tmp-crun.bubrRw.mount: Deactivated successfully.
Nov 28 08:28:26 np0005538513.localdomain podman[79451]: 2025-11-28 08:28:26.947876331 +0000 UTC m=+0.142122516 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container)
Nov 28 08:28:26 np0005538513.localdomain podman[79449]: 2025-11-28 08:28:26.911341086 +0000 UTC m=+0.112835571 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, architecture=x86_64, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3)
Nov 28 08:28:26 np0005538513.localdomain podman[79450]: 2025-11-28 08:28:26.964688433 +0000 UTC m=+0.160708701 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step3, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team)
Nov 28 08:28:26 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:28:26 np0005538513.localdomain podman[79449]: 2025-11-28 08:28:26.99539004 +0000 UTC m=+0.196884455 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 28 08:28:27 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:28:27 np0005538513.localdomain podman[79451]: 2025-11-28 08:28:27.16527811 +0000 UTC m=+0.359524355 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_id=tripleo_step1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.4)
Nov 28 08:28:27 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:28:30 np0005538513.localdomain sudo[79529]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhnatxfjpnqkwiopcwrxupvnrcixgemy ; /usr/bin/python3
Nov 28 08:28:30 np0005538513.localdomain sudo[79529]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 08:28:30 np0005538513.localdomain python3[79531]: ansible-ansible.legacy.dnf Invoked with name=['sos'] state=latest allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 08:28:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:28:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:28:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:28:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:28:31 np0005538513.localdomain systemd[1]: tmp-crun.x95Rv2.mount: Deactivated successfully.
Nov 28 08:28:31 np0005538513.localdomain podman[79543]: 2025-11-28 08:28:31.888222849 +0000 UTC m=+0.108715346 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4)
Nov 28 08:28:31 np0005538513.localdomain podman[79543]: 2025-11-28 08:28:31.934263723 +0000 UTC m=+0.154756240 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 28 08:28:31 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:28:31 np0005538513.localdomain podman[79534]: 2025-11-28 08:28:31.935662235 +0000 UTC m=+0.163032742 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Nov 28 08:28:32 np0005538513.localdomain podman[79535]: 2025-11-28 08:28:32.035874802 +0000 UTC m=+0.258206355 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, name=rhosp17/openstack-cron, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:28:32 np0005538513.localdomain podman[79533]: 2025-11-28 08:28:31.990484128 +0000 UTC m=+0.218862546 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, distribution-scope=public)
Nov 28 08:28:32 np0005538513.localdomain podman[79533]: 2025-11-28 08:28:32.069373683 +0000 UTC m=+0.297752101 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, release=1761123044, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:28:32 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:28:32 np0005538513.localdomain podman[79535]: 2025-11-28 08:28:32.094745147 +0000 UTC m=+0.317076680 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:28:32 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:28:32 np0005538513.localdomain podman[79534]: 2025-11-28 08:28:32.277355686 +0000 UTC m=+0.504726283 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z)
Nov 28 08:28:32 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:28:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:28:34 np0005538513.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 08:28:34 np0005538513.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 28 08:28:34 np0005538513.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 08:28:34 np0005538513.localdomain systemd[1]: tmp-crun.clpOO2.mount: Deactivated successfully.
Nov 28 08:28:34 np0005538513.localdomain podman[79637]: 2025-11-28 08:28:34.155206932 +0000 UTC m=+0.118773954 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vcs-type=git, container_name=nova_compute, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Nov 28 08:28:34 np0005538513.localdomain podman[79637]: 2025-11-28 08:28:34.181596926 +0000 UTC m=+0.145163898 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, version=17.1.12, container_name=nova_compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step5, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:28:34 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:28:34 np0005538513.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 08:28:34 np0005538513.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 28 08:28:34 np0005538513.localdomain systemd[1]: run-r232545fe677e499cba963d77f85a7852.service: Deactivated successfully.
Nov 28 08:28:34 np0005538513.localdomain systemd[1]: run-r8502e199661141c28b05d34d678c42bb.service: Deactivated successfully.
Nov 28 08:28:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:28:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 2400.1 total, 600.0 interval
                                                          Cumulative writes: 4541 writes, 20K keys, 4541 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4541 writes, 459 syncs, 9.89 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 08:28:34 np0005538513.localdomain sudo[79529]: pam_unix(sudo:session): session closed for user root
Nov 28 08:28:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:28:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:28:37 np0005538513.localdomain podman[79807]: 2025-11-28 08:28:37.846925902 +0000 UTC m=+0.083037943 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Nov 28 08:28:37 np0005538513.localdomain podman[79808]: 2025-11-28 08:28:37.901053563 +0000 UTC m=+0.135972768 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, architecture=x86_64)
Nov 28 08:28:37 np0005538513.localdomain podman[79807]: 2025-11-28 08:28:37.920439434 +0000 UTC m=+0.156551495 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:28:37 np0005538513.localdomain podman[79808]: 2025-11-28 08:28:37.929330845 +0000 UTC m=+0.164250090 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:28:37 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:28:37 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:28:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:28:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 2400.3 total, 600.0 interval
                                                          Cumulative writes: 5030 writes, 22K keys, 5030 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5030 writes, 563 syncs, 8.93 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 08:28:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:28:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:28:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:28:57 np0005538513.localdomain podman[79860]: 2025-11-28 08:28:57.846258922 +0000 UTC m=+0.074788273 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, container_name=metrics_qdr, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:28:57 np0005538513.localdomain systemd[1]: tmp-crun.A7H2oI.mount: Deactivated successfully.
Nov 28 08:28:57 np0005538513.localdomain podman[79858]: 2025-11-28 08:28:57.945588091 +0000 UTC m=+0.180167056 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, distribution-scope=public, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, container_name=iscsid, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:28:57 np0005538513.localdomain podman[79859]: 2025-11-28 08:28:57.924684254 +0000 UTC m=+0.156877506 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, tcib_managed=true, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:28:57 np0005538513.localdomain podman[79858]: 2025-11-28 08:28:57.981403793 +0000 UTC m=+0.215982758 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, version=17.1.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, container_name=iscsid)
Nov 28 08:28:57 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:28:58 np0005538513.localdomain podman[79859]: 2025-11-28 08:28:58.007318593 +0000 UTC m=+0.239511825 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container)
Nov 28 08:28:58 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:28:58 np0005538513.localdomain podman[79860]: 2025-11-28 08:28:58.092453729 +0000 UTC m=+0.320983070 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, architecture=x86_64, distribution-scope=public, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc.)
Nov 28 08:28:58 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:28:58 np0005538513.localdomain systemd[1]: tmp-crun.F2wSEb.mount: Deactivated successfully.
Nov 28 08:29:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:29:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:29:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:29:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:29:02 np0005538513.localdomain podman[79971]: 2025-11-28 08:29:02.866146765 +0000 UTC m=+0.098071432 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=nova_migration_target, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 08:29:02 np0005538513.localdomain podman[79972]: 2025-11-28 08:29:02.910767145 +0000 UTC m=+0.140676470 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, tcib_managed=true, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:29:02 np0005538513.localdomain podman[79972]: 2025-11-28 08:29:02.919142501 +0000 UTC m=+0.149051826 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step4, name=rhosp17/openstack-cron, container_name=logrotate_crond, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container)
Nov 28 08:29:02 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:29:02 np0005538513.localdomain podman[79970]: 2025-11-28 08:29:02.971162028 +0000 UTC m=+0.203831808 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com)
Nov 28 08:29:03 np0005538513.localdomain systemd[1]: tmp-crun.9vgNBG.mount: Deactivated successfully.
Nov 28 08:29:03 np0005538513.localdomain podman[79973]: 2025-11-28 08:29:03.019571564 +0000 UTC m=+0.244832237 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1)
Nov 28 08:29:03 np0005538513.localdomain podman[79970]: 2025-11-28 08:29:03.033506129 +0000 UTC m=+0.266175929 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 08:29:03 np0005538513.localdomain podman[79973]: 2025-11-28 08:29:03.050360112 +0000 UTC m=+0.275620835 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, io.buildah.version=1.41.4)
Nov 28 08:29:03 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:29:03 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:29:03 np0005538513.localdomain podman[79971]: 2025-11-28 08:29:03.218333455 +0000 UTC m=+0.450258072 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:29:03 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:29:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:29:04 np0005538513.localdomain podman[80063]: 2025-11-28 08:29:04.868372053 +0000 UTC m=+0.102970611 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, distribution-scope=public, config_id=tripleo_step5, vcs-type=git, url=https://www.redhat.com, container_name=nova_compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=)
Nov 28 08:29:04 np0005538513.localdomain podman[80063]: 2025-11-28 08:29:04.900447132 +0000 UTC m=+0.135045670 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:29:04 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:29:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:29:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:29:08 np0005538513.localdomain systemd[1]: tmp-crun.PImKxF.mount: Deactivated successfully.
Nov 28 08:29:08 np0005538513.localdomain podman[80090]: 2025-11-28 08:29:08.8474857 +0000 UTC m=+0.085356205 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64)
Nov 28 08:29:08 np0005538513.localdomain podman[80091]: 2025-11-28 08:29:08.897098442 +0000 UTC m=+0.131036857 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 28 08:29:08 np0005538513.localdomain podman[80090]: 2025-11-28 08:29:08.900596849 +0000 UTC m=+0.138467364 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git)
Nov 28 08:29:08 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:29:08 np0005538513.localdomain podman[80091]: 2025-11-28 08:29:08.920392463 +0000 UTC m=+0.154330898 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, release=1761123044, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, config_id=tripleo_step4)
Nov 28 08:29:08 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:29:09 np0005538513.localdomain sudo[80149]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hhxizjmwuxdjueqfgemwctlvpknrnzbt ; /usr/bin/python3
Nov 28 08:29:09 np0005538513.localdomain sudo[80149]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 08:29:09 np0005538513.localdomain python3[80151]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhel-9-for-x86_64-baseos-eus-rpms --disable rhel-9-for-x86_64-appstream-eus-rpms --disable rhel-9-for-x86_64-highavailability-eus-rpms --disable openstack-17.1-for-rhel-9-x86_64-rpms --disable fast-datapath-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 08:29:12 np0005538513.localdomain rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 08:29:16 np0005538513.localdomain sudo[80149]: pam_unix(sudo:session): session closed for user root
Nov 28 08:29:25 np0005538513.localdomain sudo[80340]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:29:25 np0005538513.localdomain sudo[80340]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:29:25 np0005538513.localdomain sudo[80340]: pam_unix(sudo:session): session closed for user root
Nov 28 08:29:25 np0005538513.localdomain sudo[80355]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 08:29:25 np0005538513.localdomain sudo[80355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:29:25 np0005538513.localdomain sudo[80355]: pam_unix(sudo:session): session closed for user root
Nov 28 08:29:25 np0005538513.localdomain sudo[80391]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:29:25 np0005538513.localdomain sudo[80391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:29:25 np0005538513.localdomain sudo[80391]: pam_unix(sudo:session): session closed for user root
Nov 28 08:29:25 np0005538513.localdomain sudo[80406]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:29:25 np0005538513.localdomain sudo[80406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:29:26 np0005538513.localdomain sudo[80406]: pam_unix(sudo:session): session closed for user root
Nov 28 08:29:27 np0005538513.localdomain sudo[80452]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:29:27 np0005538513.localdomain sudo[80452]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:29:27 np0005538513.localdomain sudo[80452]: pam_unix(sudo:session): session closed for user root
Nov 28 08:29:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:29:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:29:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:29:28 np0005538513.localdomain podman[80467]: 2025-11-28 08:29:28.855666106 +0000 UTC m=+0.091662957 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, version=17.1.12, container_name=iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 28 08:29:28 np0005538513.localdomain systemd[1]: tmp-crun.9Nzz1p.mount: Deactivated successfully.
Nov 28 08:29:28 np0005538513.localdomain podman[80468]: 2025-11-28 08:29:28.912323374 +0000 UTC m=+0.147987175 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=collectd, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, name=rhosp17/openstack-collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc.)
Nov 28 08:29:28 np0005538513.localdomain podman[80469]: 2025-11-28 08:29:28.955574632 +0000 UTC m=+0.188355935 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, release=1761123044, managed_by=tripleo_ansible)
Nov 28 08:29:28 np0005538513.localdomain podman[80467]: 2025-11-28 08:29:28.970718224 +0000 UTC m=+0.206715045 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, config_id=tripleo_step3, release=1761123044, distribution-scope=public, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:29:28 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:29:29 np0005538513.localdomain podman[80468]: 2025-11-28 08:29:29.024321099 +0000 UTC m=+0.259984900 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:29:29 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:29:29 np0005538513.localdomain podman[80469]: 2025-11-28 08:29:29.14437247 +0000 UTC m=+0.377153763 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20251118.1)
Nov 28 08:29:29 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:29:33 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:29:33 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:29:33 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:29:33 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:29:33 np0005538513.localdomain systemd[1]: tmp-crun.JKFMo5.mount: Deactivated successfully.
Nov 28 08:29:33 np0005538513.localdomain podman[80539]: 2025-11-28 08:29:33.860696676 +0000 UTC m=+0.091264423 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, release=1761123044, container_name=logrotate_crond, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, io.openshift.expose-services=)
Nov 28 08:29:33 np0005538513.localdomain podman[80539]: 2025-11-28 08:29:33.903359978 +0000 UTC m=+0.133927665 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public)
Nov 28 08:29:33 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:29:33 np0005538513.localdomain podman[80537]: 2025-11-28 08:29:33.903157792 +0000 UTC m=+0.140725043 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12)
Nov 28 08:29:34 np0005538513.localdomain podman[80538]: 2025-11-28 08:29:33.956192919 +0000 UTC m=+0.189337255 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, batch=17.1_20251118.1)
Nov 28 08:29:34 np0005538513.localdomain podman[80545]: 2025-11-28 08:29:34.106697098 +0000 UTC m=+0.333949064 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-19T00:12:45Z, vcs-type=git, distribution-scope=public)
Nov 28 08:29:34 np0005538513.localdomain podman[80537]: 2025-11-28 08:29:34.124710538 +0000 UTC m=+0.362277799 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, architecture=x86_64, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible)
Nov 28 08:29:34 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:29:34 np0005538513.localdomain podman[80545]: 2025-11-28 08:29:34.160423047 +0000 UTC m=+0.387674973 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, url=https://www.redhat.com, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Nov 28 08:29:34 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:29:34 np0005538513.localdomain podman[80538]: 2025-11-28 08:29:34.323468519 +0000 UTC m=+0.556612835 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4)
Nov 28 08:29:34 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:29:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:29:35 np0005538513.localdomain podman[80628]: 2025-11-28 08:29:35.841354028 +0000 UTC m=+0.082634331 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, release=1761123044, tcib_managed=true, container_name=nova_compute, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:29:35 np0005538513.localdomain podman[80628]: 2025-11-28 08:29:35.872448017 +0000 UTC m=+0.113728260 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5)
Nov 28 08:29:35 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:29:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:29:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:29:39 np0005538513.localdomain systemd[1]: tmp-crun.UjkoC3.mount: Deactivated successfully.
Nov 28 08:29:39 np0005538513.localdomain podman[80654]: 2025-11-28 08:29:39.844892499 +0000 UTC m=+0.081752955 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, distribution-scope=public)
Nov 28 08:29:39 np0005538513.localdomain podman[80655]: 2025-11-28 08:29:39.894071068 +0000 UTC m=+0.127944453 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:29:39 np0005538513.localdomain podman[80654]: 2025-11-28 08:29:39.945958041 +0000 UTC m=+0.182818467 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:29:39 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:29:39 np0005538513.localdomain podman[80655]: 2025-11-28 08:29:39.99777852 +0000 UTC m=+0.231651955 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z)
Nov 28 08:29:40 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:29:41 np0005538513.localdomain sshd[80704]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:29:45 np0005538513.localdomain sshd[80704]: Invalid user admin from 85.133.250.80 port 50016
Nov 28 08:29:46 np0005538513.localdomain sshd[80704]: Connection closed by invalid user admin 85.133.250.80 port 50016 [preauth]
Nov 28 08:29:46 np0005538513.localdomain sshd[80706]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:29:52 np0005538513.localdomain sshd[80708]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:29:52 np0005538513.localdomain sshd[80708]: Invalid user solana from 193.32.162.146 port 39900
Nov 28 08:29:52 np0005538513.localdomain sshd[80708]: Connection closed by invalid user solana 193.32.162.146 port 39900 [preauth]
Nov 28 08:29:55 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:29:55 np0005538513.localdomain recover_tripleo_nova_virtqemud[80711]: 61397
Nov 28 08:29:55 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:29:55 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:29:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:29:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:29:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:29:59 np0005538513.localdomain systemd[1]: tmp-crun.2cK36M.mount: Deactivated successfully.
Nov 28 08:29:59 np0005538513.localdomain podman[80732]: 2025-11-28 08:29:59.907576748 +0000 UTC m=+0.143603270 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-18T23:44:13Z)
Nov 28 08:29:59 np0005538513.localdomain podman[80733]: 2025-11-28 08:29:59.92405594 +0000 UTC m=+0.157285387 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-collectd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:29:59 np0005538513.localdomain podman[80733]: 2025-11-28 08:29:59.930500797 +0000 UTC m=+0.163730264 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:29:59 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:29:59 np0005538513.localdomain podman[80732]: 2025-11-28 08:29:59.944339449 +0000 UTC m=+0.180365931 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, release=1761123044, config_id=tripleo_step3, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:29:59 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:29:59 np0005538513.localdomain podman[80734]: 2025-11-28 08:29:59.909497797 +0000 UTC m=+0.139190386 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:30:00 np0005538513.localdomain podman[80734]: 2025-11-28 08:30:00.085277337 +0000 UTC m=+0.314969956 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, managed_by=tripleo_ansible)
Nov 28 08:30:00 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:30:00 np0005538513.localdomain systemd[1]: tmp-crun.FbXjve.mount: Deactivated successfully.
Nov 28 08:30:04 np0005538513.localdomain sudo[80834]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klxltuoubxnvuiuziklfuaaolzrgwaay ; /usr/bin/python3
Nov 28 08:30:04 np0005538513.localdomain sudo[80834]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 08:30:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:30:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:30:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:30:04 np0005538513.localdomain podman[80838]: 2025-11-28 08:30:04.428223737 +0000 UTC m=+0.087064506 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:30:04 np0005538513.localdomain podman[80838]: 2025-11-28 08:30:04.440184402 +0000 UTC m=+0.099025131 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:30:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:30:04 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:30:04 np0005538513.localdomain python3[80836]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhceph-7-tools-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 08:30:04 np0005538513.localdomain systemd[1]: tmp-crun.gMFxF5.mount: Deactivated successfully.
Nov 28 08:30:04 np0005538513.localdomain podman[80837]: 2025-11-28 08:30:04.53589756 +0000 UTC m=+0.197066240 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:30:04 np0005538513.localdomain podman[80877]: 2025-11-28 08:30:04.547078931 +0000 UTC m=+0.096683129 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z)
Nov 28 08:30:04 np0005538513.localdomain podman[80839]: 2025-11-28 08:30:04.505126132 +0000 UTC m=+0.158057091 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:30:04 np0005538513.localdomain podman[80837]: 2025-11-28 08:30:04.58735108 +0000 UTC m=+0.248519790 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, release=1761123044)
Nov 28 08:30:04 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:30:04 np0005538513.localdomain podman[80839]: 2025-11-28 08:30:04.638364715 +0000 UTC m=+0.291295765 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step4, release=1761123044, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com)
Nov 28 08:30:04 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:30:04 np0005538513.localdomain podman[80877]: 2025-11-28 08:30:04.943388357 +0000 UTC m=+0.492992615 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Nov 28 08:30:04 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:30:05 np0005538513.localdomain systemd[1]: tmp-crun.JIdA4d.mount: Deactivated successfully.
Nov 28 08:30:05 np0005538513.localdomain sshd[80706]: Invalid user orangepi from 85.133.250.80 port 50028
Nov 28 08:30:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:30:06 np0005538513.localdomain podman[80932]: 2025-11-28 08:30:06.048212669 +0000 UTC m=+0.075638207 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, config_id=tripleo_step5, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Nov 28 08:30:06 np0005538513.localdomain podman[80932]: 2025-11-28 08:30:06.079408671 +0000 UTC m=+0.106834279 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:30:06 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:30:06 np0005538513.localdomain sshd[80706]: Connection closed by invalid user orangepi 85.133.250.80 port 50028 [preauth]
Nov 28 08:30:07 np0005538513.localdomain sshd[80959]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:30:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:30:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:30:10 np0005538513.localdomain podman[81080]: 2025-11-28 08:30:10.843554446 +0000 UTC m=+0.079938299 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:30:10 np0005538513.localdomain podman[81080]: 2025-11-28 08:30:10.892494408 +0000 UTC m=+0.128878261 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:30:10 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:30:10 np0005538513.localdomain podman[81081]: 2025-11-28 08:30:10.897412298 +0000 UTC m=+0.131052488 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4)
Nov 28 08:30:10 np0005538513.localdomain podman[81081]: 2025-11-28 08:30:10.979338046 +0000 UTC m=+0.212978196 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, version=17.1.12, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, config_id=tripleo_step4, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:30:10 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:30:11 np0005538513.localdomain rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 08:30:12 np0005538513.localdomain rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 08:30:13 np0005538513.localdomain sshd[80959]: Connection closed by authenticating user root 85.133.250.80 port 40642 [preauth]
Nov 28 08:30:15 np0005538513.localdomain sshd[81134]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:30:18 np0005538513.localdomain sudo[80834]: pam_unix(sudo:session): session closed for user root
Nov 28 08:30:18 np0005538513.localdomain sshd[81134]: Connection closed by authenticating user root 85.133.250.80 port 33396 [preauth]
Nov 28 08:30:19 np0005538513.localdomain sshd[81194]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:30:27 np0005538513.localdomain sudo[81196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:30:27 np0005538513.localdomain sudo[81196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:30:27 np0005538513.localdomain sudo[81196]: pam_unix(sudo:session): session closed for user root
Nov 28 08:30:27 np0005538513.localdomain sudo[81211]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:30:27 np0005538513.localdomain sudo[81211]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:30:27 np0005538513.localdomain sshd[81194]: Connection closed by authenticating user root 85.133.250.80 port 33408 [preauth]
Nov 28 08:30:28 np0005538513.localdomain sudo[81211]: pam_unix(sudo:session): session closed for user root
Nov 28 08:30:28 np0005538513.localdomain sudo[81257]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:30:28 np0005538513.localdomain sudo[81257]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:30:28 np0005538513.localdomain sudo[81257]: pam_unix(sudo:session): session closed for user root
Nov 28 08:30:29 np0005538513.localdomain sshd[81272]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:30:30 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:30:30 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:30:30 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:30:30 np0005538513.localdomain podman[81274]: 2025-11-28 08:30:30.861929006 +0000 UTC m=+0.098669830 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, config_id=tripleo_step3, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com)
Nov 28 08:30:30 np0005538513.localdomain systemd[1]: tmp-crun.5ETw2l.mount: Deactivated successfully.
Nov 28 08:30:30 np0005538513.localdomain podman[81274]: 2025-11-28 08:30:30.911365164 +0000 UTC m=+0.148105968 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:30:30 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:30:30 np0005538513.localdomain podman[81276]: 2025-11-28 08:30:30.964381659 +0000 UTC m=+0.196068690 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, container_name=metrics_qdr, distribution-scope=public, build-date=2025-11-18T22:49:46Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Nov 28 08:30:30 np0005538513.localdomain podman[81275]: 2025-11-28 08:30:30.917116169 +0000 UTC m=+0.151610185 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, tcib_managed=true, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd)
Nov 28 08:30:30 np0005538513.localdomain podman[81275]: 2025-11-28 08:30:30.997349725 +0000 UTC m=+0.231843741 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, version=17.1.12, distribution-scope=public, build-date=2025-11-18T22:51:28Z, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:30:31 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:30:31 np0005538513.localdomain podman[81276]: 2025-11-28 08:30:31.170558037 +0000 UTC m=+0.402245138 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12)
Nov 28 08:30:31 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:30:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:30:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:30:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:30:34 np0005538513.localdomain systemd[1]: tmp-crun.vA2V32.mount: Deactivated successfully.
Nov 28 08:30:34 np0005538513.localdomain podman[81344]: 2025-11-28 08:30:34.913697535 +0000 UTC m=+0.137944048 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, release=1761123044, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi)
Nov 28 08:30:34 np0005538513.localdomain podman[81342]: 2025-11-28 08:30:34.878664118 +0000 UTC m=+0.108560612 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, version=17.1.12, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:30:34 np0005538513.localdomain podman[81344]: 2025-11-28 08:30:34.948861728 +0000 UTC m=+0.173108260 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=)
Nov 28 08:30:34 np0005538513.localdomain podman[81342]: 2025-11-28 08:30:34.96105285 +0000 UTC m=+0.190949384 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1761123044, container_name=ceilometer_agent_compute, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_id=tripleo_step4)
Nov 28 08:30:34 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:30:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:30:34 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:30:35 np0005538513.localdomain podman[81343]: 2025-11-28 08:30:35.026367061 +0000 UTC m=+0.251990815 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, container_name=logrotate_crond, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:30:35 np0005538513.localdomain podman[81343]: 2025-11-28 08:30:35.03352006 +0000 UTC m=+0.259143784 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:30:35 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:30:35 np0005538513.localdomain podman[81405]: 2025-11-28 08:30:35.089299411 +0000 UTC m=+0.092688478 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Nov 28 08:30:35 np0005538513.localdomain podman[81405]: 2025-11-28 08:30:35.455419526 +0000 UTC m=+0.458808523 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=nova_migration_target, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:30:35 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:30:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:30:36 np0005538513.localdomain podman[81436]: 2025-11-28 08:30:36.856363208 +0000 UTC m=+0.086453607 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-type=git, config_id=tripleo_step5, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible)
Nov 28 08:30:36 np0005538513.localdomain podman[81436]: 2025-11-28 08:30:36.879242726 +0000 UTC m=+0.109333115 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Nov 28 08:30:36 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:30:41 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:30:41 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:30:41 np0005538513.localdomain podman[81463]: 2025-11-28 08:30:41.837157519 +0000 UTC m=+0.078623108 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, container_name=ovn_metadata_agent, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, batch=17.1_20251118.1)
Nov 28 08:30:41 np0005538513.localdomain podman[81463]: 2025-11-28 08:30:41.889376971 +0000 UTC m=+0.130842520 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, release=1761123044)
Nov 28 08:30:41 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:30:41 np0005538513.localdomain podman[81464]: 2025-11-28 08:30:41.890349452 +0000 UTC m=+0.128740987 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., container_name=ovn_controller, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, version=17.1.12)
Nov 28 08:30:41 np0005538513.localdomain podman[81464]: 2025-11-28 08:30:41.969803824 +0000 UTC m=+0.208195349 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, batch=17.1_20251118.1, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12)
Nov 28 08:30:41 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:30:50 np0005538513.localdomain sshd[81272]: Connection closed by authenticating user root 85.133.250.80 port 36106 [preauth]
Nov 28 08:30:51 np0005538513.localdomain sshd[81510]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:30:52 np0005538513.localdomain python3[81525]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname
Nov 28 08:31:01 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:31:01 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:31:01 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:31:01 np0005538513.localdomain systemd[1]: tmp-crun.oNtBr0.mount: Deactivated successfully.
Nov 28 08:31:01 np0005538513.localdomain podman[81547]: 2025-11-28 08:31:01.851352081 +0000 UTC m=+0.088584623 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, architecture=x86_64, name=rhosp17/openstack-collectd, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, vcs-type=git)
Nov 28 08:31:01 np0005538513.localdomain podman[81547]: 2025-11-28 08:31:01.861749729 +0000 UTC m=+0.098982331 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:31:01 np0005538513.localdomain podman[81546]: 2025-11-28 08:31:01.887855514 +0000 UTC m=+0.129889341 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-iscsid-container, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:31:01 np0005538513.localdomain podman[81546]: 2025-11-28 08:31:01.89624094 +0000 UTC m=+0.138274747 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z)
Nov 28 08:31:01 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:31:01 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:31:02 np0005538513.localdomain podman[81548]: 2025-11-28 08:31:02.002640575 +0000 UTC m=+0.234482252 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, batch=17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, distribution-scope=public, build-date=2025-11-18T22:49:46Z)
Nov 28 08:31:02 np0005538513.localdomain podman[81548]: 2025-11-28 08:31:02.221367095 +0000 UTC m=+0.453208722 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:31:02 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:31:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:31:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:31:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:31:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:31:05 np0005538513.localdomain systemd[1]: tmp-crun.SydMrP.mount: Deactivated successfully.
Nov 28 08:31:05 np0005538513.localdomain podman[81639]: 2025-11-28 08:31:05.836410578 +0000 UTC m=+0.064351654 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., container_name=logrotate_crond, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, version=17.1.12, config_id=tripleo_step4, name=rhosp17/openstack-cron)
Nov 28 08:31:05 np0005538513.localdomain podman[81638]: 2025-11-28 08:31:05.899103739 +0000 UTC m=+0.126946122 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, release=1761123044, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:31:05 np0005538513.localdomain podman[81640]: 2025-11-28 08:31:05.868076343 +0000 UTC m=+0.091117210 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public)
Nov 28 08:31:05 np0005538513.localdomain podman[81640]: 2025-11-28 08:31:05.954380025 +0000 UTC m=+0.177420862 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true)
Nov 28 08:31:05 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:31:05 np0005538513.localdomain podman[81639]: 2025-11-28 08:31:05.969756213 +0000 UTC m=+0.197697269 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.expose-services=, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:31:05 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:31:06 np0005538513.localdomain podman[81637]: 2025-11-28 08:31:06.05686529 +0000 UTC m=+0.287452197 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z)
Nov 28 08:31:06 np0005538513.localdomain podman[81637]: 2025-11-28 08:31:06.109526666 +0000 UTC m=+0.340113603 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1)
Nov 28 08:31:06 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:31:06 np0005538513.localdomain podman[81638]: 2025-11-28 08:31:06.257281422 +0000 UTC m=+0.485123745 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public)
Nov 28 08:31:06 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:31:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:31:07 np0005538513.localdomain podman[81732]: 2025-11-28 08:31:07.877257083 +0000 UTC m=+0.114351179 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:31:07 np0005538513.localdomain podman[81732]: 2025-11-28 08:31:07.92960456 +0000 UTC m=+0.166698656 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=nova_compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc.)
Nov 28 08:31:07 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:31:12 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:31:12 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:31:12 np0005538513.localdomain systemd[1]: tmp-crun.XLpkcs.mount: Deactivated successfully.
Nov 28 08:31:12 np0005538513.localdomain podman[81759]: 2025-11-28 08:31:12.858872849 +0000 UTC m=+0.098041911 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=ovn_controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64)
Nov 28 08:31:12 np0005538513.localdomain podman[81759]: 2025-11-28 08:31:12.908601026 +0000 UTC m=+0.147770098 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, architecture=x86_64, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4)
Nov 28 08:31:12 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:31:12 np0005538513.localdomain podman[81758]: 2025-11-28 08:31:12.995927679 +0000 UTC m=+0.236318358 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 08:31:13 np0005538513.localdomain podman[81758]: 2025-11-28 08:31:13.041324503 +0000 UTC m=+0.281715192 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent)
Nov 28 08:31:13 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:31:25 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:31:25 np0005538513.localdomain recover_tripleo_nova_virtqemud[81806]: 61397
Nov 28 08:31:25 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:31:25 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:31:29 np0005538513.localdomain sudo[81807]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:31:29 np0005538513.localdomain sudo[81807]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:31:29 np0005538513.localdomain sudo[81807]: pam_unix(sudo:session): session closed for user root
Nov 28 08:31:29 np0005538513.localdomain sudo[81822]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 08:31:29 np0005538513.localdomain sudo[81822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:31:29 np0005538513.localdomain podman[81909]: 2025-11-28 08:31:29.913769857 +0000 UTC m=+0.093391678 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, distribution-scope=public, vendor=Red Hat, Inc., ceph=True, version=7, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, name=rhceph, release=553, RELEASE=main, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-type=git)
Nov 28 08:31:30 np0005538513.localdomain podman[81909]: 2025-11-28 08:31:30.042350169 +0000 UTC m=+0.221971970 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, description=Red Hat Ceph Storage 7, release=553, distribution-scope=public, CEPH_POINT_RELEASE=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, version=7, io.openshift.tags=rhceph ceph, vcs-type=git, ceph=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 08:31:30 np0005538513.localdomain sudo[81822]: pam_unix(sudo:session): session closed for user root
Nov 28 08:31:30 np0005538513.localdomain sudo[81976]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:31:30 np0005538513.localdomain sudo[81976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:31:30 np0005538513.localdomain sudo[81976]: pam_unix(sudo:session): session closed for user root
Nov 28 08:31:30 np0005538513.localdomain sudo[81991]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:31:30 np0005538513.localdomain sudo[81991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:31:31 np0005538513.localdomain sudo[81991]: pam_unix(sudo:session): session closed for user root
Nov 28 08:31:31 np0005538513.localdomain sudo[82039]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:31:31 np0005538513.localdomain sudo[82039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:31:31 np0005538513.localdomain sudo[82039]: pam_unix(sudo:session): session closed for user root
Nov 28 08:31:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:31:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:31:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:31:32 np0005538513.localdomain systemd[1]: tmp-crun.PbDM9a.mount: Deactivated successfully.
Nov 28 08:31:32 np0005538513.localdomain podman[82054]: 2025-11-28 08:31:32.863363577 +0000 UTC m=+0.096395661 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, release=1761123044, container_name=iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, tcib_managed=true)
Nov 28 08:31:32 np0005538513.localdomain podman[82054]: 2025-11-28 08:31:32.875529048 +0000 UTC m=+0.108561152 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.openshift.expose-services=, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, architecture=x86_64)
Nov 28 08:31:32 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:31:32 np0005538513.localdomain podman[82056]: 2025-11-28 08:31:32.923181951 +0000 UTC m=+0.151136110 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, tcib_managed=true, distribution-scope=public, version=17.1.12, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:31:32 np0005538513.localdomain podman[82055]: 2025-11-28 08:31:32.964197471 +0000 UTC m=+0.194997196 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, version=17.1.12, tcib_managed=true, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:31:32 np0005538513.localdomain podman[82055]: 2025-11-28 08:31:32.978458746 +0000 UTC m=+0.209258501 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, version=17.1.12, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-18T22:51:28Z)
Nov 28 08:31:32 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:31:33 np0005538513.localdomain podman[82056]: 2025-11-28 08:31:33.100914371 +0000 UTC m=+0.328868460 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com)
Nov 28 08:31:33 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:31:33 np0005538513.localdomain systemd[1]: tmp-crun.TciVpf.mount: Deactivated successfully.
Nov 28 08:31:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:31:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:31:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:31:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:31:36 np0005538513.localdomain systemd[1]: tmp-crun.kLofw2.mount: Deactivated successfully.
Nov 28 08:31:36 np0005538513.localdomain podman[82121]: 2025-11-28 08:31:36.860676537 +0000 UTC m=+0.094096240 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target)
Nov 28 08:31:36 np0005538513.localdomain systemd[1]: tmp-crun.V5LNCJ.mount: Deactivated successfully.
Nov 28 08:31:36 np0005538513.localdomain podman[82123]: 2025-11-28 08:31:36.902118111 +0000 UTC m=+0.130407959 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 08:31:36 np0005538513.localdomain podman[82123]: 2025-11-28 08:31:36.938772318 +0000 UTC m=+0.167062576 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi)
Nov 28 08:31:36 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:31:36 np0005538513.localdomain podman[82120]: 2025-11-28 08:31:36.954113166 +0000 UTC m=+0.189621603 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Nov 28 08:31:37 np0005538513.localdomain podman[82122]: 2025-11-28 08:31:37.001954926 +0000 UTC m=+0.232021427 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, batch=17.1_20251118.1, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:31:37 np0005538513.localdomain podman[82122]: 2025-11-28 08:31:37.007878806 +0000 UTC m=+0.237945317 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc.)
Nov 28 08:31:37 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:31:37 np0005538513.localdomain podman[82120]: 2025-11-28 08:31:37.053830887 +0000 UTC m=+0.289339274 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute)
Nov 28 08:31:37 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:31:37 np0005538513.localdomain podman[82121]: 2025-11-28 08:31:37.245383688 +0000 UTC m=+0.478803351 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Nov 28 08:31:37 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:31:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:31:38 np0005538513.localdomain podman[82216]: 2025-11-28 08:31:38.838436689 +0000 UTC m=+0.075715840 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible)
Nov 28 08:31:38 np0005538513.localdomain podman[82216]: 2025-11-28 08:31:38.891359503 +0000 UTC m=+0.128638654 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, container_name=nova_compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 08:31:38 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:31:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:31:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:31:43 np0005538513.localdomain podman[82243]: 2025-11-28 08:31:43.838111877 +0000 UTC m=+0.076451252 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public)
Nov 28 08:31:43 np0005538513.localdomain podman[82242]: 2025-11-28 08:31:43.894101125 +0000 UTC m=+0.134599636 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, config_id=tripleo_step4)
Nov 28 08:31:43 np0005538513.localdomain podman[82243]: 2025-11-28 08:31:43.919892361 +0000 UTC m=+0.158231756 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, release=1761123044)
Nov 28 08:31:43 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:31:43 np0005538513.localdomain podman[82242]: 2025-11-28 08:31:43.935950381 +0000 UTC m=+0.176448902 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1761123044)
Nov 28 08:31:43 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:31:52 np0005538513.localdomain sshd[79306]: Received disconnect from 38.102.83.114 port 46086:11: disconnected by user
Nov 28 08:31:52 np0005538513.localdomain sshd[79306]: Disconnected from user zuul 38.102.83.114 port 46086
Nov 28 08:31:52 np0005538513.localdomain sshd[79303]: pam_unix(sshd:session): session closed for user zuul
Nov 28 08:31:52 np0005538513.localdomain systemd[1]: session-34.scope: Deactivated successfully.
Nov 28 08:31:52 np0005538513.localdomain systemd[1]: session-34.scope: Consumed 18.385s CPU time.
Nov 28 08:31:52 np0005538513.localdomain systemd-logind[764]: Session 34 logged out. Waiting for processes to exit.
Nov 28 08:31:52 np0005538513.localdomain systemd-logind[764]: Removed session 34.
Nov 28 08:31:52 np0005538513.localdomain sshd[81510]: Connection closed by 85.133.250.80 port 60654 [preauth]
Nov 28 08:32:01 np0005538513.localdomain sshd[82289]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:32:01 np0005538513.localdomain sshd[82289]: Invalid user solana from 193.32.162.146 port 54702
Nov 28 08:32:01 np0005538513.localdomain sshd[82289]: Connection closed by invalid user solana 193.32.162.146 port 54702 [preauth]
Nov 28 08:32:03 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:32:03 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:32:03 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:32:03 np0005538513.localdomain systemd[1]: tmp-crun.rO5FiW.mount: Deactivated successfully.
Nov 28 08:32:03 np0005538513.localdomain podman[82336]: 2025-11-28 08:32:03.862654033 +0000 UTC m=+0.096981559 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, container_name=iscsid, io.buildah.version=1.41.4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:32:03 np0005538513.localdomain podman[82337]: 2025-11-28 08:32:03.907443029 +0000 UTC m=+0.139459004 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, managed_by=tripleo_ansible, container_name=collectd)
Nov 28 08:32:03 np0005538513.localdomain podman[82337]: 2025-11-28 08:32:03.917617689 +0000 UTC m=+0.149633704 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git)
Nov 28 08:32:03 np0005538513.localdomain podman[82336]: 2025-11-28 08:32:03.925230991 +0000 UTC m=+0.159558567 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:32:03 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:32:03 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:32:04 np0005538513.localdomain podman[82338]: 2025-11-28 08:32:04.008009926 +0000 UTC m=+0.237787403 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, io.openshift.expose-services=)
Nov 28 08:32:04 np0005538513.localdomain podman[82338]: 2025-11-28 08:32:04.206325293 +0000 UTC m=+0.436102710 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:32:04 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:32:04 np0005538513.localdomain systemd[1]: tmp-crun.uzs0tc.mount: Deactivated successfully.
Nov 28 08:32:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:32:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:32:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:32:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:32:07 np0005538513.localdomain podman[82405]: 2025-11-28 08:32:07.857661803 +0000 UTC m=+0.090051797 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute)
Nov 28 08:32:07 np0005538513.localdomain podman[82406]: 2025-11-28 08:32:07.916687003 +0000 UTC m=+0.144349343 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:32:07 np0005538513.localdomain podman[82405]: 2025-11-28 08:32:07.917424605 +0000 UTC m=+0.149814609 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 08:32:07 np0005538513.localdomain podman[82407]: 2025-11-28 08:32:07.964927474 +0000 UTC m=+0.189315804 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, url=https://www.redhat.com, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-cron)
Nov 28 08:32:07 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:32:08 np0005538513.localdomain podman[82413]: 2025-11-28 08:32:08.025945175 +0000 UTC m=+0.245932821 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:32:08 np0005538513.localdomain podman[82413]: 2025-11-28 08:32:08.094615989 +0000 UTC m=+0.314603625 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com)
Nov 28 08:32:08 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:32:08 np0005538513.localdomain podman[82407]: 2025-11-28 08:32:08.11041428 +0000 UTC m=+0.334802580 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, version=17.1.12, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:32:08 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:32:08 np0005538513.localdomain podman[82406]: 2025-11-28 08:32:08.29728962 +0000 UTC m=+0.524951910 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, managed_by=tripleo_ansible, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:32:08 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:32:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:32:09 np0005538513.localdomain systemd[1]: tmp-crun.biG08W.mount: Deactivated successfully.
Nov 28 08:32:09 np0005538513.localdomain podman[82500]: 2025-11-28 08:32:09.846663578 +0000 UTC m=+0.085058974 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, tcib_managed=true, vcs-type=git, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, release=1761123044, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:32:09 np0005538513.localdomain podman[82500]: 2025-11-28 08:32:09.904516983 +0000 UTC m=+0.142912349 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, tcib_managed=true, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:32:09 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:32:14 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:32:14 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:32:14 np0005538513.localdomain podman[82525]: 2025-11-28 08:32:14.851121521 +0000 UTC m=+0.083501067 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, config_id=tripleo_step4, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.)
Nov 28 08:32:14 np0005538513.localdomain podman[82525]: 2025-11-28 08:32:14.906402667 +0000 UTC m=+0.138782193 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:32:14 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:32:14 np0005538513.localdomain podman[82524]: 2025-11-28 08:32:14.909535182 +0000 UTC m=+0.142893218 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Nov 28 08:32:14 np0005538513.localdomain podman[82524]: 2025-11-28 08:32:14.994441002 +0000 UTC m=+0.227799028 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public)
Nov 28 08:32:15 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:32:31 np0005538513.localdomain sudo[82571]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:32:31 np0005538513.localdomain sudo[82571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:32:31 np0005538513.localdomain sudo[82571]: pam_unix(sudo:session): session closed for user root
Nov 28 08:32:32 np0005538513.localdomain sudo[82586]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:32:32 np0005538513.localdomain sudo[82586]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:32:32 np0005538513.localdomain sudo[82586]: pam_unix(sudo:session): session closed for user root
Nov 28 08:32:33 np0005538513.localdomain sudo[82633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:32:33 np0005538513.localdomain sudo[82633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:32:33 np0005538513.localdomain sudo[82633]: pam_unix(sudo:session): session closed for user root
Nov 28 08:32:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:32:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:32:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:32:34 np0005538513.localdomain podman[82648]: 2025-11-28 08:32:34.851765001 +0000 UTC m=+0.088763539 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, name=rhosp17/openstack-iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1)
Nov 28 08:32:34 np0005538513.localdomain podman[82648]: 2025-11-28 08:32:34.888786119 +0000 UTC m=+0.125784637 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, version=17.1.12, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:32:34 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:32:34 np0005538513.localdomain podman[82649]: 2025-11-28 08:32:34.912012637 +0000 UTC m=+0.145500268 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:32:34 np0005538513.localdomain podman[82649]: 2025-11-28 08:32:34.920852798 +0000 UTC m=+0.154340489 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, config_id=tripleo_step3, release=1761123044, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z)
Nov 28 08:32:34 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:32:35 np0005538513.localdomain podman[82650]: 2025-11-28 08:32:35.017310619 +0000 UTC m=+0.247332293 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd)
Nov 28 08:32:35 np0005538513.localdomain podman[82650]: 2025-11-28 08:32:35.215419861 +0000 UTC m=+0.445441555 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, container_name=metrics_qdr, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:32:35 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:32:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:32:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:32:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:32:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:32:38 np0005538513.localdomain podman[82719]: 2025-11-28 08:32:38.860167349 +0000 UTC m=+0.088775229 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, release=1761123044, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi)
Nov 28 08:32:38 np0005538513.localdomain podman[82716]: 2025-11-28 08:32:38.906898253 +0000 UTC m=+0.140841866 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, version=17.1.12, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step4, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:32:38 np0005538513.localdomain podman[82718]: 2025-11-28 08:32:38.956571788 +0000 UTC m=+0.188869650 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, container_name=logrotate_crond, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container)
Nov 28 08:32:38 np0005538513.localdomain podman[82719]: 2025-11-28 08:32:38.960269112 +0000 UTC m=+0.188877062 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:32:38 np0005538513.localdomain podman[82718]: 2025-11-28 08:32:38.97040527 +0000 UTC m=+0.202703152 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, container_name=logrotate_crond, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1)
Nov 28 08:32:38 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:32:38 np0005538513.localdomain podman[82716]: 2025-11-28 08:32:38.983121858 +0000 UTC m=+0.217065431 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute)
Nov 28 08:32:38 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:32:38 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:32:39 np0005538513.localdomain podman[82717]: 2025-11-28 08:32:39.06974023 +0000 UTC m=+0.302864268 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, batch=17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.4)
Nov 28 08:32:39 np0005538513.localdomain podman[82717]: 2025-11-28 08:32:39.443217639 +0000 UTC m=+0.676341697 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Nov 28 08:32:39 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:32:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:32:40 np0005538513.localdomain systemd[1]: tmp-crun.yBKgeH.mount: Deactivated successfully.
Nov 28 08:32:40 np0005538513.localdomain podman[82815]: 2025-11-28 08:32:40.854088014 +0000 UTC m=+0.093220774 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, version=17.1.12, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, io.buildah.version=1.41.4, config_id=tripleo_step5)
Nov 28 08:32:40 np0005538513.localdomain podman[82815]: 2025-11-28 08:32:40.884452079 +0000 UTC m=+0.123584829 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vcs-type=git)
Nov 28 08:32:40 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:32:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:32:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:32:45 np0005538513.localdomain systemd[1]: tmp-crun.9W834D.mount: Deactivated successfully.
Nov 28 08:32:45 np0005538513.localdomain podman[82841]: 2025-11-28 08:32:45.866998805 +0000 UTC m=+0.095464453 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, batch=17.1_20251118.1)
Nov 28 08:32:45 np0005538513.localdomain podman[82842]: 2025-11-28 08:32:45.909733458 +0000 UTC m=+0.135596856 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step4, batch=17.1_20251118.1, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:32:45 np0005538513.localdomain podman[82841]: 2025-11-28 08:32:45.918401032 +0000 UTC m=+0.146866690 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Nov 28 08:32:45 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:32:45 np0005538513.localdomain podman[82842]: 2025-11-28 08:32:45.963487737 +0000 UTC m=+0.189351155 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, tcib_managed=true)
Nov 28 08:32:45 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:33:04 np0005538513.localdomain sudo[83309]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova.conf --privsep_context vif_plug_ovs.privsep.vif_plug --privsep_sock_path /tmp/tmpj7yx_939/privsep.sock
Nov 28 08:33:04 np0005538513.localdomain systemd-logind[764]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 28 08:33:04 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:33:04 np0005538513.localdomain systemd[1]: Created slice User Slice of UID 0.
Nov 28 08:33:04 np0005538513.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 28 08:33:04 np0005538513.localdomain recover_tripleo_nova_virtqemud[83311]: 61397
Nov 28 08:33:04 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:33:04 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:33:04 np0005538513.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 28 08:33:04 np0005538513.localdomain systemd[1]: Starting User Manager for UID 0...
Nov 28 08:33:04 np0005538513.localdomain systemd[83313]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:33:04 np0005538513.localdomain systemd[83313]: Queued start job for default target Main User Target.
Nov 28 08:33:04 np0005538513.localdomain systemd[83313]: Created slice User Application Slice.
Nov 28 08:33:04 np0005538513.localdomain systemd[83313]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 28 08:33:04 np0005538513.localdomain systemd[83313]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 08:33:04 np0005538513.localdomain systemd[83313]: Reached target Paths.
Nov 28 08:33:04 np0005538513.localdomain systemd[83313]: Reached target Timers.
Nov 28 08:33:04 np0005538513.localdomain systemd[83313]: Starting D-Bus User Message Bus Socket...
Nov 28 08:33:04 np0005538513.localdomain systemd[83313]: Starting Create User's Volatile Files and Directories...
Nov 28 08:33:04 np0005538513.localdomain systemd[83313]: Finished Create User's Volatile Files and Directories.
Nov 28 08:33:04 np0005538513.localdomain systemd[83313]: Listening on D-Bus User Message Bus Socket.
Nov 28 08:33:04 np0005538513.localdomain systemd[83313]: Reached target Sockets.
Nov 28 08:33:04 np0005538513.localdomain systemd[83313]: Reached target Basic System.
Nov 28 08:33:04 np0005538513.localdomain systemd[83313]: Reached target Main User Target.
Nov 28 08:33:04 np0005538513.localdomain systemd[83313]: Startup finished in 151ms.
Nov 28 08:33:04 np0005538513.localdomain systemd[1]: Started User Manager for UID 0.
Nov 28 08:33:04 np0005538513.localdomain systemd[1]: Started Session c11 of User root.
Nov 28 08:33:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:33:04 np0005538513.localdomain sudo[83309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Nov 28 08:33:05 np0005538513.localdomain systemd[1]: tmp-crun.LRZehy.mount: Deactivated successfully.
Nov 28 08:33:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:33:05 np0005538513.localdomain podman[83328]: 2025-11-28 08:33:05.031207357 +0000 UTC m=+0.107135617 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vcs-type=git, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:33:05 np0005538513.localdomain podman[83328]: 2025-11-28 08:33:05.073550459 +0000 UTC m=+0.149478769 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, release=1761123044, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:33:05 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:33:05 np0005538513.localdomain podman[83345]: 2025-11-28 08:33:05.131506706 +0000 UTC m=+0.091044448 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team)
Nov 28 08:33:05 np0005538513.localdomain podman[83345]: 2025-11-28 08:33:05.143301766 +0000 UTC m=+0.102839498 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc.)
Nov 28 08:33:05 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:33:05 np0005538513.localdomain sudo[83309]: pam_unix(sudo:session): session closed for user root
Nov 28 08:33:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:33:05 np0005538513.localdomain podman[83370]: 2025-11-28 08:33:05.692623978 +0000 UTC m=+0.092027188 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., config_id=tripleo_step1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:33:05 np0005538513.localdomain podman[83370]: 2025-11-28 08:33:05.909386607 +0000 UTC m=+0.308789747 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd)
Nov 28 08:33:05 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:33:06 np0005538513.localdomain kernel: tun: Universal TUN/TAP device driver, 1.6
Nov 28 08:33:06 np0005538513.localdomain kernel: device tap09612b07-51 entered promiscuous mode
Nov 28 08:33:06 np0005538513.localdomain NetworkManager[5967]: <info>  [1764318786.0180] manager: (tap09612b07-51): new Tun device (/org/freedesktop/NetworkManager/Devices/13)
Nov 28 08:33:06 np0005538513.localdomain systemd-udevd[83414]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 08:33:06 np0005538513.localdomain NetworkManager[5967]: <info>  [1764318786.0377] device (tap09612b07-51): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Nov 28 08:33:06 np0005538513.localdomain NetworkManager[5967]: <info>  [1764318786.0382] device (tap09612b07-51): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Nov 28 08:33:06 np0005538513.localdomain systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 28 08:33:06 np0005538513.localdomain systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 28 08:33:06 np0005538513.localdomain systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 28 08:33:06 np0005538513.localdomain systemd-machined[83422]: New machine qemu-1-instance-00000002.
Nov 28 08:33:06 np0005538513.localdomain systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Nov 28 08:33:06 np0005538513.localdomain NetworkManager[5967]: <info>  [1764318786.3216] manager: (tap40d5da59-60): new Veth device (/org/freedesktop/NetworkManager/Devices/14)
Nov 28 08:33:06 np0005538513.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap40d5da59-61: link becomes ready
Nov 28 08:33:06 np0005538513.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap40d5da59-60: link becomes ready
Nov 28 08:33:06 np0005538513.localdomain NetworkManager[5967]: <info>  [1764318786.3841] device (tap40d5da59-60): carrier: link connected
Nov 28 08:33:06 np0005538513.localdomain kernel: device tap40d5da59-60 entered promiscuous mode
Nov 28 08:33:07 np0005538513.localdomain sudo[83521]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/bin/neutron-rootwrap /etc/neutron/rootwrap.conf ip netns exec ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e haproxy -f /var/lib/neutron/ovn-metadata-proxy/40d5da59-6201-424a-8380-80ecc3d67c7e.conf
Nov 28 08:33:07 np0005538513.localdomain sudo[83521]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Nov 28 08:33:08 np0005538513.localdomain podman[83546]: 2025-11-28 08:33:08.216968179 +0000 UTC m=+0.096365960 container create 9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 28 08:33:08 np0005538513.localdomain systemd[1]: Started libpod-conmon-9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef.scope.
Nov 28 08:33:08 np0005538513.localdomain podman[83546]: 2025-11-28 08:33:08.172621116 +0000 UTC m=+0.052018947 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Nov 28 08:33:08 np0005538513.localdomain systemd[1]: tmp-crun.jr6fta.mount: Deactivated successfully.
Nov 28 08:33:08 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:33:08 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b264a93705d5a28ba8f902d268499c1bea32890d992fb54a7c6890490d1eeb3f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 08:33:08 np0005538513.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 28 08:33:08 np0005538513.localdomain podman[83546]: 2025-11-28 08:33:08.36620294 +0000 UTC m=+0.245600741 container init 9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:33:08 np0005538513.localdomain podman[83546]: 2025-11-28 08:33:08.373167953 +0000 UTC m=+0.252565764 container start 9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc.)
Nov 28 08:33:08 np0005538513.localdomain sudo[83521]: pam_unix(sudo:session): session closed for user root
Nov 28 08:33:08 np0005538513.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 28 08:33:08 np0005538513.localdomain systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 28 08:33:08 np0005538513.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 28 08:33:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:33:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:33:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:33:09 np0005538513.localdomain podman[83578]: 2025-11-28 08:33:09.083006789 +0000 UTC m=+0.071796840 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-18T22:49:32Z)
Nov 28 08:33:09 np0005538513.localdomain podman[83580]: 2025-11-28 08:33:09.154906062 +0000 UTC m=+0.137292488 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=ceilometer_agent_compute)
Nov 28 08:33:09 np0005538513.localdomain podman[83578]: 2025-11-28 08:33:09.173385876 +0000 UTC m=+0.162175957 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, config_id=tripleo_step4, container_name=logrotate_crond, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc.)
Nov 28 08:33:09 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:33:09 np0005538513.localdomain podman[83579]: 2025-11-28 08:33:09.207182587 +0000 UTC m=+0.191695158 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible)
Nov 28 08:33:09 np0005538513.localdomain podman[83580]: 2025-11-28 08:33:09.228124255 +0000 UTC m=+0.210510631 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:33:09 np0005538513.localdomain podman[83579]: 2025-11-28 08:33:09.23487334 +0000 UTC m=+0.219385971 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, architecture=x86_64, config_id=tripleo_step4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 08:33:09 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:33:09 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:33:09 np0005538513.localdomain setroubleshoot[83564]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count. For complete SELinux messages run: sealert -l 96d97920-1546-4f45-b9c9-d0d51c7a6a1d
Nov 28 08:33:09 np0005538513.localdomain setroubleshoot[83564]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count.
                                                                
                                                                *****  Plugin qemu_file_image (98.8 confidence) suggests   *******************
                                                                
                                                                If max_map_count is a virtualization target
                                                                Then you need to change the label on max_map_count'
                                                                Do
                                                                # semanage fcontext -a -t virt_image_t 'max_map_count'
                                                                # restorecon -v 'max_map_count'
                                                                
                                                                *****  Plugin catchall (2.13 confidence) suggests   **************************
                                                                
                                                                If you believe that qemu-kvm should be allowed read access on the max_map_count file by default.
                                                                Then you should report this as a bug.
                                                                You can generate a local policy module to allow this access.
                                                                Do
                                                                allow this access for now by executing:
                                                                # ausearch -c 'qemu-kvm' --raw | audit2allow -M my-qemukvm
                                                                # semodule -X 300 -i my-qemukvm.pp
                                                                
Nov 28 08:33:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:33:09 np0005538513.localdomain systemd[1]: tmp-crun.YFp2kM.mount: Deactivated successfully.
Nov 28 08:33:09 np0005538513.localdomain podman[83650]: 2025-11-28 08:33:09.854560809 +0000 UTC m=+0.093271246 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=nova_migration_target, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git)
Nov 28 08:33:10 np0005538513.localdomain podman[83650]: 2025-11-28 08:33:10.284730256 +0000 UTC m=+0.523440713 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:33:10 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:33:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:33:11 np0005538513.localdomain systemd[1]: tmp-crun.8RvJjY.mount: Deactivated successfully.
Nov 28 08:33:11 np0005538513.localdomain podman[83673]: 2025-11-28 08:33:11.837390785 +0000 UTC m=+0.073524553 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:33:11 np0005538513.localdomain podman[83673]: 2025-11-28 08:33:11.866341819 +0000 UTC m=+0.102475607 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, name=rhosp17/openstack-nova-compute, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:33:11 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:33:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:33:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:33:16 np0005538513.localdomain systemd[1]: tmp-crun.Ayd8YV.mount: Deactivated successfully.
Nov 28 08:33:16 np0005538513.localdomain podman[83703]: 2025-11-28 08:33:16.860514968 +0000 UTC m=+0.094355728 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step4, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc.)
Nov 28 08:33:16 np0005538513.localdomain systemd[1]: tmp-crun.SWxAQw.mount: Deactivated successfully.
Nov 28 08:33:16 np0005538513.localdomain podman[83703]: 2025-11-28 08:33:16.909874333 +0000 UTC m=+0.143715093 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:33:16 np0005538513.localdomain podman[83704]: 2025-11-28 08:33:16.918569518 +0000 UTC m=+0.148002815 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller)
Nov 28 08:33:16 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:33:16 np0005538513.localdomain podman[83704]: 2025-11-28 08:33:16.969393548 +0000 UTC m=+0.198826845 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 08:33:16 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:33:19 np0005538513.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 28 08:33:19 np0005538513.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 28 08:33:25 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58086 [28/Nov/2025:08:33:24.020] listener listener/metadata 0/0/0/1251/1251 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Nov 28 08:33:25 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58092 [28/Nov/2025:08:33:25.351] listener listener/metadata 0/0/0/13/13 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Nov 28 08:33:25 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58102 [28/Nov/2025:08:33:25.410] listener listener/metadata 0/0/0/14/14 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Nov 28 08:33:25 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58108 [28/Nov/2025:08:33:25.465] listener listener/metadata 0/0/0/13/13 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Nov 28 08:33:25 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58114 [28/Nov/2025:08:33:25.516] listener listener/metadata 0/0/0/12/12 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Nov 28 08:33:25 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58120 [28/Nov/2025:08:33:25.569] listener listener/metadata 0/0/0/13/13 200 133 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Nov 28 08:33:25 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58122 [28/Nov/2025:08:33:25.621] listener listener/metadata 0/0/0/11/11 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Nov 28 08:33:25 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58138 [28/Nov/2025:08:33:25.683] listener listener/metadata 0/0/0/12/12 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Nov 28 08:33:25 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58148 [28/Nov/2025:08:33:25.735] listener listener/metadata 0/0/0/13/13 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Nov 28 08:33:25 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58156 [28/Nov/2025:08:33:25.791] listener listener/metadata 0/0/0/13/13 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Nov 28 08:33:25 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58166 [28/Nov/2025:08:33:25.845] listener listener/metadata 0/0/0/13/13 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Nov 28 08:33:25 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58170 [28/Nov/2025:08:33:25.887] listener listener/metadata 0/0/0/12/12 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Nov 28 08:33:25 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58180 [28/Nov/2025:08:33:25.929] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1"
Nov 28 08:33:26 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58190 [28/Nov/2025:08:33:26.012] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Nov 28 08:33:26 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58206 [28/Nov/2025:08:33:26.077] listener listener/metadata 0/0/0/12/12 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Nov 28 08:33:26 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58214 [28/Nov/2025:08:33:26.129] listener listener/metadata 0/0/0/14/14 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Nov 28 08:33:27 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Nov 28 08:33:31 np0005538513.localdomain sshd[83753]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:33:32 np0005538513.localdomain sshd[83753]: Received disconnect from 80.94.93.119 port 39772:11:  [preauth]
Nov 28 08:33:32 np0005538513.localdomain sshd[83753]: Disconnected from authenticating user root 80.94.93.119 port 39772 [preauth]
Nov 28 08:33:33 np0005538513.localdomain sudo[83755]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:33:33 np0005538513.localdomain sudo[83755]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:33:33 np0005538513.localdomain sudo[83755]: pam_unix(sudo:session): session closed for user root
Nov 28 08:33:33 np0005538513.localdomain sudo[83770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:33:33 np0005538513.localdomain sudo[83770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:33:35 np0005538513.localdomain sudo[83770]: pam_unix(sudo:session): session closed for user root
Nov 28 08:33:35 np0005538513.localdomain sudo[83817]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:33:35 np0005538513.localdomain sudo[83817]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:33:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:33:35 np0005538513.localdomain sudo[83817]: pam_unix(sudo:session): session closed for user root
Nov 28 08:33:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:33:35 np0005538513.localdomain podman[83833]: 2025-11-28 08:33:35.641587185 +0000 UTC m=+0.089967345 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step3, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1)
Nov 28 08:33:35 np0005538513.localdomain podman[83833]: 2025-11-28 08:33:35.653460937 +0000 UTC m=+0.101841087 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, container_name=collectd, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.)
Nov 28 08:33:35 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:33:35 np0005538513.localdomain systemd[1]: tmp-crun.C9GXzB.mount: Deactivated successfully.
Nov 28 08:33:35 np0005538513.localdomain podman[83832]: 2025-11-28 08:33:35.707077751 +0000 UTC m=+0.155356268 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, container_name=iscsid, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044)
Nov 28 08:33:35 np0005538513.localdomain podman[83832]: 2025-11-28 08:33:35.745532725 +0000 UTC m=+0.193811242 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-iscsid-container, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public)
Nov 28 08:33:35 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:33:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:33:36 np0005538513.localdomain systemd[1]: tmp-crun.5sk5ce.mount: Deactivated successfully.
Nov 28 08:33:36 np0005538513.localdomain podman[83871]: 2025-11-28 08:33:36.856674669 +0000 UTC m=+0.090767179 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, version=17.1.12, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:33:37 np0005538513.localdomain podman[83871]: 2025-11-28 08:33:37.072592704 +0000 UTC m=+0.306685194 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, release=1761123044)
Nov 28 08:33:37 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:33:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:33:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:33:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:33:39 np0005538513.localdomain systemd[1]: tmp-crun.qh4S22.mount: Deactivated successfully.
Nov 28 08:33:39 np0005538513.localdomain podman[83902]: 2025-11-28 08:33:39.857582534 +0000 UTC m=+0.092210043 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, url=https://www.redhat.com)
Nov 28 08:33:39 np0005538513.localdomain podman[83902]: 2025-11-28 08:33:39.890558739 +0000 UTC m=+0.125186248 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, architecture=x86_64, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, distribution-scope=public)
Nov 28 08:33:39 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:33:39 np0005538513.localdomain podman[83903]: 2025-11-28 08:33:39.908968691 +0000 UTC m=+0.140594038 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, version=17.1.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi)
Nov 28 08:33:40 np0005538513.localdomain podman[83901]: 2025-11-28 08:33:39.999638266 +0000 UTC m=+0.236674948 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible)
Nov 28 08:33:40 np0005538513.localdomain podman[83903]: 2025-11-28 08:33:40.017663146 +0000 UTC m=+0.249288493 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 28 08:33:40 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:33:40 np0005538513.localdomain podman[83901]: 2025-11-28 08:33:40.062490493 +0000 UTC m=+0.299527145 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, release=1761123044, distribution-scope=public, architecture=x86_64)
Nov 28 08:33:40 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:33:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:33:40 np0005538513.localdomain podman[83974]: 2025-11-28 08:33:40.845104658 +0000 UTC m=+0.078868036 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, architecture=x86_64, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:33:41 np0005538513.localdomain podman[83974]: 2025-11-28 08:33:41.243476097 +0000 UTC m=+0.477239485 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:33:41 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:33:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:33:42 np0005538513.localdomain systemd[1]: tmp-crun.d7AKF0.mount: Deactivated successfully.
Nov 28 08:33:42 np0005538513.localdomain podman[83998]: 2025-11-28 08:33:42.845096009 +0000 UTC m=+0.085291001 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, release=1761123044, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute)
Nov 28 08:33:42 np0005538513.localdomain podman[83998]: 2025-11-28 08:33:42.895338762 +0000 UTC m=+0.135533754 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, container_name=nova_compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Nov 28 08:33:42 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:33:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:33:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:33:47 np0005538513.localdomain podman[84024]: 2025-11-28 08:33:47.843595059 +0000 UTC m=+0.077582706 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc.)
Nov 28 08:33:47 np0005538513.localdomain systemd[1]: tmp-crun.yinZW2.mount: Deactivated successfully.
Nov 28 08:33:47 np0005538513.localdomain podman[84025]: 2025-11-28 08:33:47.917277037 +0000 UTC m=+0.146622783 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:33:47 np0005538513.localdomain podman[84025]: 2025-11-28 08:33:47.945526238 +0000 UTC m=+0.174872004 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:33:47 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:33:47 np0005538513.localdomain podman[84024]: 2025-11-28 08:33:47.971622014 +0000 UTC m=+0.205609631 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, container_name=ovn_metadata_agent, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible)
Nov 28 08:33:47 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:33:58 np0005538513.localdomain snmpd[66832]: empty variable list in _query
Nov 28 08:33:58 np0005538513.localdomain snmpd[66832]: empty variable list in _query
Nov 28 08:34:05 np0005538513.localdomain sshd[84098]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:34:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:34:05 np0005538513.localdomain sshd[84098]: Invalid user sol from 193.32.162.146 port 41292
Nov 28 08:34:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:34:05 np0005538513.localdomain systemd[1]: tmp-crun.6leuK8.mount: Deactivated successfully.
Nov 28 08:34:05 np0005538513.localdomain podman[84119]: 2025-11-28 08:34:05.868418746 +0000 UTC m=+0.100436064 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:34:05 np0005538513.localdomain podman[84119]: 2025-11-28 08:34:05.88396729 +0000 UTC m=+0.115984658 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:34:05 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:34:05 np0005538513.localdomain podman[84132]: 2025-11-28 08:34:05.937604735 +0000 UTC m=+0.081824586 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:34:05 np0005538513.localdomain sshd[84098]: Connection closed by invalid user sol 193.32.162.146 port 41292 [preauth]
Nov 28 08:34:05 np0005538513.localdomain podman[84132]: 2025-11-28 08:34:05.97348307 +0000 UTC m=+0.117702901 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:34:05 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:34:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:34:07 np0005538513.localdomain podman[84160]: 2025-11-28 08:34:07.851645416 +0000 UTC m=+0.086654934 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:34:08 np0005538513.localdomain podman[84160]: 2025-11-28 08:34:08.046633802 +0000 UTC m=+0.281643280 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:34:08 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:34:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:34:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:34:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:34:10 np0005538513.localdomain podman[84189]: 2025-11-28 08:34:10.855592292 +0000 UTC m=+0.090232653 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible)
Nov 28 08:34:10 np0005538513.localdomain systemd[1]: tmp-crun.eASN3L.mount: Deactivated successfully.
Nov 28 08:34:10 np0005538513.localdomain podman[84188]: 2025-11-28 08:34:10.915825429 +0000 UTC m=+0.153628985 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, release=1761123044, architecture=x86_64, container_name=ceilometer_agent_compute, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:34:10 np0005538513.localdomain podman[84189]: 2025-11-28 08:34:10.921355467 +0000 UTC m=+0.155995827 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.)
Nov 28 08:34:10 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:34:10 np0005538513.localdomain podman[84188]: 2025-11-28 08:34:10.946214676 +0000 UTC m=+0.184018252 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, tcib_managed=true)
Nov 28 08:34:10 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:34:11 np0005538513.localdomain podman[84190]: 2025-11-28 08:34:11.009164686 +0000 UTC m=+0.241056022 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:34:11 np0005538513.localdomain podman[84190]: 2025-11-28 08:34:11.067696361 +0000 UTC m=+0.299587647 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, version=17.1.12, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public)
Nov 28 08:34:11 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:34:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:34:11 np0005538513.localdomain podman[84259]: 2025-11-28 08:34:11.85075059 +0000 UTC m=+0.085289401 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, version=17.1.12, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=)
Nov 28 08:34:12 np0005538513.localdomain podman[84259]: 2025-11-28 08:34:12.242460576 +0000 UTC m=+0.476999337 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, container_name=nova_migration_target, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible)
Nov 28 08:34:12 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:34:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:34:13 np0005538513.localdomain systemd[1]: tmp-crun.QusPHg.mount: Deactivated successfully.
Nov 28 08:34:13 np0005538513.localdomain podman[84283]: 2025-11-28 08:34:13.845009666 +0000 UTC m=+0.082331292 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, tcib_managed=true, vcs-type=git, container_name=nova_compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:34:13 np0005538513.localdomain podman[84283]: 2025-11-28 08:34:13.90252218 +0000 UTC m=+0.139843756 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, config_id=tripleo_step5, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public)
Nov 28 08:34:13 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:34:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:34:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:34:18 np0005538513.localdomain podman[84313]: 2025-11-28 08:34:18.847989583 +0000 UTC m=+0.090124429 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com)
Nov 28 08:34:18 np0005538513.localdomain systemd[1]: tmp-crun.v7tumG.mount: Deactivated successfully.
Nov 28 08:34:18 np0005538513.localdomain podman[84313]: 2025-11-28 08:34:18.896696478 +0000 UTC m=+0.138831324 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, name=rhosp17/openstack-ovn-controller, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:34:18 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:34:18 np0005538513.localdomain podman[84312]: 2025-11-28 08:34:18.90493191 +0000 UTC m=+0.146746206 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-19T00:14:25Z, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 08:34:18 np0005538513.localdomain podman[84312]: 2025-11-28 08:34:18.98855951 +0000 UTC m=+0.230373826 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, container_name=ovn_metadata_agent, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=)
Nov 28 08:34:19 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:34:35 np0005538513.localdomain sudo[84361]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:34:35 np0005538513.localdomain sudo[84361]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:34:35 np0005538513.localdomain sudo[84361]: pam_unix(sudo:session): session closed for user root
Nov 28 08:34:35 np0005538513.localdomain sudo[84376]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:34:35 np0005538513.localdomain sudo[84376]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:34:36 np0005538513.localdomain sudo[84376]: pam_unix(sudo:session): session closed for user root
Nov 28 08:34:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:34:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:34:36 np0005538513.localdomain podman[84424]: 2025-11-28 08:34:36.86695362 +0000 UTC m=+0.099447763 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, container_name=collectd, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:34:36 np0005538513.localdomain podman[84423]: 2025-11-28 08:34:36.909311811 +0000 UTC m=+0.141799314 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:34:36 np0005538513.localdomain podman[84423]: 2025-11-28 08:34:36.920437771 +0000 UTC m=+0.152925314 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, release=1761123044, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, distribution-scope=public, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:34:36 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:34:36 np0005538513.localdomain podman[84424]: 2025-11-28 08:34:36.960856844 +0000 UTC m=+0.193350937 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:34:36 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:34:37 np0005538513.localdomain sudo[84462]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:34:37 np0005538513.localdomain sudo[84462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:34:37 np0005538513.localdomain sudo[84462]: pam_unix(sudo:session): session closed for user root
Nov 28 08:34:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:34:38 np0005538513.localdomain podman[84477]: 2025-11-28 08:34:38.843466795 +0000 UTC m=+0.081311571 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12)
Nov 28 08:34:39 np0005538513.localdomain podman[84477]: 2025-11-28 08:34:39.023371391 +0000 UTC m=+0.261216097 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:34:39 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:34:41 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:34:41 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:34:41 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:34:41 np0005538513.localdomain systemd[1]: tmp-crun.pwiLgB.mount: Deactivated successfully.
Nov 28 08:34:41 np0005538513.localdomain systemd[1]: tmp-crun.fje9kv.mount: Deactivated successfully.
Nov 28 08:34:41 np0005538513.localdomain podman[84507]: 2025-11-28 08:34:41.904471641 +0000 UTC m=+0.142160755 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, name=rhosp17/openstack-cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1)
Nov 28 08:34:41 np0005538513.localdomain podman[84507]: 2025-11-28 08:34:41.914069865 +0000 UTC m=+0.151758979 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:34:41 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:34:41 np0005538513.localdomain podman[84506]: 2025-11-28 08:34:41.869123214 +0000 UTC m=+0.108227262 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, tcib_managed=true, release=1761123044, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com)
Nov 28 08:34:42 np0005538513.localdomain podman[84506]: 2025-11-28 08:34:42.005625756 +0000 UTC m=+0.244729754 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, vcs-type=git, container_name=ceilometer_agent_compute, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true)
Nov 28 08:34:42 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:34:42 np0005538513.localdomain podman[84508]: 2025-11-28 08:34:42.058086336 +0000 UTC m=+0.289201740 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.expose-services=)
Nov 28 08:34:42 np0005538513.localdomain podman[84508]: 2025-11-28 08:34:42.113451605 +0000 UTC m=+0.344566929 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 08:34:42 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:34:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:34:42 np0005538513.localdomain podman[84578]: 2025-11-28 08:34:42.837841444 +0000 UTC m=+0.077950837 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, container_name=nova_migration_target, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z)
Nov 28 08:34:43 np0005538513.localdomain podman[84578]: 2025-11-28 08:34:43.193713358 +0000 UTC m=+0.433822721 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step4)
Nov 28 08:34:43 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:34:44 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:34:44 np0005538513.localdomain systemd[1]: tmp-crun.8j3sVr.mount: Deactivated successfully.
Nov 28 08:34:44 np0005538513.localdomain podman[84602]: 2025-11-28 08:34:44.843438126 +0000 UTC m=+0.076675758 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, container_name=nova_compute, release=1761123044)
Nov 28 08:34:44 np0005538513.localdomain podman[84602]: 2025-11-28 08:34:44.897738903 +0000 UTC m=+0.130976545 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Nov 28 08:34:44 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:34:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:34:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:34:49 np0005538513.localdomain podman[84630]: 2025-11-28 08:34:49.839087362 +0000 UTC m=+0.076276748 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 28 08:34:49 np0005538513.localdomain podman[84629]: 2025-11-28 08:34:49.897516244 +0000 UTC m=+0.137139834 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 28 08:34:49 np0005538513.localdomain podman[84630]: 2025-11-28 08:34:49.915796891 +0000 UTC m=+0.152986227 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public)
Nov 28 08:34:49 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:34:49 np0005538513.localdomain podman[84629]: 2025-11-28 08:34:49.970461958 +0000 UTC m=+0.210085578 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:34:49 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:35:05 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:35:05 np0005538513.localdomain recover_tripleo_nova_virtqemud[84695]: 61397
Nov 28 08:35:05 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:35:05 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:35:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:35:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:35:07 np0005538513.localdomain podman[84721]: 2025-11-28 08:35:07.848292329 +0000 UTC m=+0.080566903 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, vcs-type=git)
Nov 28 08:35:07 np0005538513.localdomain podman[84721]: 2025-11-28 08:35:07.862390811 +0000 UTC m=+0.094665385 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, container_name=iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:35:07 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:35:07 np0005538513.localdomain podman[84722]: 2025-11-28 08:35:07.953592357 +0000 UTC m=+0.186678697 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 28 08:35:07 np0005538513.localdomain podman[84722]: 2025-11-28 08:35:07.988547191 +0000 UTC m=+0.221633531 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.buildah.version=1.41.4, config_id=tripleo_step3)
Nov 28 08:35:08 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:35:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:35:09 np0005538513.localdomain systemd[1]: tmp-crun.S6rCgt.mount: Deactivated successfully.
Nov 28 08:35:09 np0005538513.localdomain podman[84760]: 2025-11-28 08:35:09.858462314 +0000 UTC m=+0.095017147 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:35:10 np0005538513.localdomain podman[84760]: 2025-11-28 08:35:10.043759186 +0000 UTC m=+0.280313989 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 28 08:35:10 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:35:12 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:35:12 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:35:12 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:35:12 np0005538513.localdomain podman[84791]: 2025-11-28 08:35:12.871856082 +0000 UTC m=+0.104930918 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:35:12 np0005538513.localdomain systemd[1]: tmp-crun.NgTO4n.mount: Deactivated successfully.
Nov 28 08:35:12 np0005538513.localdomain podman[84793]: 2025-11-28 08:35:12.910199631 +0000 UTC m=+0.139427377 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4)
Nov 28 08:35:12 np0005538513.localdomain podman[84791]: 2025-11-28 08:35:12.928562287 +0000 UTC m=+0.161637093 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com)
Nov 28 08:35:12 np0005538513.localdomain systemd[1]: tmp-crun.kywFKN.mount: Deactivated successfully.
Nov 28 08:35:12 np0005538513.localdomain podman[84792]: 2025-11-28 08:35:12.966146683 +0000 UTC m=+0.197076941 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:35:12 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:35:12 np0005538513.localdomain podman[84793]: 2025-11-28 08:35:12.993262763 +0000 UTC m=+0.222490479 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:35:13 np0005538513.localdomain podman[84792]: 2025-11-28 08:35:12.99988686 +0000 UTC m=+0.230817098 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, batch=17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step4, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=)
Nov 28 08:35:13 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:35:13 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:35:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:35:13 np0005538513.localdomain podman[84863]: 2025-11-28 08:35:13.84014064 +0000 UTC m=+0.079011025 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com)
Nov 28 08:35:14 np0005538513.localdomain podman[84863]: 2025-11-28 08:35:14.177040509 +0000 UTC m=+0.415910924 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 08:35:14 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:35:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:35:15 np0005538513.localdomain podman[84887]: 2025-11-28 08:35:15.840110564 +0000 UTC m=+0.078506579 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git)
Nov 28 08:35:15 np0005538513.localdomain podman[84887]: 2025-11-28 08:35:15.89332831 +0000 UTC m=+0.131724315 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:35:15 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:35:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:35:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:35:20 np0005538513.localdomain systemd[1]: tmp-crun.JE1Vr0.mount: Deactivated successfully.
Nov 28 08:35:20 np0005538513.localdomain podman[84916]: 2025-11-28 08:35:20.843373668 +0000 UTC m=+0.079015045 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step4, url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:35:20 np0005538513.localdomain systemd[1]: tmp-crun.bfo6QS.mount: Deactivated successfully.
Nov 28 08:35:20 np0005538513.localdomain podman[84916]: 2025-11-28 08:35:20.899075743 +0000 UTC m=+0.134717070 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible)
Nov 28 08:35:20 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:35:20 np0005538513.localdomain podman[84915]: 2025-11-28 08:35:20.903359147 +0000 UTC m=+0.141733899 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, container_name=ovn_metadata_agent, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public)
Nov 28 08:35:20 np0005538513.localdomain podman[84915]: 2025-11-28 08:35:20.987393439 +0000 UTC m=+0.225768201 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team)
Nov 28 08:35:20 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:35:37 np0005538513.localdomain sudo[84965]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:35:37 np0005538513.localdomain sudo[84965]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:35:37 np0005538513.localdomain sudo[84965]: pam_unix(sudo:session): session closed for user root
Nov 28 08:35:37 np0005538513.localdomain sudo[84980]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:35:37 np0005538513.localdomain sudo[84980]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:35:38 np0005538513.localdomain sudo[84980]: pam_unix(sudo:session): session closed for user root
Nov 28 08:35:38 np0005538513.localdomain sudo[85027]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:35:38 np0005538513.localdomain sudo[85027]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:35:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:35:38 np0005538513.localdomain sudo[85027]: pam_unix(sudo:session): session closed for user root
Nov 28 08:35:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:35:38 np0005538513.localdomain systemd[1]: tmp-crun.e6oo8H.mount: Deactivated successfully.
Nov 28 08:35:38 np0005538513.localdomain podman[85042]: 2025-11-28 08:35:38.806614956 +0000 UTC m=+0.069898729 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step3, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:35:38 np0005538513.localdomain podman[85043]: 2025-11-28 08:35:38.862714163 +0000 UTC m=+0.123206399 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step3, vcs-type=git, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team)
Nov 28 08:35:38 np0005538513.localdomain podman[85043]: 2025-11-28 08:35:38.871877041 +0000 UTC m=+0.132369327 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, container_name=collectd, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:35:38 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:35:38 np0005538513.localdomain podman[85042]: 2025-11-28 08:35:38.888552123 +0000 UTC m=+0.151835896 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-iscsid-container, release=1761123044, io.openshift.expose-services=, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:35:38 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:35:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:35:40 np0005538513.localdomain podman[85081]: 2025-11-28 08:35:40.846383526 +0000 UTC m=+0.083700131 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, container_name=metrics_qdr, version=17.1.12)
Nov 28 08:35:41 np0005538513.localdomain podman[85081]: 2025-11-28 08:35:41.034283801 +0000 UTC m=+0.271600416 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, io.openshift.expose-services=, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4)
Nov 28 08:35:41 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:35:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:35:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:35:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:35:43 np0005538513.localdomain podman[85110]: 2025-11-28 08:35:43.853212539 +0000 UTC m=+0.084767535 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:35:43 np0005538513.localdomain systemd[1]: tmp-crun.S2hPWm.mount: Deactivated successfully.
Nov 28 08:35:43 np0005538513.localdomain podman[85111]: 2025-11-28 08:35:43.916566972 +0000 UTC m=+0.145163205 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, vcs-type=git, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:35:43 np0005538513.localdomain podman[85110]: 2025-11-28 08:35:43.931253303 +0000 UTC m=+0.162808339 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-19T00:11:48Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:35:43 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:35:43 np0005538513.localdomain podman[85111]: 2025-11-28 08:35:43.957487594 +0000 UTC m=+0.186083797 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:35:43 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:35:44 np0005538513.localdomain podman[85112]: 2025-11-28 08:35:44.006288502 +0000 UTC m=+0.232871833 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, version=17.1.12, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4)
Nov 28 08:35:44 np0005538513.localdomain podman[85112]: 2025-11-28 08:35:44.037405337 +0000 UTC m=+0.263988658 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, architecture=x86_64, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, release=1761123044, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.12)
Nov 28 08:35:44 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:35:44 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:35:44 np0005538513.localdomain podman[85180]: 2025-11-28 08:35:44.84773424 +0000 UTC m=+0.082456453 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, distribution-scope=public, version=17.1.12, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:35:45 np0005538513.localdomain podman[85180]: 2025-11-28 08:35:45.250505802 +0000 UTC m=+0.485228005 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1)
Nov 28 08:35:45 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:35:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:35:46 np0005538513.localdomain podman[85205]: 2025-11-28 08:35:46.841950184 +0000 UTC m=+0.075884647 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, version=17.1.12, release=1761123044, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:35:46 np0005538513.localdomain podman[85205]: 2025-11-28 08:35:46.875793693 +0000 UTC m=+0.109728106 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., tcib_managed=true, container_name=nova_compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_id=tripleo_step5, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:35:46 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:35:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:35:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:35:51 np0005538513.localdomain podman[85231]: 2025-11-28 08:35:51.852606161 +0000 UTC m=+0.084503107 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, container_name=ovn_metadata_agent)
Nov 28 08:35:51 np0005538513.localdomain systemd[1]: tmp-crun.uEwlXM.mount: Deactivated successfully.
Nov 28 08:35:51 np0005538513.localdomain podman[85232]: 2025-11-28 08:35:51.905720114 +0000 UTC m=+0.135091481 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=)
Nov 28 08:35:51 np0005538513.localdomain podman[85232]: 2025-11-28 08:35:51.928144886 +0000 UTC m=+0.157516293 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=)
Nov 28 08:35:51 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:35:51 np0005538513.localdomain podman[85231]: 2025-11-28 08:35:51.958452795 +0000 UTC m=+0.190349791 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:35:51 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:36:05 np0005538513.localdomain sshd[85280]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:36:06 np0005538513.localdomain sshd[85280]: Invalid user solv from 193.32.162.146 port 56082
Nov 28 08:36:06 np0005538513.localdomain sshd[85280]: Connection closed by invalid user solv 193.32.162.146 port 56082 [preauth]
Nov 28 08:36:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:36:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:36:09 np0005538513.localdomain podman[85327]: 2025-11-28 08:36:09.848081676 +0000 UTC m=+0.086694156 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, distribution-scope=public, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:36:09 np0005538513.localdomain podman[85327]: 2025-11-28 08:36:09.860413262 +0000 UTC m=+0.099025802 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step3, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:36:09 np0005538513.localdomain systemd[1]: tmp-crun.PNkeJO.mount: Deactivated successfully.
Nov 28 08:36:09 np0005538513.localdomain podman[85328]: 2025-11-28 08:36:09.906541667 +0000 UTC m=+0.144075952 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:36:09 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:36:09 np0005538513.localdomain podman[85328]: 2025-11-28 08:36:09.967689452 +0000 UTC m=+0.205223747 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step3, tcib_managed=true, com.redhat.component=openstack-collectd-container, distribution-scope=public, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:36:09 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:36:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:36:11 np0005538513.localdomain podman[85365]: 2025-11-28 08:36:11.84868169 +0000 UTC m=+0.084631070 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, version=17.1.12, build-date=2025-11-18T22:49:46Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd)
Nov 28 08:36:12 np0005538513.localdomain podman[85365]: 2025-11-28 08:36:12.060272596 +0000 UTC m=+0.296221946 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team)
Nov 28 08:36:12 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:36:14 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:36:14 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:36:14 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:36:14 np0005538513.localdomain systemd[1]: tmp-crun.I1u0EE.mount: Deactivated successfully.
Nov 28 08:36:14 np0005538513.localdomain systemd[1]: tmp-crun.1NfuDn.mount: Deactivated successfully.
Nov 28 08:36:14 np0005538513.localdomain podman[85395]: 2025-11-28 08:36:14.857989249 +0000 UTC m=+0.095468381 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, release=1761123044, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Nov 28 08:36:14 np0005538513.localdomain podman[85395]: 2025-11-28 08:36:14.865104732 +0000 UTC m=+0.102583894 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container)
Nov 28 08:36:14 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:36:14 np0005538513.localdomain podman[85394]: 2025-11-28 08:36:14.830387725 +0000 UTC m=+0.073670748 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, container_name=ceilometer_agent_compute)
Nov 28 08:36:14 np0005538513.localdomain podman[85394]: 2025-11-28 08:36:14.910523474 +0000 UTC m=+0.153806537 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:36:14 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:36:14 np0005538513.localdomain podman[85397]: 2025-11-28 08:36:14.953331994 +0000 UTC m=+0.187140410 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-11-19T00:12:45Z, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 28 08:36:14 np0005538513.localdomain podman[85397]: 2025-11-28 08:36:14.979461683 +0000 UTC m=+0.213270069 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi)
Nov 28 08:36:14 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:36:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:36:15 np0005538513.localdomain podman[85469]: 2025-11-28 08:36:15.843082535 +0000 UTC m=+0.079335135 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:36:16 np0005538513.localdomain podman[85469]: 2025-11-28 08:36:16.238963401 +0000 UTC m=+0.475216031 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, version=17.1.12, architecture=x86_64, release=1761123044, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:36:16 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:36:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:36:17 np0005538513.localdomain podman[85492]: 2025-11-28 08:36:17.848246712 +0000 UTC m=+0.086746747 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, release=1761123044, architecture=x86_64, distribution-scope=public, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Nov 28 08:36:17 np0005538513.localdomain podman[85492]: 2025-11-28 08:36:17.881432982 +0000 UTC m=+0.119932967 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, url=https://www.redhat.com)
Nov 28 08:36:17 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:36:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:36:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:36:22 np0005538513.localdomain systemd[1]: tmp-crun.VAYe1d.mount: Deactivated successfully.
Nov 28 08:36:22 np0005538513.localdomain podman[85518]: 2025-11-28 08:36:22.833155093 +0000 UTC m=+0.075348770 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-type=git, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:36:22 np0005538513.localdomain systemd[1]: tmp-crun.VWUuyc.mount: Deactivated successfully.
Nov 28 08:36:22 np0005538513.localdomain podman[85517]: 2025-11-28 08:36:22.874162527 +0000 UTC m=+0.118556943 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:36:22 np0005538513.localdomain podman[85518]: 2025-11-28 08:36:22.881337822 +0000 UTC m=+0.123531449 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1761123044, vcs-type=git, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com)
Nov 28 08:36:22 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:36:22 np0005538513.localdomain podman[85517]: 2025-11-28 08:36:22.916381529 +0000 UTC m=+0.160775935 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, container_name=ovn_metadata_agent, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, batch=17.1_20251118.1, vcs-type=git)
Nov 28 08:36:22 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:36:38 np0005538513.localdomain sudo[85564]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:36:38 np0005538513.localdomain sudo[85564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:36:38 np0005538513.localdomain sudo[85564]: pam_unix(sudo:session): session closed for user root
Nov 28 08:36:39 np0005538513.localdomain sudo[85579]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:36:39 np0005538513.localdomain sudo[85579]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:36:39 np0005538513.localdomain sudo[85579]: pam_unix(sudo:session): session closed for user root
Nov 28 08:36:40 np0005538513.localdomain sudo[85625]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:36:40 np0005538513.localdomain sudo[85625]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:36:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:36:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:36:40 np0005538513.localdomain sudo[85625]: pam_unix(sudo:session): session closed for user root
Nov 28 08:36:40 np0005538513.localdomain systemd[1]: tmp-crun.WghwZx.mount: Deactivated successfully.
Nov 28 08:36:40 np0005538513.localdomain podman[85641]: 2025-11-28 08:36:40.404336041 +0000 UTC m=+0.078375525 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3)
Nov 28 08:36:40 np0005538513.localdomain podman[85639]: 2025-11-28 08:36:40.465754748 +0000 UTC m=+0.140571337 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, architecture=x86_64, version=17.1.12, container_name=iscsid, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:36:40 np0005538513.localdomain podman[85641]: 2025-11-28 08:36:40.495167291 +0000 UTC m=+0.169206835 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true)
Nov 28 08:36:40 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:36:40 np0005538513.localdomain podman[85639]: 2025-11-28 08:36:40.55275363 +0000 UTC m=+0.227570259 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step3, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, container_name=iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:36:40 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:36:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:36:42 np0005538513.localdomain podman[85679]: 2025-11-28 08:36:42.848587622 +0000 UTC m=+0.087512849 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:36:43 np0005538513.localdomain podman[85679]: 2025-11-28 08:36:43.035112455 +0000 UTC m=+0.274037692 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12)
Nov 28 08:36:43 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:36:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:36:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:36:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:36:45 np0005538513.localdomain podman[85709]: 2025-11-28 08:36:45.840617387 +0000 UTC m=+0.080332517 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, version=17.1.12, tcib_managed=true, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, distribution-scope=public, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:36:45 np0005538513.localdomain podman[85709]: 2025-11-28 08:36:45.893318164 +0000 UTC m=+0.133033294 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Nov 28 08:36:45 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:36:45 np0005538513.localdomain systemd[1]: tmp-crun.xBUOpH.mount: Deactivated successfully.
Nov 28 08:36:45 np0005538513.localdomain podman[85711]: 2025-11-28 08:36:45.919333481 +0000 UTC m=+0.152894339 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, config_id=tripleo_step4)
Nov 28 08:36:45 np0005538513.localdomain podman[85711]: 2025-11-28 08:36:45.945463884 +0000 UTC m=+0.179024752 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc.)
Nov 28 08:36:45 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:36:45 np0005538513.localdomain podman[85710]: 2025-11-28 08:36:45.962673898 +0000 UTC m=+0.198115034 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:36:45 np0005538513.localdomain podman[85710]: 2025-11-28 08:36:45.976301621 +0000 UTC m=+0.211742787 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, version=17.1.12)
Nov 28 08:36:45 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:36:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:36:46 np0005538513.localdomain podman[85780]: 2025-11-28 08:36:46.842572865 +0000 UTC m=+0.080032117 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, name=rhosp17/openstack-nova-compute, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:36:47 np0005538513.localdomain podman[85780]: 2025-11-28 08:36:47.215351932 +0000 UTC m=+0.452811184 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1)
Nov 28 08:36:47 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:36:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:36:48 np0005538513.localdomain podman[85803]: 2025-11-28 08:36:48.83742764 +0000 UTC m=+0.074480095 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:36:48 np0005538513.localdomain podman[85803]: 2025-11-28 08:36:48.88957533 +0000 UTC m=+0.126627755 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc.)
Nov 28 08:36:48 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:36:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:36:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:36:53 np0005538513.localdomain podman[85830]: 2025-11-28 08:36:53.853547925 +0000 UTC m=+0.091728249 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:36:53 np0005538513.localdomain podman[85829]: 2025-11-28 08:36:53.890224745 +0000 UTC m=+0.128240264 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1)
Nov 28 08:36:53 np0005538513.localdomain podman[85830]: 2025-11-28 08:36:53.903433324 +0000 UTC m=+0.141613648 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, io.openshift.expose-services=, vcs-type=git, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:36:53 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:36:53 np0005538513.localdomain podman[85829]: 2025-11-28 08:36:53.936406899 +0000 UTC m=+0.174422438 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Nov 28 08:36:53 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:37:05 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:37:05 np0005538513.localdomain recover_tripleo_nova_virtqemud[85875]: 61397
Nov 28 08:37:05 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:37:05 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:37:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:37:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:37:10 np0005538513.localdomain systemd[1]: tmp-crun.aDubYi.mount: Deactivated successfully.
Nov 28 08:37:10 np0005538513.localdomain podman[85920]: 2025-11-28 08:37:10.85739055 +0000 UTC m=+0.090239124 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, container_name=collectd, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Nov 28 08:37:10 np0005538513.localdomain systemd[1]: tmp-crun.oAg0GM.mount: Deactivated successfully.
Nov 28 08:37:10 np0005538513.localdomain podman[85919]: 2025-11-28 08:37:10.911775379 +0000 UTC m=+0.147325707 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible)
Nov 28 08:37:10 np0005538513.localdomain podman[85920]: 2025-11-28 08:37:10.917394304 +0000 UTC m=+0.150242868 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, distribution-scope=public, url=https://www.redhat.com)
Nov 28 08:37:10 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:37:10 np0005538513.localdomain podman[85919]: 2025-11-28 08:37:10.94722627 +0000 UTC m=+0.182776608 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step3, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:37:10 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:37:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:37:13 np0005538513.localdomain podman[85960]: 2025-11-28 08:37:13.855014198 +0000 UTC m=+0.080416388 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public)
Nov 28 08:37:14 np0005538513.localdomain podman[85960]: 2025-11-28 08:37:14.044811912 +0000 UTC m=+0.270214082 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20251118.1, container_name=metrics_qdr, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, architecture=x86_64, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com)
Nov 28 08:37:14 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:37:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:37:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:37:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:37:16 np0005538513.localdomain systemd[1]: tmp-crun.Z9Mkgq.mount: Deactivated successfully.
Nov 28 08:37:16 np0005538513.localdomain systemd[1]: tmp-crun.QSqJxM.mount: Deactivated successfully.
Nov 28 08:37:16 np0005538513.localdomain podman[85990]: 2025-11-28 08:37:16.912389622 +0000 UTC m=+0.140747113 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, distribution-scope=public, config_id=tripleo_step4, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron)
Nov 28 08:37:16 np0005538513.localdomain podman[85990]: 2025-11-28 08:37:16.919265396 +0000 UTC m=+0.147622847 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:37:16 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:37:16 np0005538513.localdomain podman[85989]: 2025-11-28 08:37:16.999089925 +0000 UTC m=+0.230442798 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public)
Nov 28 08:37:17 np0005538513.localdomain podman[85991]: 2025-11-28 08:37:16.876366774 +0000 UTC m=+0.103354221 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi)
Nov 28 08:37:17 np0005538513.localdomain podman[85989]: 2025-11-28 08:37:17.031234803 +0000 UTC m=+0.262587676 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Nov 28 08:37:17 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:37:17 np0005538513.localdomain podman[85991]: 2025-11-28 08:37:17.060768531 +0000 UTC m=+0.287755938 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, release=1761123044, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:37:17 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:37:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:37:17 np0005538513.localdomain podman[86061]: 2025-11-28 08:37:17.87408345 +0000 UTC m=+0.069992996 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, url=https://www.redhat.com, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:37:18 np0005538513.localdomain podman[86061]: 2025-11-28 08:37:18.238308272 +0000 UTC m=+0.434217818 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, version=17.1.12, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:37:18 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:37:19 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:37:19 np0005538513.localdomain systemd[1]: tmp-crun.RCJsnB.mount: Deactivated successfully.
Nov 28 08:37:19 np0005538513.localdomain podman[86085]: 2025-11-28 08:37:19.853409193 +0000 UTC m=+0.092811734 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true)
Nov 28 08:37:19 np0005538513.localdomain podman[86085]: 2025-11-28 08:37:19.905562862 +0000 UTC m=+0.144965393 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step5, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:37:19 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:37:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:37:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:37:24 np0005538513.localdomain podman[86111]: 2025-11-28 08:37:24.842778572 +0000 UTC m=+0.080004216 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 08:37:24 np0005538513.localdomain podman[86112]: 2025-11-28 08:37:24.896599183 +0000 UTC m=+0.131017480 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, container_name=ovn_controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, name=rhosp17/openstack-ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1)
Nov 28 08:37:24 np0005538513.localdomain podman[86111]: 2025-11-28 08:37:24.905414127 +0000 UTC m=+0.142639751 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64)
Nov 28 08:37:24 np0005538513.localdomain podman[86112]: 2025-11-28 08:37:24.91356343 +0000 UTC m=+0.147981707 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=ovn_controller)
Nov 28 08:37:24 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:37:24 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:37:40 np0005538513.localdomain sudo[86156]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:37:40 np0005538513.localdomain sudo[86156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:37:40 np0005538513.localdomain sudo[86156]: pam_unix(sudo:session): session closed for user root
Nov 28 08:37:40 np0005538513.localdomain sudo[86171]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:37:40 np0005538513.localdomain sudo[86171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:37:41 np0005538513.localdomain sudo[86171]: pam_unix(sudo:session): session closed for user root
Nov 28 08:37:41 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:37:41 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:37:41 np0005538513.localdomain podman[86219]: 2025-11-28 08:37:41.86929636 +0000 UTC m=+0.099567213 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, distribution-scope=public)
Nov 28 08:37:41 np0005538513.localdomain sudo[86242]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:37:41 np0005538513.localdomain systemd[1]: tmp-crun.kxMEaU.mount: Deactivated successfully.
Nov 28 08:37:41 np0005538513.localdomain sudo[86242]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:37:41 np0005538513.localdomain sudo[86242]: pam_unix(sudo:session): session closed for user root
Nov 28 08:37:41 np0005538513.localdomain podman[86218]: 2025-11-28 08:37:41.927194758 +0000 UTC m=+0.156934005 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4)
Nov 28 08:37:41 np0005538513.localdomain podman[86219]: 2025-11-28 08:37:41.932228925 +0000 UTC m=+0.162499758 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Nov 28 08:37:41 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:37:41 np0005538513.localdomain podman[86218]: 2025-11-28 08:37:41.965421356 +0000 UTC m=+0.195160553 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:37:41 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:37:44 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:37:44 np0005538513.localdomain systemd[1]: tmp-crun.0edv5f.mount: Deactivated successfully.
Nov 28 08:37:44 np0005538513.localdomain podman[86273]: 2025-11-28 08:37:44.832062555 +0000 UTC m=+0.073303287 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, version=17.1.12, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, release=1761123044, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:37:45 np0005538513.localdomain podman[86273]: 2025-11-28 08:37:45.025311177 +0000 UTC m=+0.266551879 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd)
Nov 28 08:37:45 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:37:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:37:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:37:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:37:47 np0005538513.localdomain systemd[1]: tmp-crun.aUMGuR.mount: Deactivated successfully.
Nov 28 08:37:47 np0005538513.localdomain podman[86303]: 2025-11-28 08:37:47.846563407 +0000 UTC m=+0.076564288 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, version=17.1.12, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container)
Nov 28 08:37:47 np0005538513.localdomain podman[86303]: 2025-11-28 08:37:47.858334623 +0000 UTC m=+0.088335534 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, container_name=logrotate_crond, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com)
Nov 28 08:37:47 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:37:47 np0005538513.localdomain podman[86302]: 2025-11-28 08:37:47.910983668 +0000 UTC m=+0.141875028 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Nov 28 08:37:47 np0005538513.localdomain podman[86302]: 2025-11-28 08:37:47.964697986 +0000 UTC m=+0.195589326 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, url=https://www.redhat.com)
Nov 28 08:37:47 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:37:48 np0005538513.localdomain podman[86304]: 2025-11-28 08:37:47.96898769 +0000 UTC m=+0.193719368 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, version=17.1.12, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1)
Nov 28 08:37:48 np0005538513.localdomain podman[86304]: 2025-11-28 08:37:48.053411712 +0000 UTC m=+0.278143410 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, release=1761123044, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team)
Nov 28 08:37:48 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:37:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:37:48 np0005538513.localdomain systemd[1]: tmp-crun.YB4Skz.mount: Deactivated successfully.
Nov 28 08:37:48 np0005538513.localdomain podman[86376]: 2025-11-28 08:37:48.843642564 +0000 UTC m=+0.082854155 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, tcib_managed=true, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:37:49 np0005538513.localdomain podman[86376]: 2025-11-28 08:37:49.21734095 +0000 UTC m=+0.456552491 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, tcib_managed=true, release=1761123044)
Nov 28 08:37:49 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:37:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:37:50 np0005538513.localdomain podman[86400]: 2025-11-28 08:37:50.840945525 +0000 UTC m=+0.078779617 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute)
Nov 28 08:37:50 np0005538513.localdomain podman[86400]: 2025-11-28 08:37:50.892849067 +0000 UTC m=+0.130683129 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, version=17.1.12, container_name=nova_compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044)
Nov 28 08:37:50 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:37:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:37:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:37:55 np0005538513.localdomain podman[86426]: 2025-11-28 08:37:55.859642273 +0000 UTC m=+0.087333723 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 08:37:55 np0005538513.localdomain podman[86427]: 2025-11-28 08:37:55.91392213 +0000 UTC m=+0.138658768 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 08:37:55 np0005538513.localdomain podman[86426]: 2025-11-28 08:37:55.931618989 +0000 UTC m=+0.159310469 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, tcib_managed=true, distribution-scope=public, release=1761123044, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, config_id=tripleo_step4, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team)
Nov 28 08:37:55 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:37:55 np0005538513.localdomain podman[86427]: 2025-11-28 08:37:55.990868369 +0000 UTC m=+0.215605037 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, vcs-type=git, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 08:37:56 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:38:05 np0005538513.localdomain systemd[83313]: Created slice User Background Tasks Slice.
Nov 28 08:38:05 np0005538513.localdomain systemd[83313]: Starting Cleanup of User's Temporary Files and Directories...
Nov 28 08:38:05 np0005538513.localdomain systemd[83313]: Finished Cleanup of User's Temporary Files and Directories.
Nov 28 08:38:12 np0005538513.localdomain sshd[86519]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:38:12 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:38:12 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:38:12 np0005538513.localdomain systemd[1]: tmp-crun.bRlNdJ.mount: Deactivated successfully.
Nov 28 08:38:12 np0005538513.localdomain podman[86522]: 2025-11-28 08:38:12.847355097 +0000 UTC m=+0.086409104 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., container_name=collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 28 08:38:12 np0005538513.localdomain podman[86522]: 2025-11-28 08:38:12.861247878 +0000 UTC m=+0.100301865 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=collectd, vcs-type=git, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd)
Nov 28 08:38:12 np0005538513.localdomain podman[86521]: 2025-11-28 08:38:12.88577417 +0000 UTC m=+0.124314011 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, container_name=iscsid)
Nov 28 08:38:12 np0005538513.localdomain podman[86521]: 2025-11-28 08:38:12.896248356 +0000 UTC m=+0.134788187 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step3, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:38:12 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:38:12 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:38:13 np0005538513.localdomain sshd[86519]: Invalid user solv from 193.32.162.146 port 42618
Nov 28 08:38:13 np0005538513.localdomain sshd[86519]: Connection closed by invalid user solv 193.32.162.146 port 42618 [preauth]
Nov 28 08:38:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:38:15 np0005538513.localdomain podman[86562]: 2025-11-28 08:38:15.83870339 +0000 UTC m=+0.077818687 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step1, vendor=Red Hat, Inc., vcs-type=git, release=1761123044)
Nov 28 08:38:16 np0005538513.localdomain podman[86562]: 2025-11-28 08:38:16.028420102 +0000 UTC m=+0.267535399 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 28 08:38:16 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:38:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:38:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:38:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:38:18 np0005538513.localdomain systemd[1]: tmp-crun.lNQiGh.mount: Deactivated successfully.
Nov 28 08:38:18 np0005538513.localdomain podman[86591]: 2025-11-28 08:38:18.855472823 +0000 UTC m=+0.095987362 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, distribution-scope=public, container_name=ceilometer_agent_compute)
Nov 28 08:38:18 np0005538513.localdomain podman[86591]: 2025-11-28 08:38:18.886393423 +0000 UTC m=+0.126907972 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git)
Nov 28 08:38:18 np0005538513.localdomain systemd[1]: tmp-crun.YBFHk4.mount: Deactivated successfully.
Nov 28 08:38:18 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:38:18 np0005538513.localdomain podman[86593]: 2025-11-28 08:38:18.915496158 +0000 UTC m=+0.149315889 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, vcs-type=git)
Nov 28 08:38:18 np0005538513.localdomain podman[86593]: 2025-11-28 08:38:18.953484447 +0000 UTC m=+0.187304218 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 08:38:18 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:38:19 np0005538513.localdomain podman[86592]: 2025-11-28 08:38:18.957457181 +0000 UTC m=+0.194720879 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team)
Nov 28 08:38:19 np0005538513.localdomain podman[86592]: 2025-11-28 08:38:19.038485838 +0000 UTC m=+0.275749536 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, io.openshift.expose-services=, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Nov 28 08:38:19 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:38:19 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:38:19 np0005538513.localdomain podman[86662]: 2025-11-28 08:38:19.847228454 +0000 UTC m=+0.085904038 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git)
Nov 28 08:38:20 np0005538513.localdomain podman[86662]: 2025-11-28 08:38:20.236512295 +0000 UTC m=+0.475187829 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com)
Nov 28 08:38:20 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:38:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:38:21 np0005538513.localdomain podman[86684]: 2025-11-28 08:38:21.837805678 +0000 UTC m=+0.076065203 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public)
Nov 28 08:38:21 np0005538513.localdomain podman[86684]: 2025-11-28 08:38:21.87133465 +0000 UTC m=+0.109594205 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:38:21 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:38:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:38:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:38:26 np0005538513.localdomain systemd[1]: tmp-crun.loXtch.mount: Deactivated successfully.
Nov 28 08:38:26 np0005538513.localdomain podman[86710]: 2025-11-28 08:38:26.852992507 +0000 UTC m=+0.090771290 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1)
Nov 28 08:38:26 np0005538513.localdomain podman[86711]: 2025-11-28 08:38:26.903990941 +0000 UTC m=+0.137346977 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, container_name=ovn_controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 08:38:26 np0005538513.localdomain podman[86710]: 2025-11-28 08:38:26.921387871 +0000 UTC m=+0.159166674 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:38:26 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:38:26 np0005538513.localdomain podman[86711]: 2025-11-28 08:38:26.976937206 +0000 UTC m=+0.210293242 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.buildah.version=1.41.4, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:38:26 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:38:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:38:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3000.1 total, 600.0 interval
                                                          Cumulative writes: 4899 writes, 22K keys, 4899 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4899 writes, 608 syncs, 8.06 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 358 writes, 1243 keys, 358 commit groups, 1.0 writes per commit group, ingest: 1.48 MB, 0.00 MB/s
                                                          Interval WAL: 358 writes, 149 syncs, 2.40 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 08:38:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:38:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3000.3 total, 600.0 interval
                                                          Cumulative writes: 5616 writes, 25K keys, 5616 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5616 writes, 758 syncs, 7.41 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 586 writes, 2486 keys, 586 commit groups, 1.0 writes per commit group, ingest: 3.16 MB, 0.01 MB/s
                                                          Interval WAL: 586 writes, 195 syncs, 3.01 writes per sync, written: 0.00 GB, 0.01 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 08:38:42 np0005538513.localdomain sudo[86757]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:38:42 np0005538513.localdomain sudo[86757]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:38:42 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:38:42 np0005538513.localdomain sudo[86757]: pam_unix(sudo:session): session closed for user root
Nov 28 08:38:42 np0005538513.localdomain recover_tripleo_nova_virtqemud[86773]: 61397
Nov 28 08:38:42 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:38:42 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:38:42 np0005538513.localdomain sudo[86774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:38:42 np0005538513.localdomain sudo[86774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:38:42 np0005538513.localdomain sudo[86774]: pam_unix(sudo:session): session closed for user root
Nov 28 08:38:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:38:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:38:43 np0005538513.localdomain podman[86821]: 2025-11-28 08:38:43.844486798 +0000 UTC m=+0.078057924 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., container_name=iscsid, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=)
Nov 28 08:38:43 np0005538513.localdomain podman[86821]: 2025-11-28 08:38:43.88638937 +0000 UTC m=+0.119960456 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, name=rhosp17/openstack-iscsid, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, config_id=tripleo_step3, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044)
Nov 28 08:38:43 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:38:43 np0005538513.localdomain podman[86822]: 2025-11-28 08:38:43.895739251 +0000 UTC m=+0.128826252 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., container_name=collectd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12)
Nov 28 08:38:43 np0005538513.localdomain podman[86822]: 2025-11-28 08:38:43.980431151 +0000 UTC m=+0.213518152 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, container_name=collectd, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd)
Nov 28 08:38:43 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:38:46 np0005538513.localdomain sudo[86859]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:38:46 np0005538513.localdomain sudo[86859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:38:46 np0005538513.localdomain sudo[86859]: pam_unix(sudo:session): session closed for user root
Nov 28 08:38:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:38:46 np0005538513.localdomain systemd[1]: tmp-crun.XfXIrz.mount: Deactivated successfully.
Nov 28 08:38:46 np0005538513.localdomain podman[86874]: 2025-11-28 08:38:46.859864778 +0000 UTC m=+0.097903801 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, container_name=metrics_qdr, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z)
Nov 28 08:38:47 np0005538513.localdomain podman[86874]: 2025-11-28 08:38:47.056422013 +0000 UTC m=+0.294461036 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, architecture=x86_64, tcib_managed=true, vcs-type=git)
Nov 28 08:38:47 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:38:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:38:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:38:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:38:49 np0005538513.localdomain podman[86905]: 2025-11-28 08:38:49.867926231 +0000 UTC m=+0.101825354 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4)
Nov 28 08:38:49 np0005538513.localdomain systemd[1]: tmp-crun.KrrWjT.mount: Deactivated successfully.
Nov 28 08:38:49 np0005538513.localdomain podman[86903]: 2025-11-28 08:38:49.931612868 +0000 UTC m=+0.170776195 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:38:49 np0005538513.localdomain podman[86903]: 2025-11-28 08:38:49.961063293 +0000 UTC m=+0.200226630 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, url=https://www.redhat.com)
Nov 28 08:38:49 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:38:49 np0005538513.localdomain podman[86904]: 2025-11-28 08:38:49.983735977 +0000 UTC m=+0.221060267 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, tcib_managed=true, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.)
Nov 28 08:38:49 np0005538513.localdomain podman[86904]: 2025-11-28 08:38:49.991203819 +0000 UTC m=+0.228528109 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=)
Nov 28 08:38:50 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:38:50 np0005538513.localdomain podman[86905]: 2025-11-28 08:38:50.048152918 +0000 UTC m=+0.282052041 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, distribution-scope=public, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible)
Nov 28 08:38:50 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:38:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:38:50 np0005538513.localdomain podman[86975]: 2025-11-28 08:38:50.838083021 +0000 UTC m=+0.076557569 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, vendor=Red Hat, Inc.)
Nov 28 08:38:50 np0005538513.localdomain systemd[1]: tmp-crun.X9nNub.mount: Deactivated successfully.
Nov 28 08:38:51 np0005538513.localdomain podman[86975]: 2025-11-28 08:38:51.208390251 +0000 UTC m=+0.446864779 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:38:51 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:38:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:38:52 np0005538513.localdomain systemd[1]: tmp-crun.J4XyRa.mount: Deactivated successfully.
Nov 28 08:38:52 np0005538513.localdomain podman[86998]: 2025-11-28 08:38:52.849388866 +0000 UTC m=+0.091149362 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:38:52 np0005538513.localdomain podman[86998]: 2025-11-28 08:38:52.880339068 +0000 UTC m=+0.122099594 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com)
Nov 28 08:38:52 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:38:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:38:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:38:57 np0005538513.localdomain systemd[1]: tmp-crun.u263jz.mount: Deactivated successfully.
Nov 28 08:38:57 np0005538513.localdomain podman[87025]: 2025-11-28 08:38:57.85188362 +0000 UTC m=+0.085035003 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, container_name=ovn_controller, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Nov 28 08:38:57 np0005538513.localdomain podman[87024]: 2025-11-28 08:38:57.904318767 +0000 UTC m=+0.140703110 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 08:38:57 np0005538513.localdomain podman[87025]: 2025-11-28 08:38:57.92662619 +0000 UTC m=+0.159777583 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, tcib_managed=true, distribution-scope=public, container_name=ovn_controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-18T23:34:05Z)
Nov 28 08:38:57 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:38:57 np0005538513.localdomain podman[87024]: 2025-11-28 08:38:57.979486772 +0000 UTC m=+0.215871155 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=)
Nov 28 08:38:57 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:38:58 np0005538513.localdomain systemd[1]: tmp-crun.uocFgP.mount: Deactivated successfully.
Nov 28 08:39:14 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:39:14 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:39:14 np0005538513.localdomain podman[87116]: 2025-11-28 08:39:14.857865596 +0000 UTC m=+0.087603301 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, vcs-type=git, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Nov 28 08:39:14 np0005538513.localdomain podman[87116]: 2025-11-28 08:39:14.898783507 +0000 UTC m=+0.128521272 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=)
Nov 28 08:39:14 np0005538513.localdomain systemd[1]: tmp-crun.9gG2sI.mount: Deactivated successfully.
Nov 28 08:39:14 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:39:14 np0005538513.localdomain podman[87117]: 2025-11-28 08:39:14.921139932 +0000 UTC m=+0.148551445 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, distribution-scope=public)
Nov 28 08:39:14 np0005538513.localdomain podman[87117]: 2025-11-28 08:39:14.935415455 +0000 UTC m=+0.162826988 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, distribution-scope=public, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, container_name=collectd, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:39:14 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:39:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:39:17 np0005538513.localdomain systemd[1]: tmp-crun.QcOmEs.mount: Deactivated successfully.
Nov 28 08:39:17 np0005538513.localdomain podman[87159]: 2025-11-28 08:39:17.856360401 +0000 UTC m=+0.094486005 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1)
Nov 28 08:39:18 np0005538513.localdomain podman[87159]: 2025-11-28 08:39:18.043592146 +0000 UTC m=+0.281717820 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=metrics_qdr, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z)
Nov 28 08:39:18 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:39:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:39:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:39:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:39:20 np0005538513.localdomain systemd[1]: tmp-crun.qZnROW.mount: Deactivated successfully.
Nov 28 08:39:20 np0005538513.localdomain podman[87186]: 2025-11-28 08:39:20.865099474 +0000 UTC m=+0.101104331 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:39:20 np0005538513.localdomain podman[87187]: 2025-11-28 08:39:20.958807105 +0000 UTC m=+0.189180117 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, container_name=logrotate_crond, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-type=git, version=17.1.12, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:39:20 np0005538513.localdomain podman[87187]: 2025-11-28 08:39:20.967970219 +0000 UTC m=+0.198343261 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, vcs-type=git, version=17.1.12, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z)
Nov 28 08:39:20 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:39:21 np0005538513.localdomain podman[87186]: 2025-11-28 08:39:21.024398822 +0000 UTC m=+0.260403669 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container)
Nov 28 08:39:21 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:39:21 np0005538513.localdomain podman[87188]: 2025-11-28 08:39:21.113465528 +0000 UTC m=+0.342456976 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 28 08:39:21 np0005538513.localdomain podman[87188]: 2025-11-28 08:39:21.144185212 +0000 UTC m=+0.373176620 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, container_name=ceilometer_agent_ipmi, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 28 08:39:21 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:39:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:39:21 np0005538513.localdomain podman[87259]: 2025-11-28 08:39:21.860120998 +0000 UTC m=+0.091496763 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1)
Nov 28 08:39:22 np0005538513.localdomain podman[87259]: 2025-11-28 08:39:22.257485208 +0000 UTC m=+0.488860943 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:39:22 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:39:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:39:23 np0005538513.localdomain systemd[1]: tmp-crun.VGAhPV.mount: Deactivated successfully.
Nov 28 08:39:23 np0005538513.localdomain podman[87283]: 2025-11-28 08:39:23.853483065 +0000 UTC m=+0.090846162 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Nov 28 08:39:23 np0005538513.localdomain podman[87283]: 2025-11-28 08:39:23.88291539 +0000 UTC m=+0.120278487 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step5, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, tcib_managed=true, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:39:23 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:39:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:39:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:39:28 np0005538513.localdomain systemd[1]: tmp-crun.sPJSFc.mount: Deactivated successfully.
Nov 28 08:39:28 np0005538513.localdomain podman[87310]: 2025-11-28 08:39:28.85921652 +0000 UTC m=+0.094745024 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:39:28 np0005538513.localdomain podman[87310]: 2025-11-28 08:39:28.887345553 +0000 UTC m=+0.122874097 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, version=17.1.12, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container)
Nov 28 08:39:28 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:39:28 np0005538513.localdomain systemd[1]: tmp-crun.sKhyOl.mount: Deactivated successfully.
Nov 28 08:39:28 np0005538513.localdomain podman[87309]: 2025-11-28 08:39:28.95578773 +0000 UTC m=+0.194436011 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.12, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com)
Nov 28 08:39:28 np0005538513.localdomain podman[87309]: 2025-11-28 08:39:28.996936667 +0000 UTC m=+0.235584948 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12)
Nov 28 08:39:29 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:39:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:39:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:39:45 np0005538513.localdomain systemd[1]: tmp-crun.iySGs3.mount: Deactivated successfully.
Nov 28 08:39:45 np0005538513.localdomain podman[87355]: 2025-11-28 08:39:45.847342215 +0000 UTC m=+0.084088102 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4)
Nov 28 08:39:45 np0005538513.localdomain podman[87355]: 2025-11-28 08:39:45.856490409 +0000 UTC m=+0.093236326 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, distribution-scope=public, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step3, architecture=x86_64)
Nov 28 08:39:45 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:39:45 np0005538513.localdomain systemd[1]: tmp-crun.xXlFKd.mount: Deactivated successfully.
Nov 28 08:39:45 np0005538513.localdomain podman[87356]: 2025-11-28 08:39:45.961010636 +0000 UTC m=+0.194495762 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, container_name=collectd, batch=17.1_20251118.1)
Nov 28 08:39:45 np0005538513.localdomain podman[87356]: 2025-11-28 08:39:45.971910564 +0000 UTC m=+0.205395680 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., container_name=collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:39:45 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:39:46 np0005538513.localdomain sudo[87395]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:39:46 np0005538513.localdomain sudo[87395]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:39:46 np0005538513.localdomain sudo[87395]: pam_unix(sudo:session): session closed for user root
Nov 28 08:39:46 np0005538513.localdomain sudo[87410]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 08:39:46 np0005538513.localdomain sudo[87410]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:39:46 np0005538513.localdomain sudo[87410]: pam_unix(sudo:session): session closed for user root
Nov 28 08:39:46 np0005538513.localdomain sudo[87446]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:39:46 np0005538513.localdomain sudo[87446]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:39:46 np0005538513.localdomain sudo[87446]: pam_unix(sudo:session): session closed for user root
Nov 28 08:39:47 np0005538513.localdomain sudo[87461]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:39:47 np0005538513.localdomain sudo[87461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:39:47 np0005538513.localdomain sudo[87461]: pam_unix(sudo:session): session closed for user root
Nov 28 08:39:48 np0005538513.localdomain sudo[87508]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:39:48 np0005538513.localdomain sudo[87508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:39:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:39:48 np0005538513.localdomain sudo[87508]: pam_unix(sudo:session): session closed for user root
Nov 28 08:39:48 np0005538513.localdomain podman[87523]: 2025-11-28 08:39:48.381180449 +0000 UTC m=+0.076304041 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, container_name=metrics_qdr, vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step1, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:39:48 np0005538513.localdomain podman[87523]: 2025-11-28 08:39:48.568490577 +0000 UTC m=+0.263614219 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, container_name=metrics_qdr, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true)
Nov 28 08:39:48 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:39:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:39:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:39:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:39:51 np0005538513.localdomain systemd[1]: tmp-crun.wpbXhR.mount: Deactivated successfully.
Nov 28 08:39:51 np0005538513.localdomain systemd[1]: tmp-crun.c4oX9U.mount: Deactivated successfully.
Nov 28 08:39:51 np0005538513.localdomain podman[87553]: 2025-11-28 08:39:51.918347825 +0000 UTC m=+0.154360425 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=ceilometer_agent_compute, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12)
Nov 28 08:39:51 np0005538513.localdomain podman[87554]: 2025-11-28 08:39:51.873526792 +0000 UTC m=+0.107476108 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, container_name=logrotate_crond, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:39:51 np0005538513.localdomain podman[87554]: 2025-11-28 08:39:51.959538904 +0000 UTC m=+0.193488230 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, release=1761123044, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:39:51 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:39:51 np0005538513.localdomain podman[87553]: 2025-11-28 08:39:51.974456287 +0000 UTC m=+0.210468917 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:39:52 np0005538513.localdomain podman[87555]: 2025-11-28 08:39:52.019169816 +0000 UTC m=+0.246012171 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:39:52 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:39:52 np0005538513.localdomain podman[87555]: 2025-11-28 08:39:52.082093511 +0000 UTC m=+0.308935786 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20251118.1, release=1761123044)
Nov 28 08:39:52 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:39:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:39:52 np0005538513.localdomain podman[87628]: 2025-11-28 08:39:52.85131612 +0000 UTC m=+0.088788338 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:39:53 np0005538513.localdomain podman[87628]: 2025-11-28 08:39:53.293485572 +0000 UTC m=+0.530957770 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target)
Nov 28 08:39:53 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:39:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:39:54 np0005538513.localdomain systemd[1]: tmp-crun.ksQEil.mount: Deactivated successfully.
Nov 28 08:39:54 np0005538513.localdomain podman[87653]: 2025-11-28 08:39:54.870608054 +0000 UTC m=+0.095785196 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, container_name=nova_compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1)
Nov 28 08:39:54 np0005538513.localdomain podman[87653]: 2025-11-28 08:39:54.907344765 +0000 UTC m=+0.132521887 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:39:54 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:39:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:39:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:39:59 np0005538513.localdomain systemd[1]: tmp-crun.eUI2SD.mount: Deactivated successfully.
Nov 28 08:39:59 np0005538513.localdomain podman[87679]: 2025-11-28 08:39:59.867156364 +0000 UTC m=+0.097314833 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, release=1761123044, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:39:59 np0005538513.localdomain podman[87679]: 2025-11-28 08:39:59.917547189 +0000 UTC m=+0.147705668 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Nov 28 08:39:59 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:40:00 np0005538513.localdomain podman[87680]: 2025-11-28 08:39:59.999544996 +0000 UTC m=+0.232130851 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T23:34:05Z)
Nov 28 08:40:00 np0005538513.localdomain podman[87680]: 2025-11-28 08:40:00.031499798 +0000 UTC m=+0.264085673 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, build-date=2025-11-18T23:34:05Z, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, batch=17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:40:00 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:40:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:40:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:40:16 np0005538513.localdomain systemd[1]: tmp-crun.7B4iGC.mount: Deactivated successfully.
Nov 28 08:40:16 np0005538513.localdomain podman[87773]: 2025-11-28 08:40:16.847376232 +0000 UTC m=+0.088367715 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, vendor=Red Hat, Inc., container_name=iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:40:16 np0005538513.localdomain podman[87774]: 2025-11-28 08:40:16.904163845 +0000 UTC m=+0.139820893 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, batch=17.1_20251118.1, container_name=collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, version=17.1.12, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, release=1761123044)
Nov 28 08:40:16 np0005538513.localdomain podman[87773]: 2025-11-28 08:40:16.933359573 +0000 UTC m=+0.174351076 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:40:16 np0005538513.localdomain podman[87774]: 2025-11-28 08:40:16.942465815 +0000 UTC m=+0.178122833 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, container_name=collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, tcib_managed=true, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd)
Nov 28 08:40:16 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:40:16 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:40:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:40:18 np0005538513.localdomain systemd[1]: tmp-crun.maPk1q.mount: Deactivated successfully.
Nov 28 08:40:18 np0005538513.localdomain podman[87814]: 2025-11-28 08:40:18.849838273 +0000 UTC m=+0.088969674 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z)
Nov 28 08:40:19 np0005538513.localdomain podman[87814]: 2025-11-28 08:40:19.082275602 +0000 UTC m=+0.321407063 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, url=https://www.redhat.com, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, io.buildah.version=1.41.4, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:40:19 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:40:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:40:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:40:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:40:22 np0005538513.localdomain systemd[1]: tmp-crun.aQdWDT.mount: Deactivated successfully.
Nov 28 08:40:22 np0005538513.localdomain podman[87843]: 2025-11-28 08:40:22.846791478 +0000 UTC m=+0.084317660 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z)
Nov 28 08:40:22 np0005538513.localdomain podman[87845]: 2025-11-28 08:40:22.889580796 +0000 UTC m=+0.119316757 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z)
Nov 28 08:40:22 np0005538513.localdomain podman[87844]: 2025-11-28 08:40:22.943294154 +0000 UTC m=+0.179816085 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, name=rhosp17/openstack-cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z)
Nov 28 08:40:22 np0005538513.localdomain podman[87844]: 2025-11-28 08:40:22.948461675 +0000 UTC m=+0.184983606 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64)
Nov 28 08:40:22 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:40:22 np0005538513.localdomain podman[87845]: 2025-11-28 08:40:22.969582161 +0000 UTC m=+0.199318182 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-19T00:12:45Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:40:22 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:40:23 np0005538513.localdomain podman[87843]: 2025-11-28 08:40:23.021606636 +0000 UTC m=+0.259132828 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc.)
Nov 28 08:40:23 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:40:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:40:23 np0005538513.localdomain systemd[1]: tmp-crun.IvXF6I.mount: Deactivated successfully.
Nov 28 08:40:23 np0005538513.localdomain podman[87913]: 2025-11-28 08:40:23.848936701 +0000 UTC m=+0.080484920 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1)
Nov 28 08:40:24 np0005538513.localdomain podman[87913]: 2025-11-28 08:40:24.219417808 +0000 UTC m=+0.450966057 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, release=1761123044, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=)
Nov 28 08:40:24 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:40:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:40:25 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:40:25 np0005538513.localdomain recover_tripleo_nova_virtqemud[87940]: 61397
Nov 28 08:40:25 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:40:25 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:40:25 np0005538513.localdomain podman[87934]: 2025-11-28 08:40:25.837576903 +0000 UTC m=+0.078802218 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:40:25 np0005538513.localdomain podman[87934]: 2025-11-28 08:40:25.893872441 +0000 UTC m=+0.135097706 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step5, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044)
Nov 28 08:40:25 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:40:30 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:40:30 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:40:30 np0005538513.localdomain systemd[1]: tmp-crun.5vgE4e.mount: Deactivated successfully.
Nov 28 08:40:30 np0005538513.localdomain podman[87963]: 2025-11-28 08:40:30.858722678 +0000 UTC m=+0.091298957 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:40:30 np0005538513.localdomain podman[87963]: 2025-11-28 08:40:30.904633794 +0000 UTC m=+0.137210073 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:40:30 np0005538513.localdomain podman[87962]: 2025-11-28 08:40:30.903819639 +0000 UTC m=+0.138556555 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:40:30 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:40:30 np0005538513.localdomain podman[87962]: 2025-11-28 08:40:30.983846474 +0000 UTC m=+0.218583330 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1761123044, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1)
Nov 28 08:40:30 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:40:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:40:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:40:47 np0005538513.localdomain podman[88012]: 2025-11-28 08:40:47.853946851 +0000 UTC m=+0.089708647 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container)
Nov 28 08:40:47 np0005538513.localdomain podman[88012]: 2025-11-28 08:40:47.865522221 +0000 UTC m=+0.101284027 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., container_name=iscsid, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:40:47 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:40:47 np0005538513.localdomain podman[88013]: 2025-11-28 08:40:47.953279066 +0000 UTC m=+0.186588106 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-collectd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:40:47 np0005538513.localdomain podman[88013]: 2025-11-28 08:40:47.962077429 +0000 UTC m=+0.195386429 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step3, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-collectd-container, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:40:47 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:40:48 np0005538513.localdomain sudo[88050]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:40:48 np0005538513.localdomain sudo[88050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:40:48 np0005538513.localdomain sudo[88050]: pam_unix(sudo:session): session closed for user root
Nov 28 08:40:48 np0005538513.localdomain sudo[88065]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:40:48 np0005538513.localdomain sudo[88065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:40:49 np0005538513.localdomain sudo[88065]: pam_unix(sudo:session): session closed for user root
Nov 28 08:40:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:40:49 np0005538513.localdomain systemd[1]: tmp-crun.hRtnyV.mount: Deactivated successfully.
Nov 28 08:40:49 np0005538513.localdomain podman[88112]: 2025-11-28 08:40:49.848854568 +0000 UTC m=+0.086259331 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, distribution-scope=public, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=)
Nov 28 08:40:49 np0005538513.localdomain sudo[88137]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:40:49 np0005538513.localdomain sudo[88137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:40:49 np0005538513.localdomain sudo[88137]: pam_unix(sudo:session): session closed for user root
Nov 28 08:40:50 np0005538513.localdomain podman[88112]: 2025-11-28 08:40:50.038494386 +0000 UTC m=+0.275899199 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=metrics_qdr)
Nov 28 08:40:50 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:40:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:40:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:40:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:40:53 np0005538513.localdomain systemd[1]: tmp-crun.NQZLjS.mount: Deactivated successfully.
Nov 28 08:40:53 np0005538513.localdomain podman[88157]: 2025-11-28 08:40:53.855570546 +0000 UTC m=+0.092137843 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, version=17.1.12, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute)
Nov 28 08:40:53 np0005538513.localdomain podman[88157]: 2025-11-28 08:40:53.889591932 +0000 UTC m=+0.126159179 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Nov 28 08:40:53 np0005538513.localdomain podman[88159]: 2025-11-28 08:40:53.902302376 +0000 UTC m=+0.136451248 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 08:40:53 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:40:53 np0005538513.localdomain podman[88158]: 2025-11-28 08:40:53.957525242 +0000 UTC m=+0.192533081 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, vcs-type=git, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container)
Nov 28 08:40:53 np0005538513.localdomain podman[88158]: 2025-11-28 08:40:53.965889251 +0000 UTC m=+0.200897140 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com)
Nov 28 08:40:53 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:40:54 np0005538513.localdomain podman[88159]: 2025-11-28 08:40:54.010842748 +0000 UTC m=+0.244991670 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, version=17.1.12, release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com)
Nov 28 08:40:54 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:40:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:40:54 np0005538513.localdomain systemd[1]: tmp-crun.LGOMcN.mount: Deactivated successfully.
Nov 28 08:40:54 np0005538513.localdomain podman[88228]: 2025-11-28 08:40:54.857933827 +0000 UTC m=+0.088608933 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, release=1761123044, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:40:55 np0005538513.localdomain podman[88228]: 2025-11-28 08:40:55.222423407 +0000 UTC m=+0.453098503 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:40:55 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:40:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:40:56 np0005538513.localdomain systemd[1]: tmp-crun.2lXeVo.mount: Deactivated successfully.
Nov 28 08:40:56 np0005538513.localdomain podman[88252]: 2025-11-28 08:40:56.853548605 +0000 UTC m=+0.088322005 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, build-date=2025-11-19T00:36:58Z, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.buildah.version=1.41.4)
Nov 28 08:40:56 np0005538513.localdomain podman[88252]: 2025-11-28 08:40:56.884460995 +0000 UTC m=+0.119234385 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container)
Nov 28 08:40:56 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:41:01 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:41:01 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:41:01 np0005538513.localdomain podman[88278]: 2025-11-28 08:41:01.831102525 +0000 UTC m=+0.068783478 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, tcib_managed=true, container_name=ovn_metadata_agent, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1)
Nov 28 08:41:01 np0005538513.localdomain systemd[1]: tmp-crun.NLC4mH.mount: Deactivated successfully.
Nov 28 08:41:01 np0005538513.localdomain podman[88278]: 2025-11-28 08:41:01.892609375 +0000 UTC m=+0.130290348 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=)
Nov 28 08:41:01 np0005538513.localdomain podman[88279]: 2025-11-28 08:41:01.891924803 +0000 UTC m=+0.125723895 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, io.buildah.version=1.41.4, container_name=ovn_controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:41:01 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:41:01 np0005538513.localdomain podman[88279]: 2025-11-28 08:41:01.976230542 +0000 UTC m=+0.210029564 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public)
Nov 28 08:41:01 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:41:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:41:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:41:18 np0005538513.localdomain systemd[1]: tmp-crun.kwt0KY.mount: Deactivated successfully.
Nov 28 08:41:18 np0005538513.localdomain podman[88350]: 2025-11-28 08:41:18.85793997 +0000 UTC m=+0.092360339 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:41:18 np0005538513.localdomain podman[88350]: 2025-11-28 08:41:18.869350105 +0000 UTC m=+0.103770504 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:41:18 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:41:18 np0005538513.localdomain podman[88349]: 2025-11-28 08:41:18.955379827 +0000 UTC m=+0.191263442 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, container_name=iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:41:18 np0005538513.localdomain podman[88349]: 2025-11-28 08:41:18.964736347 +0000 UTC m=+0.200619992 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.)
Nov 28 08:41:18 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:41:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:41:20 np0005538513.localdomain podman[88387]: 2025-11-28 08:41:20.844233759 +0000 UTC m=+0.081960577 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:41:21 np0005538513.localdomain podman[88387]: 2025-11-28 08:41:21.034339543 +0000 UTC m=+0.272066371 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:41:21 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:41:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:41:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:41:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:41:24 np0005538513.localdomain systemd[1]: tmp-crun.Z6Piwy.mount: Deactivated successfully.
Nov 28 08:41:24 np0005538513.localdomain podman[88418]: 2025-11-28 08:41:24.850801511 +0000 UTC m=+0.090245173 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Nov 28 08:41:24 np0005538513.localdomain podman[88418]: 2025-11-28 08:41:24.890567136 +0000 UTC m=+0.130010788 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, batch=17.1_20251118.1, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:41:24 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:41:24 np0005538513.localdomain podman[88417]: 2025-11-28 08:41:24.943411267 +0000 UTC m=+0.183158700 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, distribution-scope=public, url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc.)
Nov 28 08:41:24 np0005538513.localdomain podman[88419]: 2025-11-28 08:41:24.894105646 +0000 UTC m=+0.127992706 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible)
Nov 28 08:41:24 np0005538513.localdomain podman[88417]: 2025-11-28 08:41:24.997618271 +0000 UTC m=+0.237365693 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:41:25 np0005538513.localdomain podman[88419]: 2025-11-28 08:41:25.02945063 +0000 UTC m=+0.263337740 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, version=17.1.12)
Nov 28 08:41:25 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:41:25 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:41:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:41:25 np0005538513.localdomain podman[88488]: 2025-11-28 08:41:25.843691418 +0000 UTC m=+0.079892492 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, distribution-scope=public, io.buildah.version=1.41.4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, release=1761123044, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=)
Nov 28 08:41:26 np0005538513.localdomain podman[88488]: 2025-11-28 08:41:26.214382661 +0000 UTC m=+0.450583775 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, tcib_managed=true, distribution-scope=public, build-date=2025-11-19T00:36:58Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container)
Nov 28 08:41:26 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:41:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:41:27 np0005538513.localdomain systemd[1]: tmp-crun.fXYHIa.mount: Deactivated successfully.
Nov 28 08:41:27 np0005538513.localdomain podman[88511]: 2025-11-28 08:41:27.844274722 +0000 UTC m=+0.082666379 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.buildah.version=1.41.4, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=)
Nov 28 08:41:27 np0005538513.localdomain podman[88511]: 2025-11-28 08:41:27.876365058 +0000 UTC m=+0.114756715 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, container_name=nova_compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:41:27 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:41:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:41:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:41:32 np0005538513.localdomain podman[88538]: 2025-11-28 08:41:32.830840472 +0000 UTC m=+0.073205435 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.buildah.version=1.41.4)
Nov 28 08:41:32 np0005538513.localdomain podman[88538]: 2025-11-28 08:41:32.877454769 +0000 UTC m=+0.119819682 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, batch=17.1_20251118.1, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Nov 28 08:41:32 np0005538513.localdomain systemd[1]: tmp-crun.uBtEVH.mount: Deactivated successfully.
Nov 28 08:41:32 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:41:32 np0005538513.localdomain podman[88539]: 2025-11-28 08:41:32.901595479 +0000 UTC m=+0.139682369 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, vendor=Red Hat, Inc.)
Nov 28 08:41:32 np0005538513.localdomain podman[88539]: 2025-11-28 08:41:32.95348362 +0000 UTC m=+0.191570510 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:41:32 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:41:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:41:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:41:49 np0005538513.localdomain systemd[1]: tmp-crun.qZuRae.mount: Deactivated successfully.
Nov 28 08:41:49 np0005538513.localdomain systemd[1]: tmp-crun.aNkFPe.mount: Deactivated successfully.
Nov 28 08:41:49 np0005538513.localdomain podman[88585]: 2025-11-28 08:41:49.892942138 +0000 UTC m=+0.129248455 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, container_name=iscsid, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:41:49 np0005538513.localdomain podman[88586]: 2025-11-28 08:41:49.866543799 +0000 UTC m=+0.098679897 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, url=https://www.redhat.com, container_name=collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:41:49 np0005538513.localdomain podman[88585]: 2025-11-28 08:41:49.926200281 +0000 UTC m=+0.162506618 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git)
Nov 28 08:41:49 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:41:49 np0005538513.localdomain podman[88586]: 2025-11-28 08:41:49.951451606 +0000 UTC m=+0.183587674 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:41:49 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:41:50 np0005538513.localdomain sudo[88624]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:41:50 np0005538513.localdomain sudo[88624]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:41:50 np0005538513.localdomain sudo[88624]: pam_unix(sudo:session): session closed for user root
Nov 28 08:41:50 np0005538513.localdomain sudo[88639]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 08:41:50 np0005538513.localdomain sudo[88639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:41:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:41:51 np0005538513.localdomain podman[88725]: 2025-11-28 08:41:51.08531269 +0000 UTC m=+0.101883745 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, description=Red Hat Ceph Storage 7, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.buildah.version=1.33.12, name=rhceph, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, RELEASE=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_BRANCH=main, ceph=True, CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.openshift.tags=rhceph ceph)
Nov 28 08:41:51 np0005538513.localdomain systemd[1]: tmp-crun.OpDrWk.mount: Deactivated successfully.
Nov 28 08:41:51 np0005538513.localdomain podman[88725]: 2025-11-28 08:41:51.190350932 +0000 UTC m=+0.206921967 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, release=553, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, version=7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 08:41:51 np0005538513.localdomain podman[88744]: 2025-11-28 08:41:51.193374826 +0000 UTC m=+0.100019286 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, container_name=metrics_qdr, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible)
Nov 28 08:41:51 np0005538513.localdomain podman[88744]: 2025-11-28 08:41:51.356455091 +0000 UTC m=+0.263099491 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true)
Nov 28 08:41:51 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:41:51 np0005538513.localdomain sudo[88639]: pam_unix(sudo:session): session closed for user root
Nov 28 08:41:51 np0005538513.localdomain sudo[88822]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:41:51 np0005538513.localdomain sudo[88822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:41:51 np0005538513.localdomain sudo[88822]: pam_unix(sudo:session): session closed for user root
Nov 28 08:41:51 np0005538513.localdomain sudo[88837]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:41:51 np0005538513.localdomain sudo[88837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:41:52 np0005538513.localdomain sudo[88837]: pam_unix(sudo:session): session closed for user root
Nov 28 08:41:52 np0005538513.localdomain sudo[88884]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:41:52 np0005538513.localdomain sudo[88884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:41:52 np0005538513.localdomain sudo[88884]: pam_unix(sudo:session): session closed for user root
Nov 28 08:41:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:41:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:41:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:41:55 np0005538513.localdomain systemd[1]: tmp-crun.HxaACk.mount: Deactivated successfully.
Nov 28 08:41:55 np0005538513.localdomain podman[88901]: 2025-11-28 08:41:55.85392472 +0000 UTC m=+0.087334473 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 08:41:55 np0005538513.localdomain podman[88899]: 2025-11-28 08:41:55.900564429 +0000 UTC m=+0.136996796 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 28 08:41:55 np0005538513.localdomain podman[88901]: 2025-11-28 08:41:55.90834969 +0000 UTC m=+0.141759423 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:41:55 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:41:55 np0005538513.localdomain podman[88899]: 2025-11-28 08:41:55.975384742 +0000 UTC m=+0.211817049 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, container_name=ceilometer_agent_compute, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4)
Nov 28 08:41:55 np0005538513.localdomain podman[88900]: 2025-11-28 08:41:55.973127223 +0000 UTC m=+0.209344613 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Nov 28 08:41:55 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:41:56 np0005538513.localdomain podman[88900]: 2025-11-28 08:41:56.060528707 +0000 UTC m=+0.296746137 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, container_name=logrotate_crond)
Nov 28 08:41:56 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:41:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:41:56 np0005538513.localdomain podman[88973]: 2025-11-28 08:41:56.84146896 +0000 UTC m=+0.081872404 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64)
Nov 28 08:41:57 np0005538513.localdomain podman[88973]: 2025-11-28 08:41:57.211378818 +0000 UTC m=+0.451782312 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, distribution-scope=public, build-date=2025-11-19T00:36:58Z, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:41:57 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:41:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:41:58 np0005538513.localdomain podman[88997]: 2025-11-28 08:41:58.839292506 +0000 UTC m=+0.078691155 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, maintainer=OpenStack TripleO Team)
Nov 28 08:41:58 np0005538513.localdomain podman[88997]: 2025-11-28 08:41:58.899110914 +0000 UTC m=+0.138509533 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, release=1761123044, name=rhosp17/openstack-nova-compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true)
Nov 28 08:41:58 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:42:03 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:42:03 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:42:03 np0005538513.localdomain systemd[1]: tmp-crun.26yWzQ.mount: Deactivated successfully.
Nov 28 08:42:03 np0005538513.localdomain podman[89024]: 2025-11-28 08:42:03.842995567 +0000 UTC m=+0.080504902 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, release=1761123044, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=)
Nov 28 08:42:03 np0005538513.localdomain podman[89024]: 2025-11-28 08:42:03.889057067 +0000 UTC m=+0.126566442 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-19T00:14:25Z, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:42:03 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:42:03 np0005538513.localdomain podman[89025]: 2025-11-28 08:42:03.904031042 +0000 UTC m=+0.138851083 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:42:03 np0005538513.localdomain podman[89025]: 2025-11-28 08:42:03.983461349 +0000 UTC m=+0.218281370 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=)
Nov 28 08:42:03 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:42:05 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:42:05 np0005538513.localdomain recover_tripleo_nova_virtqemud[89075]: 61397
Nov 28 08:42:05 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:42:05 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:42:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:42:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:42:20 np0005538513.localdomain systemd[1]: tmp-crun.zY3GfN.mount: Deactivated successfully.
Nov 28 08:42:20 np0005538513.localdomain podman[89077]: 2025-11-28 08:42:20.852069943 +0000 UTC m=+0.086085734 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, container_name=collectd, com.redhat.component=openstack-collectd-container, release=1761123044, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step3, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64)
Nov 28 08:42:20 np0005538513.localdomain podman[89077]: 2025-11-28 08:42:20.863364203 +0000 UTC m=+0.097379964 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64)
Nov 28 08:42:20 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:42:20 np0005538513.localdomain podman[89076]: 2025-11-28 08:42:20.95886524 +0000 UTC m=+0.191562301 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044, url=https://www.redhat.com, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, build-date=2025-11-18T23:44:13Z, vcs-type=git, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:42:20 np0005538513.localdomain podman[89076]: 2025-11-28 08:42:20.998511811 +0000 UTC m=+0.231208852 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, release=1761123044, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vcs-type=git, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:42:21 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:42:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:42:21 np0005538513.localdomain systemd[1]: tmp-crun.xATJQM.mount: Deactivated successfully.
Nov 28 08:42:21 np0005538513.localdomain podman[89117]: 2025-11-28 08:42:21.858079787 +0000 UTC m=+0.091795102 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=metrics_qdr, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:42:22 np0005538513.localdomain podman[89117]: 2025-11-28 08:42:22.047243492 +0000 UTC m=+0.280958837 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:42:22 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:42:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:42:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:42:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:42:26 np0005538513.localdomain systemd[1]: tmp-crun.mdgcNu.mount: Deactivated successfully.
Nov 28 08:42:26 np0005538513.localdomain podman[89148]: 2025-11-28 08:42:26.860231019 +0000 UTC m=+0.100476761 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z)
Nov 28 08:42:26 np0005538513.localdomain podman[89148]: 2025-11-28 08:42:26.890302744 +0000 UTC m=+0.130548456 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, distribution-scope=public, architecture=x86_64)
Nov 28 08:42:26 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:42:26 np0005538513.localdomain podman[89150]: 2025-11-28 08:42:26.910976345 +0000 UTC m=+0.141869866 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true)
Nov 28 08:42:26 np0005538513.localdomain podman[89149]: 2025-11-28 08:42:26.959981958 +0000 UTC m=+0.194598295 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:42:26 np0005538513.localdomain podman[89150]: 2025-11-28 08:42:26.969685729 +0000 UTC m=+0.200579230 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:42:26 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:42:26 np0005538513.localdomain podman[89149]: 2025-11-28 08:42:26.99322111 +0000 UTC m=+0.227837417 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=logrotate_crond, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Nov 28 08:42:27 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:42:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:42:27 np0005538513.localdomain podman[89220]: 2025-11-28 08:42:27.845284753 +0000 UTC m=+0.082191094 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_migration_target, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Nov 28 08:42:28 np0005538513.localdomain podman[89220]: 2025-11-28 08:42:28.208382979 +0000 UTC m=+0.445289330 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:42:28 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:42:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:42:29 np0005538513.localdomain podman[89243]: 2025-11-28 08:42:29.873336469 +0000 UTC m=+0.110596526 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=)
Nov 28 08:42:29 np0005538513.localdomain podman[89243]: 2025-11-28 08:42:29.898370247 +0000 UTC m=+0.135630264 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, tcib_managed=true, container_name=nova_compute)
Nov 28 08:42:29 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:42:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:42:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:42:34 np0005538513.localdomain podman[89292]: 2025-11-28 08:42:34.841886078 +0000 UTC m=+0.078485718 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 08:42:34 np0005538513.localdomain podman[89292]: 2025-11-28 08:42:34.888393363 +0000 UTC m=+0.124992953 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-11-19T00:14:25Z)
Nov 28 08:42:34 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:42:34 np0005538513.localdomain systemd[1]: tmp-crun.YejIWT.mount: Deactivated successfully.
Nov 28 08:42:34 np0005538513.localdomain podman[89293]: 2025-11-28 08:42:34.972170125 +0000 UTC m=+0.204657637 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, release=1761123044, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com)
Nov 28 08:42:35 np0005538513.localdomain podman[89293]: 2025-11-28 08:42:35.024406107 +0000 UTC m=+0.256893649 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, release=1761123044, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z)
Nov 28 08:42:35 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:42:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:42:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:42:51 np0005538513.localdomain podman[89340]: 2025-11-28 08:42:51.855153697 +0000 UTC m=+0.083927846 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, version=17.1.12, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, container_name=iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:42:51 np0005538513.localdomain podman[89340]: 2025-11-28 08:42:51.861355341 +0000 UTC m=+0.090129420 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-type=git, version=17.1.12, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:42:51 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:42:51 np0005538513.localdomain podman[89341]: 2025-11-28 08:42:51.904983135 +0000 UTC m=+0.128463121 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, config_id=tripleo_step3, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:42:51 np0005538513.localdomain podman[89341]: 2025-11-28 08:42:51.944590065 +0000 UTC m=+0.168070021 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=collectd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12)
Nov 28 08:42:51 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:42:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:42:52 np0005538513.localdomain podman[89379]: 2025-11-28 08:42:52.882757492 +0000 UTC m=+0.082031229 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd)
Nov 28 08:42:53 np0005538513.localdomain podman[89379]: 2025-11-28 08:42:53.078316205 +0000 UTC m=+0.277589942 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, release=1761123044, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, version=17.1.12)
Nov 28 08:42:53 np0005538513.localdomain sudo[89408]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:42:53 np0005538513.localdomain sudo[89408]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:42:53 np0005538513.localdomain sudo[89408]: pam_unix(sudo:session): session closed for user root
Nov 28 08:42:53 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:42:53 np0005538513.localdomain sudo[89423]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:42:53 np0005538513.localdomain sudo[89423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:42:53 np0005538513.localdomain sudo[89423]: pam_unix(sudo:session): session closed for user root
Nov 28 08:42:54 np0005538513.localdomain sudo[89471]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:42:54 np0005538513.localdomain sudo[89471]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:42:54 np0005538513.localdomain sudo[89471]: pam_unix(sudo:session): session closed for user root
Nov 28 08:42:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:42:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:42:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:42:57 np0005538513.localdomain podman[89486]: 2025-11-28 08:42:57.888317141 +0000 UTC m=+0.125761067 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:42:57 np0005538513.localdomain systemd[1]: tmp-crun.nv47X8.mount: Deactivated successfully.
Nov 28 08:42:57 np0005538513.localdomain podman[89486]: 2025-11-28 08:42:57.949818291 +0000 UTC m=+0.187262217 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:42:57 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:42:57 np0005538513.localdomain podman[89488]: 2025-11-28 08:42:57.993112185 +0000 UTC m=+0.225112222 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, release=1761123044, config_id=tripleo_step4, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true)
Nov 28 08:42:58 np0005538513.localdomain podman[89487]: 2025-11-28 08:42:57.954823107 +0000 UTC m=+0.191234140 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:42:58 np0005538513.localdomain podman[89488]: 2025-11-28 08:42:58.017758031 +0000 UTC m=+0.249758068 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.)
Nov 28 08:42:58 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:42:58 np0005538513.localdomain podman[89487]: 2025-11-28 08:42:58.035344398 +0000 UTC m=+0.271755461 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Nov 28 08:42:58 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:42:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:42:58 np0005538513.localdomain podman[89559]: 2025-11-28 08:42:58.843985991 +0000 UTC m=+0.078324883 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, release=1761123044, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:42:59 np0005538513.localdomain podman[89559]: 2025-11-28 08:42:59.255966246 +0000 UTC m=+0.490305168 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:42:59 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:43:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:43:00 np0005538513.localdomain podman[89583]: 2025-11-28 08:43:00.803706846 +0000 UTC m=+0.049143338 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, distribution-scope=public, config_id=tripleo_step5, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 08:43:00 np0005538513.localdomain systemd[1]: tmp-crun.3KOeZk.mount: Deactivated successfully.
Nov 28 08:43:00 np0005538513.localdomain podman[89583]: 2025-11-28 08:43:00.824434329 +0000 UTC m=+0.069870831 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, container_name=nova_compute)
Nov 28 08:43:00 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:43:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:43:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:43:05 np0005538513.localdomain podman[89611]: 2025-11-28 08:43:05.84197138 +0000 UTC m=+0.082003228 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:43:05 np0005538513.localdomain podman[89612]: 2025-11-28 08:43:05.901443707 +0000 UTC m=+0.139121001 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z)
Nov 28 08:43:05 np0005538513.localdomain podman[89611]: 2025-11-28 08:43:05.917509196 +0000 UTC m=+0.157541054 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, distribution-scope=public, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:43:05 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:43:05 np0005538513.localdomain podman[89612]: 2025-11-28 08:43:05.944506505 +0000 UTC m=+0.182183749 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:43:05 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:43:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:43:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:43:22 np0005538513.localdomain podman[89657]: 2025-11-28 08:43:22.865325698 +0000 UTC m=+0.100224094 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, container_name=iscsid, release=1761123044, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:43:22 np0005538513.localdomain podman[89657]: 2025-11-28 08:43:22.87280066 +0000 UTC m=+0.107699006 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, architecture=x86_64, config_id=tripleo_step3, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, container_name=iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:43:22 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:43:22 np0005538513.localdomain systemd[1]: tmp-crun.6XC80B.mount: Deactivated successfully.
Nov 28 08:43:22 np0005538513.localdomain podman[89658]: 2025-11-28 08:43:22.994127718 +0000 UTC m=+0.226231807 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., container_name=collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044)
Nov 28 08:43:23 np0005538513.localdomain podman[89658]: 2025-11-28 08:43:23.009537426 +0000 UTC m=+0.241641485 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, release=1761123044, batch=17.1_20251118.1)
Nov 28 08:43:23 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:43:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:43:23 np0005538513.localdomain podman[89696]: 2025-11-28 08:43:23.849236025 +0000 UTC m=+0.088257381 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:43:24 np0005538513.localdomain podman[89696]: 2025-11-28 08:43:24.040414313 +0000 UTC m=+0.279435629 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd)
Nov 28 08:43:24 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:43:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:43:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:43:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:43:28 np0005538513.localdomain systemd[1]: tmp-crun.EMP0N2.mount: Deactivated successfully.
Nov 28 08:43:28 np0005538513.localdomain podman[89727]: 2025-11-28 08:43:28.831199164 +0000 UTC m=+0.070431609 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4)
Nov 28 08:43:28 np0005538513.localdomain podman[89727]: 2025-11-28 08:43:28.841411601 +0000 UTC m=+0.080644106 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, release=1761123044, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:43:28 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:43:28 np0005538513.localdomain podman[89728]: 2025-11-28 08:43:28.917828075 +0000 UTC m=+0.147919846 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:43:28 np0005538513.localdomain podman[89728]: 2025-11-28 08:43:28.944554494 +0000 UTC m=+0.174646235 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, release=1761123044, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:43:28 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:43:29 np0005538513.localdomain podman[89726]: 2025-11-28 08:43:28.897208904 +0000 UTC m=+0.136997046 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4)
Nov 28 08:43:29 np0005538513.localdomain podman[89726]: 2025-11-28 08:43:29.030824774 +0000 UTC m=+0.270612846 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, tcib_managed=true, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:43:29 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:43:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:43:29 np0005538513.localdomain systemd[1]: tmp-crun.XxYrnc.mount: Deactivated successfully.
Nov 28 08:43:29 np0005538513.localdomain systemd[1]: tmp-crun.WjxMSG.mount: Deactivated successfully.
Nov 28 08:43:29 np0005538513.localdomain podman[89796]: 2025-11-28 08:43:29.857922931 +0000 UTC m=+0.091413761 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12)
Nov 28 08:43:30 np0005538513.localdomain podman[89796]: 2025-11-28 08:43:30.276968646 +0000 UTC m=+0.510459446 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, version=17.1.12, architecture=x86_64, tcib_managed=true)
Nov 28 08:43:30 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:43:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:43:31 np0005538513.localdomain systemd[1]: tmp-crun.SmxqFL.mount: Deactivated successfully.
Nov 28 08:43:31 np0005538513.localdomain podman[89820]: 2025-11-28 08:43:31.850871686 +0000 UTC m=+0.085900139 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, distribution-scope=public, config_id=tripleo_step5, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, container_name=nova_compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.12)
Nov 28 08:43:31 np0005538513.localdomain podman[89820]: 2025-11-28 08:43:31.908523066 +0000 UTC m=+0.143551529 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, container_name=nova_compute, distribution-scope=public)
Nov 28 08:43:31 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:43:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:43:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:43:36 np0005538513.localdomain systemd[1]: tmp-crun.92iz7M.mount: Deactivated successfully.
Nov 28 08:43:36 np0005538513.localdomain podman[89848]: 2025-11-28 08:43:36.849919185 +0000 UTC m=+0.086611091 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 28 08:43:36 np0005538513.localdomain podman[89848]: 2025-11-28 08:43:36.910342262 +0000 UTC m=+0.147034158 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, url=https://www.redhat.com, release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, vcs-type=git)
Nov 28 08:43:36 np0005538513.localdomain podman[89849]: 2025-11-28 08:43:36.910558458 +0000 UTC m=+0.143208648 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team)
Nov 28 08:43:36 np0005538513.localdomain podman[89849]: 2025-11-28 08:43:36.932769698 +0000 UTC m=+0.165419888 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., container_name=ovn_controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:43:36 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:43:36 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:43:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:43:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:43:53 np0005538513.localdomain podman[89897]: 2025-11-28 08:43:53.850797417 +0000 UTC m=+0.087928703 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, tcib_managed=true, com.redhat.component=openstack-collectd-container, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:43:53 np0005538513.localdomain podman[89897]: 2025-11-28 08:43:53.891629594 +0000 UTC m=+0.128760850 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, batch=17.1_20251118.1, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-collectd-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team)
Nov 28 08:43:53 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:43:53 np0005538513.localdomain podman[89896]: 2025-11-28 08:43:53.941163403 +0000 UTC m=+0.181618512 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, version=17.1.12)
Nov 28 08:43:53 np0005538513.localdomain podman[89896]: 2025-11-28 08:43:53.949132371 +0000 UTC m=+0.189587510 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step3, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 28 08:43:53 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:43:54 np0005538513.localdomain sudo[89934]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:43:54 np0005538513.localdomain sudo[89934]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:43:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:43:54 np0005538513.localdomain sudo[89934]: pam_unix(sudo:session): session closed for user root
Nov 28 08:43:54 np0005538513.localdomain sudo[89955]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:43:54 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:43:54 np0005538513.localdomain podman[89948]: 2025-11-28 08:43:54.817608503 +0000 UTC m=+0.093341349 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12)
Nov 28 08:43:54 np0005538513.localdomain sudo[89955]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:43:54 np0005538513.localdomain recover_tripleo_nova_virtqemud[89983]: 61397
Nov 28 08:43:54 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:43:54 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:43:55 np0005538513.localdomain podman[89948]: 2025-11-28 08:43:55.006146588 +0000 UTC m=+0.281879384 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044)
Nov 28 08:43:55 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:43:55 np0005538513.localdomain sudo[89955]: pam_unix(sudo:session): session closed for user root
Nov 28 08:43:56 np0005538513.localdomain sudo[90028]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:43:56 np0005538513.localdomain sudo[90028]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:43:56 np0005538513.localdomain sudo[90028]: pam_unix(sudo:session): session closed for user root
Nov 28 08:43:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:43:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:43:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:43:59 np0005538513.localdomain podman[90045]: 2025-11-28 08:43:59.864994522 +0000 UTC m=+0.091928636 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:43:59 np0005538513.localdomain podman[90045]: 2025-11-28 08:43:59.887960995 +0000 UTC m=+0.114895119 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, tcib_managed=true)
Nov 28 08:43:59 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:43:59 np0005538513.localdomain podman[90044]: 2025-11-28 08:43:59.902158286 +0000 UTC m=+0.132998632 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, config_id=tripleo_step4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=logrotate_crond, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:43:59 np0005538513.localdomain systemd[1]: tmp-crun.DuLpQj.mount: Deactivated successfully.
Nov 28 08:43:59 np0005538513.localdomain podman[90043]: 2025-11-28 08:43:59.963271614 +0000 UTC m=+0.195115991 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, container_name=ceilometer_agent_compute, version=17.1.12)
Nov 28 08:43:59 np0005538513.localdomain podman[90044]: 2025-11-28 08:43:59.98534604 +0000 UTC m=+0.216186426 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git)
Nov 28 08:43:59 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:44:00 np0005538513.localdomain podman[90043]: 2025-11-28 08:44:00.041356669 +0000 UTC m=+0.273201036 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4)
Nov 28 08:44:00 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:44:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:44:00 np0005538513.localdomain podman[90116]: 2025-11-28 08:44:00.844219454 +0000 UTC m=+0.075328251 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 08:44:01 np0005538513.localdomain podman[90116]: 2025-11-28 08:44:01.22017797 +0000 UTC m=+0.451286767 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12)
Nov 28 08:44:01 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:44:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:44:02 np0005538513.localdomain systemd[1]: tmp-crun.2J5WrK.mount: Deactivated successfully.
Nov 28 08:44:02 np0005538513.localdomain podman[90137]: 2025-11-28 08:44:02.85391273 +0000 UTC m=+0.082705880 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, vcs-type=git)
Nov 28 08:44:02 np0005538513.localdomain podman[90137]: 2025-11-28 08:44:02.878434801 +0000 UTC m=+0.107227941 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=nova_compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=)
Nov 28 08:44:02 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:44:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:44:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:44:07 np0005538513.localdomain podman[90165]: 2025-11-28 08:44:07.839825358 +0000 UTC m=+0.078651623 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:44:07 np0005538513.localdomain podman[90165]: 2025-11-28 08:44:07.891637528 +0000 UTC m=+0.130463773 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, container_name=ovn_controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller)
Nov 28 08:44:07 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:44:07 np0005538513.localdomain podman[90164]: 2025-11-28 08:44:07.902195636 +0000 UTC m=+0.142861328 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, tcib_managed=true, version=17.1.12, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044)
Nov 28 08:44:07 np0005538513.localdomain podman[90164]: 2025-11-28 08:44:07.948106261 +0000 UTC m=+0.188771913 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_id=tripleo_step4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Nov 28 08:44:07 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:44:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:44:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:44:24 np0005538513.localdomain podman[90211]: 2025-11-28 08:44:24.854901901 +0000 UTC m=+0.089393438 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z)
Nov 28 08:44:24 np0005538513.localdomain podman[90211]: 2025-11-28 08:44:24.865691806 +0000 UTC m=+0.100183313 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, container_name=collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.12)
Nov 28 08:44:24 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:44:24 np0005538513.localdomain podman[90210]: 2025-11-28 08:44:24.951135909 +0000 UTC m=+0.186040298 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team)
Nov 28 08:44:24 np0005538513.localdomain podman[90210]: 2025-11-28 08:44:24.959379025 +0000 UTC m=+0.194283414 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, vcs-type=git, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-18T23:44:13Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:44:24 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:44:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:44:25 np0005538513.localdomain podman[90249]: 2025-11-28 08:44:25.833744451 +0000 UTC m=+0.075098384 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:44:26 np0005538513.localdomain podman[90249]: 2025-11-28 08:44:26.004804284 +0000 UTC m=+0.246158197 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, config_id=tripleo_step1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=)
Nov 28 08:44:26 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:44:30 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:44:30 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:44:30 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:44:30 np0005538513.localdomain podman[90279]: 2025-11-28 08:44:30.835451061 +0000 UTC m=+0.072383879 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, version=17.1.12, com.redhat.component=openstack-cron-container, release=1761123044, name=rhosp17/openstack-cron, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=)
Nov 28 08:44:30 np0005538513.localdomain podman[90279]: 2025-11-28 08:44:30.845247495 +0000 UTC m=+0.082180263 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:44:30 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:44:30 np0005538513.localdomain podman[90278]: 2025-11-28 08:44:30.88564094 +0000 UTC m=+0.130078832 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, tcib_managed=true)
Nov 28 08:44:30 np0005538513.localdomain podman[90278]: 2025-11-28 08:44:30.908264642 +0000 UTC m=+0.152702504 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git)
Nov 28 08:44:30 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:44:30 np0005538513.localdomain podman[90285]: 2025-11-28 08:44:30.998916667 +0000 UTC m=+0.232954445 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:44:31 np0005538513.localdomain podman[90285]: 2025-11-28 08:44:31.029358473 +0000 UTC m=+0.263396261 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, tcib_managed=true)
Nov 28 08:44:31 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:44:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:44:31 np0005538513.localdomain systemd[1]: tmp-crun.9yCsDL.mount: Deactivated successfully.
Nov 28 08:44:31 np0005538513.localdomain podman[90352]: 2025-11-28 08:44:31.84743399 +0000 UTC m=+0.085477225 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com)
Nov 28 08:44:32 np0005538513.localdomain podman[90352]: 2025-11-28 08:44:32.228406652 +0000 UTC m=+0.466449887 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:44:32 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:44:33 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:44:33 np0005538513.localdomain systemd[1]: tmp-crun.XNcEjy.mount: Deactivated successfully.
Nov 28 08:44:33 np0005538513.localdomain podman[90375]: 2025-11-28 08:44:33.848480076 +0000 UTC m=+0.086272639 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, distribution-scope=public, config_id=tripleo_step5, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12)
Nov 28 08:44:33 np0005538513.localdomain podman[90375]: 2025-11-28 08:44:33.880420319 +0000 UTC m=+0.118212862 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4)
Nov 28 08:44:33 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:44:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:44:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:44:38 np0005538513.localdomain podman[90402]: 2025-11-28 08:44:38.841396004 +0000 UTC m=+0.077865930 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com)
Nov 28 08:44:38 np0005538513.localdomain systemd[1]: tmp-crun.freZ7Q.mount: Deactivated successfully.
Nov 28 08:44:38 np0005538513.localdomain podman[90403]: 2025-11-28 08:44:38.889535038 +0000 UTC m=+0.122844326 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller)
Nov 28 08:44:38 np0005538513.localdomain podman[90402]: 2025-11-28 08:44:38.902501171 +0000 UTC m=+0.138971007 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, build-date=2025-11-19T00:14:25Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Nov 28 08:44:38 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:44:38 np0005538513.localdomain podman[90403]: 2025-11-28 08:44:38.953128574 +0000 UTC m=+0.186437942 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, name=rhosp17/openstack-ovn-controller, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., container_name=ovn_controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:44:38 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:44:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:44:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:44:55 np0005538513.localdomain podman[90451]: 2025-11-28 08:44:55.84311033 +0000 UTC m=+0.077216600 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step3)
Nov 28 08:44:55 np0005538513.localdomain podman[90452]: 2025-11-28 08:44:55.895256609 +0000 UTC m=+0.128072568 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, container_name=collectd, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:44:55 np0005538513.localdomain podman[90451]: 2025-11-28 08:44:55.929255205 +0000 UTC m=+0.163361515 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, config_id=tripleo_step3, container_name=iscsid, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.)
Nov 28 08:44:55 np0005538513.localdomain podman[90452]: 2025-11-28 08:44:55.930448532 +0000 UTC m=+0.163264521 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com)
Nov 28 08:44:55 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:44:55 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:44:56 np0005538513.localdomain sudo[90491]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:44:56 np0005538513.localdomain sudo[90491]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:44:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:44:56 np0005538513.localdomain sudo[90491]: pam_unix(sudo:session): session closed for user root
Nov 28 08:44:56 np0005538513.localdomain sudo[90512]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:44:56 np0005538513.localdomain sudo[90512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:44:56 np0005538513.localdomain podman[90506]: 2025-11-28 08:44:56.407369134 +0000 UTC m=+0.084443194 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, distribution-scope=public)
Nov 28 08:44:56 np0005538513.localdomain podman[90506]: 2025-11-28 08:44:56.623482845 +0000 UTC m=+0.300556905 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, container_name=metrics_qdr, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc.)
Nov 28 08:44:56 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:44:57 np0005538513.localdomain sudo[90512]: pam_unix(sudo:session): session closed for user root
Nov 28 08:44:57 np0005538513.localdomain sudo[90581]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:44:57 np0005538513.localdomain sudo[90581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:44:57 np0005538513.localdomain sudo[90581]: pam_unix(sudo:session): session closed for user root
Nov 28 08:45:01 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:45:01 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:45:01 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:45:01 np0005538513.localdomain systemd[1]: tmp-crun.geOwC9.mount: Deactivated successfully.
Nov 28 08:45:01 np0005538513.localdomain podman[90597]: 2025-11-28 08:45:01.860885266 +0000 UTC m=+0.089261814 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, release=1761123044, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, batch=17.1_20251118.1, container_name=logrotate_crond, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, url=https://www.redhat.com)
Nov 28 08:45:01 np0005538513.localdomain podman[90597]: 2025-11-28 08:45:01.870736281 +0000 UTC m=+0.099112799 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vcs-type=git, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12)
Nov 28 08:45:01 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:45:01 np0005538513.localdomain podman[90596]: 2025-11-28 08:45:01.958226069 +0000 UTC m=+0.187834184 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:45:02 np0005538513.localdomain podman[90596]: 2025-11-28 08:45:02.012737912 +0000 UTC m=+0.242346057 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:45:02 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:45:02 np0005538513.localdomain podman[90598]: 2025-11-28 08:45:02.028488261 +0000 UTC m=+0.253880056 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64)
Nov 28 08:45:02 np0005538513.localdomain podman[90598]: 2025-11-28 08:45:02.057852653 +0000 UTC m=+0.283244428 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:45:02 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:45:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:45:02 np0005538513.localdomain podman[90667]: 2025-11-28 08:45:02.850223331 +0000 UTC m=+0.085454045 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, container_name=nova_migration_target)
Nov 28 08:45:03 np0005538513.localdomain podman[90667]: 2025-11-28 08:45:03.211081378 +0000 UTC m=+0.446312102 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:45:03 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:45:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:45:04 np0005538513.localdomain podman[90689]: 2025-11-28 08:45:04.847836562 +0000 UTC m=+0.084377172 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vendor=Red Hat, Inc., container_name=nova_compute, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:45:04 np0005538513.localdomain podman[90689]: 2025-11-28 08:45:04.880324651 +0000 UTC m=+0.116865241 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, distribution-scope=public, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute)
Nov 28 08:45:04 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:45:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:45:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:45:09 np0005538513.localdomain systemd[1]: tmp-crun.Q0bjyz.mount: Deactivated successfully.
Nov 28 08:45:09 np0005538513.localdomain podman[90715]: 2025-11-28 08:45:09.866354204 +0000 UTC m=+0.096701445 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:45:09 np0005538513.localdomain podman[90715]: 2025-11-28 08:45:09.909200204 +0000 UTC m=+0.139547475 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, release=1761123044, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, io.buildah.version=1.41.4)
Nov 28 08:45:09 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:45:09 np0005538513.localdomain podman[90716]: 2025-11-28 08:45:09.916256183 +0000 UTC m=+0.143377104 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.4)
Nov 28 08:45:09 np0005538513.localdomain podman[90716]: 2025-11-28 08:45:09.999641992 +0000 UTC m=+0.226762873 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Nov 28 08:45:10 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:45:19 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:45:19 np0005538513.localdomain recover_tripleo_nova_virtqemud[90766]: 61397
Nov 28 08:45:19 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:45:19 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:45:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:45:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:45:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:45:26 np0005538513.localdomain systemd[1]: tmp-crun.cv1f5f.mount: Deactivated successfully.
Nov 28 08:45:26 np0005538513.localdomain podman[90768]: 2025-11-28 08:45:26.921187008 +0000 UTC m=+0.145674113 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true)
Nov 28 08:45:26 np0005538513.localdomain podman[90768]: 2025-11-28 08:45:26.963940494 +0000 UTC m=+0.188427539 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container)
Nov 28 08:45:26 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:45:26 np0005538513.localdomain podman[90767]: 2025-11-28 08:45:26.978465688 +0000 UTC m=+0.197535493 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, container_name=iscsid, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:45:26 np0005538513.localdomain podman[90769]: 2025-11-28 08:45:26.882107327 +0000 UTC m=+0.103717002 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:45:26 np0005538513.localdomain podman[90767]: 2025-11-28 08:45:26.990311928 +0000 UTC m=+0.209381763 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step3, container_name=iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, architecture=x86_64, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, vcs-type=git)
Nov 28 08:45:27 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:45:27 np0005538513.localdomain podman[90769]: 2025-11-28 08:45:27.079319739 +0000 UTC m=+0.300929344 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, release=1761123044, vcs-type=git, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Nov 28 08:45:27 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:45:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:45:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:45:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:45:32 np0005538513.localdomain systemd[1]: tmp-crun.XbEmfv.mount: Deactivated successfully.
Nov 28 08:45:32 np0005538513.localdomain podman[90837]: 2025-11-28 08:45:32.866552034 +0000 UTC m=+0.096062722 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044)
Nov 28 08:45:32 np0005538513.localdomain podman[90837]: 2025-11-28 08:45:32.895393045 +0000 UTC m=+0.124903713 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:45:32 np0005538513.localdomain systemd[1]: tmp-crun.iJ2a4J.mount: Deactivated successfully.
Nov 28 08:45:32 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:45:32 np0005538513.localdomain podman[90836]: 2025-11-28 08:45:32.915527824 +0000 UTC m=+0.147246782 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-cron, release=1761123044, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, version=17.1.12)
Nov 28 08:45:32 np0005538513.localdomain podman[90836]: 2025-11-28 08:45:32.923471172 +0000 UTC m=+0.155190150 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, version=17.1.12, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:45:32 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:45:33 np0005538513.localdomain podman[90835]: 2025-11-28 08:45:33.027889255 +0000 UTC m=+0.262413510 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:45:33 np0005538513.localdomain podman[90835]: 2025-11-28 08:45:33.06738455 +0000 UTC m=+0.301908855 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute)
Nov 28 08:45:33 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:45:33 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:45:33 np0005538513.localdomain podman[90907]: 2025-11-28 08:45:33.845301946 +0000 UTC m=+0.079630578 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc.)
Nov 28 08:45:34 np0005538513.localdomain podman[90907]: 2025-11-28 08:45:34.454524053 +0000 UTC m=+0.688852675 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, build-date=2025-11-19T00:36:58Z)
Nov 28 08:45:34 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:45:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:45:35 np0005538513.localdomain podman[90930]: 2025-11-28 08:45:35.844809705 +0000 UTC m=+0.080902789 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:45:35 np0005538513.localdomain podman[90930]: 2025-11-28 08:45:35.877319212 +0000 UTC m=+0.113412236 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-type=git, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=nova_compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:45:35 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:45:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:45:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:45:40 np0005538513.localdomain podman[90956]: 2025-11-28 08:45:40.820543943 +0000 UTC m=+0.063121323 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1)
Nov 28 08:45:40 np0005538513.localdomain systemd[1]: tmp-crun.khAJw5.mount: Deactivated successfully.
Nov 28 08:45:40 np0005538513.localdomain podman[90957]: 2025-11-28 08:45:40.874743907 +0000 UTC m=+0.114830289 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step4, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, release=1761123044, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team)
Nov 28 08:45:40 np0005538513.localdomain podman[90956]: 2025-11-28 08:45:40.889519519 +0000 UTC m=+0.132096899 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, container_name=ovn_metadata_agent)
Nov 28 08:45:40 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:45:40 np0005538513.localdomain podman[90957]: 2025-11-28 08:45:40.903154785 +0000 UTC m=+0.143241187 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible)
Nov 28 08:45:40 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully.
Nov 28 08:45:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:45:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:45:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:45:57 np0005538513.localdomain podman[91005]: 2025-11-28 08:45:57.852073112 +0000 UTC m=+0.081290181 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:45:57 np0005538513.localdomain sudo[91029]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:45:57 np0005538513.localdomain sudo[91029]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:45:57 np0005538513.localdomain sudo[91029]: pam_unix(sudo:session): session closed for user root
Nov 28 08:45:57 np0005538513.localdomain podman[91003]: 2025-11-28 08:45:57.913220353 +0000 UTC m=+0.147333115 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64)
Nov 28 08:45:57 np0005538513.localdomain sudo[91069]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:45:57 np0005538513.localdomain sudo[91069]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:45:57 np0005538513.localdomain podman[91003]: 2025-11-28 08:45:57.965521737 +0000 UTC m=+0.199634499 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, tcib_managed=true, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc.)
Nov 28 08:45:57 np0005538513.localdomain systemd[1]: tmp-crun.vgGeWk.mount: Deactivated successfully.
Nov 28 08:45:57 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:45:58 np0005538513.localdomain podman[91004]: 2025-11-28 08:45:57.974789147 +0000 UTC m=+0.208744184 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_id=tripleo_step3, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container)
Nov 28 08:45:58 np0005538513.localdomain podman[91004]: 2025-11-28 08:45:58.059066571 +0000 UTC m=+0.293021558 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., container_name=collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git)
Nov 28 08:45:58 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:45:58 np0005538513.localdomain podman[91005]: 2025-11-28 08:45:58.11185755 +0000 UTC m=+0.341074619 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, container_name=metrics_qdr, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, build-date=2025-11-18T22:49:46Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1)
Nov 28 08:45:58 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:45:58 np0005538513.localdomain sudo[91069]: pam_unix(sudo:session): session closed for user root
Nov 28 08:45:59 np0005538513.localdomain sudo[91132]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:45:59 np0005538513.localdomain sudo[91132]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:45:59 np0005538513.localdomain sudo[91132]: pam_unix(sudo:session): session closed for user root
Nov 28 08:46:03 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:46:03 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:46:03 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:46:03 np0005538513.localdomain podman[91148]: 2025-11-28 08:46:03.852751166 +0000 UTC m=+0.081654352 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:46:03 np0005538513.localdomain podman[91148]: 2025-11-28 08:46:03.890760873 +0000 UTC m=+0.119664059 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:46:03 np0005538513.localdomain podman[91149]: 2025-11-28 08:46:03.906390642 +0000 UTC m=+0.131797969 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:46:03 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:46:03 np0005538513.localdomain systemd[1]: tmp-crun.HOeSMz.mount: Deactivated successfully.
Nov 28 08:46:03 np0005538513.localdomain podman[91149]: 2025-11-28 08:46:03.967377538 +0000 UTC m=+0.192784825 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team)
Nov 28 08:46:03 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:46:04 np0005538513.localdomain podman[91147]: 2025-11-28 08:46:03.970172655 +0000 UTC m=+0.200631970 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:46:04 np0005538513.localdomain podman[91147]: 2025-11-28 08:46:04.053326443 +0000 UTC m=+0.283785798 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:46:04 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:46:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:46:04 np0005538513.localdomain podman[91220]: 2025-11-28 08:46:04.85008134 +0000 UTC m=+0.082681775 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, architecture=x86_64, container_name=nova_migration_target, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, vcs-type=git)
Nov 28 08:46:05 np0005538513.localdomain podman[91220]: 2025-11-28 08:46:05.225266343 +0000 UTC m=+0.457866768 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, release=1761123044, url=https://www.redhat.com)
Nov 28 08:46:05 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:46:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:46:06 np0005538513.localdomain podman[91244]: 2025-11-28 08:46:06.870223933 +0000 UTC m=+0.080222858 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, container_name=nova_compute, config_id=tripleo_step5, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:46:06 np0005538513.localdomain podman[91244]: 2025-11-28 08:46:06.901382637 +0000 UTC m=+0.111381572 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc.)
Nov 28 08:46:06 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:46:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:46:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:46:11 np0005538513.localdomain systemd[1]: tmp-crun.FYSJLZ.mount: Deactivated successfully.
Nov 28 08:46:11 np0005538513.localdomain podman[91272]: 2025-11-28 08:46:11.882237884 +0000 UTC m=+0.114679395 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-18T23:34:05Z, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team)
Nov 28 08:46:11 np0005538513.localdomain podman[91271]: 2025-11-28 08:46:11.937891783 +0000 UTC m=+0.172836052 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:46:11 np0005538513.localdomain podman[91272]: 2025-11-28 08:46:11.938478531 +0000 UTC m=+0.170920052 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, version=17.1.12, tcib_managed=true)
Nov 28 08:46:11 np0005538513.localdomain podman[91272]: unhealthy
Nov 28 08:46:12 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:46:12 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:46:12 np0005538513.localdomain podman[91271]: 2025-11-28 08:46:12.013358141 +0000 UTC m=+0.248302480 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git)
Nov 28 08:46:12 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:46:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:46:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:46:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:46:28 np0005538513.localdomain podman[91321]: 2025-11-28 08:46:28.83631154 +0000 UTC m=+0.070565596 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, distribution-scope=public, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12)
Nov 28 08:46:28 np0005538513.localdomain podman[91322]: 2025-11-28 08:46:28.90032977 +0000 UTC m=+0.130453047 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, distribution-scope=public, architecture=x86_64, container_name=collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:46:28 np0005538513.localdomain podman[91322]: 2025-11-28 08:46:28.93838128 +0000 UTC m=+0.168504577 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, tcib_managed=true)
Nov 28 08:46:28 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:46:28 np0005538513.localdomain podman[91323]: 2025-11-28 08:46:28.953217863 +0000 UTC m=+0.180121499 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=metrics_qdr, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, release=1761123044, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Nov 28 08:46:28 np0005538513.localdomain podman[91321]: 2025-11-28 08:46:28.972059182 +0000 UTC m=+0.206313298 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, build-date=2025-11-18T23:44:13Z, container_name=iscsid, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step3, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, vcs-type=git, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:46:28 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:46:29 np0005538513.localdomain podman[91323]: 2025-11-28 08:46:29.154434981 +0000 UTC m=+0.381338647 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:46:29 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:46:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:46:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:46:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:46:34 np0005538513.localdomain podman[91389]: 2025-11-28 08:46:34.848614737 +0000 UTC m=+0.088806125 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-type=git, container_name=ceilometer_agent_compute, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:46:34 np0005538513.localdomain podman[91391]: 2025-11-28 08:46:34.895254815 +0000 UTC m=+0.130412607 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_id=tripleo_step4)
Nov 28 08:46:34 np0005538513.localdomain podman[91389]: 2025-11-28 08:46:34.906546057 +0000 UTC m=+0.146737425 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:46:34 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:46:34 np0005538513.localdomain systemd[1]: tmp-crun.wM0B8c.mount: Deactivated successfully.
Nov 28 08:46:34 np0005538513.localdomain podman[91390]: 2025-11-28 08:46:34.951881044 +0000 UTC m=+0.187349156 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, container_name=logrotate_crond, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Nov 28 08:46:34 np0005538513.localdomain podman[91391]: 2025-11-28 08:46:34.956356694 +0000 UTC m=+0.191514476 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:46:34 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:46:34 np0005538513.localdomain podman[91390]: 2025-11-28 08:46:34.980426365 +0000 UTC m=+0.215894477 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, version=17.1.12, container_name=logrotate_crond, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z)
Nov 28 08:46:34 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:46:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:46:35 np0005538513.localdomain podman[91462]: 2025-11-28 08:46:35.840048686 +0000 UTC m=+0.072183287 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, distribution-scope=public, container_name=nova_migration_target, vcs-type=git, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:46:36 np0005538513.localdomain podman[91462]: 2025-11-28 08:46:36.201409527 +0000 UTC m=+0.433544148 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, release=1761123044, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:46:36 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:46:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:46:37 np0005538513.localdomain podman[91483]: 2025-11-28 08:46:37.84867927 +0000 UTC m=+0.082019574 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12)
Nov 28 08:46:37 np0005538513.localdomain podman[91483]: 2025-11-28 08:46:37.87587648 +0000 UTC m=+0.109216784 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1)
Nov 28 08:46:37 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:46:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:46:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:46:42 np0005538513.localdomain systemd[1]: tmp-crun.NmEX8q.mount: Deactivated successfully.
Nov 28 08:46:42 np0005538513.localdomain podman[91509]: 2025-11-28 08:46:42.857675997 +0000 UTC m=+0.090029054 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Nov 28 08:46:42 np0005538513.localdomain podman[91510]: 2025-11-28 08:46:42.905265754 +0000 UTC m=+0.133998998 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, config_id=tripleo_step4, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1)
Nov 28 08:46:42 np0005538513.localdomain podman[91509]: 2025-11-28 08:46:42.911631073 +0000 UTC m=+0.143984130 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:46:42 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully.
Nov 28 08:46:42 np0005538513.localdomain podman[91510]: 2025-11-28 08:46:42.929890654 +0000 UTC m=+0.158623888 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 28 08:46:42 np0005538513.localdomain podman[91510]: unhealthy
Nov 28 08:46:42 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:46:42 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:46:59 np0005538513.localdomain sudo[91560]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:46:59 np0005538513.localdomain sudo[91560]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:46:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:46:59 np0005538513.localdomain sudo[91560]: pam_unix(sudo:session): session closed for user root
Nov 28 08:46:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:46:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:46:59 np0005538513.localdomain sudo[91593]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:46:59 np0005538513.localdomain sudo[91593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:46:59 np0005538513.localdomain systemd[1]: tmp-crun.FOuSQa.mount: Deactivated successfully.
Nov 28 08:46:59 np0005538513.localdomain podman[91576]: 2025-11-28 08:46:59.647170722 +0000 UTC m=+0.102563106 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, container_name=collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, version=17.1.12)
Nov 28 08:46:59 np0005538513.localdomain podman[91576]: 2025-11-28 08:46:59.66150799 +0000 UTC m=+0.116900384 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, version=17.1.12, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:46:59 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:46:59 np0005538513.localdomain systemd[1]: tmp-crun.VPukj1.mount: Deactivated successfully.
Nov 28 08:46:59 np0005538513.localdomain podman[91577]: 2025-11-28 08:46:59.753463863 +0000 UTC m=+0.206288586 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, container_name=metrics_qdr, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible)
Nov 28 08:46:59 np0005538513.localdomain podman[91574]: 2025-11-28 08:46:59.785869446 +0000 UTC m=+0.244730908 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64)
Nov 28 08:46:59 np0005538513.localdomain podman[91574]: 2025-11-28 08:46:59.800592216 +0000 UTC m=+0.259453628 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, distribution-scope=public, config_id=tripleo_step3, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git)
Nov 28 08:46:59 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:46:59 np0005538513.localdomain podman[91577]: 2025-11-28 08:46:59.993302828 +0000 UTC m=+0.446127541 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, release=1761123044, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:47:00 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:47:00 np0005538513.localdomain sudo[91593]: pam_unix(sudo:session): session closed for user root
Nov 28 08:47:04 np0005538513.localdomain sudo[91685]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:47:04 np0005538513.localdomain sudo[91685]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:47:04 np0005538513.localdomain sudo[91685]: pam_unix(sudo:session): session closed for user root
Nov 28 08:47:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:47:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:47:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:47:05 np0005538513.localdomain podman[91702]: 2025-11-28 08:47:05.873257419 +0000 UTC m=+0.091119419 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=)
Nov 28 08:47:05 np0005538513.localdomain systemd[1]: tmp-crun.0zIsXx.mount: Deactivated successfully.
Nov 28 08:47:05 np0005538513.localdomain podman[91701]: 2025-11-28 08:47:05.912851976 +0000 UTC m=+0.131882472 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team)
Nov 28 08:47:05 np0005538513.localdomain podman[91703]: 2025-11-28 08:47:05.972691735 +0000 UTC m=+0.188164301 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi)
Nov 28 08:47:05 np0005538513.localdomain podman[91701]: 2025-11-28 08:47:05.973531952 +0000 UTC m=+0.192562538 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, container_name=ceilometer_agent_compute, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:47:05 np0005538513.localdomain podman[91702]: 2025-11-28 08:47:05.989205022 +0000 UTC m=+0.207067022 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step4, version=17.1.12, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, release=1761123044, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Nov 28 08:47:05 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:47:06 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:47:06 np0005538513.localdomain podman[91703]: 2025-11-28 08:47:06.053070127 +0000 UTC m=+0.268542632 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044)
Nov 28 08:47:06 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:47:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:47:06 np0005538513.localdomain podman[91773]: 2025-11-28 08:47:06.851099354 +0000 UTC m=+0.085710979 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, version=17.1.12, name=rhosp17/openstack-nova-compute, architecture=x86_64, distribution-scope=public, release=1761123044, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team)
Nov 28 08:47:07 np0005538513.localdomain podman[91773]: 2025-11-28 08:47:07.219463295 +0000 UTC m=+0.454074880 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-type=git, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4)
Nov 28 08:47:07 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:47:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:47:08 np0005538513.localdomain podman[91797]: 2025-11-28 08:47:08.839226758 +0000 UTC m=+0.078100272 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12)
Nov 28 08:47:08 np0005538513.localdomain podman[91797]: 2025-11-28 08:47:08.870393741 +0000 UTC m=+0.109267255 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, vcs-type=git, name=rhosp17/openstack-nova-compute, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 08:47:08 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:47:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:47:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:47:13 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:47:13 np0005538513.localdomain recover_tripleo_nova_virtqemud[91836]: 61397
Nov 28 08:47:13 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:47:13 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:47:13 np0005538513.localdomain systemd[1]: tmp-crun.JxlQYc.mount: Deactivated successfully.
Nov 28 08:47:13 np0005538513.localdomain podman[91823]: 2025-11-28 08:47:13.854263622 +0000 UTC m=+0.095041540 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=ovn_metadata_agent, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, distribution-scope=public, config_id=tripleo_step4, batch=17.1_20251118.1)
Nov 28 08:47:13 np0005538513.localdomain podman[91823]: 2025-11-28 08:47:13.87145951 +0000 UTC m=+0.112237478 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc.)
Nov 28 08:47:13 np0005538513.localdomain podman[91823]: unhealthy
Nov 28 08:47:13 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:47:13 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:47:13 np0005538513.localdomain podman[91824]: 2025-11-28 08:47:13.957206869 +0000 UTC m=+0.195433038 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, container_name=ovn_controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4)
Nov 28 08:47:13 np0005538513.localdomain podman[91824]: 2025-11-28 08:47:13.970934928 +0000 UTC m=+0.209161087 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:47:13 np0005538513.localdomain podman[91824]: unhealthy
Nov 28 08:47:13 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:47:13 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:47:14 np0005538513.localdomain systemd[1]: tmp-crun.uwgKKx.mount: Deactivated successfully.
Nov 28 08:47:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:47:29 np0005538513.localdomain podman[91865]: 2025-11-28 08:47:29.850568401 +0000 UTC m=+0.085150452 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, container_name=collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:47:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:47:29 np0005538513.localdomain podman[91865]: 2025-11-28 08:47:29.862346698 +0000 UTC m=+0.096928749 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public)
Nov 28 08:47:29 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:47:29 np0005538513.localdomain systemd[1]: tmp-crun.euHbTd.mount: Deactivated successfully.
Nov 28 08:47:29 np0005538513.localdomain podman[91885]: 2025-11-28 08:47:29.954196599 +0000 UTC m=+0.083847211 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:47:29 np0005538513.localdomain podman[91885]: 2025-11-28 08:47:29.963452888 +0000 UTC m=+0.093103520 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, version=17.1.12, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step3, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1)
Nov 28 08:47:29 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:47:30 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:47:30 np0005538513.localdomain systemd[1]: tmp-crun.yEUeO7.mount: Deactivated successfully.
Nov 28 08:47:30 np0005538513.localdomain podman[91904]: 2025-11-28 08:47:30.869550331 +0000 UTC m=+0.102834514 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:47:31 np0005538513.localdomain podman[91904]: 2025-11-28 08:47:31.065557566 +0000 UTC m=+0.298841759 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible)
Nov 28 08:47:31 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:47:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:47:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:47:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:47:36 np0005538513.localdomain podman[91936]: 2025-11-28 08:47:36.858879042 +0000 UTC m=+0.088045003 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, version=17.1.12, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:47:36 np0005538513.localdomain podman[91936]: 2025-11-28 08:47:36.893436971 +0000 UTC m=+0.122602902 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, distribution-scope=public, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 28 08:47:36 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:47:36 np0005538513.localdomain podman[91935]: 2025-11-28 08:47:36.916319196 +0000 UTC m=+0.145982963 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, architecture=x86_64, release=1761123044, build-date=2025-11-18T22:49:32Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, vcs-type=git)
Nov 28 08:47:36 np0005538513.localdomain podman[91935]: 2025-11-28 08:47:36.949456312 +0000 UTC m=+0.179120129 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, url=https://www.redhat.com, container_name=logrotate_crond, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64)
Nov 28 08:47:36 np0005538513.localdomain podman[91934]: 2025-11-28 08:47:36.964391948 +0000 UTC m=+0.197441870 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044)
Nov 28 08:47:36 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:47:37 np0005538513.localdomain podman[91934]: 2025-11-28 08:47:37.0185152 +0000 UTC m=+0.251565082 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4)
Nov 28 08:47:37 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:47:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:47:37 np0005538513.localdomain podman[92006]: 2025-11-28 08:47:37.856132263 +0000 UTC m=+0.091003275 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, container_name=nova_migration_target, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git)
Nov 28 08:47:38 np0005538513.localdomain podman[92006]: 2025-11-28 08:47:38.265533585 +0000 UTC m=+0.500404577 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:47:38 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:47:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:47:39 np0005538513.localdomain podman[92029]: 2025-11-28 08:47:39.845581037 +0000 UTC m=+0.079696732 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:47:39 np0005538513.localdomain podman[92029]: 2025-11-28 08:47:39.8815233 +0000 UTC m=+0.115638985 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, config_id=tripleo_step5)
Nov 28 08:47:39 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:47:44 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:47:44 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:47:44 np0005538513.localdomain podman[92055]: 2025-11-28 08:47:44.852670365 +0000 UTC m=+0.084565514 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, batch=17.1_20251118.1)
Nov 28 08:47:44 np0005538513.localdomain podman[92055]: 2025-11-28 08:47:44.868540721 +0000 UTC m=+0.100435830 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, vcs-type=git, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.4)
Nov 28 08:47:44 np0005538513.localdomain podman[92055]: unhealthy
Nov 28 08:47:44 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:47:44 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:47:44 np0005538513.localdomain systemd[1]: tmp-crun.ALqc1q.mount: Deactivated successfully.
Nov 28 08:47:44 np0005538513.localdomain podman[92056]: 2025-11-28 08:47:44.969133434 +0000 UTC m=+0.197901665 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, release=1761123044)
Nov 28 08:47:45 np0005538513.localdomain podman[92056]: 2025-11-28 08:47:45.041594858 +0000 UTC m=+0.270363109 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Nov 28 08:47:45 np0005538513.localdomain podman[92056]: unhealthy
Nov 28 08:47:45 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:47:45 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:48:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:48:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:48:00 np0005538513.localdomain podman[92097]: 2025-11-28 08:48:00.848640894 +0000 UTC m=+0.081275131 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, io.openshift.expose-services=, vcs-type=git)
Nov 28 08:48:00 np0005538513.localdomain podman[92097]: 2025-11-28 08:48:00.857423798 +0000 UTC m=+0.090057995 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, architecture=x86_64, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step3, batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, maintainer=OpenStack TripleO Team)
Nov 28 08:48:00 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:48:00 np0005538513.localdomain podman[92096]: 2025-11-28 08:48:00.899797043 +0000 UTC m=+0.134748573 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, version=17.1.12, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z)
Nov 28 08:48:00 np0005538513.localdomain podman[92096]: 2025-11-28 08:48:00.934866448 +0000 UTC m=+0.169817938 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, com.redhat.component=openstack-iscsid-container, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:48:00 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:48:01 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:48:01 np0005538513.localdomain podman[92135]: 2025-11-28 08:48:01.835902783 +0000 UTC m=+0.076742979 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team)
Nov 28 08:48:02 np0005538513.localdomain podman[92135]: 2025-11-28 08:48:02.050950802 +0000 UTC m=+0.291791008 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, release=1761123044, vendor=Red Hat, Inc., container_name=metrics_qdr, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Nov 28 08:48:02 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:48:04 np0005538513.localdomain sudo[92165]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:48:04 np0005538513.localdomain sudo[92165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:48:04 np0005538513.localdomain sudo[92165]: pam_unix(sudo:session): session closed for user root
Nov 28 08:48:04 np0005538513.localdomain sudo[92180]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:48:04 np0005538513.localdomain sudo[92180]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:48:05 np0005538513.localdomain sudo[92180]: pam_unix(sudo:session): session closed for user root
Nov 28 08:48:05 np0005538513.localdomain sudo[92226]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:48:05 np0005538513.localdomain sudo[92226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:48:05 np0005538513.localdomain sudo[92226]: pam_unix(sudo:session): session closed for user root
Nov 28 08:48:05 np0005538513.localdomain sudo[92241]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 28 08:48:05 np0005538513.localdomain sudo[92241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:48:06 np0005538513.localdomain sudo[92241]: pam_unix(sudo:session): session closed for user root
Nov 28 08:48:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:48:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:48:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:48:07 np0005538513.localdomain podman[92277]: 2025-11-28 08:48:07.861454174 +0000 UTC m=+0.087656761 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, version=17.1.12)
Nov 28 08:48:07 np0005538513.localdomain podman[92275]: 2025-11-28 08:48:07.912976863 +0000 UTC m=+0.146608741 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, distribution-scope=public)
Nov 28 08:48:07 np0005538513.localdomain podman[92277]: 2025-11-28 08:48:07.919129216 +0000 UTC m=+0.145331823 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public)
Nov 28 08:48:07 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:48:07 np0005538513.localdomain podman[92275]: 2025-11-28 08:48:07.948484533 +0000 UTC m=+0.182116401 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:48:07 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:48:08 np0005538513.localdomain podman[92276]: 2025-11-28 08:48:08.014037291 +0000 UTC m=+0.242238330 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container)
Nov 28 08:48:08 np0005538513.localdomain podman[92276]: 2025-11-28 08:48:08.023876789 +0000 UTC m=+0.252077748 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, container_name=logrotate_crond, release=1761123044, vcs-type=git, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-cron)
Nov 28 08:48:08 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:48:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:48:08 np0005538513.localdomain podman[92347]: 2025-11-28 08:48:08.832540587 +0000 UTC m=+0.073224459 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, distribution-scope=public, config_id=tripleo_step4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container)
Nov 28 08:48:09 np0005538513.localdomain podman[92347]: 2025-11-28 08:48:09.175469533 +0000 UTC m=+0.416153405 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc.)
Nov 28 08:48:09 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:48:09 np0005538513.localdomain sudo[92370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:48:09 np0005538513.localdomain sudo[92370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:48:09 np0005538513.localdomain sudo[92370]: pam_unix(sudo:session): session closed for user root
Nov 28 08:48:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:48:10 np0005538513.localdomain podman[92385]: 2025-11-28 08:48:10.854410115 +0000 UTC m=+0.086525555 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:48:10 np0005538513.localdomain podman[92385]: 2025-11-28 08:48:10.890506133 +0000 UTC m=+0.122621623 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.)
Nov 28 08:48:10 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:48:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:48:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:48:15 np0005538513.localdomain podman[92413]: 2025-11-28 08:48:15.856809807 +0000 UTC m=+0.084405799 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true)
Nov 28 08:48:15 np0005538513.localdomain podman[92413]: 2025-11-28 08:48:15.87355124 +0000 UTC m=+0.101147222 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, config_id=tripleo_step4, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 08:48:15 np0005538513.localdomain podman[92413]: unhealthy
Nov 28 08:48:15 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:48:15 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:48:15 np0005538513.localdomain systemd[1]: tmp-crun.vtF9Jz.mount: Deactivated successfully.
Nov 28 08:48:15 np0005538513.localdomain podman[92412]: 2025-11-28 08:48:15.975711692 +0000 UTC m=+0.205195223 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, distribution-scope=public, architecture=x86_64, build-date=2025-11-19T00:14:25Z)
Nov 28 08:48:15 np0005538513.localdomain podman[92412]: 2025-11-28 08:48:15.994504769 +0000 UTC m=+0.223988260 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, tcib_managed=true, version=17.1.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4)
Nov 28 08:48:16 np0005538513.localdomain podman[92412]: unhealthy
Nov 28 08:48:16 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:48:16 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:48:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:48:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:48:31 np0005538513.localdomain podman[92453]: 2025-11-28 08:48:31.844903699 +0000 UTC m=+0.084309865 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, release=1761123044, distribution-scope=public, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:48:31 np0005538513.localdomain podman[92453]: 2025-11-28 08:48:31.88241155 +0000 UTC m=+0.121817686 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, tcib_managed=true, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:48:31 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:48:31 np0005538513.localdomain podman[92454]: 2025-11-28 08:48:31.902068285 +0000 UTC m=+0.140734459 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4)
Nov 28 08:48:31 np0005538513.localdomain podman[92454]: 2025-11-28 08:48:31.915411272 +0000 UTC m=+0.154077406 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, version=17.1.12, release=1761123044, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4)
Nov 28 08:48:31 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:48:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:48:32 np0005538513.localdomain systemd[1]: tmp-crun.lY7M7V.mount: Deactivated successfully.
Nov 28 08:48:32 np0005538513.localdomain podman[92490]: 2025-11-28 08:48:32.850357806 +0000 UTC m=+0.081038073 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc.)
Nov 28 08:48:33 np0005538513.localdomain podman[92490]: 2025-11-28 08:48:33.073332343 +0000 UTC m=+0.304012560 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, release=1761123044, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, container_name=metrics_qdr)
Nov 28 08:48:33 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3600.1 total, 600.0 interval
                                                          Cumulative writes: 4899 writes, 22K keys, 4899 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4899 writes, 608 syncs, 8.06 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 08:48:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:48:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:48:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:48:38 np0005538513.localdomain podman[92521]: 2025-11-28 08:48:38.845736325 +0000 UTC m=+0.081621971 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, tcib_managed=true, container_name=logrotate_crond, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:48:38 np0005538513.localdomain podman[92521]: 2025-11-28 08:48:38.858295687 +0000 UTC m=+0.094181333 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, container_name=logrotate_crond, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true)
Nov 28 08:48:38 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:48:38 np0005538513.localdomain systemd[1]: tmp-crun.MzVwCQ.mount: Deactivated successfully.
Nov 28 08:48:38 np0005538513.localdomain podman[92520]: 2025-11-28 08:48:38.96525619 +0000 UTC m=+0.204791381 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, tcib_managed=true, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com)
Nov 28 08:48:39 np0005538513.localdomain podman[92520]: 2025-11-28 08:48:39.019395191 +0000 UTC m=+0.258930322 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, release=1761123044)
Nov 28 08:48:39 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:48:39 np0005538513.localdomain podman[92522]: 2025-11-28 08:48:39.106690929 +0000 UTC m=+0.338565190 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12)
Nov 28 08:48:39 np0005538513.localdomain systemd[1]: tmp-crun.iAk3uG.mount: Deactivated successfully.
Nov 28 08:48:39 np0005538513.localdomain podman[92522]: 2025-11-28 08:48:39.137346737 +0000 UTC m=+0.369221038 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, release=1761123044, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, url=https://www.redhat.com)
Nov 28 08:48:39 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3600.3 total, 600.0 interval
                                                          Cumulative writes: 5616 writes, 25K keys, 5616 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5616 writes, 758 syncs, 7.41 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 08:48:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:48:39 np0005538513.localdomain podman[92594]: 2025-11-28 08:48:39.849162849 +0000 UTC m=+0.080277520 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:48:40 np0005538513.localdomain podman[92594]: 2025-11-28 08:48:40.22361041 +0000 UTC m=+0.454725091 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team)
Nov 28 08:48:40 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:48:41 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:48:41 np0005538513.localdomain systemd[1]: tmp-crun.a90Vaj.mount: Deactivated successfully.
Nov 28 08:48:41 np0005538513.localdomain podman[92617]: 2025-11-28 08:48:41.844763526 +0000 UTC m=+0.082898811 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:48:41 np0005538513.localdomain podman[92617]: 2025-11-28 08:48:41.875351752 +0000 UTC m=+0.113487027 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, container_name=nova_compute, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:48:41 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:48:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:48:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:48:46 np0005538513.localdomain systemd[1]: tmp-crun.KumBMN.mount: Deactivated successfully.
Nov 28 08:48:46 np0005538513.localdomain podman[92643]: 2025-11-28 08:48:46.861445861 +0000 UTC m=+0.084777379 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:48:46 np0005538513.localdomain podman[92643]: 2025-11-28 08:48:46.869045209 +0000 UTC m=+0.092376747 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, container_name=ovn_metadata_agent, config_id=tripleo_step4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Nov 28 08:48:46 np0005538513.localdomain podman[92643]: unhealthy
Nov 28 08:48:46 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:48:46 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:48:46 np0005538513.localdomain podman[92644]: 2025-11-28 08:48:46.953512248 +0000 UTC m=+0.173445180 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1)
Nov 28 08:48:46 np0005538513.localdomain podman[92644]: 2025-11-28 08:48:46.972319016 +0000 UTC m=+0.192251918 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, container_name=ovn_controller, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:48:46 np0005538513.localdomain podman[92644]: unhealthy
Nov 28 08:48:46 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:48:46 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:49:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:49:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:49:02 np0005538513.localdomain podman[92685]: 2025-11-28 08:49:02.841483862 +0000 UTC m=+0.084885824 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, container_name=iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true)
Nov 28 08:49:02 np0005538513.localdomain podman[92685]: 2025-11-28 08:49:02.877261829 +0000 UTC m=+0.120663781 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, batch=17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-18T23:44:13Z, vcs-type=git, config_id=tripleo_step3, managed_by=tripleo_ansible)
Nov 28 08:49:02 np0005538513.localdomain systemd[1]: tmp-crun.kdFX3A.mount: Deactivated successfully.
Nov 28 08:49:02 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:49:02 np0005538513.localdomain podman[92686]: 2025-11-28 08:49:02.90063919 +0000 UTC m=+0.140940735 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=collectd, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, version=17.1.12)
Nov 28 08:49:02 np0005538513.localdomain podman[92686]: 2025-11-28 08:49:02.910408655 +0000 UTC m=+0.150710210 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step3, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:49:02 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:49:03 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:49:03 np0005538513.localdomain podman[92726]: 2025-11-28 08:49:03.845397301 +0000 UTC m=+0.079997481 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., container_name=metrics_qdr, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=)
Nov 28 08:49:04 np0005538513.localdomain podman[92726]: 2025-11-28 08:49:04.073502528 +0000 UTC m=+0.308102688 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, container_name=metrics_qdr, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., config_id=tripleo_step1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 28 08:49:04 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:49:09 np0005538513.localdomain sudo[92756]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:49:09 np0005538513.localdomain sudo[92756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:49:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:49:09 np0005538513.localdomain sudo[92756]: pam_unix(sudo:session): session closed for user root
Nov 28 08:49:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:49:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:49:09 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:49:09 np0005538513.localdomain recover_tripleo_nova_virtqemud[92786]: 61397
Nov 28 08:49:09 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:49:09 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:49:09 np0005538513.localdomain sudo[92785]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:49:09 np0005538513.localdomain sudo[92785]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:49:09 np0005538513.localdomain systemd[1]: tmp-crun.s31sHo.mount: Deactivated successfully.
Nov 28 08:49:09 np0005538513.localdomain podman[92770]: 2025-11-28 08:49:09.760481531 +0000 UTC m=+0.103753833 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, version=17.1.12)
Nov 28 08:49:09 np0005538513.localdomain podman[92772]: 2025-11-28 08:49:09.821661002 +0000 UTC m=+0.162848040 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:49:09 np0005538513.localdomain podman[92772]: 2025-11-28 08:49:09.830706405 +0000 UTC m=+0.171893402 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:49:09 np0005538513.localdomain podman[92773]: 2025-11-28 08:49:09.860132795 +0000 UTC m=+0.196128089 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-19T00:12:45Z, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4)
Nov 28 08:49:09 np0005538513.localdomain podman[92770]: 2025-11-28 08:49:09.891285668 +0000 UTC m=+0.234557960 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Nov 28 08:49:09 np0005538513.localdomain podman[92773]: 2025-11-28 08:49:09.889399229 +0000 UTC m=+0.225394563 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 28 08:49:09 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:49:09 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:49:09 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:49:10 np0005538513.localdomain sudo[92785]: pam_unix(sudo:session): session closed for user root
Nov 28 08:49:10 np0005538513.localdomain sudo[92890]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:49:10 np0005538513.localdomain sudo[92890]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:49:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:49:10 np0005538513.localdomain sudo[92890]: pam_unix(sudo:session): session closed for user root
Nov 28 08:49:10 np0005538513.localdomain sudo[92906]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 -- inventory --format=json-pretty --filter-for-batch
Nov 28 08:49:10 np0005538513.localdomain sudo[92906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:49:10 np0005538513.localdomain podman[92905]: 2025-11-28 08:49:10.669499135 +0000 UTC m=+0.086823454 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vendor=Red Hat, Inc., container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, build-date=2025-11-19T00:36:58Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:49:11 np0005538513.localdomain podman[92905]: 2025-11-28 08:49:11.042250123 +0000 UTC m=+0.459574392 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:49:11 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:49:11 np0005538513.localdomain podman[92985]: 
Nov 28 08:49:11 np0005538513.localdomain podman[92985]: 2025-11-28 08:49:11.219534112 +0000 UTC m=+0.083168009 container create f38c474b3edba19cc96451ffef18fff014c31ffa8f1a2f2234aaf3a23bf9a4fa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_varahamihira, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, vendor=Red Hat, Inc., RELEASE=main, CEPH_POINT_RELEASE=, GIT_BRANCH=main, release=553, name=rhceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, architecture=x86_64, vcs-type=git)
Nov 28 08:49:11 np0005538513.localdomain systemd[1]: Started libpod-conmon-f38c474b3edba19cc96451ffef18fff014c31ffa8f1a2f2234aaf3a23bf9a4fa.scope.
Nov 28 08:49:11 np0005538513.localdomain podman[92985]: 2025-11-28 08:49:11.185932282 +0000 UTC m=+0.049566179 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 08:49:11 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:49:11 np0005538513.localdomain podman[92985]: 2025-11-28 08:49:11.301749282 +0000 UTC m=+0.165383169 container init f38c474b3edba19cc96451ffef18fff014c31ffa8f1a2f2234aaf3a23bf9a4fa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_varahamihira, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, version=7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, distribution-scope=public, RELEASE=main)
Nov 28 08:49:11 np0005538513.localdomain podman[92985]: 2025-11-28 08:49:11.314759298 +0000 UTC m=+0.178393185 container start f38c474b3edba19cc96451ffef18fff014c31ffa8f1a2f2234aaf3a23bf9a4fa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_varahamihira, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhceph, vendor=Red Hat, Inc., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, vcs-type=git, RELEASE=main)
Nov 28 08:49:11 np0005538513.localdomain podman[92985]: 2025-11-28 08:49:11.315129059 +0000 UTC m=+0.178762996 container attach f38c474b3edba19cc96451ffef18fff014c31ffa8f1a2f2234aaf3a23bf9a4fa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_varahamihira, ceph=True, RELEASE=main, version=7, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, name=rhceph, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, release=553, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True)
Nov 28 08:49:11 np0005538513.localdomain charming_varahamihira[93001]: 167 167
Nov 28 08:49:11 np0005538513.localdomain systemd[1]: libpod-f38c474b3edba19cc96451ffef18fff014c31ffa8f1a2f2234aaf3a23bf9a4fa.scope: Deactivated successfully.
Nov 28 08:49:11 np0005538513.localdomain podman[92985]: 2025-11-28 08:49:11.319252308 +0000 UTC m=+0.182886225 container died f38c474b3edba19cc96451ffef18fff014c31ffa8f1a2f2234aaf3a23bf9a4fa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_varahamihira, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-type=git, io.openshift.tags=rhceph ceph, RELEASE=main, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.expose-services=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, version=7, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, ceph=True)
Nov 28 08:49:11 np0005538513.localdomain podman[93006]: 2025-11-28 08:49:11.427116008 +0000 UTC m=+0.092790450 container remove f38c474b3edba19cc96451ffef18fff014c31ffa8f1a2f2234aaf3a23bf9a4fa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_varahamihira, vcs-type=git, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_CLEAN=True, name=rhceph, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, release=553)
Nov 28 08:49:11 np0005538513.localdomain systemd[1]: libpod-conmon-f38c474b3edba19cc96451ffef18fff014c31ffa8f1a2f2234aaf3a23bf9a4fa.scope: Deactivated successfully.
Nov 28 08:49:11 np0005538513.localdomain podman[93027]: 
Nov 28 08:49:11 np0005538513.localdomain podman[93027]: 2025-11-28 08:49:11.664692642 +0000 UTC m=+0.080386363 container create 7937f315b75cd4b852b2d3c86bae8d967bcc45ac6d1e9ee12db7e0cf1b89533a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_albattani, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, release=553, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, version=7, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.openshift.expose-services=, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 08:49:11 np0005538513.localdomain systemd[1]: Started libpod-conmon-7937f315b75cd4b852b2d3c86bae8d967bcc45ac6d1e9ee12db7e0cf1b89533a.scope.
Nov 28 08:49:11 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 08:49:11 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9dc99256477a60c2659e433e2fdd6d92a53e343843f6c595e3ad75eb7c063e49/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 08:49:11 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9dc99256477a60c2659e433e2fdd6d92a53e343843f6c595e3ad75eb7c063e49/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 08:49:11 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9dc99256477a60c2659e433e2fdd6d92a53e343843f6c595e3ad75eb7c063e49/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 08:49:11 np0005538513.localdomain podman[93027]: 2025-11-28 08:49:11.728185806 +0000 UTC m=+0.143879527 container init 7937f315b75cd4b852b2d3c86bae8d967bcc45ac6d1e9ee12db7e0cf1b89533a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_albattani, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, version=7, description=Red Hat Ceph Storage 7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, release=553, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 08:49:11 np0005538513.localdomain podman[93027]: 2025-11-28 08:49:11.632679742 +0000 UTC m=+0.048373503 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 08:49:11 np0005538513.localdomain podman[93027]: 2025-11-28 08:49:11.738959953 +0000 UTC m=+0.154653684 container start 7937f315b75cd4b852b2d3c86bae8d967bcc45ac6d1e9ee12db7e0cf1b89533a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_albattani, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_CLEAN=True, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, version=7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, build-date=2025-09-24T08:57:55, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, ceph=True)
Nov 28 08:49:11 np0005538513.localdomain podman[93027]: 2025-11-28 08:49:11.739320804 +0000 UTC m=+0.155014575 container attach 7937f315b75cd4b852b2d3c86bae8d967bcc45ac6d1e9ee12db7e0cf1b89533a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_albattani, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, name=rhceph, build-date=2025-09-24T08:57:55, RELEASE=main, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553)
Nov 28 08:49:11 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-7e41374e26fe1058b6fc5ed7dafc746a4518e7bdef00d193a7c5d6e3a115c4ea-merged.mount: Deactivated successfully.
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]: [
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:     {
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:         "available": false,
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:         "ceph_device": false,
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:         "lsm_data": {},
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:         "lvs": [],
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:         "path": "/dev/sr0",
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:         "rejected_reasons": [
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:             "Insufficient space (<5GB)",
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:             "Has a FileSystem"
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:         ],
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:         "sys_api": {
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:             "actuators": null,
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:             "device_nodes": "sr0",
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:             "human_readable_size": "482.00 KB",
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:             "id_bus": "ata",
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:             "model": "QEMU DVD-ROM",
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:             "nr_requests": "2",
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:             "partitions": {},
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:             "path": "/dev/sr0",
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:             "removable": "1",
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:             "rev": "2.5+",
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:             "ro": "0",
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:             "rotational": "1",
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:             "sas_address": "",
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:             "sas_device_handle": "",
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:             "scheduler_mode": "mq-deadline",
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:             "sectors": 0,
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:             "sectorsize": "2048",
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:             "size": 493568.0,
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:             "support_discard": "0",
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:             "type": "disk",
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:             "vendor": "QEMU"
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:         }
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]:     }
Nov 28 08:49:12 np0005538513.localdomain sharp_albattani[93043]: ]
Nov 28 08:49:12 np0005538513.localdomain systemd[1]: libpod-7937f315b75cd4b852b2d3c86bae8d967bcc45ac6d1e9ee12db7e0cf1b89533a.scope: Deactivated successfully.
Nov 28 08:49:12 np0005538513.localdomain podman[93027]: 2025-11-28 08:49:12.697388051 +0000 UTC m=+1.113081822 container died 7937f315b75cd4b852b2d3c86bae8d967bcc45ac6d1e9ee12db7e0cf1b89533a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_albattani, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_CLEAN=True, architecture=x86_64, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, build-date=2025-09-24T08:57:55, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=553, vendor=Red Hat, Inc., ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, vcs-type=git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 08:49:12 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:49:12 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-9dc99256477a60c2659e433e2fdd6d92a53e343843f6c595e3ad75eb7c063e49-merged.mount: Deactivated successfully.
Nov 28 08:49:12 np0005538513.localdomain systemd[1]: tmp-crun.KNdFYF.mount: Deactivated successfully.
Nov 28 08:49:12 np0005538513.localdomain podman[95066]: 2025-11-28 08:49:12.831625986 +0000 UTC m=+0.105682164 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vcs-type=git, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, container_name=nova_compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Nov 28 08:49:12 np0005538513.localdomain podman[95066]: 2025-11-28 08:49:12.861585401 +0000 UTC m=+0.135641809 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-type=git, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible)
Nov 28 08:49:12 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:49:12 np0005538513.localdomain podman[95060]: 2025-11-28 08:49:12.900718364 +0000 UTC m=+0.194517969 container remove 7937f315b75cd4b852b2d3c86bae8d967bcc45ac6d1e9ee12db7e0cf1b89533a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_albattani, ceph=True, io.openshift.expose-services=, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, architecture=x86_64, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, release=553, GIT_BRANCH=main, version=7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc.)
Nov 28 08:49:12 np0005538513.localdomain systemd[1]: libpod-conmon-7937f315b75cd4b852b2d3c86bae8d967bcc45ac6d1e9ee12db7e0cf1b89533a.scope: Deactivated successfully.
Nov 28 08:49:12 np0005538513.localdomain sudo[92906]: pam_unix(sudo:session): session closed for user root
Nov 28 08:49:13 np0005538513.localdomain sudo[95102]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:49:13 np0005538513.localdomain sudo[95102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:49:13 np0005538513.localdomain sudo[95102]: pam_unix(sudo:session): session closed for user root
Nov 28 08:49:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:49:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:49:17 np0005538513.localdomain systemd[1]: tmp-crun.ah4RYx.mount: Deactivated successfully.
Nov 28 08:49:17 np0005538513.localdomain systemd[1]: tmp-crun.Z6KIKD.mount: Deactivated successfully.
Nov 28 08:49:17 np0005538513.localdomain podman[95118]: 2025-11-28 08:49:17.905979405 +0000 UTC m=+0.139730098 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:49:17 np0005538513.localdomain podman[95117]: 2025-11-28 08:49:17.868093911 +0000 UTC m=+0.101838543 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12)
Nov 28 08:49:17 np0005538513.localdomain podman[95118]: 2025-11-28 08:49:17.92760039 +0000 UTC m=+0.161351133 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public)
Nov 28 08:49:17 np0005538513.localdomain podman[95118]: unhealthy
Nov 28 08:49:17 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:49:17 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:49:17 np0005538513.localdomain podman[95117]: 2025-11-28 08:49:17.952919072 +0000 UTC m=+0.186663694 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Nov 28 08:49:17 np0005538513.localdomain podman[95117]: unhealthy
Nov 28 08:49:17 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:49:17 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:49:33 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:49:33 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:49:33 np0005538513.localdomain systemd[1]: tmp-crun.0gzMWv.mount: Deactivated successfully.
Nov 28 08:49:33 np0005538513.localdomain podman[95156]: 2025-11-28 08:49:33.864586208 +0000 UTC m=+0.092826722 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, container_name=iscsid, version=17.1.12, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:49:33 np0005538513.localdomain podman[95157]: 2025-11-28 08:49:33.916997465 +0000 UTC m=+0.140673447 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-type=git, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:49:33 np0005538513.localdomain podman[95157]: 2025-11-28 08:49:33.928590528 +0000 UTC m=+0.152266550 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, release=1761123044, config_id=tripleo_step3, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:49:33 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:49:33 np0005538513.localdomain podman[95156]: 2025-11-28 08:49:33.984137953 +0000 UTC m=+0.212378467 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Nov 28 08:49:33 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:49:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:49:34 np0005538513.localdomain podman[95195]: 2025-11-28 08:49:34.847589554 +0000 UTC m=+0.085297867 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4)
Nov 28 08:49:35 np0005538513.localdomain podman[95195]: 2025-11-28 08:49:35.04560145 +0000 UTC m=+0.283309783 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, release=1761123044, container_name=metrics_qdr, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 28 08:49:35 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:49:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:49:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:49:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:49:40 np0005538513.localdomain systemd[1]: tmp-crun.mCqIwu.mount: Deactivated successfully.
Nov 28 08:49:40 np0005538513.localdomain podman[95226]: 2025-11-28 08:49:40.87652989 +0000 UTC m=+0.109847924 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, distribution-scope=public, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:49:40 np0005538513.localdomain podman[95226]: 2025-11-28 08:49:40.889392452 +0000 UTC m=+0.122710446 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step4, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1)
Nov 28 08:49:40 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:49:40 np0005538513.localdomain podman[95225]: 2025-11-28 08:49:40.966182572 +0000 UTC m=+0.202921883 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, architecture=x86_64, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:49:40 np0005538513.localdomain podman[95227]: 2025-11-28 08:49:40.839621907 +0000 UTC m=+0.074671004 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, architecture=x86_64, url=https://www.redhat.com)
Nov 28 08:49:41 np0005538513.localdomain podman[95227]: 2025-11-28 08:49:41.025499815 +0000 UTC m=+0.260548932 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi)
Nov 28 08:49:41 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:49:41 np0005538513.localdomain podman[95225]: 2025-11-28 08:49:41.075985722 +0000 UTC m=+0.312724983 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:49:41 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:49:41 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:49:41 np0005538513.localdomain podman[95296]: 2025-11-28 08:49:41.19400158 +0000 UTC m=+0.079196765 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:49:41 np0005538513.localdomain podman[95296]: 2025-11-28 08:49:41.557463587 +0000 UTC m=+0.442658762 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:49:41 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:49:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:49:43 np0005538513.localdomain podman[95317]: 2025-11-28 08:49:43.844134859 +0000 UTC m=+0.085997808 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:49:43 np0005538513.localdomain podman[95317]: 2025-11-28 08:49:43.903588997 +0000 UTC m=+0.145451896 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=nova_compute, config_id=tripleo_step5, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:49:43 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:49:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:49:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:49:48 np0005538513.localdomain podman[95342]: 2025-11-28 08:49:48.847171009 +0000 UTC m=+0.081955501 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:49:48 np0005538513.localdomain podman[95341]: 2025-11-28 08:49:48.894203439 +0000 UTC m=+0.133275636 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 08:49:48 np0005538513.localdomain podman[95342]: 2025-11-28 08:49:48.912408978 +0000 UTC m=+0.147193460 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team)
Nov 28 08:49:48 np0005538513.localdomain podman[95342]: unhealthy
Nov 28 08:49:48 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:49:48 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:49:48 np0005538513.localdomain podman[95341]: 2025-11-28 08:49:48.937407749 +0000 UTC m=+0.176480156 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:49:48 np0005538513.localdomain podman[95341]: unhealthy
Nov 28 08:49:48 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:49:48 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:50:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:50:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:50:04 np0005538513.localdomain systemd[1]: tmp-crun.KSwXz8.mount: Deactivated successfully.
Nov 28 08:50:04 np0005538513.localdomain podman[95382]: 2025-11-28 08:50:04.85184813 +0000 UTC m=+0.090339183 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, version=17.1.12, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 28 08:50:04 np0005538513.localdomain systemd[1]: tmp-crun.WEEUr0.mount: Deactivated successfully.
Nov 28 08:50:04 np0005538513.localdomain podman[95381]: 2025-11-28 08:50:04.894138602 +0000 UTC m=+0.134343559 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, name=rhosp17/openstack-iscsid, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid)
Nov 28 08:50:04 np0005538513.localdomain podman[95382]: 2025-11-28 08:50:04.914811138 +0000 UTC m=+0.153302231 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step3, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, managed_by=tripleo_ansible, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd)
Nov 28 08:50:04 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:50:04 np0005538513.localdomain podman[95381]: 2025-11-28 08:50:04.931424166 +0000 UTC m=+0.171629113 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 28 08:50:04 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:50:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:50:06 np0005538513.localdomain podman[95420]: 2025-11-28 08:50:06.842299197 +0000 UTC m=+0.077779172 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container)
Nov 28 08:50:07 np0005538513.localdomain podman[95420]: 2025-11-28 08:50:07.031425446 +0000 UTC m=+0.266905381 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 28 08:50:07 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:50:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:50:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:50:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:50:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:50:11 np0005538513.localdomain podman[95450]: 2025-11-28 08:50:11.856289908 +0000 UTC m=+0.085693648 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., container_name=nova_migration_target, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, version=17.1.12)
Nov 28 08:50:11 np0005538513.localdomain systemd[1]: tmp-crun.6HC5JW.mount: Deactivated successfully.
Nov 28 08:50:11 np0005538513.localdomain podman[95451]: 2025-11-28 08:50:11.913012872 +0000 UTC m=+0.139731228 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_id=tripleo_step4)
Nov 28 08:50:11 np0005538513.localdomain podman[95449]: 2025-11-28 08:50:11.952276698 +0000 UTC m=+0.184426384 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1761123044, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:50:11 np0005538513.localdomain podman[95451]: 2025-11-28 08:50:11.975627738 +0000 UTC m=+0.202346054 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-cron, version=17.1.12, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:50:11 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:50:11 np0005538513.localdomain podman[95449]: 2025-11-28 08:50:11.987692564 +0000 UTC m=+0.219842310 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4)
Nov 28 08:50:12 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:50:12 np0005538513.localdomain podman[95456]: 2025-11-28 08:50:12.06276084 +0000 UTC m=+0.285361048 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, release=1761123044, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Nov 28 08:50:12 np0005538513.localdomain podman[95456]: 2025-11-28 08:50:12.11651236 +0000 UTC m=+0.339112568 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi)
Nov 28 08:50:12 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:50:12 np0005538513.localdomain podman[95450]: 2025-11-28 08:50:12.327544294 +0000 UTC m=+0.556948034 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, container_name=nova_migration_target, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true)
Nov 28 08:50:12 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:50:13 np0005538513.localdomain sudo[95543]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:50:13 np0005538513.localdomain sudo[95543]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:50:13 np0005538513.localdomain sudo[95543]: pam_unix(sudo:session): session closed for user root
Nov 28 08:50:13 np0005538513.localdomain sudo[95558]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 08:50:13 np0005538513.localdomain sudo[95558]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:50:14 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:50:14 np0005538513.localdomain sudo[95558]: pam_unix(sudo:session): session closed for user root
Nov 28 08:50:14 np0005538513.localdomain podman[95587]: 2025-11-28 08:50:14.232489129 +0000 UTC m=+0.076724979 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git)
Nov 28 08:50:14 np0005538513.localdomain podman[95587]: 2025-11-28 08:50:14.287628711 +0000 UTC m=+0.131864511 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, container_name=nova_compute, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Nov 28 08:50:14 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:50:14 np0005538513.localdomain sudo[95618]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:50:14 np0005538513.localdomain sudo[95618]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:50:14 np0005538513.localdomain sudo[95618]: pam_unix(sudo:session): session closed for user root
Nov 28 08:50:14 np0005538513.localdomain sudo[95634]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:50:14 np0005538513.localdomain sudo[95634]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:50:15 np0005538513.localdomain sudo[95634]: pam_unix(sudo:session): session closed for user root
Nov 28 08:50:15 np0005538513.localdomain sudo[95681]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:50:15 np0005538513.localdomain sudo[95681]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:50:15 np0005538513.localdomain sudo[95681]: pam_unix(sudo:session): session closed for user root
Nov 28 08:50:19 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:50:19 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:50:19 np0005538513.localdomain podman[95697]: 2025-11-28 08:50:19.845932101 +0000 UTC m=+0.079108932 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team)
Nov 28 08:50:19 np0005538513.localdomain podman[95696]: 2025-11-28 08:50:19.893871 +0000 UTC m=+0.130636243 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 08:50:19 np0005538513.localdomain podman[95696]: 2025-11-28 08:50:19.911350136 +0000 UTC m=+0.148115409 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, version=17.1.12, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z)
Nov 28 08:50:19 np0005538513.localdomain podman[95696]: unhealthy
Nov 28 08:50:19 np0005538513.localdomain podman[95697]: 2025-11-28 08:50:19.919529202 +0000 UTC m=+0.152705983 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible)
Nov 28 08:50:19 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:50:19 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:50:19 np0005538513.localdomain podman[95697]: unhealthy
Nov 28 08:50:19 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:50:19 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:50:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:50:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:50:35 np0005538513.localdomain podman[95732]: 2025-11-28 08:50:35.847960711 +0000 UTC m=+0.084338207 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Nov 28 08:50:35 np0005538513.localdomain podman[95732]: 2025-11-28 08:50:35.858877321 +0000 UTC m=+0.095254827 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:50:35 np0005538513.localdomain podman[95733]: 2025-11-28 08:50:35.908208023 +0000 UTC m=+0.137766666 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, tcib_managed=true, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, version=17.1.12)
Nov 28 08:50:35 np0005538513.localdomain podman[95733]: 2025-11-28 08:50:35.920343072 +0000 UTC m=+0.149901725 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.41.4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:50:35 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:50:35 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:50:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:50:37 np0005538513.localdomain podman[95772]: 2025-11-28 08:50:37.852554198 +0000 UTC m=+0.080673972 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=)
Nov 28 08:50:38 np0005538513.localdomain podman[95772]: 2025-11-28 08:50:38.040402437 +0000 UTC m=+0.268522221 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:50:38 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:50:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:50:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:50:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:50:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:50:42 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:50:42 np0005538513.localdomain recover_tripleo_nova_virtqemud[95822]: 61397
Nov 28 08:50:42 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:50:42 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:50:42 np0005538513.localdomain podman[95802]: 2025-11-28 08:50:42.853792532 +0000 UTC m=+0.092859742 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64)
Nov 28 08:50:42 np0005538513.localdomain systemd[1]: tmp-crun.dKmcoC.mount: Deactivated successfully.
Nov 28 08:50:42 np0005538513.localdomain podman[95804]: 2025-11-28 08:50:42.900984067 +0000 UTC m=+0.134399601 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public)
Nov 28 08:50:42 np0005538513.localdomain podman[95804]: 2025-11-28 08:50:42.909229214 +0000 UTC m=+0.142644758 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Nov 28 08:50:42 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:50:42 np0005538513.localdomain systemd[1]: tmp-crun.fOUNwD.mount: Deactivated successfully.
Nov 28 08:50:42 np0005538513.localdomain podman[95802]: 2025-11-28 08:50:42.979436228 +0000 UTC m=+0.218503438 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:50:42 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:50:42 np0005538513.localdomain podman[95803]: 2025-11-28 08:50:42.998715591 +0000 UTC m=+0.234819879 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:50:43 np0005538513.localdomain podman[95805]: 2025-11-28 08:50:42.972504682 +0000 UTC m=+0.201615641 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1)
Nov 28 08:50:43 np0005538513.localdomain podman[95805]: 2025-11-28 08:50:43.055318089 +0000 UTC m=+0.284429058 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:50:43 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:50:43 np0005538513.localdomain podman[95803]: 2025-11-28 08:50:43.358683609 +0000 UTC m=+0.594787957 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:50:43 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:50:44 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:50:44 np0005538513.localdomain podman[95895]: 2025-11-28 08:50:44.846157969 +0000 UTC m=+0.081267711 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step5, managed_by=tripleo_ansible, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:50:44 np0005538513.localdomain podman[95895]: 2025-11-28 08:50:44.875363551 +0000 UTC m=+0.110473293 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, batch=17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 08:50:44 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:50:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:50:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:50:50 np0005538513.localdomain podman[95923]: 2025-11-28 08:50:50.856540675 +0000 UTC m=+0.087089602 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, version=17.1.12, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true)
Nov 28 08:50:50 np0005538513.localdomain podman[95923]: 2025-11-28 08:50:50.897527556 +0000 UTC m=+0.128076483 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible)
Nov 28 08:50:50 np0005538513.localdomain podman[95923]: unhealthy
Nov 28 08:50:50 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:50:50 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:50:50 np0005538513.localdomain podman[95922]: 2025-11-28 08:50:50.907560749 +0000 UTC m=+0.141501952 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, container_name=ovn_metadata_agent, distribution-scope=public, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:50:50 np0005538513.localdomain podman[95922]: 2025-11-28 08:50:50.989595343 +0000 UTC m=+0.223536506 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.12, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044)
Nov 28 08:50:50 np0005538513.localdomain podman[95922]: unhealthy
Nov 28 08:50:51 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:50:51 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:51:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:51:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:51:06 np0005538513.localdomain podman[95961]: 2025-11-28 08:51:06.846304708 +0000 UTC m=+0.083578623 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, vcs-type=git, config_id=tripleo_step3, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container)
Nov 28 08:51:06 np0005538513.localdomain podman[95961]: 2025-11-28 08:51:06.859670326 +0000 UTC m=+0.096944261 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, architecture=x86_64, io.openshift.expose-services=)
Nov 28 08:51:06 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:51:06 np0005538513.localdomain podman[95962]: 2025-11-28 08:51:06.95390143 +0000 UTC m=+0.188225453 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step3, batch=17.1_20251118.1, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:51:06 np0005538513.localdomain podman[95962]: 2025-11-28 08:51:06.99006368 +0000 UTC m=+0.224387733 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:51:07 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:51:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:51:08 np0005538513.localdomain systemd[1]: tmp-crun.91Q8MG.mount: Deactivated successfully.
Nov 28 08:51:08 np0005538513.localdomain podman[95998]: 2025-11-28 08:51:08.86670773 +0000 UTC m=+0.103375131 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_id=tripleo_step1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, managed_by=tripleo_ansible)
Nov 28 08:51:09 np0005538513.localdomain podman[95998]: 2025-11-28 08:51:09.0641673 +0000 UTC m=+0.300834741 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd)
Nov 28 08:51:09 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:51:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:51:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:51:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:51:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:51:13 np0005538513.localdomain podman[96028]: 2025-11-28 08:51:13.854756172 +0000 UTC m=+0.083751838 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.openshift.expose-services=, version=17.1.12)
Nov 28 08:51:13 np0005538513.localdomain podman[96025]: 2025-11-28 08:51:13.909496152 +0000 UTC m=+0.142001747 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team)
Nov 28 08:51:13 np0005538513.localdomain podman[96026]: 2025-11-28 08:51:13.962243981 +0000 UTC m=+0.192882098 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:51:14 np0005538513.localdomain podman[96028]: 2025-11-28 08:51:14.012974856 +0000 UTC m=+0.241970472 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team)
Nov 28 08:51:14 np0005538513.localdomain podman[96027]: 2025-11-28 08:51:14.021861723 +0000 UTC m=+0.250449706 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_id=tripleo_step4, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Nov 28 08:51:14 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:51:14 np0005538513.localdomain podman[96027]: 2025-11-28 08:51:14.035312564 +0000 UTC m=+0.263900567 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=logrotate_crond, release=1761123044, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git)
Nov 28 08:51:14 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:51:14 np0005538513.localdomain podman[96025]: 2025-11-28 08:51:14.088659701 +0000 UTC m=+0.321165326 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, config_id=tripleo_step4)
Nov 28 08:51:14 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:51:14 np0005538513.localdomain podman[96026]: 2025-11-28 08:51:14.327659318 +0000 UTC m=+0.558297455 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=nova_migration_target, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=)
Nov 28 08:51:14 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:51:14 np0005538513.localdomain systemd[1]: tmp-crun.zkZEwR.mount: Deactivated successfully.
Nov 28 08:51:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:51:15 np0005538513.localdomain sudo[96131]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:51:15 np0005538513.localdomain podman[96124]: 2025-11-28 08:51:15.84909784 +0000 UTC m=+0.087109814 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 08:51:15 np0005538513.localdomain sudo[96131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:51:15 np0005538513.localdomain sudo[96131]: pam_unix(sudo:session): session closed for user root
Nov 28 08:51:15 np0005538513.localdomain podman[96124]: 2025-11-28 08:51:15.882515494 +0000 UTC m=+0.120527488 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, container_name=nova_compute, release=1761123044, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:51:15 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:51:15 np0005538513.localdomain sudo[96165]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:51:15 np0005538513.localdomain sudo[96165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:51:16 np0005538513.localdomain sudo[96165]: pam_unix(sudo:session): session closed for user root
Nov 28 08:51:17 np0005538513.localdomain sudo[96213]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:51:17 np0005538513.localdomain sudo[96213]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:51:17 np0005538513.localdomain sudo[96213]: pam_unix(sudo:session): session closed for user root
Nov 28 08:51:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:51:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:51:21 np0005538513.localdomain podman[96229]: 2025-11-28 08:51:21.855011478 +0000 UTC m=+0.088477146 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, vcs-type=git, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=)
Nov 28 08:51:21 np0005538513.localdomain podman[96229]: 2025-11-28 08:51:21.896533875 +0000 UTC m=+0.129999533 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:51:21 np0005538513.localdomain podman[96229]: unhealthy
Nov 28 08:51:21 np0005538513.localdomain podman[96228]: 2025-11-28 08:51:21.908807779 +0000 UTC m=+0.144645401 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12)
Nov 28 08:51:21 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:51:21 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:51:21 np0005538513.localdomain podman[96228]: 2025-11-28 08:51:21.950521223 +0000 UTC m=+0.186358845 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, config_id=tripleo_step4)
Nov 28 08:51:21 np0005538513.localdomain podman[96228]: unhealthy
Nov 28 08:51:21 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:51:21 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:51:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:51:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:51:37 np0005538513.localdomain podman[96271]: 2025-11-28 08:51:37.837872607 +0000 UTC m=+0.070388190 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, url=https://www.redhat.com, name=rhosp17/openstack-collectd, vcs-type=git, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step3)
Nov 28 08:51:37 np0005538513.localdomain podman[96271]: 2025-11-28 08:51:37.874537782 +0000 UTC m=+0.107053365 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc.)
Nov 28 08:51:37 np0005538513.localdomain podman[96270]: 2025-11-28 08:51:37.903094615 +0000 UTC m=+0.136043662 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, name=rhosp17/openstack-iscsid, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, version=17.1.12, io.openshift.expose-services=, container_name=iscsid, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4)
Nov 28 08:51:37 np0005538513.localdomain podman[96270]: 2025-11-28 08:51:37.91733869 +0000 UTC m=+0.150287727 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, version=17.1.12, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z)
Nov 28 08:51:37 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:51:37 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:51:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:51:39 np0005538513.localdomain podman[96309]: 2025-11-28 08:51:39.833503074 +0000 UTC m=+0.073218449 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=metrics_qdr, architecture=x86_64, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:51:40 np0005538513.localdomain podman[96309]: 2025-11-28 08:51:40.058551386 +0000 UTC m=+0.298266701 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, version=17.1.12)
Nov 28 08:51:40 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:51:44 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:51:44 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:51:44 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:51:44 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:51:44 np0005538513.localdomain podman[96338]: 2025-11-28 08:51:44.856114787 +0000 UTC m=+0.093769151 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4)
Nov 28 08:51:44 np0005538513.localdomain podman[96339]: 2025-11-28 08:51:44.893273018 +0000 UTC m=+0.128720053 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:51:44 np0005538513.localdomain podman[96338]: 2025-11-28 08:51:44.912478058 +0000 UTC m=+0.150132342 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, batch=17.1_20251118.1, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, version=17.1.12)
Nov 28 08:51:44 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:51:44 np0005538513.localdomain podman[96340]: 2025-11-28 08:51:44.997719702 +0000 UTC m=+0.231289348 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=)
Nov 28 08:51:45 np0005538513.localdomain podman[96340]: 2025-11-28 08:51:45.009319454 +0000 UTC m=+0.242889130 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-cron, version=17.1.12, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:51:45 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:51:45 np0005538513.localdomain podman[96341]: 2025-11-28 08:51:45.048055225 +0000 UTC m=+0.278699600 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1761123044, vcs-type=git, version=17.1.12, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4)
Nov 28 08:51:45 np0005538513.localdomain podman[96341]: 2025-11-28 08:51:45.101799054 +0000 UTC m=+0.332443449 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-type=git, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:51:45 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:51:45 np0005538513.localdomain podman[96339]: 2025-11-28 08:51:45.296509108 +0000 UTC m=+0.531956163 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., container_name=nova_migration_target, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:51:45 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:51:45 np0005538513.localdomain systemd[1]: tmp-crun.AhAcAJ.mount: Deactivated successfully.
Nov 28 08:51:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:51:46 np0005538513.localdomain podman[96434]: 2025-11-28 08:51:46.841216676 +0000 UTC m=+0.079331561 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, container_name=nova_compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.openshift.expose-services=, tcib_managed=true)
Nov 28 08:51:46 np0005538513.localdomain podman[96434]: 2025-11-28 08:51:46.875474496 +0000 UTC m=+0.113589381 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute)
Nov 28 08:51:46 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:51:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:51:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:51:52 np0005538513.localdomain systemd[1]: tmp-crun.NG1Ihm.mount: Deactivated successfully.
Nov 28 08:51:52 np0005538513.localdomain podman[96461]: 2025-11-28 08:51:52.83899344 +0000 UTC m=+0.076738179 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, url=https://www.redhat.com, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Nov 28 08:51:52 np0005538513.localdomain podman[96461]: 2025-11-28 08:51:52.849362683 +0000 UTC m=+0.087107412 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:14:25Z, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64)
Nov 28 08:51:52 np0005538513.localdomain podman[96461]: unhealthy
Nov 28 08:51:52 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:51:52 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:51:52 np0005538513.localdomain podman[96462]: 2025-11-28 08:51:52.947405277 +0000 UTC m=+0.179284973 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, vcs-type=git, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, release=1761123044, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 08:51:52 np0005538513.localdomain podman[96462]: 2025-11-28 08:51:52.991424053 +0000 UTC m=+0.223303709 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:51:52 np0005538513.localdomain podman[96462]: unhealthy
Nov 28 08:51:53 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:51:53 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:52:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:52:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:52:08 np0005538513.localdomain systemd[1]: tmp-crun.CbeLqK.mount: Deactivated successfully.
Nov 28 08:52:08 np0005538513.localdomain systemd[1]: tmp-crun.gK4XKv.mount: Deactivated successfully.
Nov 28 08:52:08 np0005538513.localdomain podman[96502]: 2025-11-28 08:52:08.910276052 +0000 UTC m=+0.145705944 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, config_id=tripleo_step3, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, managed_by=tripleo_ansible)
Nov 28 08:52:08 np0005538513.localdomain podman[96503]: 2025-11-28 08:52:08.86767753 +0000 UTC m=+0.102113671 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, container_name=collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team)
Nov 28 08:52:08 np0005538513.localdomain podman[96502]: 2025-11-28 08:52:08.946526344 +0000 UTC m=+0.181956306 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, release=1761123044, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:52:08 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:52:09 np0005538513.localdomain podman[96503]: 2025-11-28 08:52:09.002110731 +0000 UTC m=+0.236546882 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, version=17.1.12, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, container_name=collectd, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Nov 28 08:52:09 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:52:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:52:10 np0005538513.localdomain systemd[1]: tmp-crun.8S88RS.mount: Deactivated successfully.
Nov 28 08:52:10 np0005538513.localdomain podman[96543]: 2025-11-28 08:52:10.852443009 +0000 UTC m=+0.090526049 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, release=1761123044, distribution-scope=public, config_id=tripleo_step1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, version=17.1.12)
Nov 28 08:52:11 np0005538513.localdomain podman[96543]: 2025-11-28 08:52:11.070435871 +0000 UTC m=+0.308518901 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, release=1761123044, vcs-type=git)
Nov 28 08:52:11 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:52:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:52:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:52:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:52:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:52:15 np0005538513.localdomain systemd[1]: tmp-crun.gCdxnr.mount: Deactivated successfully.
Nov 28 08:52:15 np0005538513.localdomain podman[96574]: 2025-11-28 08:52:15.860665493 +0000 UTC m=+0.092440230 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, architecture=x86_64, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-nova-compute-container)
Nov 28 08:52:15 np0005538513.localdomain systemd[1]: tmp-crun.V1heXz.mount: Deactivated successfully.
Nov 28 08:52:15 np0005538513.localdomain podman[96573]: 2025-11-28 08:52:15.914738542 +0000 UTC m=+0.145242970 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, release=1761123044, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible)
Nov 28 08:52:15 np0005538513.localdomain podman[96575]: 2025-11-28 08:52:15.877122736 +0000 UTC m=+0.101137651 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:52:15 np0005538513.localdomain podman[96575]: 2025-11-28 08:52:15.960285515 +0000 UTC m=+0.184300420 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, version=17.1.12, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible)
Nov 28 08:52:15 np0005538513.localdomain podman[96573]: 2025-11-28 08:52:15.974424567 +0000 UTC m=+0.204928965 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, architecture=x86_64, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:52:15 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:52:15 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:52:16 np0005538513.localdomain podman[96583]: 2025-11-28 08:52:15.975377537 +0000 UTC m=+0.196373418 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible)
Nov 28 08:52:16 np0005538513.localdomain podman[96583]: 2025-11-28 08:52:16.054332654 +0000 UTC m=+0.275328495 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, version=17.1.12, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 28 08:52:16 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:52:16 np0005538513.localdomain podman[96574]: 2025-11-28 08:52:16.232515041 +0000 UTC m=+0.464289668 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:52:16 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:52:17 np0005538513.localdomain sudo[96667]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:52:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:52:17 np0005538513.localdomain sudo[96667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:52:17 np0005538513.localdomain sudo[96667]: pam_unix(sudo:session): session closed for user root
Nov 28 08:52:17 np0005538513.localdomain systemd[1]: tmp-crun.1qztnj.mount: Deactivated successfully.
Nov 28 08:52:17 np0005538513.localdomain podman[96681]: 2025-11-28 08:52:17.576808807 +0000 UTC m=+0.082685774 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1)
Nov 28 08:52:17 np0005538513.localdomain sudo[96691]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 08:52:17 np0005538513.localdomain sudo[96691]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:52:17 np0005538513.localdomain podman[96681]: 2025-11-28 08:52:17.609494229 +0000 UTC m=+0.115371216 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute)
Nov 28 08:52:17 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:52:18 np0005538513.localdomain systemd[1]: tmp-crun.vCGtOl.mount: Deactivated successfully.
Nov 28 08:52:18 np0005538513.localdomain podman[96793]: 2025-11-28 08:52:18.447268837 +0000 UTC m=+0.102979509 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.component=rhceph-container, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_CLEAN=True, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, distribution-scope=public, name=rhceph, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, RELEASE=main, release=553, ceph=True)
Nov 28 08:52:18 np0005538513.localdomain podman[96793]: 2025-11-28 08:52:18.545554308 +0000 UTC m=+0.201264990 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, vcs-type=git, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, name=rhceph, release=553, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 08:52:18 np0005538513.localdomain sudo[96691]: pam_unix(sudo:session): session closed for user root
Nov 28 08:52:18 np0005538513.localdomain sudo[96861]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:52:18 np0005538513.localdomain sudo[96861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:52:18 np0005538513.localdomain sudo[96861]: pam_unix(sudo:session): session closed for user root
Nov 28 08:52:19 np0005538513.localdomain sudo[96876]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:52:19 np0005538513.localdomain sudo[96876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:52:19 np0005538513.localdomain sudo[96876]: pam_unix(sudo:session): session closed for user root
Nov 28 08:52:20 np0005538513.localdomain sudo[96923]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:52:20 np0005538513.localdomain sudo[96923]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:52:20 np0005538513.localdomain sudo[96923]: pam_unix(sudo:session): session closed for user root
Nov 28 08:52:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:52:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:52:23 np0005538513.localdomain systemd[1]: tmp-crun.issTeR.mount: Deactivated successfully.
Nov 28 08:52:23 np0005538513.localdomain podman[96938]: 2025-11-28 08:52:23.894104555 +0000 UTC m=+0.131948644 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.12, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Nov 28 08:52:23 np0005538513.localdomain podman[96939]: 2025-11-28 08:52:23.862404435 +0000 UTC m=+0.100265224 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, version=17.1.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com)
Nov 28 08:52:23 np0005538513.localdomain podman[96938]: 2025-11-28 08:52:23.937464231 +0000 UTC m=+0.175308310 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 08:52:23 np0005538513.localdomain podman[96938]: unhealthy
Nov 28 08:52:23 np0005538513.localdomain podman[96939]: 2025-11-28 08:52:23.94639371 +0000 UTC m=+0.184254459 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team)
Nov 28 08:52:23 np0005538513.localdomain podman[96939]: unhealthy
Nov 28 08:52:23 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:52:23 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:52:23 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:52:23 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:52:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:52:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:52:39 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:52:39 np0005538513.localdomain recover_tripleo_nova_virtqemud[96991]: 61397
Nov 28 08:52:39 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:52:39 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:52:39 np0005538513.localdomain podman[96979]: 2025-11-28 08:52:39.844576443 +0000 UTC m=+0.083399588 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, architecture=x86_64, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, maintainer=OpenStack TripleO Team)
Nov 28 08:52:39 np0005538513.localdomain systemd[1]: tmp-crun.6vg2WE.mount: Deactivated successfully.
Nov 28 08:52:39 np0005538513.localdomain podman[96978]: 2025-11-28 08:52:39.900138199 +0000 UTC m=+0.140239133 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, vcs-type=git, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team)
Nov 28 08:52:39 np0005538513.localdomain podman[96979]: 2025-11-28 08:52:39.908626834 +0000 UTC m=+0.147450009 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-collectd-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, architecture=x86_64, config_id=tripleo_step3, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z)
Nov 28 08:52:39 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:52:39 np0005538513.localdomain podman[96978]: 2025-11-28 08:52:39.932122258 +0000 UTC m=+0.172223162 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:52:39 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:52:41 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:52:41 np0005538513.localdomain systemd[1]: tmp-crun.6IM8ng.mount: Deactivated successfully.
Nov 28 08:52:41 np0005538513.localdomain podman[97019]: 2025-11-28 08:52:41.852500935 +0000 UTC m=+0.089783826 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:52:42 np0005538513.localdomain podman[97019]: 2025-11-28 08:52:42.043463982 +0000 UTC m=+0.280746923 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1761123044, container_name=metrics_qdr, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc.)
Nov 28 08:52:42 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:52:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:52:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:52:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:52:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:52:46 np0005538513.localdomain podman[97051]: 2025-11-28 08:52:46.861206762 +0000 UTC m=+0.091756158 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, tcib_managed=true, io.buildah.version=1.41.4, release=1761123044, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:52:46 np0005538513.localdomain podman[97049]: 2025-11-28 08:52:46.907611662 +0000 UTC m=+0.143783964 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, release=1761123044)
Nov 28 08:52:46 np0005538513.localdomain systemd[1]: tmp-crun.KfXC3N.mount: Deactivated successfully.
Nov 28 08:52:46 np0005538513.localdomain podman[97049]: 2025-11-28 08:52:46.961506656 +0000 UTC m=+0.197678928 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, release=1761123044, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com)
Nov 28 08:52:46 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:52:47 np0005538513.localdomain podman[97050]: 2025-11-28 08:52:46.96289888 +0000 UTC m=+0.196015416 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:52:47 np0005538513.localdomain podman[97052]: 2025-11-28 08:52:47.024073401 +0000 UTC m=+0.251380216 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:12:45Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:52:47 np0005538513.localdomain podman[97051]: 2025-11-28 08:52:47.03109383 +0000 UTC m=+0.261643236 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:52:47 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:52:47 np0005538513.localdomain podman[97052]: 2025-11-28 08:52:47.057488116 +0000 UTC m=+0.284794911 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12)
Nov 28 08:52:47 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:52:47 np0005538513.localdomain podman[97050]: 2025-11-28 08:52:47.337719732 +0000 UTC m=+0.570836258 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com)
Nov 28 08:52:47 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:52:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:52:47 np0005538513.localdomain podman[97143]: 2025-11-28 08:52:47.836760845 +0000 UTC m=+0.075647525 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-type=git, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1)
Nov 28 08:52:47 np0005538513.localdomain podman[97143]: 2025-11-28 08:52:47.86765105 +0000 UTC m=+0.106537760 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, release=1761123044, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true)
Nov 28 08:52:47 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:52:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:52:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:52:54 np0005538513.localdomain podman[97170]: 2025-11-28 08:52:54.849304868 +0000 UTC m=+0.084392898 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, architecture=x86_64)
Nov 28 08:52:54 np0005538513.localdomain podman[97170]: 2025-11-28 08:52:54.89769634 +0000 UTC m=+0.132784440 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-19T00:14:25Z)
Nov 28 08:52:54 np0005538513.localdomain podman[97170]: unhealthy
Nov 28 08:52:54 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:52:54 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:52:54 np0005538513.localdomain podman[97171]: 2025-11-28 08:52:54.898876537 +0000 UTC m=+0.131466319 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ovn_controller, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:52:54 np0005538513.localdomain podman[97171]: 2025-11-28 08:52:54.983399018 +0000 UTC m=+0.215988730 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, container_name=ovn_controller, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1)
Nov 28 08:52:54 np0005538513.localdomain podman[97171]: unhealthy
Nov 28 08:52:54 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:52:54 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:53:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:53:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:53:10 np0005538513.localdomain podman[97210]: 2025-11-28 08:53:10.845221145 +0000 UTC m=+0.081678153 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, name=rhosp17/openstack-iscsid, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.)
Nov 28 08:53:10 np0005538513.localdomain podman[97210]: 2025-11-28 08:53:10.858309985 +0000 UTC m=+0.094767023 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.12, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:53:10 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:53:10 np0005538513.localdomain podman[97211]: 2025-11-28 08:53:10.945248041 +0000 UTC m=+0.179609903 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public)
Nov 28 08:53:10 np0005538513.localdomain podman[97211]: 2025-11-28 08:53:10.957421511 +0000 UTC m=+0.191783373 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, architecture=x86_64, name=rhosp17/openstack-collectd)
Nov 28 08:53:10 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:53:12 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:53:12 np0005538513.localdomain systemd[1]: tmp-crun.HAU6EO.mount: Deactivated successfully.
Nov 28 08:53:12 np0005538513.localdomain podman[97251]: 2025-11-28 08:53:12.861644473 +0000 UTC m=+0.095126244 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, architecture=x86_64, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, container_name=metrics_qdr, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com)
Nov 28 08:53:13 np0005538513.localdomain podman[97251]: 2025-11-28 08:53:13.05648147 +0000 UTC m=+0.289963261 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.buildah.version=1.41.4)
Nov 28 08:53:13 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:53:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:53:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:53:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:53:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:53:17 np0005538513.localdomain podman[97280]: 2025-11-28 08:53:17.843057909 +0000 UTC m=+0.080854818 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute)
Nov 28 08:53:17 np0005538513.localdomain podman[97283]: 2025-11-28 08:53:17.893223176 +0000 UTC m=+0.130277921 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12)
Nov 28 08:53:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:53:17 np0005538513.localdomain podman[97283]: 2025-11-28 08:53:17.915537414 +0000 UTC m=+0.152592159 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:53:17 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:53:18 np0005538513.localdomain podman[97352]: 2025-11-28 08:53:18.000971793 +0000 UTC m=+0.081393464 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4)
Nov 28 08:53:18 np0005538513.localdomain podman[97280]: 2025-11-28 08:53:18.021262688 +0000 UTC m=+0.259059587 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, architecture=x86_64, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:53:18 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:53:18 np0005538513.localdomain podman[97352]: 2025-11-28 08:53:18.054974541 +0000 UTC m=+0.135396182 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, container_name=nova_compute, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 08:53:18 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:53:18 np0005538513.localdomain podman[97281]: 2025-11-28 08:53:18.103299531 +0000 UTC m=+0.339569202 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com)
Nov 28 08:53:18 np0005538513.localdomain podman[97282]: 2025-11-28 08:53:18.155046137 +0000 UTC m=+0.390686358 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team)
Nov 28 08:53:18 np0005538513.localdomain podman[97282]: 2025-11-28 08:53:18.164660908 +0000 UTC m=+0.400301129 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-cron, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step4, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:53:18 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:53:18 np0005538513.localdomain podman[97281]: 2025-11-28 08:53:18.470965079 +0000 UTC m=+0.707234710 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:53:18 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:53:20 np0005538513.localdomain sudo[97396]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:53:20 np0005538513.localdomain sudo[97396]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:53:20 np0005538513.localdomain sudo[97396]: pam_unix(sudo:session): session closed for user root
Nov 28 08:53:20 np0005538513.localdomain sudo[97411]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:53:20 np0005538513.localdomain sudo[97411]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:53:21 np0005538513.localdomain sudo[97411]: pam_unix(sudo:session): session closed for user root
Nov 28 08:53:21 np0005538513.localdomain sudo[97457]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:53:21 np0005538513.localdomain sudo[97457]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:53:21 np0005538513.localdomain sudo[97457]: pam_unix(sudo:session): session closed for user root
Nov 28 08:53:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:53:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:53:25 np0005538513.localdomain systemd[1]: tmp-crun.gdA5E1.mount: Deactivated successfully.
Nov 28 08:53:25 np0005538513.localdomain podman[97473]: 2025-11-28 08:53:25.850594213 +0000 UTC m=+0.086980790 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044)
Nov 28 08:53:25 np0005538513.localdomain systemd[1]: tmp-crun.pcQOD5.mount: Deactivated successfully.
Nov 28 08:53:25 np0005538513.localdomain podman[97472]: 2025-11-28 08:53:25.883198161 +0000 UTC m=+0.118926996 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 28 08:53:25 np0005538513.localdomain podman[97472]: 2025-11-28 08:53:25.900463881 +0000 UTC m=+0.136192726 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git)
Nov 28 08:53:25 np0005538513.localdomain podman[97472]: unhealthy
Nov 28 08:53:25 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:53:25 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:53:25 np0005538513.localdomain podman[97473]: 2025-11-28 08:53:25.916942836 +0000 UTC m=+0.153329393 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller)
Nov 28 08:53:25 np0005538513.localdomain podman[97473]: unhealthy
Nov 28 08:53:25 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:53:25 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:53:41 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:53:41 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:53:41 np0005538513.localdomain systemd[1]: tmp-crun.Ha93bu.mount: Deactivated successfully.
Nov 28 08:53:41 np0005538513.localdomain podman[97511]: 2025-11-28 08:53:41.858087763 +0000 UTC m=+0.091330965 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-iscsid-container)
Nov 28 08:53:41 np0005538513.localdomain podman[97512]: 2025-11-28 08:53:41.890217896 +0000 UTC m=+0.122616712 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, container_name=collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git)
Nov 28 08:53:41 np0005538513.localdomain podman[97511]: 2025-11-28 08:53:41.89387024 +0000 UTC m=+0.127113482 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step3, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1)
Nov 28 08:53:41 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:53:41 np0005538513.localdomain podman[97512]: 2025-11-28 08:53:41.925094727 +0000 UTC m=+0.157493603 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=)
Nov 28 08:53:41 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:53:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:53:43 np0005538513.localdomain systemd[1]: Starting dnf makecache...
Nov 28 08:53:43 np0005538513.localdomain podman[97548]: 2025-11-28 08:53:43.840226819 +0000 UTC m=+0.080303660 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, name=rhosp17/openstack-qdrouterd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, config_id=tripleo_step1, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container)
Nov 28 08:53:44 np0005538513.localdomain dnf[97549]: Updating Subscription Management repositories.
Nov 28 08:53:44 np0005538513.localdomain podman[97548]: 2025-11-28 08:53:44.042635273 +0000 UTC m=+0.282712144 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:53:44 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:53:45 np0005538513.localdomain dnf[97549]: Metadata cache refreshed recently.
Nov 28 08:53:45 np0005538513.localdomain systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 28 08:53:45 np0005538513.localdomain systemd[1]: Finished dnf makecache.
Nov 28 08:53:45 np0005538513.localdomain systemd[1]: dnf-makecache.service: Consumed 1.910s CPU time.
Nov 28 08:53:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:53:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:53:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:53:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:53:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:53:48 np0005538513.localdomain systemd[1]: tmp-crun.yXdwRY.mount: Deactivated successfully.
Nov 28 08:53:48 np0005538513.localdomain podman[97578]: 2025-11-28 08:53:48.87400195 +0000 UTC m=+0.111686281 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Nov 28 08:53:48 np0005538513.localdomain podman[97578]: 2025-11-28 08:53:48.908300391 +0000 UTC m=+0.145984702 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=ceilometer_agent_compute)
Nov 28 08:53:48 np0005538513.localdomain systemd[1]: tmp-crun.HeJXjq.mount: Deactivated successfully.
Nov 28 08:53:48 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:53:48 np0005538513.localdomain podman[97579]: 2025-11-28 08:53:48.914290628 +0000 UTC m=+0.151104881 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Nov 28 08:53:48 np0005538513.localdomain podman[97581]: 2025-11-28 08:53:48.968605066 +0000 UTC m=+0.198815223 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:53:49 np0005538513.localdomain podman[97580]: 2025-11-28 08:53:49.022709256 +0000 UTC m=+0.255730861 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-cron, container_name=logrotate_crond, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:53:49 np0005538513.localdomain podman[97581]: 2025-11-28 08:53:49.027559748 +0000 UTC m=+0.257769885 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public)
Nov 28 08:53:49 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:53:49 np0005538513.localdomain podman[97580]: 2025-11-28 08:53:49.080603506 +0000 UTC m=+0.313625131 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, batch=17.1_20251118.1, name=rhosp17/openstack-cron, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:53:49 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:53:49 np0005538513.localdomain podman[97587]: 2025-11-28 08:53:49.161284467 +0000 UTC m=+0.387529901 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, io.buildah.version=1.41.4, config_id=tripleo_step4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:53:49 np0005538513.localdomain podman[97587]: 2025-11-28 08:53:49.219557677 +0000 UTC m=+0.445803171 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:53:49 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:53:49 np0005538513.localdomain podman[97579]: 2025-11-28 08:53:49.314171584 +0000 UTC m=+0.550985917 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, version=17.1.12, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:53:49 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:53:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:53:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:53:56 np0005538513.localdomain systemd[1]: tmp-crun.QuTMbu.mount: Deactivated successfully.
Nov 28 08:53:56 np0005538513.localdomain podman[97694]: 2025-11-28 08:53:56.855974847 +0000 UTC m=+0.092175088 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:53:56 np0005538513.localdomain podman[97694]: 2025-11-28 08:53:56.896646971 +0000 UTC m=+0.132847202 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 08:53:56 np0005538513.localdomain podman[97694]: unhealthy
Nov 28 08:53:56 np0005538513.localdomain podman[97695]: 2025-11-28 08:53:56.914084857 +0000 UTC m=+0.143302088 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, architecture=x86_64, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller)
Nov 28 08:53:56 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:53:56 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:53:56 np0005538513.localdomain podman[97695]: 2025-11-28 08:53:56.932499094 +0000 UTC m=+0.161716385 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, version=17.1.12)
Nov 28 08:53:56 np0005538513.localdomain podman[97695]: unhealthy
Nov 28 08:53:56 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:53:56 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:54:05 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:54:05 np0005538513.localdomain recover_tripleo_nova_virtqemud[97734]: 61397
Nov 28 08:54:05 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:54:05 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:54:12 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:54:12 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:54:12 np0005538513.localdomain systemd[1]: tmp-crun.GQjtmg.mount: Deactivated successfully.
Nov 28 08:54:12 np0005538513.localdomain podman[97735]: 2025-11-28 08:54:12.861563326 +0000 UTC m=+0.097356390 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, batch=17.1_20251118.1, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.buildah.version=1.41.4)
Nov 28 08:54:12 np0005538513.localdomain podman[97735]: 2025-11-28 08:54:12.898986988 +0000 UTC m=+0.134780042 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, container_name=iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:54:12 np0005538513.localdomain systemd[1]: tmp-crun.vPAWTJ.mount: Deactivated successfully.
Nov 28 08:54:12 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:54:12 np0005538513.localdomain podman[97736]: 2025-11-28 08:54:12.917214689 +0000 UTC m=+0.152761375 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, name=rhosp17/openstack-collectd, version=17.1.12, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team)
Nov 28 08:54:12 np0005538513.localdomain podman[97736]: 2025-11-28 08:54:12.952511694 +0000 UTC m=+0.188058370 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:54:12 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:54:14 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:54:14 np0005538513.localdomain podman[97771]: 2025-11-28 08:54:14.847268719 +0000 UTC m=+0.083371352 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, vcs-type=git)
Nov 28 08:54:15 np0005538513.localdomain podman[97771]: 2025-11-28 08:54:15.037559338 +0000 UTC m=+0.273661981 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, vcs-type=git, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, architecture=x86_64)
Nov 28 08:54:15 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:54:19 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:54:19 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:54:19 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:54:19 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:54:19 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:54:19 np0005538513.localdomain podman[97801]: 2025-11-28 08:54:19.863419183 +0000 UTC m=+0.093699194 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, architecture=x86_64, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044)
Nov 28 08:54:19 np0005538513.localdomain systemd[1]: tmp-crun.7lx01n.mount: Deactivated successfully.
Nov 28 08:54:19 np0005538513.localdomain podman[97800]: 2025-11-28 08:54:19.922662989 +0000 UTC m=+0.156674468 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-19T00:11:48Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute)
Nov 28 08:54:19 np0005538513.localdomain podman[97800]: 2025-11-28 08:54:19.950978555 +0000 UTC m=+0.184990044 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container)
Nov 28 08:54:19 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:54:19 np0005538513.localdomain podman[97814]: 2025-11-28 08:54:19.969139694 +0000 UTC m=+0.187173172 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi)
Nov 28 08:54:20 np0005538513.localdomain podman[97808]: 2025-11-28 08:54:20.019297145 +0000 UTC m=+0.241098951 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, url=https://www.redhat.com)
Nov 28 08:54:20 np0005538513.localdomain podman[97802]: 2025-11-28 08:54:20.066293377 +0000 UTC m=+0.293470571 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:54:20 np0005538513.localdomain podman[97814]: 2025-11-28 08:54:20.071414747 +0000 UTC m=+0.289448235 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:54:20 np0005538513.localdomain podman[97808]: 2025-11-28 08:54:20.071852701 +0000 UTC m=+0.293654567 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, url=https://www.redhat.com, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z)
Nov 28 08:54:20 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:54:20 np0005538513.localdomain podman[97802]: 2025-11-28 08:54:20.12226836 +0000 UTC m=+0.349445554 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, io.openshift.expose-services=, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:54:20 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:54:20 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:54:20 np0005538513.localdomain podman[97801]: 2025-11-28 08:54:20.227541906 +0000 UTC m=+0.457821967 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 08:54:20 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:54:22 np0005538513.localdomain sudo[97915]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:54:22 np0005538513.localdomain sudo[97915]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:54:22 np0005538513.localdomain sudo[97915]: pam_unix(sudo:session): session closed for user root
Nov 28 08:54:22 np0005538513.localdomain sudo[97930]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:54:22 np0005538513.localdomain sudo[97930]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:54:22 np0005538513.localdomain sudo[97930]: pam_unix(sudo:session): session closed for user root
Nov 28 08:54:23 np0005538513.localdomain sudo[97977]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:54:23 np0005538513.localdomain sudo[97977]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:54:23 np0005538513.localdomain sudo[97977]: pam_unix(sudo:session): session closed for user root
Nov 28 08:54:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:54:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:54:27 np0005538513.localdomain podman[97992]: 2025-11-28 08:54:27.855732399 +0000 UTC m=+0.089487423 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, version=17.1.12, distribution-scope=public, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent)
Nov 28 08:54:27 np0005538513.localdomain podman[97993]: 2025-11-28 08:54:27.904528968 +0000 UTC m=+0.138207310 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, release=1761123044, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc.)
Nov 28 08:54:27 np0005538513.localdomain podman[97993]: 2025-11-28 08:54:27.920332992 +0000 UTC m=+0.154011294 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1761123044, vcs-type=git)
Nov 28 08:54:27 np0005538513.localdomain podman[97993]: unhealthy
Nov 28 08:54:27 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:54:27 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:54:27 np0005538513.localdomain podman[97992]: 2025-11-28 08:54:27.932225464 +0000 UTC m=+0.165980528 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Nov 28 08:54:27 np0005538513.localdomain podman[97992]: unhealthy
Nov 28 08:54:27 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:54:27 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:54:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:54:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:54:43 np0005538513.localdomain podman[98032]: 2025-11-28 08:54:43.854785572 +0000 UTC m=+0.090732322 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:54:43 np0005538513.localdomain podman[98032]: 2025-11-28 08:54:43.89237954 +0000 UTC m=+0.128326230 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vcs-type=git, config_id=tripleo_step3, name=rhosp17/openstack-collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044)
Nov 28 08:54:43 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:54:43 np0005538513.localdomain podman[98031]: 2025-11-28 08:54:43.900267437 +0000 UTC m=+0.139023325 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., container_name=iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:54:43 np0005538513.localdomain podman[98031]: 2025-11-28 08:54:43.983439351 +0000 UTC m=+0.222195209 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, distribution-scope=public, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid)
Nov 28 08:54:43 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:54:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:54:45 np0005538513.localdomain systemd[1]: tmp-crun.fozlxL.mount: Deactivated successfully.
Nov 28 08:54:45 np0005538513.localdomain podman[98070]: 2025-11-28 08:54:45.85153231 +0000 UTC m=+0.090082322 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step1, batch=17.1_20251118.1, container_name=metrics_qdr, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, version=17.1.12)
Nov 28 08:54:46 np0005538513.localdomain podman[98070]: 2025-11-28 08:54:46.053324499 +0000 UTC m=+0.291874471 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, release=1761123044, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1)
Nov 28 08:54:46 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:54:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:54:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:54:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:54:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:54:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:54:50 np0005538513.localdomain podman[98100]: 2025-11-28 08:54:50.862606606 +0000 UTC m=+0.100832268 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=)
Nov 28 08:54:50 np0005538513.localdomain podman[98101]: 2025-11-28 08:54:50.918372013 +0000 UTC m=+0.152954962 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=nova_migration_target, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:54:50 np0005538513.localdomain podman[98100]: 2025-11-28 08:54:50.94766737 +0000 UTC m=+0.185893012 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-19T00:11:48Z, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:54:51 np0005538513.localdomain podman[98103]: 2025-11-28 08:54:51.024687892 +0000 UTC m=+0.252279961 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step5, io.buildah.version=1.41.4, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-nova-compute)
Nov 28 08:54:51 np0005538513.localdomain podman[98102]: 2025-11-28 08:54:50.991672648 +0000 UTC m=+0.223527691 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:54:51 np0005538513.localdomain podman[98102]: 2025-11-28 08:54:51.073286544 +0000 UTC m=+0.305141597 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, name=rhosp17/openstack-cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:54:51 np0005538513.localdomain podman[98103]: 2025-11-28 08:54:51.083560026 +0000 UTC m=+0.311152085 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, container_name=nova_compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044)
Nov 28 08:54:51 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:54:51 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:54:51 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:54:51 np0005538513.localdomain podman[98109]: 2025-11-28 08:54:51.122193155 +0000 UTC m=+0.346800511 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 28 08:54:51 np0005538513.localdomain podman[98109]: 2025-11-28 08:54:51.152506914 +0000 UTC m=+0.377114330 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:54:51 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:54:51 np0005538513.localdomain podman[98101]: 2025-11-28 08:54:51.281474473 +0000 UTC m=+0.516057482 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, vcs-type=git)
Nov 28 08:54:51 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:54:51 np0005538513.localdomain systemd[1]: tmp-crun.BQL1OG.mount: Deactivated successfully.
Nov 28 08:54:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:54:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:54:58 np0005538513.localdomain podman[98220]: 2025-11-28 08:54:58.839728036 +0000 UTC m=+0.081247755 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, build-date=2025-11-19T00:14:25Z, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, architecture=x86_64, container_name=ovn_metadata_agent, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044)
Nov 28 08:54:58 np0005538513.localdomain podman[98220]: 2025-11-28 08:54:58.881419762 +0000 UTC m=+0.122939481 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 08:54:58 np0005538513.localdomain podman[98220]: unhealthy
Nov 28 08:54:58 np0005538513.localdomain systemd[1]: tmp-crun.tskW4s.mount: Deactivated successfully.
Nov 28 08:54:58 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:54:58 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:54:58 np0005538513.localdomain podman[98221]: 2025-11-28 08:54:58.903080421 +0000 UTC m=+0.140930084 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1)
Nov 28 08:54:58 np0005538513.localdomain podman[98221]: 2025-11-28 08:54:58.946484069 +0000 UTC m=+0.184333762 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, config_id=tripleo_step4, io.buildah.version=1.41.4, release=1761123044)
Nov 28 08:54:58 np0005538513.localdomain podman[98221]: unhealthy
Nov 28 08:54:58 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:54:58 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:55:14 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:55:14 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:55:14 np0005538513.localdomain systemd[1]: tmp-crun.rN6qi5.mount: Deactivated successfully.
Nov 28 08:55:14 np0005538513.localdomain podman[98258]: 2025-11-28 08:55:14.859711553 +0000 UTC m=+0.088704479 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:55:14 np0005538513.localdomain podman[98258]: 2025-11-28 08:55:14.893560973 +0000 UTC m=+0.122553959 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z)
Nov 28 08:55:14 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:55:14 np0005538513.localdomain podman[98259]: 2025-11-28 08:55:14.908934425 +0000 UTC m=+0.135754232 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044)
Nov 28 08:55:14 np0005538513.localdomain podman[98259]: 2025-11-28 08:55:14.988758985 +0000 UTC m=+0.215578772 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container)
Nov 28 08:55:15 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:55:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:55:16 np0005538513.localdomain systemd[1]: tmp-crun.StDRY6.mount: Deactivated successfully.
Nov 28 08:55:16 np0005538513.localdomain podman[98297]: 2025-11-28 08:55:16.849721761 +0000 UTC m=+0.082776172 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., container_name=metrics_qdr, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044)
Nov 28 08:55:17 np0005538513.localdomain podman[98297]: 2025-11-28 08:55:17.040545177 +0000 UTC m=+0.273599578 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 28 08:55:17 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:55:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:55:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:55:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:55:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:55:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:55:21 np0005538513.localdomain systemd[1]: tmp-crun.JitIoh.mount: Deactivated successfully.
Nov 28 08:55:21 np0005538513.localdomain systemd[1]: tmp-crun.YRB9sN.mount: Deactivated successfully.
Nov 28 08:55:21 np0005538513.localdomain podman[98328]: 2025-11-28 08:55:21.885071608 +0000 UTC m=+0.111565145 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron)
Nov 28 08:55:21 np0005538513.localdomain podman[98328]: 2025-11-28 08:55:21.913510289 +0000 UTC m=+0.140003796 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:55:21 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:55:21 np0005538513.localdomain podman[98334]: 2025-11-28 08:55:21.963729161 +0000 UTC m=+0.183291061 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:55:21 np0005538513.localdomain podman[98327]: 2025-11-28 08:55:21.918634329 +0000 UTC m=+0.147085967 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_migration_target, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-type=git)
Nov 28 08:55:21 np0005538513.localdomain podman[98334]: 2025-11-28 08:55:21.992357558 +0000 UTC m=+0.211919538 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true)
Nov 28 08:55:22 np0005538513.localdomain podman[98333]: 2025-11-28 08:55:21.850773853 +0000 UTC m=+0.078188558 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., container_name=nova_compute, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true)
Nov 28 08:55:22 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:55:22 np0005538513.localdomain podman[98333]: 2025-11-28 08:55:22.034287091 +0000 UTC m=+0.261701836 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:55:22 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:55:22 np0005538513.localdomain podman[98326]: 2025-11-28 08:55:21.895616079 +0000 UTC m=+0.131145428 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4)
Nov 28 08:55:22 np0005538513.localdomain podman[98326]: 2025-11-28 08:55:22.078201646 +0000 UTC m=+0.313730985 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 08:55:22 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:55:22 np0005538513.localdomain podman[98327]: 2025-11-28 08:55:22.27537204 +0000 UTC m=+0.503823738 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:55:22 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:55:23 np0005538513.localdomain sudo[98443]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:55:23 np0005538513.localdomain sudo[98443]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:55:23 np0005538513.localdomain sudo[98443]: pam_unix(sudo:session): session closed for user root
Nov 28 08:55:23 np0005538513.localdomain sudo[98458]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:55:23 np0005538513.localdomain sudo[98458]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:55:24 np0005538513.localdomain sudo[98458]: pam_unix(sudo:session): session closed for user root
Nov 28 08:55:24 np0005538513.localdomain sudo[98505]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:55:24 np0005538513.localdomain sudo[98505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:55:24 np0005538513.localdomain sudo[98505]: pam_unix(sudo:session): session closed for user root
Nov 28 08:55:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:55:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:55:29 np0005538513.localdomain podman[98520]: 2025-11-28 08:55:29.85975239 +0000 UTC m=+0.091063532 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, version=17.1.12, architecture=x86_64, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:55:29 np0005538513.localdomain systemd[1]: tmp-crun.VCpchZ.mount: Deactivated successfully.
Nov 28 08:55:29 np0005538513.localdomain podman[98521]: 2025-11-28 08:55:29.914294348 +0000 UTC m=+0.142364989 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:55:29 np0005538513.localdomain podman[98520]: 2025-11-28 08:55:29.927343517 +0000 UTC m=+0.158654649 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, config_id=tripleo_step4, architecture=x86_64, version=17.1.12, release=1761123044)
Nov 28 08:55:29 np0005538513.localdomain podman[98520]: unhealthy
Nov 28 08:55:29 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:55:29 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:55:29 np0005538513.localdomain podman[98521]: 2025-11-28 08:55:29.957495351 +0000 UTC m=+0.185565992 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, version=17.1.12, tcib_managed=true, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container)
Nov 28 08:55:29 np0005538513.localdomain podman[98521]: unhealthy
Nov 28 08:55:29 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:55:29 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:55:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:55:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:55:45 np0005538513.localdomain podman[98560]: 2025-11-28 08:55:45.863983682 +0000 UTC m=+0.090688321 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, vcs-type=git, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true)
Nov 28 08:55:45 np0005538513.localdomain podman[98559]: 2025-11-28 08:55:45.910107257 +0000 UTC m=+0.140932484 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, vcs-type=git, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com)
Nov 28 08:55:45 np0005538513.localdomain podman[98559]: 2025-11-28 08:55:45.919417428 +0000 UTC m=+0.150242645 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:55:45 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:55:45 np0005538513.localdomain podman[98560]: 2025-11-28 08:55:45.977505197 +0000 UTC m=+0.204209826 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, distribution-scope=public, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:55:45 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:55:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:55:47 np0005538513.localdomain podman[98598]: 2025-11-28 08:55:47.844589895 +0000 UTC m=+0.078614443 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:55:48 np0005538513.localdomain podman[98598]: 2025-11-28 08:55:48.033406738 +0000 UTC m=+0.267431306 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1)
Nov 28 08:55:48 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:55:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:55:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:55:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:55:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:55:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:55:52 np0005538513.localdomain podman[98628]: 2025-11-28 08:55:52.836612304 +0000 UTC m=+0.077536529 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:55:52 np0005538513.localdomain podman[98629]: 2025-11-28 08:55:52.852522742 +0000 UTC m=+0.088021757 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, batch=17.1_20251118.1, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:55:52 np0005538513.localdomain podman[98637]: 2025-11-28 08:55:52.888076496 +0000 UTC m=+0.117367117 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Nov 28 08:55:52 np0005538513.localdomain podman[98628]: 2025-11-28 08:55:52.896366006 +0000 UTC m=+0.137290251 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible)
Nov 28 08:55:52 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:55:52 np0005538513.localdomain podman[98636]: 2025-11-28 08:55:52.936866023 +0000 UTC m=+0.169117436 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, release=1761123044, distribution-scope=public, config_id=tripleo_step5, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:55:52 np0005538513.localdomain podman[98637]: 2025-11-28 08:55:52.941288972 +0000 UTC m=+0.170579553 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, version=17.1.12, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi)
Nov 28 08:55:52 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:55:52 np0005538513.localdomain podman[98636]: 2025-11-28 08:55:52.986530259 +0000 UTC m=+0.218781702 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, version=17.1.12)
Nov 28 08:55:52 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:55:53 np0005538513.localdomain podman[98630]: 2025-11-28 08:55:53.056545262 +0000 UTC m=+0.288979391 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, container_name=logrotate_crond)
Nov 28 08:55:53 np0005538513.localdomain podman[98630]: 2025-11-28 08:55:53.068523576 +0000 UTC m=+0.300957745 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, distribution-scope=public, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1)
Nov 28 08:55:53 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:55:53 np0005538513.localdomain podman[98629]: 2025-11-28 08:55:53.19028507 +0000 UTC m=+0.425784115 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12)
Nov 28 08:55:53 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:55:53 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:55:53 np0005538513.localdomain recover_tripleo_nova_virtqemud[98752]: 61397
Nov 28 08:55:53 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:55:53 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:56:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:56:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:56:00 np0005538513.localdomain podman[98754]: 2025-11-28 08:56:00.845894159 +0000 UTC m=+0.080369958 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-type=git, container_name=ovn_controller, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:56:00 np0005538513.localdomain podman[98754]: 2025-11-28 08:56:00.860683432 +0000 UTC m=+0.095159241 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-ovn-controller, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:56:00 np0005538513.localdomain podman[98753]: 2025-11-28 08:56:00.89987293 +0000 UTC m=+0.134919807 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:56:00 np0005538513.localdomain podman[98754]: unhealthy
Nov 28 08:56:00 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:56:00 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:56:00 np0005538513.localdomain podman[98753]: 2025-11-28 08:56:00.966607369 +0000 UTC m=+0.201654246 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, container_name=ovn_metadata_agent, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z)
Nov 28 08:56:00 np0005538513.localdomain podman[98753]: unhealthy
Nov 28 08:56:00 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:56:00 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:56:14 np0005538513.localdomain sshd[35710]: Received disconnect from 192.168.122.100 port 35712:11: disconnected by user
Nov 28 08:56:14 np0005538513.localdomain sshd[35710]: Disconnected from user tripleo-admin 192.168.122.100 port 35712
Nov 28 08:56:14 np0005538513.localdomain sshd[35690]: pam_unix(sshd:session): session closed for user tripleo-admin
Nov 28 08:56:14 np0005538513.localdomain systemd[1]: session-28.scope: Deactivated successfully.
Nov 28 08:56:14 np0005538513.localdomain systemd[1]: session-28.scope: Consumed 7min 6.514s CPU time.
Nov 28 08:56:14 np0005538513.localdomain systemd-logind[764]: Session 28 logged out. Waiting for processes to exit.
Nov 28 08:56:14 np0005538513.localdomain systemd-logind[764]: Removed session 28.
Nov 28 08:56:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:56:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:56:16 np0005538513.localdomain systemd[1]: tmp-crun.NIguys.mount: Deactivated successfully.
Nov 28 08:56:16 np0005538513.localdomain podman[98796]: 2025-11-28 08:56:16.903245988 +0000 UTC m=+0.134323738 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=)
Nov 28 08:56:16 np0005538513.localdomain podman[98797]: 2025-11-28 08:56:16.868784099 +0000 UTC m=+0.098351261 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, release=1761123044, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, container_name=collectd)
Nov 28 08:56:16 np0005538513.localdomain podman[98796]: 2025-11-28 08:56:16.941433074 +0000 UTC m=+0.172510814 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, architecture=x86_64, tcib_managed=true, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, container_name=iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Nov 28 08:56:16 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:56:16 np0005538513.localdomain podman[98797]: 2025-11-28 08:56:16.954361379 +0000 UTC m=+0.183928481 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, version=17.1.12, vcs-type=git, name=rhosp17/openstack-collectd, tcib_managed=true, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, container_name=collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:56:16 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:56:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:56:18 np0005538513.localdomain systemd[1]: tmp-crun.gTGWVQ.mount: Deactivated successfully.
Nov 28 08:56:18 np0005538513.localdomain podman[98835]: 2025-11-28 08:56:18.851885631 +0000 UTC m=+0.089612597 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.buildah.version=1.41.4, container_name=metrics_qdr, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, vcs-type=git)
Nov 28 08:56:19 np0005538513.localdomain podman[98835]: 2025-11-28 08:56:19.069462305 +0000 UTC m=+0.307189261 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, vcs-type=git, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 28 08:56:19 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:56:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:56:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:56:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:56:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:56:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:56:23 np0005538513.localdomain podman[98865]: 2025-11-28 08:56:23.849090705 +0000 UTC m=+0.084217598 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-cron-container)
Nov 28 08:56:23 np0005538513.localdomain systemd[1]: tmp-crun.BiH09W.mount: Deactivated successfully.
Nov 28 08:56:23 np0005538513.localdomain podman[98863]: 2025-11-28 08:56:23.91760077 +0000 UTC m=+0.155683576 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:56:23 np0005538513.localdomain podman[98863]: 2025-11-28 08:56:23.945335049 +0000 UTC m=+0.183417825 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public)
Nov 28 08:56:23 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:56:23 np0005538513.localdomain podman[98872]: 2025-11-28 08:56:23.961929609 +0000 UTC m=+0.184941063 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team)
Nov 28 08:56:24 np0005538513.localdomain podman[98864]: 2025-11-28 08:56:24.012992957 +0000 UTC m=+0.248440790 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, version=17.1.12, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:56:24 np0005538513.localdomain podman[98865]: 2025-11-28 08:56:24.06894952 +0000 UTC m=+0.304076393 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, release=1761123044, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, config_id=tripleo_step4, name=rhosp17/openstack-cron, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:56:24 np0005538513.localdomain podman[98866]: 2025-11-28 08:56:23.884309898 +0000 UTC m=+0.111322547 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, architecture=x86_64, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:56:24 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:56:24 np0005538513.localdomain podman[98872]: 2025-11-28 08:56:24.119348358 +0000 UTC m=+0.342359812 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:56:24 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:56:24 np0005538513.localdomain podman[98866]: 2025-11-28 08:56:24.205637391 +0000 UTC m=+0.432650110 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, container_name=nova_compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, version=17.1.12)
Nov 28 08:56:24 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:56:24 np0005538513.localdomain podman[98864]: 2025-11-28 08:56:24.406360657 +0000 UTC m=+0.641808470 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target)
Nov 28 08:56:24 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:56:24 np0005538513.localdomain systemd[1]: Stopping User Manager for UID 1003...
Nov 28 08:56:24 np0005538513.localdomain systemd[35694]: Activating special unit Exit the Session...
Nov 28 08:56:24 np0005538513.localdomain systemd[35694]: Removed slice User Background Tasks Slice.
Nov 28 08:56:24 np0005538513.localdomain systemd[35694]: Stopped target Main User Target.
Nov 28 08:56:24 np0005538513.localdomain systemd[35694]: Stopped target Basic System.
Nov 28 08:56:24 np0005538513.localdomain systemd[35694]: Stopped target Paths.
Nov 28 08:56:24 np0005538513.localdomain systemd[35694]: Stopped target Sockets.
Nov 28 08:56:24 np0005538513.localdomain systemd[35694]: Stopped target Timers.
Nov 28 08:56:24 np0005538513.localdomain systemd[35694]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 28 08:56:24 np0005538513.localdomain systemd[35694]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 08:56:24 np0005538513.localdomain systemd[35694]: Closed D-Bus User Message Bus Socket.
Nov 28 08:56:24 np0005538513.localdomain systemd[35694]: Stopped Create User's Volatile Files and Directories.
Nov 28 08:56:24 np0005538513.localdomain systemd[35694]: Removed slice User Application Slice.
Nov 28 08:56:24 np0005538513.localdomain systemd[35694]: Reached target Shutdown.
Nov 28 08:56:24 np0005538513.localdomain systemd[35694]: Finished Exit the Session.
Nov 28 08:56:24 np0005538513.localdomain systemd[35694]: Reached target Exit the Session.
Nov 28 08:56:24 np0005538513.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Nov 28 08:56:24 np0005538513.localdomain systemd[1]: Stopped User Manager for UID 1003.
Nov 28 08:56:24 np0005538513.localdomain systemd[1]: user@1003.service: Consumed 5.344s CPU time, read 0B from disk, written 7.0K to disk.
Nov 28 08:56:24 np0005538513.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Nov 28 08:56:24 np0005538513.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Nov 28 08:56:24 np0005538513.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Nov 28 08:56:24 np0005538513.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Nov 28 08:56:24 np0005538513.localdomain systemd[1]: user-1003.slice: Consumed 7min 11.886s CPU time.
Nov 28 08:56:24 np0005538513.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Nov 28 08:56:25 np0005538513.localdomain sudo[98979]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:56:25 np0005538513.localdomain sudo[98979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:56:25 np0005538513.localdomain sudo[98979]: pam_unix(sudo:session): session closed for user root
Nov 28 08:56:25 np0005538513.localdomain sudo[98994]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:56:25 np0005538513.localdomain sudo[98994]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:56:25 np0005538513.localdomain sudo[98994]: pam_unix(sudo:session): session closed for user root
Nov 28 08:56:26 np0005538513.localdomain sudo[99041]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:56:26 np0005538513.localdomain sudo[99041]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:56:26 np0005538513.localdomain sudo[99041]: pam_unix(sudo:session): session closed for user root
Nov 28 08:56:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:56:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:56:31 np0005538513.localdomain podman[99057]: 2025-11-28 08:56:31.847546193 +0000 UTC m=+0.086070947 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1761123044, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=ovn_controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, architecture=x86_64)
Nov 28 08:56:31 np0005538513.localdomain podman[99057]: 2025-11-28 08:56:31.869346646 +0000 UTC m=+0.107871370 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, release=1761123044, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:56:31 np0005538513.localdomain podman[99057]: unhealthy
Nov 28 08:56:31 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:56:31 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:56:31 np0005538513.localdomain podman[99056]: 2025-11-28 08:56:31.953987386 +0000 UTC m=+0.193767229 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, version=17.1.12, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:56:31 np0005538513.localdomain podman[99056]: 2025-11-28 08:56:31.997472607 +0000 UTC m=+0.237252370 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:56:32 np0005538513.localdomain podman[99056]: unhealthy
Nov 28 08:56:32 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:56:32 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:56:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:56:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:56:47 np0005538513.localdomain podman[99097]: 2025-11-28 08:56:47.846362677 +0000 UTC m=+0.082984620 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, distribution-scope=public, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc.)
Nov 28 08:56:47 np0005538513.localdomain podman[99097]: 2025-11-28 08:56:47.86052928 +0000 UTC m=+0.097151223 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, container_name=iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_id=tripleo_step3, url=https://www.redhat.com, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:56:47 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:56:47 np0005538513.localdomain podman[99098]: 2025-11-28 08:56:47.947807693 +0000 UTC m=+0.181970659 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, build-date=2025-11-18T22:51:28Z, architecture=x86_64, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true)
Nov 28 08:56:47 np0005538513.localdomain podman[99098]: 2025-11-28 08:56:47.960646836 +0000 UTC m=+0.194809792 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044)
Nov 28 08:56:47 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:56:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:56:49 np0005538513.localdomain podman[99134]: 2025-11-28 08:56:49.846777122 +0000 UTC m=+0.080585815 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:56:50 np0005538513.localdomain podman[99134]: 2025-11-28 08:56:50.07245332 +0000 UTC m=+0.306261973 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=metrics_qdr, tcib_managed=true, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container)
Nov 28 08:56:50 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:56:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:56:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:56:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:56:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:56:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:56:54 np0005538513.localdomain podman[99166]: 2025-11-28 08:56:54.858752336 +0000 UTC m=+0.092889160 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, config_id=tripleo_step4, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team)
Nov 28 08:56:54 np0005538513.localdomain systemd[1]: tmp-crun.FSz5V7.mount: Deactivated successfully.
Nov 28 08:56:54 np0005538513.localdomain podman[99166]: 2025-11-28 08:56:54.922385359 +0000 UTC m=+0.156522223 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, container_name=logrotate_crond, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron)
Nov 28 08:56:54 np0005538513.localdomain podman[99168]: 2025-11-28 08:56:54.922583525 +0000 UTC m=+0.145688843 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 08:56:54 np0005538513.localdomain podman[99168]: 2025-11-28 08:56:54.953450692 +0000 UTC m=+0.176555970 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:56:54 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:56:55 np0005538513.localdomain podman[99167]: 2025-11-28 08:56:55.013377708 +0000 UTC m=+0.243004950 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team)
Nov 28 08:56:55 np0005538513.localdomain podman[99165]: 2025-11-28 08:56:54.974790579 +0000 UTC m=+0.208590442 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public)
Nov 28 08:56:55 np0005538513.localdomain podman[99167]: 2025-11-28 08:56:55.070691613 +0000 UTC m=+0.300318845 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git)
Nov 28 08:56:55 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:56:55 np0005538513.localdomain podman[99164]: 2025-11-28 08:56:55.073384388 +0000 UTC m=+0.308202183 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1)
Nov 28 08:56:55 np0005538513.localdomain podman[99164]: 2025-11-28 08:56:55.157393729 +0000 UTC m=+0.392211544 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, release=1761123044, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:56:55 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:56:55 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:56:55 np0005538513.localdomain podman[99165]: 2025-11-28 08:56:55.342098923 +0000 UTC m=+0.575898866 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:56:55 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:56:55 np0005538513.localdomain systemd[1]: tmp-crun.P1RemA.mount: Deactivated successfully.
Nov 28 08:57:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:57:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:57:02 np0005538513.localdomain podman[99287]: 2025-11-28 08:57:02.852125782 +0000 UTC m=+0.084781686 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com)
Nov 28 08:57:02 np0005538513.localdomain podman[99287]: 2025-11-28 08:57:02.898370811 +0000 UTC m=+0.131026715 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=)
Nov 28 08:57:02 np0005538513.localdomain podman[99287]: unhealthy
Nov 28 08:57:02 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:57:02 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:57:02 np0005538513.localdomain podman[99288]: 2025-11-28 08:57:02.902546661 +0000 UTC m=+0.132088767 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, vcs-type=git, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Nov 28 08:57:02 np0005538513.localdomain podman[99288]: 2025-11-28 08:57:02.986974246 +0000 UTC m=+0.216516362 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, batch=17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, architecture=x86_64)
Nov 28 08:57:02 np0005538513.localdomain podman[99288]: unhealthy
Nov 28 08:57:02 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:57:02 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:57:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:57:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:57:18 np0005538513.localdomain systemd[1]: tmp-crun.aPNwdz.mount: Deactivated successfully.
Nov 28 08:57:18 np0005538513.localdomain podman[99325]: 2025-11-28 08:57:18.843236906 +0000 UTC m=+0.080662957 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, release=1761123044, architecture=x86_64, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:57:18 np0005538513.localdomain podman[99325]: 2025-11-28 08:57:18.884552119 +0000 UTC m=+0.121978250 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=)
Nov 28 08:57:18 np0005538513.localdomain systemd[1]: tmp-crun.Q05erx.mount: Deactivated successfully.
Nov 28 08:57:18 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:57:18 np0005538513.localdomain podman[99326]: 2025-11-28 08:57:18.912098882 +0000 UTC m=+0.147866212 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, batch=17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, container_name=collectd, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible)
Nov 28 08:57:18 np0005538513.localdomain podman[99326]: 2025-11-28 08:57:18.926444861 +0000 UTC m=+0.162212261 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, container_name=collectd, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:57:18 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:57:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:57:20 np0005538513.localdomain systemd[1]: tmp-crun.hHzywW.mount: Deactivated successfully.
Nov 28 08:57:20 np0005538513.localdomain podman[99363]: 2025-11-28 08:57:20.849734021 +0000 UTC m=+0.087975586 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 28 08:57:21 np0005538513.localdomain podman[99363]: 2025-11-28 08:57:21.04162928 +0000 UTC m=+0.279870865 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, vcs-type=git, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true)
Nov 28 08:57:21 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:57:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:57:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:57:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:57:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:57:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:57:25 np0005538513.localdomain systemd[1]: tmp-crun.uXWe86.mount: Deactivated successfully.
Nov 28 08:57:25 np0005538513.localdomain podman[99394]: 2025-11-28 08:57:25.866313191 +0000 UTC m=+0.097341150 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:57:25 np0005538513.localdomain systemd[1]: tmp-crun.VPghA6.mount: Deactivated successfully.
Nov 28 08:57:25 np0005538513.localdomain podman[99393]: 2025-11-28 08:57:25.957885848 +0000 UTC m=+0.193934474 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.expose-services=, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-ceilometer-compute)
Nov 28 08:57:25 np0005538513.localdomain podman[99395]: 2025-11-28 08:57:25.93110319 +0000 UTC m=+0.160802918 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-cron, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, container_name=logrotate_crond)
Nov 28 08:57:25 np0005538513.localdomain podman[99393]: 2025-11-28 08:57:25.993498093 +0000 UTC m=+0.229546759 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, vcs-type=git, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:57:26 np0005538513.localdomain podman[99395]: 2025-11-28 08:57:26.010731093 +0000 UTC m=+0.240430781 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://www.redhat.com)
Nov 28 08:57:26 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:57:26 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:57:26 np0005538513.localdomain podman[99402]: 2025-11-28 08:57:25.933459174 +0000 UTC m=+0.152748125 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-type=git)
Nov 28 08:57:26 np0005538513.localdomain podman[99402]: 2025-11-28 08:57:26.063503966 +0000 UTC m=+0.282792917 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:57:26 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:57:26 np0005538513.localdomain podman[99401]: 2025-11-28 08:57:26.086233957 +0000 UTC m=+0.308040517 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_compute, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Nov 28 08:57:26 np0005538513.localdomain podman[99401]: 2025-11-28 08:57:26.143554253 +0000 UTC m=+0.365360813 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step5, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:57:26 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:57:26 np0005538513.localdomain podman[99394]: 2025-11-28 08:57:26.230468165 +0000 UTC m=+0.461496164 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:57:26 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:57:26 np0005538513.localdomain sudo[99511]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:57:26 np0005538513.localdomain sudo[99511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:57:26 np0005538513.localdomain sudo[99511]: pam_unix(sudo:session): session closed for user root
Nov 28 08:57:26 np0005538513.localdomain sudo[99526]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:57:26 np0005538513.localdomain sudo[99526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:57:27 np0005538513.localdomain sudo[99526]: pam_unix(sudo:session): session closed for user root
Nov 28 08:57:28 np0005538513.localdomain sudo[99574]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:57:28 np0005538513.localdomain sudo[99574]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:57:28 np0005538513.localdomain sudo[99574]: pam_unix(sudo:session): session closed for user root
Nov 28 08:57:33 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:57:33 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:57:33 np0005538513.localdomain podman[99589]: 2025-11-28 08:57:33.847278773 +0000 UTC m=+0.082663450 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, container_name=ovn_metadata_agent, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:57:33 np0005538513.localdomain systemd[1]: tmp-crun.SwpesI.mount: Deactivated successfully.
Nov 28 08:57:33 np0005538513.localdomain podman[99589]: 2025-11-28 08:57:33.891493127 +0000 UTC m=+0.126877784 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git)
Nov 28 08:57:33 np0005538513.localdomain podman[99589]: unhealthy
Nov 28 08:57:33 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:57:33 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:57:33 np0005538513.localdomain podman[99590]: 2025-11-28 08:57:33.906383894 +0000 UTC m=+0.141313527 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, container_name=ovn_controller, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:57:33 np0005538513.localdomain podman[99590]: 2025-11-28 08:57:33.988087932 +0000 UTC m=+0.223017495 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team)
Nov 28 08:57:33 np0005538513.localdomain podman[99590]: unhealthy
Nov 28 08:57:34 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:57:34 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:57:37 np0005538513.localdomain sshd[99628]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:57:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:57:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:57:49 np0005538513.localdomain systemd[1]: tmp-crun.T6NWc3.mount: Deactivated successfully.
Nov 28 08:57:49 np0005538513.localdomain podman[99630]: 2025-11-28 08:57:49.866827088 +0000 UTC m=+0.100043324 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid)
Nov 28 08:57:49 np0005538513.localdomain podman[99630]: 2025-11-28 08:57:49.906613404 +0000 UTC m=+0.139829700 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, batch=17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, distribution-scope=public, vcs-type=git, architecture=x86_64)
Nov 28 08:57:49 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:57:49 np0005538513.localdomain podman[99631]: 2025-11-28 08:57:49.951943973 +0000 UTC m=+0.184254610 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:57:49 np0005538513.localdomain podman[99631]: 2025-11-28 08:57:49.989461068 +0000 UTC m=+0.221771706 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, architecture=x86_64, config_id=tripleo_step3, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-collectd)
Nov 28 08:57:50 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:57:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:57:51 np0005538513.localdomain podman[99670]: 2025-11-28 08:57:51.858844329 +0000 UTC m=+0.087183491 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc.)
Nov 28 08:57:52 np0005538513.localdomain podman[99670]: 2025-11-28 08:57:52.084566898 +0000 UTC m=+0.312906040 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044)
Nov 28 08:57:52 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:57:55 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:57:55 np0005538513.localdomain recover_tripleo_nova_virtqemud[99700]: 61397
Nov 28 08:57:55 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:57:55 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:57:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:57:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:57:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:57:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:57:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:57:56 np0005538513.localdomain podman[99702]: 2025-11-28 08:57:56.860220361 +0000 UTC m=+0.091826656 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12)
Nov 28 08:57:56 np0005538513.localdomain podman[99701]: 2025-11-28 08:57:56.925118724 +0000 UTC m=+0.156732029 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:57:56 np0005538513.localdomain podman[99703]: 2025-11-28 08:57:56.986222187 +0000 UTC m=+0.215760048 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, tcib_managed=true, container_name=logrotate_crond, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:57:57 np0005538513.localdomain podman[99704]: 2025-11-28 08:57:57.025850219 +0000 UTC m=+0.248294237 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step5, distribution-scope=public, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, container_name=nova_compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044)
Nov 28 08:57:57 np0005538513.localdomain podman[99708]: 2025-11-28 08:57:57.079559471 +0000 UTC m=+0.300179392 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, release=1761123044, vcs-type=git)
Nov 28 08:57:57 np0005538513.localdomain podman[99704]: 2025-11-28 08:57:57.089545263 +0000 UTC m=+0.311989331 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, release=1761123044, container_name=nova_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-nova-compute)
Nov 28 08:57:57 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:57:57 np0005538513.localdomain podman[99701]: 2025-11-28 08:57:57.108630761 +0000 UTC m=+0.340244076 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute)
Nov 28 08:57:57 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:57:57 np0005538513.localdomain podman[99703]: 2025-11-28 08:57:57.145705241 +0000 UTC m=+0.375243092 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1761123044, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, version=17.1.12)
Nov 28 08:57:57 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:57:57 np0005538513.localdomain podman[99708]: 2025-11-28 08:57:57.162555469 +0000 UTC m=+0.383175390 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:57:57 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:57:57 np0005538513.localdomain podman[99702]: 2025-11-28 08:57:57.274385631 +0000 UTC m=+0.505991906 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, release=1761123044, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:57:57 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:57:57 np0005538513.localdomain systemd[1]: tmp-crun.Lq8Ha0.mount: Deactivated successfully.
Nov 28 08:58:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:58:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:58:04 np0005538513.localdomain podman[99818]: 2025-11-28 08:58:04.860936319 +0000 UTC m=+0.070044905 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=)
Nov 28 08:58:04 np0005538513.localdomain systemd[1]: tmp-crun.a3fAnb.mount: Deactivated successfully.
Nov 28 08:58:04 np0005538513.localdomain podman[99819]: 2025-11-28 08:58:04.936254148 +0000 UTC m=+0.140142420 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:58:04 np0005538513.localdomain podman[99818]: 2025-11-28 08:58:04.953936192 +0000 UTC m=+0.163044818 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:58:04 np0005538513.localdomain podman[99818]: unhealthy
Nov 28 08:58:04 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:58:04 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:58:04 np0005538513.localdomain podman[99819]: 2025-11-28 08:58:04.982796616 +0000 UTC m=+0.186684958 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1)
Nov 28 08:58:04 np0005538513.localdomain podman[99819]: unhealthy
Nov 28 08:58:04 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:58:04 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:58:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:58:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:58:20 np0005538513.localdomain systemd[1]: tmp-crun.C0kunO.mount: Deactivated successfully.
Nov 28 08:58:20 np0005538513.localdomain podman[99859]: 2025-11-28 08:58:20.857949681 +0000 UTC m=+0.093779179 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, distribution-scope=public, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, release=1761123044, container_name=iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, vendor=Red Hat, Inc.)
Nov 28 08:58:20 np0005538513.localdomain podman[99860]: 2025-11-28 08:58:20.906232123 +0000 UTC m=+0.139208941 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_id=tripleo_step3, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T22:51:28Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 28 08:58:20 np0005538513.localdomain podman[99860]: 2025-11-28 08:58:20.917423253 +0000 UTC m=+0.150400121 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, config_id=tripleo_step3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z)
Nov 28 08:58:20 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:58:20 np0005538513.localdomain podman[99859]: 2025-11-28 08:58:20.969117491 +0000 UTC m=+0.204947009 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, config_id=tripleo_step3, container_name=iscsid, vcs-type=git, maintainer=OpenStack TripleO Team)
Nov 28 08:58:20 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:58:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:58:22 np0005538513.localdomain systemd[1]: tmp-crun.bbyEaT.mount: Deactivated successfully.
Nov 28 08:58:22 np0005538513.localdomain podman[99899]: 2025-11-28 08:58:22.839596528 +0000 UTC m=+0.081402620 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, version=17.1.12, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 28 08:58:23 np0005538513.localdomain podman[99899]: 2025-11-28 08:58:23.043748591 +0000 UTC m=+0.285554643 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, tcib_managed=true, container_name=metrics_qdr, io.buildah.version=1.41.4, config_id=tripleo_step1, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:58:23 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:58:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:58:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:58:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:58:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:58:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:58:27 np0005538513.localdomain systemd[1]: tmp-crun.tbk1DP.mount: Deactivated successfully.
Nov 28 08:58:27 np0005538513.localdomain podman[99931]: 2025-11-28 08:58:27.913321624 +0000 UTC m=+0.136974760 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team)
Nov 28 08:58:27 np0005538513.localdomain podman[99942]: 2025-11-28 08:58:27.923142262 +0000 UTC m=+0.139660145 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-type=git)
Nov 28 08:58:27 np0005538513.localdomain podman[99929]: 2025-11-28 08:58:27.841899578 +0000 UTC m=+0.072658266 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, com.redhat.component=openstack-nova-compute-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:58:27 np0005538513.localdomain podman[99930]: 2025-11-28 08:58:27.964114075 +0000 UTC m=+0.187962537 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:58:27 np0005538513.localdomain podman[99928]: 2025-11-28 08:58:27.873110795 +0000 UTC m=+0.104216354 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:58:27 np0005538513.localdomain podman[99931]: 2025-11-28 08:58:27.997863862 +0000 UTC m=+0.221516968 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, tcib_managed=true, version=17.1.12, container_name=nova_compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com)
Nov 28 08:58:28 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:58:28 np0005538513.localdomain podman[99930]: 2025-11-28 08:58:28.026836999 +0000 UTC m=+0.250685481 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, vendor=Red Hat, Inc., version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64)
Nov 28 08:58:28 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:58:28 np0005538513.localdomain podman[99942]: 2025-11-28 08:58:28.049785448 +0000 UTC m=+0.266303361 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, distribution-scope=public, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, container_name=ceilometer_agent_ipmi, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi)
Nov 28 08:58:28 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:58:28 np0005538513.localdomain podman[99928]: 2025-11-28 08:58:28.105534824 +0000 UTC m=+0.336640343 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true)
Nov 28 08:58:28 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:58:28 np0005538513.localdomain podman[99929]: 2025-11-28 08:58:28.214157826 +0000 UTC m=+0.444916544 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_migration_target, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:58:28 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:58:28 np0005538513.localdomain sudo[100040]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:58:28 np0005538513.localdomain sudo[100040]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:58:28 np0005538513.localdomain sudo[100040]: pam_unix(sudo:session): session closed for user root
Nov 28 08:58:28 np0005538513.localdomain sudo[100055]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:58:28 np0005538513.localdomain sudo[100055]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:58:29 np0005538513.localdomain sudo[100055]: pam_unix(sudo:session): session closed for user root
Nov 28 08:58:29 np0005538513.localdomain sudo[100102]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:58:29 np0005538513.localdomain sudo[100102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:58:29 np0005538513.localdomain sudo[100102]: pam_unix(sudo:session): session closed for user root
Nov 28 08:58:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:58:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4200.1 total, 600.0 interval
                                                          Cumulative writes: 4899 writes, 22K keys, 4899 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4899 writes, 608 syncs, 8.06 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 08:58:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:58:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:58:35 np0005538513.localdomain podman[100117]: 2025-11-28 08:58:35.854275304 +0000 UTC m=+0.089398411 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:58:35 np0005538513.localdomain podman[100117]: 2025-11-28 08:58:35.898663074 +0000 UTC m=+0.133786221 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc.)
Nov 28 08:58:35 np0005538513.localdomain podman[100117]: unhealthy
Nov 28 08:58:35 np0005538513.localdomain podman[100118]: 2025-11-28 08:58:35.911489165 +0000 UTC m=+0.143559946 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 28 08:58:35 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:58:35 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:58:35 np0005538513.localdomain podman[100118]: 2025-11-28 08:58:35.956547016 +0000 UTC m=+0.188617767 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team)
Nov 28 08:58:35 np0005538513.localdomain podman[100118]: unhealthy
Nov 28 08:58:35 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:58:35 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:58:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:58:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4200.3 total, 600.0 interval
                                                          Cumulative writes: 5616 writes, 25K keys, 5616 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5616 writes, 758 syncs, 7.41 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 08:58:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:58:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:58:51 np0005538513.localdomain podman[100155]: 2025-11-28 08:58:51.852176664 +0000 UTC m=+0.080956837 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, container_name=iscsid, release=1761123044)
Nov 28 08:58:51 np0005538513.localdomain systemd[1]: tmp-crun.eEJEy5.mount: Deactivated successfully.
Nov 28 08:58:51 np0005538513.localdomain podman[100156]: 2025-11-28 08:58:51.915260599 +0000 UTC m=+0.141178802 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, container_name=collectd, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:58:51 np0005538513.localdomain podman[100155]: 2025-11-28 08:58:51.940138988 +0000 UTC m=+0.168919211 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, version=17.1.12, build-date=2025-11-18T23:44:13Z, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, container_name=iscsid, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:58:51 np0005538513.localdomain podman[100156]: 2025-11-28 08:58:51.94946301 +0000 UTC m=+0.175381223 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, tcib_managed=true, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:58:51 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:58:51 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:58:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:58:53 np0005538513.localdomain systemd[1]: tmp-crun.CMv7WA.mount: Deactivated successfully.
Nov 28 08:58:53 np0005538513.localdomain podman[100192]: 2025-11-28 08:58:53.85489133 +0000 UTC m=+0.088929685 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:58:54 np0005538513.localdomain podman[100192]: 2025-11-28 08:58:54.078559155 +0000 UTC m=+0.312597460 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, build-date=2025-11-18T22:49:46Z, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:58:54 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:58:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:58:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:58:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:58:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:58:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:58:58 np0005538513.localdomain systemd[1]: tmp-crun.Mg0VU7.mount: Deactivated successfully.
Nov 28 08:58:58 np0005538513.localdomain podman[100221]: 2025-11-28 08:58:58.868086584 +0000 UTC m=+0.102675457 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, batch=17.1_20251118.1)
Nov 28 08:58:58 np0005538513.localdomain podman[100231]: 2025-11-28 08:58:58.919604527 +0000 UTC m=+0.140823381 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12)
Nov 28 08:58:58 np0005538513.localdomain podman[100221]: 2025-11-28 08:58:58.924429148 +0000 UTC m=+0.159018041 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:58:58 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:58:58 np0005538513.localdomain podman[100231]: 2025-11-28 08:58:58.945109476 +0000 UTC m=+0.166328310 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:58:58 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:58:59 np0005538513.localdomain podman[100222]: 2025-11-28 08:58:59.014170538 +0000 UTC m=+0.245363014 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Nov 28 08:58:59 np0005538513.localdomain podman[100224]: 2025-11-28 08:58:59.070246274 +0000 UTC m=+0.295792223 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, version=17.1.12, config_id=tripleo_step5, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, name=rhosp17/openstack-nova-compute)
Nov 28 08:58:59 np0005538513.localdomain podman[100223]: 2025-11-28 08:58:58.878441528 +0000 UTC m=+0.104692530 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, name=rhosp17/openstack-cron, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Nov 28 08:58:59 np0005538513.localdomain podman[100223]: 2025-11-28 08:58:59.118627259 +0000 UTC m=+0.344878231 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, container_name=logrotate_crond, release=1761123044, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:58:59 np0005538513.localdomain podman[100224]: 2025-11-28 08:58:59.128004594 +0000 UTC m=+0.353550603 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, url=https://www.redhat.com, container_name=nova_compute, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.buildah.version=1.41.4)
Nov 28 08:58:59 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:58:59 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:58:59 np0005538513.localdomain podman[100222]: 2025-11-28 08:58:59.388495211 +0000 UTC m=+0.619687727 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4)
Nov 28 08:58:59 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:59:06 np0005538513.localdomain sshd[99628]: Connection closed by 62.60.131.18 port 43854 [preauth]
Nov 28 08:59:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:59:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:59:06 np0005538513.localdomain podman[100339]: 2025-11-28 08:59:06.190112036 +0000 UTC m=+0.082475714 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, version=17.1.12, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:59:06 np0005538513.localdomain podman[100339]: 2025-11-28 08:59:06.204896169 +0000 UTC m=+0.097259897 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-type=git)
Nov 28 08:59:06 np0005538513.localdomain podman[100339]: unhealthy
Nov 28 08:59:06 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:59:06 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:59:06 np0005538513.localdomain systemd[1]: tmp-crun.TLwlNh.mount: Deactivated successfully.
Nov 28 08:59:06 np0005538513.localdomain podman[100340]: 2025-11-28 08:59:06.302246848 +0000 UTC m=+0.192737687 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., architecture=x86_64, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git)
Nov 28 08:59:06 np0005538513.localdomain podman[100340]: 2025-11-28 08:59:06.319435756 +0000 UTC m=+0.209926595 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=ovn_controller, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, distribution-scope=public)
Nov 28 08:59:06 np0005538513.localdomain podman[100340]: unhealthy
Nov 28 08:59:06 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:59:06 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:59:06 np0005538513.localdomain sshd[100378]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:59:07 np0005538513.localdomain sshd[100378]: Invalid user  from 194.187.176.48 port 63772
Nov 28 08:59:07 np0005538513.localdomain sshd[100378]: Connection closed by invalid user  194.187.176.48 port 63772 [preauth]
Nov 28 08:59:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:59:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:59:22 np0005538513.localdomain podman[100381]: 2025-11-28 08:59:22.865448775 +0000 UTC m=+0.099148985 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc.)
Nov 28 08:59:22 np0005538513.localdomain systemd[1]: tmp-crun.i0xML3.mount: Deactivated successfully.
Nov 28 08:59:22 np0005538513.localdomain podman[100380]: 2025-11-28 08:59:22.933371002 +0000 UTC m=+0.169104296 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, container_name=iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:59:22 np0005538513.localdomain podman[100380]: 2025-11-28 08:59:22.943439278 +0000 UTC m=+0.179172562 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4)
Nov 28 08:59:22 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:59:22 np0005538513.localdomain podman[100381]: 2025-11-28 08:59:22.9565832 +0000 UTC m=+0.190283440 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, url=https://www.redhat.com, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.12, vcs-type=git, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-collectd-container, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd)
Nov 28 08:59:22 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:59:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:59:24 np0005538513.localdomain podman[100417]: 2025-11-28 08:59:24.841400136 +0000 UTC m=+0.082938118 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:59:25 np0005538513.localdomain podman[100417]: 2025-11-28 08:59:25.030360123 +0000 UTC m=+0.271898035 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:59:25 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 08:59:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 08:59:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 08:59:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 08:59:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 08:59:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 08:59:29 np0005538513.localdomain podman[100456]: 2025-11-28 08:59:29.878070823 +0000 UTC m=+0.104574436 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, url=https://www.redhat.com, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi)
Nov 28 08:59:29 np0005538513.localdomain podman[100456]: 2025-11-28 08:59:29.89937468 +0000 UTC m=+0.125878293 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com)
Nov 28 08:59:29 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 08:59:29 np0005538513.localdomain systemd[1]: tmp-crun.mONzuW.mount: Deactivated successfully.
Nov 28 08:59:29 np0005538513.localdomain podman[100448]: 2025-11-28 08:59:29.966233254 +0000 UTC m=+0.200096667 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:59:30 np0005538513.localdomain podman[100449]: 2025-11-28 08:59:30.014563877 +0000 UTC m=+0.246743878 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, build-date=2025-11-18T22:49:32Z, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, io.openshift.expose-services=, com.redhat.component=openstack-cron-container)
Nov 28 08:59:30 np0005538513.localdomain podman[100449]: 2025-11-28 08:59:30.021922058 +0000 UTC m=+0.254102069 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:59:30 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 08:59:30 np0005538513.localdomain podman[100450]: 2025-11-28 08:59:30.062149987 +0000 UTC m=+0.291419397 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, container_name=nova_compute, vcs-type=git, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Nov 28 08:59:30 np0005538513.localdomain podman[100450]: 2025-11-28 08:59:30.094915964 +0000 UTC m=+0.324185384 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:59:30 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 08:59:30 np0005538513.localdomain podman[100447]: 2025-11-28 08:59:30.116317084 +0000 UTC m=+0.354486042 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, build-date=2025-11-19T00:11:48Z)
Nov 28 08:59:30 np0005538513.localdomain podman[100447]: 2025-11-28 08:59:30.146481288 +0000 UTC m=+0.384650276 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:59:30 np0005538513.localdomain sudo[100548]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:59:30 np0005538513.localdomain sudo[100548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:59:30 np0005538513.localdomain sudo[100548]: pam_unix(sudo:session): session closed for user root
Nov 28 08:59:30 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 08:59:30 np0005538513.localdomain sudo[100585]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:59:30 np0005538513.localdomain sudo[100585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:59:30 np0005538513.localdomain podman[100448]: 2025-11-28 08:59:30.328596411 +0000 UTC m=+0.562459844 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:59:30 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 08:59:30 np0005538513.localdomain systemd[1]: tmp-crun.x6r8D3.mount: Deactivated successfully.
Nov 28 08:59:30 np0005538513.localdomain sudo[100585]: pam_unix(sudo:session): session closed for user root
Nov 28 08:59:33 np0005538513.localdomain sudo[100633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:59:33 np0005538513.localdomain sudo[100633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:59:33 np0005538513.localdomain sudo[100633]: pam_unix(sudo:session): session closed for user root
Nov 28 08:59:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 08:59:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 08:59:36 np0005538513.localdomain podman[100649]: 2025-11-28 08:59:36.846972859 +0000 UTC m=+0.085065964 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, release=1761123044, vcs-type=git, tcib_managed=true, container_name=ovn_controller, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=)
Nov 28 08:59:36 np0005538513.localdomain podman[100649]: 2025-11-28 08:59:36.892500025 +0000 UTC m=+0.130593150 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, release=1761123044, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, container_name=ovn_controller)
Nov 28 08:59:36 np0005538513.localdomain podman[100649]: unhealthy
Nov 28 08:59:36 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:59:36 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 08:59:36 np0005538513.localdomain podman[100648]: 2025-11-28 08:59:36.894993903 +0000 UTC m=+0.134425011 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.12, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Nov 28 08:59:36 np0005538513.localdomain podman[100648]: 2025-11-28 08:59:36.978415696 +0000 UTC m=+0.217846794 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:59:36 np0005538513.localdomain podman[100648]: unhealthy
Nov 28 08:59:36 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:59:36 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 08:59:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 08:59:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 08:59:53 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:59:53 np0005538513.localdomain recover_tripleo_nova_virtqemud[100701]: 61397
Nov 28 08:59:53 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:59:53 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:59:53 np0005538513.localdomain systemd[1]: tmp-crun.ukCqXz.mount: Deactivated successfully.
Nov 28 08:59:53 np0005538513.localdomain podman[100688]: 2025-11-28 08:59:53.863588218 +0000 UTC m=+0.095676917 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step3, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, container_name=iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Nov 28 08:59:53 np0005538513.localdomain systemd[1]: tmp-crun.kKjlcp.mount: Deactivated successfully.
Nov 28 08:59:53 np0005538513.localdomain podman[100689]: 2025-11-28 08:59:53.919083536 +0000 UTC m=+0.148849032 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, distribution-scope=public, release=1761123044, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, version=17.1.12)
Nov 28 08:59:53 np0005538513.localdomain podman[100688]: 2025-11-28 08:59:53.930164833 +0000 UTC m=+0.162253562 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, architecture=x86_64, distribution-scope=public, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:59:53 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 08:59:53 np0005538513.localdomain podman[100689]: 2025-11-28 08:59:53.98529452 +0000 UTC m=+0.215060006 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, distribution-scope=public, container_name=collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:59:53 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 08:59:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 08:59:55 np0005538513.localdomain podman[100729]: 2025-11-28 08:59:55.848337423 +0000 UTC m=+0.085743767 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step1, container_name=metrics_qdr, vcs-type=git)
Nov 28 08:59:56 np0005538513.localdomain podman[100729]: 2025-11-28 08:59:56.084099246 +0000 UTC m=+0.321505610 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git)
Nov 28 08:59:56 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 09:00:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 09:00:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 09:00:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 09:00:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 09:00:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 09:00:00 np0005538513.localdomain systemd[1]: tmp-crun.3c693F.mount: Deactivated successfully.
Nov 28 09:00:00 np0005538513.localdomain podman[100767]: 2025-11-28 09:00:00.927293483 +0000 UTC m=+0.142954927 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, release=1761123044, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 09:00:00 np0005538513.localdomain podman[100759]: 2025-11-28 09:00:00.96710707 +0000 UTC m=+0.193256203 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-type=git, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container)
Nov 28 09:00:00 np0005538513.localdomain podman[100767]: 2025-11-28 09:00:00.986469226 +0000 UTC m=+0.202130620 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 09:00:00 np0005538513.localdomain podman[100758]: 2025-11-28 09:00:00.892644569 +0000 UTC m=+0.116861641 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, io.openshift.expose-services=, release=1761123044, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 09:00:01 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 09:00:01 np0005538513.localdomain podman[100758]: 2025-11-28 09:00:01.026533161 +0000 UTC m=+0.250750273 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, architecture=x86_64, tcib_managed=true, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:00:01 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 09:00:01 np0005538513.localdomain podman[100761]: 2025-11-28 09:00:01.05332176 +0000 UTC m=+0.264092161 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step5, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public)
Nov 28 09:00:01 np0005538513.localdomain podman[100761]: 2025-11-28 09:00:01.082526044 +0000 UTC m=+0.293296385 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, container_name=nova_compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 09:00:01 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 09:00:01 np0005538513.localdomain podman[100760]: 2025-11-28 09:00:00.987751797 +0000 UTC m=+0.210362430 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-cron, distribution-scope=public, container_name=logrotate_crond, release=1761123044, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, vcs-type=git, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team)
Nov 28 09:00:01 np0005538513.localdomain podman[100760]: 2025-11-28 09:00:01.171372247 +0000 UTC m=+0.393982830 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., tcib_managed=true)
Nov 28 09:00:01 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 09:00:01 np0005538513.localdomain podman[100759]: 2025-11-28 09:00:01.387416592 +0000 UTC m=+0.613565715 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute)
Nov 28 09:00:01 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 09:00:01 np0005538513.localdomain CROND[100880]: (root) CMD (sleep `expr ${RANDOM} % 90`; /usr/sbin/logrotate -s /var/lib/logrotate/logrotate-crond.status /etc/logrotate-crond.conf 2>&1|logger -t logrotate-crond)
Nov 28 09:00:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 09:00:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 09:00:07 np0005538513.localdomain podman[100883]: 2025-11-28 09:00:07.837472902 +0000 UTC m=+0.078752727 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, container_name=ovn_metadata_agent, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team)
Nov 28 09:00:07 np0005538513.localdomain podman[100884]: 2025-11-28 09:00:07.878495426 +0000 UTC m=+0.117199121 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4)
Nov 28 09:00:07 np0005538513.localdomain podman[100883]: 2025-11-28 09:00:07.882042618 +0000 UTC m=+0.123322443 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4)
Nov 28 09:00:07 np0005538513.localdomain podman[100883]: unhealthy
Nov 28 09:00:07 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:00:07 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 09:00:07 np0005538513.localdomain podman[100884]: 2025-11-28 09:00:07.89615749 +0000 UTC m=+0.134861235 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1761123044, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc.)
Nov 28 09:00:07 np0005538513.localdomain podman[100884]: unhealthy
Nov 28 09:00:07 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:00:07 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 09:00:21 np0005538513.localdomain CROND[100879]: (root) CMDEND (sleep `expr ${RANDOM} % 90`; /usr/sbin/logrotate -s /var/lib/logrotate/logrotate-crond.status /etc/logrotate-crond.conf 2>&1|logger -t logrotate-crond)
Nov 28 09:00:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 09:00:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 09:00:24 np0005538513.localdomain podman[100926]: 2025-11-28 09:00:24.829351964 +0000 UTC m=+0.073560344 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team)
Nov 28 09:00:24 np0005538513.localdomain podman[100926]: 2025-11-28 09:00:24.838515361 +0000 UTC m=+0.082723701 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, release=1761123044, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git)
Nov 28 09:00:24 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 09:00:24 np0005538513.localdomain podman[100927]: 2025-11-28 09:00:24.906719258 +0000 UTC m=+0.144494657 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-collectd, url=https://www.redhat.com, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, tcib_managed=true, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044)
Nov 28 09:00:24 np0005538513.localdomain podman[100927]: 2025-11-28 09:00:24.918659421 +0000 UTC m=+0.156434770 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 09:00:24 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 09:00:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 09:00:26 np0005538513.localdomain podman[100964]: 2025-11-28 09:00:26.839665268 +0000 UTC m=+0.080846283 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 09:00:27 np0005538513.localdomain podman[100964]: 2025-11-28 09:00:27.032454906 +0000 UTC m=+0.273635901 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12)
Nov 28 09:00:27 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 09:00:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 09:00:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 09:00:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 09:00:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 09:00:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 09:00:31 np0005538513.localdomain systemd[1]: tmp-crun.4tmTOo.mount: Deactivated successfully.
Nov 28 09:00:31 np0005538513.localdomain systemd[1]: tmp-crun.JxaPRF.mount: Deactivated successfully.
Nov 28 09:00:31 np0005538513.localdomain podman[100992]: 2025-11-28 09:00:31.911894168 +0000 UTC m=+0.149325797 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64)
Nov 28 09:00:31 np0005538513.localdomain podman[100995]: 2025-11-28 09:00:31.876091207 +0000 UTC m=+0.104987689 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.12, container_name=nova_compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com)
Nov 28 09:00:32 np0005538513.localdomain podman[100994]: 2025-11-28 09:00:31.956446993 +0000 UTC m=+0.187070519 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 09:00:32 np0005538513.localdomain podman[101001]: 2025-11-28 09:00:32.011784956 +0000 UTC m=+0.236498007 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi)
Nov 28 09:00:32 np0005538513.localdomain podman[100995]: 2025-11-28 09:00:32.012393505 +0000 UTC m=+0.241289957 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step5, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 09:00:32 np0005538513.localdomain podman[100993]: 2025-11-28 09:00:31.979863127 +0000 UTC m=+0.214736316 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 09:00:32 np0005538513.localdomain podman[100994]: 2025-11-28 09:00:32.039316708 +0000 UTC m=+0.269940214 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, name=rhosp17/openstack-cron, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1)
Nov 28 09:00:32 np0005538513.localdomain podman[100992]: 2025-11-28 09:00:32.046937907 +0000 UTC m=+0.284369486 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, architecture=x86_64, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute)
Nov 28 09:00:32 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 09:00:32 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 09:00:32 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 09:00:32 np0005538513.localdomain podman[101001]: 2025-11-28 09:00:32.114603866 +0000 UTC m=+0.339316907 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team)
Nov 28 09:00:32 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 09:00:32 np0005538513.localdomain podman[100993]: 2025-11-28 09:00:32.341418669 +0000 UTC m=+0.576291848 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team)
Nov 28 09:00:32 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 09:00:34 np0005538513.localdomain sudo[101109]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:00:35 np0005538513.localdomain sudo[101109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:00:35 np0005538513.localdomain sudo[101109]: pam_unix(sudo:session): session closed for user root
Nov 28 09:00:35 np0005538513.localdomain sudo[101124]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 09:00:35 np0005538513.localdomain sudo[101124]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:00:35 np0005538513.localdomain sudo[101124]: pam_unix(sudo:session): session closed for user root
Nov 28 09:00:36 np0005538513.localdomain sudo[101157]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:00:36 np0005538513.localdomain sudo[101157]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:00:36 np0005538513.localdomain sudo[101157]: pam_unix(sudo:session): session closed for user root
Nov 28 09:00:36 np0005538513.localdomain sudo[101172]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:00:36 np0005538513.localdomain sudo[101172]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:00:36 np0005538513.localdomain sudo[101172]: pam_unix(sudo:session): session closed for user root
Nov 28 09:00:37 np0005538513.localdomain sudo[101218]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:00:37 np0005538513.localdomain sudo[101218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:00:37 np0005538513.localdomain sudo[101218]: pam_unix(sudo:session): session closed for user root
Nov 28 09:00:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 09:00:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 09:00:38 np0005538513.localdomain systemd[1]: tmp-crun.RnYGOO.mount: Deactivated successfully.
Nov 28 09:00:38 np0005538513.localdomain podman[101233]: 2025-11-28 09:00:38.865518338 +0000 UTC m=+0.095083909 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, architecture=x86_64, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com)
Nov 28 09:00:38 np0005538513.localdomain podman[101234]: 2025-11-28 09:00:38.912767417 +0000 UTC m=+0.142006758 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=)
Nov 28 09:00:38 np0005538513.localdomain podman[101233]: 2025-11-28 09:00:38.919166158 +0000 UTC m=+0.148731739 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 09:00:38 np0005538513.localdomain podman[101233]: unhealthy
Nov 28 09:00:38 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:00:38 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 09:00:38 np0005538513.localdomain podman[101234]: 2025-11-28 09:00:38.996472878 +0000 UTC m=+0.225712159 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 09:00:39 np0005538513.localdomain podman[101234]: unhealthy
Nov 28 09:00:39 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:00:39 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 09:00:47 np0005538513.localdomain sshd[101274]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:00:48 np0005538513.localdomain sshd[101274]: Received disconnect from 193.46.255.99 port 39552:11:  [preauth]
Nov 28 09:00:48 np0005538513.localdomain sshd[101274]: Disconnected from authenticating user root 193.46.255.99 port 39552 [preauth]
Nov 28 09:00:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 09:00:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 09:00:55 np0005538513.localdomain podman[101276]: 2025-11-28 09:00:55.853872632 +0000 UTC m=+0.090955540 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 09:00:55 np0005538513.localdomain podman[101276]: 2025-11-28 09:00:55.893490192 +0000 UTC m=+0.130573070 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true)
Nov 28 09:00:55 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 09:00:55 np0005538513.localdomain podman[101277]: 2025-11-28 09:00:55.910475334 +0000 UTC m=+0.145736385 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git)
Nov 28 09:00:55 np0005538513.localdomain podman[101277]: 2025-11-28 09:00:55.925333179 +0000 UTC m=+0.160594210 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, version=17.1.12, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4)
Nov 28 09:00:55 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 09:00:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 09:00:57 np0005538513.localdomain podman[101315]: 2025-11-28 09:00:57.847148923 +0000 UTC m=+0.085908811 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, config_id=tripleo_step1, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 09:00:58 np0005538513.localdomain podman[101315]: 2025-11-28 09:00:58.038466505 +0000 UTC m=+0.277226373 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container)
Nov 28 09:00:58 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 09:01:01 np0005538513.localdomain CROND[101345]: (root) CMD (run-parts /etc/cron.hourly)
Nov 28 09:01:01 np0005538513.localdomain run-parts[101348]: (/etc/cron.hourly) starting 0anacron
Nov 28 09:01:01 np0005538513.localdomain run-parts[101354]: (/etc/cron.hourly) finished 0anacron
Nov 28 09:01:01 np0005538513.localdomain CROND[101344]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 28 09:01:01 np0005538513.localdomain CROND[101356]: (root) CMD (run-parts /etc/cron.hourly)
Nov 28 09:01:01 np0005538513.localdomain run-parts[101359]: (/etc/cron.hourly) starting 0anacron
Nov 28 09:01:01 np0005538513.localdomain anacron[101367]: Anacron started on 2025-11-28
Nov 28 09:01:01 np0005538513.localdomain anacron[101367]: Will run job `cron.daily' in 44 min.
Nov 28 09:01:01 np0005538513.localdomain anacron[101367]: Will run job `cron.weekly' in 64 min.
Nov 28 09:01:01 np0005538513.localdomain anacron[101367]: Will run job `cron.monthly' in 84 min.
Nov 28 09:01:01 np0005538513.localdomain anacron[101367]: Jobs will be executed sequentially
Nov 28 09:01:01 np0005538513.localdomain run-parts[101369]: (/etc/cron.hourly) finished 0anacron
Nov 28 09:01:01 np0005538513.localdomain CROND[101355]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 28 09:01:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 09:01:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 09:01:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 09:01:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 09:01:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 09:01:02 np0005538513.localdomain podman[101370]: 2025-11-28 09:01:02.884808251 +0000 UTC m=+0.116637584 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container)
Nov 28 09:01:02 np0005538513.localdomain podman[101383]: 2025-11-28 09:01:02.920512718 +0000 UTC m=+0.141618325 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, distribution-scope=public, build-date=2025-11-19T00:12:45Z, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Nov 28 09:01:02 np0005538513.localdomain podman[101373]: 2025-11-28 09:01:02.966332724 +0000 UTC m=+0.189361281 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Nov 28 09:01:02 np0005538513.localdomain podman[101370]: 2025-11-28 09:01:02.986896628 +0000 UTC m=+0.218726051 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, architecture=x86_64, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044)
Nov 28 09:01:02 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 09:01:03 np0005538513.localdomain podman[101383]: 2025-11-28 09:01:03.007154342 +0000 UTC m=+0.228259979 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, batch=17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi)
Nov 28 09:01:03 np0005538513.localdomain podman[101371]: 2025-11-28 09:01:02.852108997 +0000 UTC m=+0.083497496 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.openshift.expose-services=)
Nov 28 09:01:03 np0005538513.localdomain podman[101373]: 2025-11-28 09:01:03.019413976 +0000 UTC m=+0.242442503 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, container_name=nova_compute)
Nov 28 09:01:03 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 09:01:03 np0005538513.localdomain podman[101372]: 2025-11-28 09:01:02.988090386 +0000 UTC m=+0.214123057 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, distribution-scope=public, release=1761123044)
Nov 28 09:01:03 np0005538513.localdomain podman[101372]: 2025-11-28 09:01:03.068275186 +0000 UTC m=+0.294307937 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=logrotate_crond, tcib_managed=true, build-date=2025-11-18T22:49:32Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=)
Nov 28 09:01:03 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 09:01:03 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 09:01:03 np0005538513.localdomain podman[101371]: 2025-11-28 09:01:03.214417082 +0000 UTC m=+0.445805581 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, vcs-type=git, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 09:01:03 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 09:01:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 09:01:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 09:01:09 np0005538513.localdomain systemd[1]: tmp-crun.FH8VwZ.mount: Deactivated successfully.
Nov 28 09:01:09 np0005538513.localdomain podman[101491]: 2025-11-28 09:01:09.887423032 +0000 UTC m=+0.116972203 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, managed_by=tripleo_ansible)
Nov 28 09:01:09 np0005538513.localdomain podman[101490]: 2025-11-28 09:01:09.903777525 +0000 UTC m=+0.136042691 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Nov 28 09:01:09 np0005538513.localdomain podman[101491]: 2025-11-28 09:01:09.931446891 +0000 UTC m=+0.160996062 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 28 09:01:09 np0005538513.localdomain podman[101491]: unhealthy
Nov 28 09:01:09 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:01:09 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 09:01:09 np0005538513.localdomain podman[101490]: 2025-11-28 09:01:09.944512041 +0000 UTC m=+0.176777227 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Nov 28 09:01:09 np0005538513.localdomain podman[101490]: unhealthy
Nov 28 09:01:09 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:01:09 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 09:01:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 09:01:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 09:01:26 np0005538513.localdomain podman[101531]: 2025-11-28 09:01:26.838083896 +0000 UTC m=+0.073537783 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-collectd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z)
Nov 28 09:01:26 np0005538513.localdomain podman[101531]: 2025-11-28 09:01:26.849446922 +0000 UTC m=+0.084900799 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, version=17.1.12, container_name=collectd, config_id=tripleo_step3, io.buildah.version=1.41.4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, maintainer=OpenStack TripleO Team)
Nov 28 09:01:26 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 09:01:26 np0005538513.localdomain podman[101530]: 2025-11-28 09:01:26.902185874 +0000 UTC m=+0.140423969 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, release=1761123044, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, distribution-scope=public, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z)
Nov 28 09:01:26 np0005538513.localdomain podman[101530]: 2025-11-28 09:01:26.935384734 +0000 UTC m=+0.173622849 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 09:01:26 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 09:01:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 09:01:28 np0005538513.localdomain podman[101569]: 2025-11-28 09:01:28.840289168 +0000 UTC m=+0.079423549 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, distribution-scope=public)
Nov 28 09:01:29 np0005538513.localdomain podman[101569]: 2025-11-28 09:01:29.000491434 +0000 UTC m=+0.239625875 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 09:01:29 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 09:01:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 09:01:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 09:01:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 09:01:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 09:01:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 09:01:34 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 09:01:34 np0005538513.localdomain recover_tripleo_nova_virtqemud[101628]: 61397
Nov 28 09:01:34 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 09:01:34 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 09:01:34 np0005538513.localdomain systemd[1]: tmp-crun.ZPme0I.mount: Deactivated successfully.
Nov 28 09:01:34 np0005538513.localdomain podman[101598]: 2025-11-28 09:01:34.212357557 +0000 UTC m=+0.106929329 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 09:01:34 np0005538513.localdomain systemd[1]: tmp-crun.PKuXzr.mount: Deactivated successfully.
Nov 28 09:01:34 np0005538513.localdomain podman[101599]: 2025-11-28 09:01:34.255164778 +0000 UTC m=+0.145305881 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, container_name=nova_migration_target, config_id=tripleo_step4, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, version=17.1.12, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 09:01:34 np0005538513.localdomain podman[101598]: 2025-11-28 09:01:34.262962383 +0000 UTC m=+0.157534135 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com)
Nov 28 09:01:34 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 09:01:34 np0005538513.localdomain podman[101600]: 2025-11-28 09:01:34.309994825 +0000 UTC m=+0.199407015 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1)
Nov 28 09:01:34 np0005538513.localdomain podman[101600]: 2025-11-28 09:01:34.322345812 +0000 UTC m=+0.211757992 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Nov 28 09:01:34 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 09:01:34 np0005538513.localdomain podman[101604]: 2025-11-28 09:01:34.364403389 +0000 UTC m=+0.243178026 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com)
Nov 28 09:01:34 np0005538513.localdomain podman[101604]: 2025-11-28 09:01:34.392414017 +0000 UTC m=+0.271188714 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 09:01:34 np0005538513.localdomain podman[101607]: 2025-11-28 09:01:34.429173058 +0000 UTC m=+0.308932906 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Nov 28 09:01:34 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 09:01:34 np0005538513.localdomain podman[101607]: 2025-11-28 09:01:34.465397602 +0000 UTC m=+0.345157440 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 09:01:34 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 09:01:34 np0005538513.localdomain podman[101599]: 2025-11-28 09:01:34.616336389 +0000 UTC m=+0.506477502 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible)
Nov 28 09:01:34 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 09:01:37 np0005538513.localdomain sudo[101716]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:01:37 np0005538513.localdomain sudo[101716]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:01:37 np0005538513.localdomain sudo[101716]: pam_unix(sudo:session): session closed for user root
Nov 28 09:01:37 np0005538513.localdomain sudo[101731]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:01:37 np0005538513.localdomain sudo[101731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:01:38 np0005538513.localdomain sudo[101731]: pam_unix(sudo:session): session closed for user root
Nov 28 09:01:38 np0005538513.localdomain sudo[101778]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:01:38 np0005538513.localdomain sudo[101778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:01:38 np0005538513.localdomain sudo[101778]: pam_unix(sudo:session): session closed for user root
Nov 28 09:01:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 09:01:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 09:01:40 np0005538513.localdomain podman[101793]: 2025-11-28 09:01:40.850691602 +0000 UTC m=+0.089395990 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-11-19T00:14:25Z, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4)
Nov 28 09:01:40 np0005538513.localdomain podman[101793]: 2025-11-28 09:01:40.890722586 +0000 UTC m=+0.129426924 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, release=1761123044, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 09:01:40 np0005538513.localdomain podman[101793]: unhealthy
Nov 28 09:01:40 np0005538513.localdomain systemd[1]: tmp-crun.Hq0JBF.mount: Deactivated successfully.
Nov 28 09:01:40 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:01:40 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 09:01:40 np0005538513.localdomain podman[101794]: 2025-11-28 09:01:40.913346004 +0000 UTC m=+0.151059151 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, release=1761123044, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, tcib_managed=true)
Nov 28 09:01:40 np0005538513.localdomain podman[101794]: 2025-11-28 09:01:40.927322602 +0000 UTC m=+0.165035749 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible)
Nov 28 09:01:40 np0005538513.localdomain podman[101794]: unhealthy
Nov 28 09:01:40 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:01:40 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 09:01:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 09:01:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 09:01:57 np0005538513.localdomain podman[101836]: 2025-11-28 09:01:57.84489035 +0000 UTC m=+0.079429398 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc.)
Nov 28 09:01:57 np0005538513.localdomain podman[101836]: 2025-11-28 09:01:57.857567948 +0000 UTC m=+0.092107006 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, tcib_managed=true, distribution-scope=public, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team)
Nov 28 09:01:57 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 09:01:57 np0005538513.localdomain podman[101835]: 2025-11-28 09:01:57.909489004 +0000 UTC m=+0.144596190 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, container_name=iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Nov 28 09:01:57 np0005538513.localdomain podman[101835]: 2025-11-28 09:01:57.921476889 +0000 UTC m=+0.156584085 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, container_name=iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4)
Nov 28 09:01:57 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 09:01:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 09:01:59 np0005538513.localdomain podman[101873]: 2025-11-28 09:01:59.846279006 +0000 UTC m=+0.080610126 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, version=17.1.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public)
Nov 28 09:02:00 np0005538513.localdomain podman[101873]: 2025-11-28 09:02:00.036282256 +0000 UTC m=+0.270613356 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=)
Nov 28 09:02:00 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 09:02:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 09:02:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 09:02:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 09:02:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 09:02:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 09:02:04 np0005538513.localdomain podman[101905]: 2025-11-28 09:02:04.873928681 +0000 UTC m=+0.106134975 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 09:02:04 np0005538513.localdomain podman[101903]: 2025-11-28 09:02:04.853230642 +0000 UTC m=+0.091698913 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, architecture=x86_64, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=ceilometer_agent_compute, io.openshift.expose-services=, distribution-scope=public, version=17.1.12, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1)
Nov 28 09:02:04 np0005538513.localdomain podman[101911]: 2025-11-28 09:02:04.919199658 +0000 UTC m=+0.146008743 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 09:02:04 np0005538513.localdomain systemd[1]: tmp-crun.rQLpyQ.mount: Deactivated successfully.
Nov 28 09:02:04 np0005538513.localdomain podman[101905]: 2025-11-28 09:02:04.939808003 +0000 UTC m=+0.172014307 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, distribution-scope=public, container_name=logrotate_crond, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container)
Nov 28 09:02:04 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 09:02:04 np0005538513.localdomain podman[101911]: 2025-11-28 09:02:04.979449545 +0000 UTC m=+0.206258650 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, container_name=nova_compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step5)
Nov 28 09:02:04 np0005538513.localdomain podman[101903]: 2025-11-28 09:02:04.987801336 +0000 UTC m=+0.226269607 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 09:02:04 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 09:02:04 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully.
Nov 28 09:02:05 np0005538513.localdomain podman[101913]: 2025-11-28 09:02:05.073308094 +0000 UTC m=+0.296726823 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi)
Nov 28 09:02:05 np0005538513.localdomain podman[101904]: 2025-11-28 09:02:05.119980575 +0000 UTC m=+0.352368665 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 09:02:05 np0005538513.localdomain podman[101913]: 2025-11-28 09:02:05.128452391 +0000 UTC m=+0.351871080 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, release=1761123044, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 09:02:05 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully.
Nov 28 09:02:05 np0005538513.localdomain podman[101904]: 2025-11-28 09:02:05.50138742 +0000 UTC m=+0.733775500 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, container_name=nova_migration_target, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com)
Nov 28 09:02:05 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 09:02:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 09:02:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 09:02:11 np0005538513.localdomain systemd[1]: tmp-crun.X7kq12.mount: Deactivated successfully.
Nov 28 09:02:11 np0005538513.localdomain podman[102021]: 2025-11-28 09:02:11.856847676 +0000 UTC m=+0.093954802 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Nov 28 09:02:11 np0005538513.localdomain podman[102022]: 2025-11-28 09:02:11.900876515 +0000 UTC m=+0.134705119 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., version=17.1.12)
Nov 28 09:02:11 np0005538513.localdomain podman[102021]: 2025-11-28 09:02:11.904408226 +0000 UTC m=+0.141515332 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.4)
Nov 28 09:02:11 np0005538513.localdomain podman[102021]: unhealthy
Nov 28 09:02:11 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:02:11 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 09:02:11 np0005538513.localdomain podman[102022]: 2025-11-28 09:02:11.920428678 +0000 UTC m=+0.154257302 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:02:11 np0005538513.localdomain podman[102022]: unhealthy
Nov 28 09:02:11 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:02:11 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 09:02:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 09:02:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 09:02:28 np0005538513.localdomain systemd[1]: tmp-crun.ympjBx.mount: Deactivated successfully.
Nov 28 09:02:28 np0005538513.localdomain podman[102062]: 2025-11-28 09:02:28.848921767 +0000 UTC m=+0.088801692 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 09:02:28 np0005538513.localdomain podman[102062]: 2025-11-28 09:02:28.858396834 +0000 UTC m=+0.098276699 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, build-date=2025-11-18T23:44:13Z, container_name=iscsid, version=17.1.12, managed_by=tripleo_ansible)
Nov 28 09:02:28 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 09:02:28 np0005538513.localdomain podman[102063]: 2025-11-28 09:02:28.925938759 +0000 UTC m=+0.167954731 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, container_name=collectd, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 09:02:28 np0005538513.localdomain podman[102063]: 2025-11-28 09:02:28.933666151 +0000 UTC m=+0.175682193 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=collectd, vcs-type=git, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z)
Nov 28 09:02:28 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 09:02:30 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 09:02:30 np0005538513.localdomain podman[102101]: 2025-11-28 09:02:30.84165449 +0000 UTC m=+0.078895668 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, config_id=tripleo_step1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=metrics_qdr, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, managed_by=tripleo_ansible)
Nov 28 09:02:31 np0005538513.localdomain podman[102101]: 2025-11-28 09:02:31.049840685 +0000 UTC m=+0.287081793 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git)
Nov 28 09:02:31 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 09:02:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 09:02:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 09:02:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 09:02:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 09:02:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 09:02:35 np0005538513.localdomain podman[102132]: 2025-11-28 09:02:35.852841714 +0000 UTC m=+0.085571975 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Nov 28 09:02:35 np0005538513.localdomain podman[102132]: 2025-11-28 09:02:35.890510922 +0000 UTC m=+0.123241183 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, batch=17.1_20251118.1, config_id=tripleo_step4, tcib_managed=true, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=)
Nov 28 09:02:35 np0005538513.localdomain systemd[1]: tmp-crun.96rFJ1.mount: Deactivated successfully.
Nov 28 09:02:35 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 09:02:35 np0005538513.localdomain podman[102131]: 2025-11-28 09:02:35.909664896 +0000 UTC m=+0.145741140 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, container_name=nova_migration_target, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 09:02:35 np0005538513.localdomain podman[102134]: 2025-11-28 09:02:35.972966759 +0000 UTC m=+0.201962073 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 09:02:36 np0005538513.localdomain podman[102142]: 2025-11-28 09:02:36.022634929 +0000 UTC m=+0.248365392 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Nov 28 09:02:36 np0005538513.localdomain podman[102134]: 2025-11-28 09:02:36.031232286 +0000 UTC m=+0.260227570 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, container_name=nova_compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container)
Nov 28 09:02:36 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully.
Nov 28 09:02:36 np0005538513.localdomain podman[102142]: 2025-11-28 09:02:36.062924438 +0000 UTC m=+0.288654871 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc.)
Nov 28 09:02:36 np0005538513.localdomain podman[102142]: unhealthy
Nov 28 09:02:36 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:02:36 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Failed with result 'exit-code'.
Nov 28 09:02:36 np0005538513.localdomain podman[102130]: 2025-11-28 09:02:36.117694596 +0000 UTC m=+0.357351621 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 09:02:36 np0005538513.localdomain podman[102130]: 2025-11-28 09:02:36.132901518 +0000 UTC m=+0.372558583 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:02:36 np0005538513.localdomain podman[102130]: unhealthy
Nov 28 09:02:36 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:02:36 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Failed with result 'exit-code'.
Nov 28 09:02:36 np0005538513.localdomain podman[102131]: 2025-11-28 09:02:36.288255345 +0000 UTC m=+0.524331549 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, name=rhosp17/openstack-nova-compute, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, distribution-scope=public, com.redhat.component=openstack-nova-compute-container)
Nov 28 09:02:36 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 09:02:39 np0005538513.localdomain sudo[102235]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:02:39 np0005538513.localdomain sudo[102235]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:02:39 np0005538513.localdomain sudo[102235]: pam_unix(sudo:session): session closed for user root
Nov 28 09:02:39 np0005538513.localdomain sudo[102250]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:02:39 np0005538513.localdomain sudo[102250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:02:39 np0005538513.localdomain podman[102333]: 2025-11-28 09:02:39.934775964 +0000 UTC m=+0.093559642 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-type=git, release=553, io.buildah.version=1.33.12, GIT_BRANCH=main, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, ceph=True, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, RELEASE=main, com.redhat.component=rhceph-container)
Nov 28 09:02:40 np0005538513.localdomain podman[102333]: 2025-11-28 09:02:40.028263782 +0000 UTC m=+0.187047450 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_BRANCH=main, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, distribution-scope=public, version=7, release=553, vcs-type=git, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc.)
Nov 28 09:02:40 np0005538513.localdomain sudo[102250]: pam_unix(sudo:session): session closed for user root
Nov 28 09:02:40 np0005538513.localdomain sudo[102398]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:02:40 np0005538513.localdomain sudo[102398]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:02:40 np0005538513.localdomain sudo[102398]: pam_unix(sudo:session): session closed for user root
Nov 28 09:02:40 np0005538513.localdomain sudo[102413]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:02:40 np0005538513.localdomain sudo[102413]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:02:41 np0005538513.localdomain sudo[102413]: pam_unix(sudo:session): session closed for user root
Nov 28 09:02:41 np0005538513.localdomain sudo[102461]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:02:41 np0005538513.localdomain sudo[102461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:02:41 np0005538513.localdomain sudo[102461]: pam_unix(sudo:session): session closed for user root
Nov 28 09:02:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 09:02:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 09:02:42 np0005538513.localdomain podman[102477]: 2025-11-28 09:02:42.853788055 +0000 UTC m=+0.090275780 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, vcs-type=git, name=rhosp17/openstack-ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true)
Nov 28 09:02:42 np0005538513.localdomain podman[102476]: 2025-11-28 09:02:42.902118563 +0000 UTC m=+0.138713182 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, release=1761123044, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com)
Nov 28 09:02:42 np0005538513.localdomain podman[102477]: 2025-11-28 09:02:42.919676778 +0000 UTC m=+0.156164493 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ovn_controller, io.buildah.version=1.41.4)
Nov 28 09:02:42 np0005538513.localdomain podman[102477]: unhealthy
Nov 28 09:02:42 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:02:42 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 09:02:42 np0005538513.localdomain podman[102476]: 2025-11-28 09:02:42.944575399 +0000 UTC m=+0.181169998 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent)
Nov 28 09:02:42 np0005538513.localdomain podman[102476]: unhealthy
Nov 28 09:02:42 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:02:42 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 09:02:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 09:02:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 09:02:59 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 09:02:59 np0005538513.localdomain recover_tripleo_nova_virtqemud[102526]: 61397
Nov 28 09:02:59 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 09:02:59 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 09:02:59 np0005538513.localdomain podman[102516]: 2025-11-28 09:02:59.864511111 +0000 UTC m=+0.100748055 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, container_name=iscsid, version=17.1.12)
Nov 28 09:02:59 np0005538513.localdomain podman[102516]: 2025-11-28 09:02:59.8783707 +0000 UTC m=+0.114607684 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4)
Nov 28 09:02:59 np0005538513.localdomain systemd[1]: tmp-crun.mkdMCz.mount: Deactivated successfully.
Nov 28 09:02:59 np0005538513.localdomain podman[102517]: 2025-11-28 09:02:59.922153768 +0000 UTC m=+0.154986606 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3)
Nov 28 09:02:59 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 09:02:59 np0005538513.localdomain podman[102517]: 2025-11-28 09:02:59.987390401 +0000 UTC m=+0.220223249 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step3, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Nov 28 09:02:59 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 09:03:01 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 09:03:01 np0005538513.localdomain systemd[1]: tmp-crun.zGFk5F.mount: Deactivated successfully.
Nov 28 09:03:01 np0005538513.localdomain podman[102557]: 2025-11-28 09:03:01.856899419 +0000 UTC m=+0.095786741 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=)
Nov 28 09:03:02 np0005538513.localdomain podman[102557]: 2025-11-28 09:03:02.087737367 +0000 UTC m=+0.326624649 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, url=https://www.redhat.com, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12)
Nov 28 09:03:02 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 09:03:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 09:03:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 09:03:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 09:03:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 09:03:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 09:03:06 np0005538513.localdomain podman[102587]: 2025-11-28 09:03:06.869173807 +0000 UTC m=+0.105364858 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute)
Nov 28 09:03:06 np0005538513.localdomain systemd[1]: tmp-crun.YHXcrU.mount: Deactivated successfully.
Nov 28 09:03:06 np0005538513.localdomain podman[102588]: 2025-11-28 09:03:06.967791845 +0000 UTC m=+0.199583650 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 09:03:06 np0005538513.localdomain podman[102588]: 2025-11-28 09:03:06.973558364 +0000 UTC m=+0.205350149 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, config_id=tripleo_step4, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container)
Nov 28 09:03:06 np0005538513.localdomain podman[102586]: 2025-11-28 09:03:06.930906821 +0000 UTC m=+0.161656964 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true)
Nov 28 09:03:06 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 09:03:07 np0005538513.localdomain podman[102586]: 2025-11-28 09:03:07.010832939 +0000 UTC m=+0.241583042 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git)
Nov 28 09:03:07 np0005538513.localdomain podman[102586]: unhealthy
Nov 28 09:03:07 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:03:07 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Failed with result 'exit-code'.
Nov 28 09:03:07 np0005538513.localdomain podman[102589]: 2025-11-28 09:03:07.079489528 +0000 UTC m=+0.307808325 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 09:03:07 np0005538513.localdomain podman[102589]: 2025-11-28 09:03:07.101485221 +0000 UTC m=+0.329803998 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, config_id=tripleo_step5, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 09:03:07 np0005538513.localdomain podman[102589]: unhealthy
Nov 28 09:03:07 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:03:07 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'.
Nov 28 09:03:07 np0005538513.localdomain podman[102598]: 2025-11-28 09:03:07.168583981 +0000 UTC m=+0.392171821 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team)
Nov 28 09:03:07 np0005538513.localdomain podman[102598]: 2025-11-28 09:03:07.211602945 +0000 UTC m=+0.435190815 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=)
Nov 28 09:03:07 np0005538513.localdomain podman[102598]: unhealthy
Nov 28 09:03:07 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:03:07 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Failed with result 'exit-code'.
Nov 28 09:03:07 np0005538513.localdomain podman[102587]: 2025-11-28 09:03:07.223636638 +0000 UTC m=+0.459827709 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 09:03:07 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 09:03:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 09:03:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 09:03:13 np0005538513.localdomain podman[102686]: 2025-11-28 09:03:13.85312249 +0000 UTC m=+0.088916398 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, distribution-scope=public, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=)
Nov 28 09:03:13 np0005538513.localdomain podman[102686]: 2025-11-28 09:03:13.896275288 +0000 UTC m=+0.132069166 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible)
Nov 28 09:03:13 np0005538513.localdomain podman[102686]: unhealthy
Nov 28 09:03:13 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:03:13 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 09:03:13 np0005538513.localdomain podman[102687]: 2025-11-28 09:03:13.903570054 +0000 UTC m=+0.136275977 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container)
Nov 28 09:03:13 np0005538513.localdomain podman[102687]: 2025-11-28 09:03:13.984650839 +0000 UTC m=+0.217356702 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git)
Nov 28 09:03:13 np0005538513.localdomain podman[102687]: unhealthy
Nov 28 09:03:13 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:03:13 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 09:03:18 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26986 DF PROTO=TCP SPT=39260 DPT=9105 SEQ=1654414081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0285EA0000000001030307) 
Nov 28 09:03:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26987 DF PROTO=TCP SPT=39260 DPT=9105 SEQ=1654414081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB028A020000000001030307) 
Nov 28 09:03:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26988 DF PROTO=TCP SPT=39260 DPT=9105 SEQ=1654414081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0292020000000001030307) 
Nov 28 09:03:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26989 DF PROTO=TCP SPT=39260 DPT=9105 SEQ=1654414081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02A1C20000000001030307) 
Nov 28 09:03:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60509 DF PROTO=TCP SPT=49524 DPT=9101 SEQ=3725412370 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02ACD90000000001030307) 
Nov 28 09:03:29 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60510 DF PROTO=TCP SPT=49524 DPT=9101 SEQ=3725412370 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02B0C20000000001030307) 
Nov 28 09:03:30 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 09:03:30 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 09:03:30 np0005538513.localdomain systemd[1]: tmp-crun.j44F37.mount: Deactivated successfully.
Nov 28 09:03:30 np0005538513.localdomain podman[102725]: 2025-11-28 09:03:30.848754729 +0000 UTC m=+0.086445662 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, vcs-type=git)
Nov 28 09:03:30 np0005538513.localdomain podman[102725]: 2025-11-28 09:03:30.85783106 +0000 UTC m=+0.095521943 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 09:03:30 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 09:03:30 np0005538513.localdomain podman[102726]: 2025-11-28 09:03:30.9497336 +0000 UTC m=+0.183342746 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_id=tripleo_step3, batch=17.1_20251118.1)
Nov 28 09:03:30 np0005538513.localdomain podman[102726]: 2025-11-28 09:03:30.988477531 +0000 UTC m=+0.222086627 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1)
Nov 28 09:03:31 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 09:03:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60511 DF PROTO=TCP SPT=49524 DPT=9101 SEQ=3725412370 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02B8C20000000001030307) 
Nov 28 09:03:32 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32505 DF PROTO=TCP SPT=47858 DPT=9102 SEQ=1229046501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02BDC40000000001030307) 
Nov 28 09:03:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 09:03:32 np0005538513.localdomain podman[102765]: 2025-11-28 09:03:32.839793145 +0000 UTC m=+0.078438834 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, vcs-type=git)
Nov 28 09:03:33 np0005538513.localdomain podman[102765]: 2025-11-28 09:03:33.045833254 +0000 UTC m=+0.284478953 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 09:03:33 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 09:03:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26990 DF PROTO=TCP SPT=39260 DPT=9105 SEQ=1654414081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02C1820000000001030307) 
Nov 28 09:03:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32506 DF PROTO=TCP SPT=47858 DPT=9102 SEQ=1229046501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02C1C30000000001030307) 
Nov 28 09:03:35 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60512 DF PROTO=TCP SPT=49524 DPT=9101 SEQ=3725412370 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02C8830000000001030307) 
Nov 28 09:03:35 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29953 DF PROTO=TCP SPT=49902 DPT=9100 SEQ=4105540862 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02C9500000000001030307) 
Nov 28 09:03:35 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32507 DF PROTO=TCP SPT=47858 DPT=9102 SEQ=1229046501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02C9C30000000001030307) 
Nov 28 09:03:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29954 DF PROTO=TCP SPT=49902 DPT=9100 SEQ=4105540862 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02CD430000000001030307) 
Nov 28 09:03:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 09:03:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 09:03:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 09:03:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 09:03:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 09:03:37 np0005538513.localdomain systemd[1]: tmp-crun.qXqCU2.mount: Deactivated successfully.
Nov 28 09:03:37 np0005538513.localdomain podman[102795]: 2025-11-28 09:03:37.851135133 +0000 UTC m=+0.082204870 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 09:03:37 np0005538513.localdomain systemd[1]: tmp-crun.ztu0md.mount: Deactivated successfully.
Nov 28 09:03:37 np0005538513.localdomain podman[102794]: 2025-11-28 09:03:37.882534486 +0000 UTC m=+0.114704488 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc.)
Nov 28 09:03:37 np0005538513.localdomain podman[102794]: 2025-11-28 09:03:37.921571837 +0000 UTC m=+0.153741869 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible)
Nov 28 09:03:37 np0005538513.localdomain podman[102794]: unhealthy
Nov 28 09:03:37 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:03:37 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Failed with result 'exit-code'.
Nov 28 09:03:37 np0005538513.localdomain podman[102809]: 2025-11-28 09:03:37.89457738 +0000 UTC m=+0.110372144 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044)
Nov 28 09:03:37 np0005538513.localdomain podman[102796]: 2025-11-28 09:03:37.922569558 +0000 UTC m=+0.148271208 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=logrotate_crond, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.)
Nov 28 09:03:38 np0005538513.localdomain podman[102796]: 2025-11-28 09:03:38.000886567 +0000 UTC m=+0.226588247 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, build-date=2025-11-18T22:49:32Z)
Nov 28 09:03:38 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 09:03:38 np0005538513.localdomain podman[102809]: 2025-11-28 09:03:38.023628052 +0000 UTC m=+0.239422876 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi)
Nov 28 09:03:38 np0005538513.localdomain podman[102809]: unhealthy
Nov 28 09:03:38 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:03:38 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Failed with result 'exit-code'.
Nov 28 09:03:38 np0005538513.localdomain podman[102797]: 2025-11-28 09:03:38.080510285 +0000 UTC m=+0.302012736 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 09:03:38 np0005538513.localdomain podman[102797]: 2025-11-28 09:03:38.098515513 +0000 UTC m=+0.320017974 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_id=tripleo_step5, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-nova-compute-container)
Nov 28 09:03:38 np0005538513.localdomain podman[102797]: unhealthy
Nov 28 09:03:38 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:03:38 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'.
Nov 28 09:03:38 np0005538513.localdomain podman[102795]: 2025-11-28 09:03:38.228114322 +0000 UTC m=+0.459184029 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Nov 28 09:03:38 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 09:03:38 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29955 DF PROTO=TCP SPT=49902 DPT=9100 SEQ=4105540862 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02D5420000000001030307) 
Nov 28 09:03:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32508 DF PROTO=TCP SPT=47858 DPT=9102 SEQ=1229046501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02D9820000000001030307) 
Nov 28 09:03:39 np0005538513.localdomain sshd[102893]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:03:40 np0005538513.localdomain sshd[102893]: Accepted publickey for zuul from 192.168.122.31 port 41902 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:03:40 np0005538513.localdomain systemd-logind[764]: New session 36 of user zuul.
Nov 28 09:03:40 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5554 DF PROTO=TCP SPT=57248 DPT=9882 SEQ=2974427963 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02DBD60000000001030307) 
Nov 28 09:03:40 np0005538513.localdomain systemd[1]: Started Session 36 of User zuul.
Nov 28 09:03:40 np0005538513.localdomain sshd[102893]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:03:40 np0005538513.localdomain sudo[102986]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nblweivttqcmejczszgfahxibqyrmxth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320620.195172-26-59804325733847/AnsiballZ_stat.py
Nov 28 09:03:40 np0005538513.localdomain sudo[102986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:03:40 np0005538513.localdomain python3.9[102988]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:03:40 np0005538513.localdomain sudo[102986]: pam_unix(sudo:session): session closed for user root
Nov 28 09:03:41 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5555 DF PROTO=TCP SPT=57248 DPT=9882 SEQ=2974427963 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02DFC20000000001030307) 
Nov 28 09:03:41 np0005538513.localdomain sudo[103080]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewkxmcubcdzsfkaiufdyoefsdlpizegd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320621.12354-62-165046921684868/AnsiballZ_command.py
Nov 28 09:03:41 np0005538513.localdomain sudo[103080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:03:41 np0005538513.localdomain python3.9[103082]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf'); print(p['DEFAULT']['host'])"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:03:41 np0005538513.localdomain sudo[103080]: pam_unix(sudo:session): session closed for user root
Nov 28 09:03:42 np0005538513.localdomain sudo[103111]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:03:42 np0005538513.localdomain sudo[103111]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:03:42 np0005538513.localdomain sudo[103111]: pam_unix(sudo:session): session closed for user root
Nov 28 09:03:42 np0005538513.localdomain sudo[103146]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:03:42 np0005538513.localdomain sudo[103146]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:03:42 np0005538513.localdomain sudo[103203]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-weyhwnyybnsyxyvegfcioxdchzjemgyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320621.9658813-86-196214495921084/AnsiballZ_stat.py
Nov 28 09:03:42 np0005538513.localdomain sudo[103203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:03:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29956 DF PROTO=TCP SPT=49902 DPT=9100 SEQ=4105540862 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02E5030000000001030307) 
Nov 28 09:03:42 np0005538513.localdomain python3.9[103205]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:03:42 np0005538513.localdomain sudo[103203]: pam_unix(sudo:session): session closed for user root
Nov 28 09:03:42 np0005538513.localdomain sudo[103146]: pam_unix(sudo:session): session closed for user root
Nov 28 09:03:42 np0005538513.localdomain sudo[103328]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onxyatbivcsadcunnvprhtoprnesnfzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320622.6626248-110-12614750603953/AnsiballZ_command.py
Nov 28 09:03:42 np0005538513.localdomain sudo[103328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:03:43 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5556 DF PROTO=TCP SPT=57248 DPT=9882 SEQ=2974427963 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02E7C20000000001030307) 
Nov 28 09:03:43 np0005538513.localdomain python3.9[103330]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf'); print(p['DEFAULT']['host'])"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:03:43 np0005538513.localdomain sudo[103328]: pam_unix(sudo:session): session closed for user root
Nov 28 09:03:43 np0005538513.localdomain sudo[103346]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:03:43 np0005538513.localdomain sudo[103346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:03:43 np0005538513.localdomain sudo[103346]: pam_unix(sudo:session): session closed for user root
Nov 28 09:03:43 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60513 DF PROTO=TCP SPT=49524 DPT=9101 SEQ=3725412370 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02E9820000000001030307) 
Nov 28 09:03:43 np0005538513.localdomain sudo[103436]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bidkllunzdvaeohpgddjdmmnjvvpkjef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320623.4914544-137-72024585699169/AnsiballZ_command.py
Nov 28 09:03:43 np0005538513.localdomain sudo[103436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:03:43 np0005538513.localdomain python3.9[103438]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:03:43 np0005538513.localdomain sudo[103436]: pam_unix(sudo:session): session closed for user root
Nov 28 09:03:44 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 09:03:44 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 09:03:44 np0005538513.localdomain python3.9[103529]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 28 09:03:44 np0005538513.localdomain podman[103531]: 2025-11-28 09:03:44.839621287 +0000 UTC m=+0.072280322 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 09:03:44 np0005538513.localdomain systemd[1]: tmp-crun.ZnnGSl.mount: Deactivated successfully.
Nov 28 09:03:44 np0005538513.localdomain podman[103530]: 2025-11-28 09:03:44.858569495 +0000 UTC m=+0.090062105 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, maintainer=OpenStack TripleO Team, distribution-scope=public)
Nov 28 09:03:44 np0005538513.localdomain podman[103531]: 2025-11-28 09:03:44.8887471 +0000 UTC m=+0.121406165 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=ovn_controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1)
Nov 28 09:03:44 np0005538513.localdomain podman[103531]: unhealthy
Nov 28 09:03:44 np0005538513.localdomain podman[103530]: 2025-11-28 09:03:44.899685609 +0000 UTC m=+0.131178239 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Nov 28 09:03:44 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:03:44 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 09:03:44 np0005538513.localdomain podman[103530]: unhealthy
Nov 28 09:03:44 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:03:44 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 09:03:46 np0005538513.localdomain python3.9[103658]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:03:46 np0005538513.localdomain python3.9[103750]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 28 09:03:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5557 DF PROTO=TCP SPT=57248 DPT=9882 SEQ=2974427963 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02F7820000000001030307) 
Nov 28 09:03:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32509 DF PROTO=TCP SPT=47858 DPT=9102 SEQ=1229046501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02F9820000000001030307) 
Nov 28 09:03:48 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5398 DF PROTO=TCP SPT=47412 DPT=9105 SEQ=2833108021 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02FB1A0000000001030307) 
Nov 28 09:03:48 np0005538513.localdomain python3.9[103840]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:03:48 np0005538513.localdomain python3.9[103888]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:03:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5399 DF PROTO=TCP SPT=47412 DPT=9105 SEQ=2833108021 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02FF430000000001030307) 
Nov 28 09:03:49 np0005538513.localdomain sshd[102893]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:03:49 np0005538513.localdomain systemd-logind[764]: Session 36 logged out. Waiting for processes to exit.
Nov 28 09:03:49 np0005538513.localdomain systemd[1]: session-36.scope: Deactivated successfully.
Nov 28 09:03:49 np0005538513.localdomain systemd[1]: session-36.scope: Consumed 5.017s CPU time.
Nov 28 09:03:49 np0005538513.localdomain systemd-logind[764]: Removed session 36.
Nov 28 09:03:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5400 DF PROTO=TCP SPT=47412 DPT=9105 SEQ=2833108021 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0307420000000001030307) 
Nov 28 09:03:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5401 DF PROTO=TCP SPT=47412 DPT=9105 SEQ=2833108021 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0317020000000001030307) 
Nov 28 09:03:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40253 DF PROTO=TCP SPT=38946 DPT=9101 SEQ=3847402935 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0322090000000001030307) 
Nov 28 09:04:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40255 DF PROTO=TCP SPT=38946 DPT=9101 SEQ=3847402935 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB032E030000000001030307) 
Nov 28 09:04:01 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 09:04:01 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 09:04:01 np0005538513.localdomain systemd[1]: tmp-crun.Ho0RdK.mount: Deactivated successfully.
Nov 28 09:04:01 np0005538513.localdomain podman[103905]: 2025-11-28 09:04:01.867468475 +0000 UTC m=+0.096903755 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12)
Nov 28 09:04:01 np0005538513.localdomain podman[103905]: 2025-11-28 09:04:01.881311124 +0000 UTC m=+0.110746424 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, release=1761123044, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 09:04:01 np0005538513.localdomain podman[103904]: 2025-11-28 09:04:01.910134687 +0000 UTC m=+0.140046333 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, build-date=2025-11-18T23:44:13Z, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 09:04:01 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 09:04:01 np0005538513.localdomain podman[103904]: 2025-11-28 09:04:01.948365243 +0000 UTC m=+0.178276849 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, version=17.1.12, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=iscsid, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible)
Nov 28 09:04:01 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 09:04:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7653 DF PROTO=TCP SPT=51962 DPT=9102 SEQ=3559992270 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0337020000000001030307) 
Nov 28 09:04:03 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 09:04:03 np0005538513.localdomain podman[103942]: 2025-11-28 09:04:03.830094031 +0000 UTC m=+0.071613502 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=metrics_qdr, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z)
Nov 28 09:04:04 np0005538513.localdomain podman[103942]: 2025-11-28 09:04:04.034310843 +0000 UTC m=+0.275830224 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, container_name=metrics_qdr, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true)
Nov 28 09:04:04 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 09:04:04 np0005538513.localdomain sshd[103971]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:04:04 np0005538513.localdomain sshd[103971]: Accepted publickey for zuul from 192.168.122.30 port 36580 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:04:04 np0005538513.localdomain systemd-logind[764]: New session 37 of user zuul.
Nov 28 09:04:04 np0005538513.localdomain systemd[1]: Started Session 37 of User zuul.
Nov 28 09:04:04 np0005538513.localdomain sshd[103971]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:04:05 np0005538513.localdomain sudo[104064]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqcteklywtuqrtbmulnewqpwahhlgise ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320644.909837-23-212902688191958/AnsiballZ_systemd_service.py
Nov 28 09:04:05 np0005538513.localdomain sudo[104064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:04:05 np0005538513.localdomain python3.9[104066]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:04:05 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:04:05 np0005538513.localdomain systemd-rc-local-generator[104092]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:04:05 np0005538513.localdomain systemd-sysv-generator[104097]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:04:06 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:04:06 np0005538513.localdomain sudo[104064]: pam_unix(sudo:session): session closed for user root
Nov 28 09:04:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1354 DF PROTO=TCP SPT=60910 DPT=9100 SEQ=1665886645 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0342820000000001030307) 
Nov 28 09:04:07 np0005538513.localdomain python3.9[104193]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:04:07 np0005538513.localdomain network[104210]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:04:07 np0005538513.localdomain network[104211]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:04:07 np0005538513.localdomain network[104212]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:04:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 09:04:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 09:04:08 np0005538513.localdomain podman[104234]: 2025-11-28 09:04:08.059241965 +0000 UTC m=+0.092530390 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 09:04:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 09:04:08 np0005538513.localdomain podman[104234]: 2025-11-28 09:04:08.103033463 +0000 UTC m=+0.136321908 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4)
Nov 28 09:04:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 09:04:08 np0005538513.localdomain systemd[1]: tmp-crun.t1xnbp.mount: Deactivated successfully.
Nov 28 09:04:08 np0005538513.localdomain podman[104234]: unhealthy
Nov 28 09:04:08 np0005538513.localdomain podman[104251]: 2025-11-28 09:04:08.171369862 +0000 UTC m=+0.121457907 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=)
Nov 28 09:04:08 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:04:08 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Failed with result 'exit-code'.
Nov 28 09:04:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 09:04:08 np0005538513.localdomain podman[104286]: 2025-11-28 09:04:08.264087678 +0000 UTC m=+0.124146651 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, vcs-type=git, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z)
Nov 28 09:04:08 np0005538513.localdomain podman[104269]: 2025-11-28 09:04:08.225064638 +0000 UTC m=+0.143406169 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, url=https://www.redhat.com)
Nov 28 09:04:08 np0005538513.localdomain podman[104251]: 2025-11-28 09:04:08.283069416 +0000 UTC m=+0.233157451 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, release=1761123044, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-cron, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 09:04:08 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 09:04:08 np0005538513.localdomain podman[104269]: 2025-11-28 09:04:08.308155484 +0000 UTC m=+0.226496955 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team)
Nov 28 09:04:08 np0005538513.localdomain podman[104269]: unhealthy
Nov 28 09:04:08 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:04:08 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Failed with result 'exit-code'.
Nov 28 09:04:08 np0005538513.localdomain podman[104286]: 2025-11-28 09:04:08.359539257 +0000 UTC m=+0.219598260 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.component=openstack-nova-compute-container)
Nov 28 09:04:08 np0005538513.localdomain podman[104286]: unhealthy
Nov 28 09:04:08 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:04:08 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:04:08 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'.
Nov 28 09:04:08 np0005538513.localdomain podman[104320]: 2025-11-28 09:04:08.454719059 +0000 UTC m=+0.185265967 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, release=1761123044, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Nov 28 09:04:08 np0005538513.localdomain podman[104320]: 2025-11-28 09:04:08.849407736 +0000 UTC m=+0.579954604 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, release=1761123044, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 09:04:08 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 09:04:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7655 DF PROTO=TCP SPT=51962 DPT=9102 SEQ=3559992270 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB034EC20000000001030307) 
Nov 28 09:04:11 np0005538513.localdomain python3.9[104510]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:04:11 np0005538513.localdomain network[104527]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:04:11 np0005538513.localdomain network[104528]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:04:11 np0005538513.localdomain network[104529]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:04:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1356 DF PROTO=TCP SPT=60910 DPT=9100 SEQ=1665886645 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB035A420000000001030307) 
Nov 28 09:04:13 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:04:15 np0005538513.localdomain sudo[104726]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rpeyhakzzgcnlynybqfbqpqeajzlzkss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320654.856202-113-51793428359402/AnsiballZ_systemd_service.py
Nov 28 09:04:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 09:04:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 09:04:15 np0005538513.localdomain sudo[104726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:04:15 np0005538513.localdomain systemd[1]: tmp-crun.vnfl5d.mount: Deactivated successfully.
Nov 28 09:04:15 np0005538513.localdomain podman[104728]: 2025-11-28 09:04:15.237099152 +0000 UTC m=+0.101224010 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64)
Nov 28 09:04:15 np0005538513.localdomain podman[104728]: 2025-11-28 09:04:15.252364045 +0000 UTC m=+0.116488913 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible)
Nov 28 09:04:15 np0005538513.localdomain podman[104728]: unhealthy
Nov 28 09:04:15 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:04:15 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 09:04:15 np0005538513.localdomain podman[104729]: 2025-11-28 09:04:15.325238454 +0000 UTC m=+0.182115928 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Nov 28 09:04:15 np0005538513.localdomain podman[104729]: 2025-11-28 09:04:15.343613294 +0000 UTC m=+0.200490768 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044)
Nov 28 09:04:15 np0005538513.localdomain podman[104729]: unhealthy
Nov 28 09:04:15 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:04:15 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 09:04:15 np0005538513.localdomain python3.9[104730]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:04:15 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:04:15 np0005538513.localdomain systemd-sysv-generator[104799]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:04:15 np0005538513.localdomain systemd-rc-local-generator[104795]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:04:15 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:04:15 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 09:04:15 np0005538513.localdomain systemd[1]: Stopping ceilometer_agent_compute container...
Nov 28 09:04:15 np0005538513.localdomain recover_tripleo_nova_virtqemud[104812]: 61397
Nov 28 09:04:15 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 09:04:15 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 09:04:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4591 DF PROTO=TCP SPT=36590 DPT=9882 SEQ=2705474875 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB036CC20000000001030307) 
Nov 28 09:04:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17534 DF PROTO=TCP SPT=51352 DPT=9105 SEQ=270335686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0374420000000001030307) 
Nov 28 09:04:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17535 DF PROTO=TCP SPT=51352 DPT=9105 SEQ=270335686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB037C420000000001030307) 
Nov 28 09:04:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17536 DF PROTO=TCP SPT=51352 DPT=9105 SEQ=270335686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB038C020000000001030307) 
Nov 28 09:04:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20513 DF PROTO=TCP SPT=37358 DPT=9101 SEQ=38800111 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0397390000000001030307) 
Nov 28 09:04:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20515 DF PROTO=TCP SPT=37358 DPT=9101 SEQ=38800111 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB03A3420000000001030307) 
Nov 28 09:04:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 09:04:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 09:04:32 np0005538513.localdomain podman[104828]: 2025-11-28 09:04:32.108793136 +0000 UTC m=+0.090414944 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_step3, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4)
Nov 28 09:04:32 np0005538513.localdomain podman[104827]: 2025-11-28 09:04:32.15307166 +0000 UTC m=+0.134687838 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container)
Nov 28 09:04:32 np0005538513.localdomain podman[104827]: 2025-11-28 09:04:32.161592974 +0000 UTC m=+0.143209102 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.12, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:04:32 np0005538513.localdomain podman[104828]: 2025-11-28 09:04:32.175532166 +0000 UTC m=+0.157154034 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, batch=17.1_20251118.1)
Nov 28 09:04:32 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 09:04:32 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 09:04:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17537 DF PROTO=TCP SPT=51352 DPT=9105 SEQ=270335686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB03AB820000000001030307) 
Nov 28 09:04:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 09:04:34 np0005538513.localdomain systemd[1]: tmp-crun.VB3gP4.mount: Deactivated successfully.
Nov 28 09:04:34 np0005538513.localdomain podman[104867]: 2025-11-28 09:04:34.459523396 +0000 UTC m=+0.092306503 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr)
Nov 28 09:04:34 np0005538513.localdomain podman[104867]: 2025-11-28 09:04:34.654296186 +0000 UTC m=+0.287079213 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public)
Nov 28 09:04:34 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 09:04:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32511 DF PROTO=TCP SPT=47858 DPT=9102 SEQ=1229046501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB03B7830000000001030307) 
Nov 28 09:04:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 09:04:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 09:04:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 09:04:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 09:04:38 np0005538513.localdomain podman[104898]: 2025-11-28 09:04:38.603612414 +0000 UTC m=+0.085331647 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., vcs-type=git, release=1761123044, name=rhosp17/openstack-nova-compute)
Nov 28 09:04:38 np0005538513.localdomain podman[104896]: Error: container 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 is not running
Nov 28 09:04:38 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Main process exited, code=exited, status=125/n/a
Nov 28 09:04:38 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Failed with result 'exit-code'.
Nov 28 09:04:38 np0005538513.localdomain systemd[1]: tmp-crun.lQDgVk.mount: Deactivated successfully.
Nov 28 09:04:38 np0005538513.localdomain podman[104897]: 2025-11-28 09:04:38.716801314 +0000 UTC m=+0.202505760 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, vcs-type=git, version=17.1.12, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-cron-container)
Nov 28 09:04:38 np0005538513.localdomain podman[104898]: 2025-11-28 09:04:38.728770184 +0000 UTC m=+0.210489397 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, container_name=nova_compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 09:04:38 np0005538513.localdomain podman[104898]: unhealthy
Nov 28 09:04:38 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:04:38 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'.
Nov 28 09:04:38 np0005538513.localdomain podman[104897]: 2025-11-28 09:04:38.749312982 +0000 UTC m=+0.235017388 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, tcib_managed=true)
Nov 28 09:04:38 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 09:04:38 np0005538513.localdomain podman[104904]: 2025-11-28 09:04:38.814242655 +0000 UTC m=+0.291956073 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi)
Nov 28 09:04:38 np0005538513.localdomain podman[104904]: 2025-11-28 09:04:38.853477942 +0000 UTC m=+0.331191380 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, config_id=tripleo_step4)
Nov 28 09:04:38 np0005538513.localdomain podman[104904]: unhealthy
Nov 28 09:04:38 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:04:38 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Failed with result 'exit-code'.
Nov 28 09:04:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 09:04:38 np0005538513.localdomain podman[104968]: 2025-11-28 09:04:38.973316268 +0000 UTC m=+0.083333125 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Nov 28 09:04:39 np0005538513.localdomain podman[104968]: 2025-11-28 09:04:39.335819588 +0000 UTC m=+0.445836435 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Nov 28 09:04:39 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 09:04:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29959 DF PROTO=TCP SPT=49902 DPT=9100 SEQ=4105540862 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB03C3820000000001030307) 
Nov 28 09:04:39 np0005538513.localdomain systemd[1]: tmp-crun.jGx6ku.mount: Deactivated successfully.
Nov 28 09:04:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17093 DF PROTO=TCP SPT=47480 DPT=9100 SEQ=2077246492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB03CF820000000001030307) 
Nov 28 09:04:43 np0005538513.localdomain sudo[104990]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:04:43 np0005538513.localdomain sudo[104990]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:04:43 np0005538513.localdomain sudo[104990]: pam_unix(sudo:session): session closed for user root
Nov 28 09:04:43 np0005538513.localdomain sudo[105005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:04:43 np0005538513.localdomain sudo[105005]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:04:44 np0005538513.localdomain sudo[105005]: pam_unix(sudo:session): session closed for user root
Nov 28 09:04:45 np0005538513.localdomain sudo[105052]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:04:45 np0005538513.localdomain sudo[105052]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:04:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 09:04:45 np0005538513.localdomain sudo[105052]: pam_unix(sudo:session): session closed for user root
Nov 28 09:04:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 09:04:45 np0005538513.localdomain systemd[1]: tmp-crun.gB2nSj.mount: Deactivated successfully.
Nov 28 09:04:45 np0005538513.localdomain podman[105068]: 2025-11-28 09:04:45.463446879 +0000 UTC m=+0.086983368 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.)
Nov 28 09:04:45 np0005538513.localdomain podman[105067]: 2025-11-28 09:04:45.483522452 +0000 UTC m=+0.106387590 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public)
Nov 28 09:04:45 np0005538513.localdomain podman[105068]: 2025-11-28 09:04:45.512596813 +0000 UTC m=+0.136133282 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 09:04:45 np0005538513.localdomain podman[105068]: unhealthy
Nov 28 09:04:45 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:04:45 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 09:04:45 np0005538513.localdomain podman[105067]: 2025-11-28 09:04:45.526529985 +0000 UTC m=+0.149395103 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true)
Nov 28 09:04:45 np0005538513.localdomain podman[105067]: unhealthy
Nov 28 09:04:45 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:04:45 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 09:04:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3368 DF PROTO=TCP SPT=57916 DPT=9882 SEQ=1245516862 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB03E2020000000001030307) 
Nov 28 09:04:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44877 DF PROTO=TCP SPT=35524 DPT=9105 SEQ=2507938630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB03E9820000000001030307) 
Nov 28 09:04:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44878 DF PROTO=TCP SPT=35524 DPT=9105 SEQ=2507938630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB03F1830000000001030307) 
Nov 28 09:04:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44879 DF PROTO=TCP SPT=35524 DPT=9105 SEQ=2507938630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0401420000000001030307) 
Nov 28 09:04:58 np0005538513.localdomain podman[104814]: time="2025-11-28T09:04:58Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_compute in 42 seconds, resorting to SIGKILL"
Nov 28 09:04:58 np0005538513.localdomain systemd[1]: libpod-4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.scope: Deactivated successfully.
Nov 28 09:04:58 np0005538513.localdomain systemd[1]: libpod-4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.scope: Consumed 6.001s CPU time.
Nov 28 09:04:58 np0005538513.localdomain podman[104814]: 2025-11-28 09:04:58.027835346 +0000 UTC m=+42.083970291 container stop 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com)
Nov 28 09:04:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31348 DF PROTO=TCP SPT=41632 DPT=9101 SEQ=2572632617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB040C6A0000000001030307) 
Nov 28 09:04:58 np0005538513.localdomain podman[104814]: 2025-11-28 09:04:58.064284946 +0000 UTC m=+42.120419891 container died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=)
Nov 28 09:04:58 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.timer: Deactivated successfully.
Nov 28 09:04:58 np0005538513.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.
Nov 28 09:04:58 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Failed to open /run/systemd/transient/4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: No such file or directory
Nov 28 09:04:58 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5-userdata-shm.mount: Deactivated successfully.
Nov 28 09:04:58 np0005538513.localdomain podman[104814]: 2025-11-28 09:04:58.124273727 +0000 UTC m=+42.180408632 container cleanup 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:04:58 np0005538513.localdomain podman[104814]: ceilometer_agent_compute
Nov 28 09:04:58 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.timer: Failed to open /run/systemd/transient/4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.timer: No such file or directory
Nov 28 09:04:58 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Failed to open /run/systemd/transient/4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: No such file or directory
Nov 28 09:04:58 np0005538513.localdomain podman[105110]: 2025-11-28 09:04:58.177056904 +0000 UTC m=+0.126910307 container cleanup 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, version=17.1.12)
Nov 28 09:04:58 np0005538513.localdomain systemd[1]: libpod-conmon-4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.scope: Deactivated successfully.
Nov 28 09:04:58 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.timer: Failed to open /run/systemd/transient/4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.timer: No such file or directory
Nov 28 09:04:58 np0005538513.localdomain systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Failed to open /run/systemd/transient/4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: No such file or directory
Nov 28 09:04:58 np0005538513.localdomain podman[105128]: 2025-11-28 09:04:58.287824318 +0000 UTC m=+0.070391154 container cleanup 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute)
Nov 28 09:04:58 np0005538513.localdomain podman[105128]: ceilometer_agent_compute
Nov 28 09:04:58 np0005538513.localdomain systemd[1]: tripleo_ceilometer_agent_compute.service: Deactivated successfully.
Nov 28 09:04:58 np0005538513.localdomain systemd[1]: Stopped ceilometer_agent_compute container.
Nov 28 09:04:58 np0005538513.localdomain systemd[1]: tripleo_ceilometer_agent_compute.service: Consumed 1.098s CPU time, no IO.
Nov 28 09:04:58 np0005538513.localdomain sudo[104726]: pam_unix(sudo:session): session closed for user root
Nov 28 09:04:58 np0005538513.localdomain sudo[105230]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twtkczrobyuleyfxooqunqkiwhebnhim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320698.4538002-113-135338067873816/AnsiballZ_systemd_service.py
Nov 28 09:04:58 np0005538513.localdomain sudo[105230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:04:59 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-9f3450f8dbd8f52854977f74bd961373e3aeac1471ae57db291ae89b64fa40dd-merged.mount: Deactivated successfully.
Nov 28 09:04:59 np0005538513.localdomain python3.9[105232]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_ipmi.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:04:59 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:04:59 np0005538513.localdomain systemd-rc-local-generator[105255]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:04:59 np0005538513.localdomain systemd-sysv-generator[105262]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:04:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:04:59 np0005538513.localdomain systemd[1]: Stopping ceilometer_agent_ipmi container...
Nov 28 09:05:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31350 DF PROTO=TCP SPT=41632 DPT=9101 SEQ=2572632617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0418820000000001030307) 
Nov 28 09:05:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 09:05:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 09:05:02 np0005538513.localdomain systemd[1]: tmp-crun.Ro8dQo.mount: Deactivated successfully.
Nov 28 09:05:02 np0005538513.localdomain podman[105289]: 2025-11-28 09:05:02.356190896 +0000 UTC m=+0.090388673 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 09:05:02 np0005538513.localdomain podman[105288]: 2025-11-28 09:05:02.40630592 +0000 UTC m=+0.142452937 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1)
Nov 28 09:05:02 np0005538513.localdomain podman[105288]: 2025-11-28 09:05:02.414857655 +0000 UTC m=+0.151004723 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, tcib_managed=true, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team)
Nov 28 09:05:02 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 09:05:02 np0005538513.localdomain podman[105289]: 2025-11-28 09:05:02.468279283 +0000 UTC m=+0.202477030 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, container_name=collectd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, distribution-scope=public)
Nov 28 09:05:02 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 09:05:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48331 DF PROTO=TCP SPT=33600 DPT=9102 SEQ=2680091707 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0421420000000001030307) 
Nov 28 09:05:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 09:05:04 np0005538513.localdomain systemd[1]: tmp-crun.fdLolF.mount: Deactivated successfully.
Nov 28 09:05:04 np0005538513.localdomain podman[105325]: 2025-11-28 09:05:04.862035846 +0000 UTC m=+0.095198103 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, vcs-type=git)
Nov 28 09:05:05 np0005538513.localdomain podman[105325]: 2025-11-28 09:05:05.081553942 +0000 UTC m=+0.314716209 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044)
Nov 28 09:05:05 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 09:05:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21990 DF PROTO=TCP SPT=56088 DPT=9100 SEQ=1367658504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB042D020000000001030307) 
Nov 28 09:05:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 09:05:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 09:05:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 09:05:09 np0005538513.localdomain systemd[1]: tmp-crun.IPytoW.mount: Deactivated successfully.
Nov 28 09:05:09 np0005538513.localdomain podman[105355]: 2025-11-28 09:05:09.109871641 +0000 UTC m=+0.088378491 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, build-date=2025-11-19T00:36:58Z, architecture=x86_64, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 09:05:09 np0005538513.localdomain podman[105355]: 2025-11-28 09:05:09.134379161 +0000 UTC m=+0.112886051 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, batch=17.1_20251118.1, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, vcs-type=git, tcib_managed=true, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:05:09 np0005538513.localdomain systemd[1]: tmp-crun.Dk2P6H.mount: Deactivated successfully.
Nov 28 09:05:09 np0005538513.localdomain podman[105354]: 2025-11-28 09:05:09.164487955 +0000 UTC m=+0.144970497 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, release=1761123044, architecture=x86_64, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron)
Nov 28 09:05:09 np0005538513.localdomain podman[105356]: Error: container f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b is not running
Nov 28 09:05:09 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Main process exited, code=exited, status=125/n/a
Nov 28 09:05:09 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Failed with result 'exit-code'.
Nov 28 09:05:09 np0005538513.localdomain podman[105354]: 2025-11-28 09:05:09.200870433 +0000 UTC m=+0.181352945 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, tcib_managed=true)
Nov 28 09:05:09 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 09:05:09 np0005538513.localdomain podman[105355]: unhealthy
Nov 28 09:05:09 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:05:09 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'.
Nov 28 09:05:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48333 DF PROTO=TCP SPT=33600 DPT=9102 SEQ=2680091707 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0439020000000001030307) 
Nov 28 09:05:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 09:05:09 np0005538513.localdomain podman[105410]: 2025-11-28 09:05:09.840900778 +0000 UTC m=+0.076545584 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vendor=Red Hat, Inc.)
Nov 28 09:05:10 np0005538513.localdomain podman[105410]: 2025-11-28 09:05:10.252383567 +0000 UTC m=+0.488028373 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, container_name=nova_migration_target, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 09:05:10 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 09:05:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21992 DF PROTO=TCP SPT=56088 DPT=9100 SEQ=1367658504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0444C20000000001030307) 
Nov 28 09:05:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 09:05:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 09:05:15 np0005538513.localdomain podman[105433]: 2025-11-28 09:05:15.845629879 +0000 UTC m=+0.082724547 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 09:05:15 np0005538513.localdomain podman[105433]: 2025-11-28 09:05:15.889530369 +0000 UTC m=+0.126624977 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 09:05:15 np0005538513.localdomain podman[105433]: unhealthy
Nov 28 09:05:15 np0005538513.localdomain systemd[1]: tmp-crun.A9Xm4C.mount: Deactivated successfully.
Nov 28 09:05:15 np0005538513.localdomain podman[105434]: 2025-11-28 09:05:15.900353505 +0000 UTC m=+0.133926984 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller)
Nov 28 09:05:15 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:05:15 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 09:05:15 np0005538513.localdomain podman[105434]: 2025-11-28 09:05:15.944387211 +0000 UTC m=+0.177960700 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 09:05:15 np0005538513.localdomain podman[105434]: unhealthy
Nov 28 09:05:15 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:05:15 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 09:05:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20651 DF PROTO=TCP SPT=44250 DPT=9882 SEQ=949298037 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0457420000000001030307) 
Nov 28 09:05:18 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40133 DF PROTO=TCP SPT=49662 DPT=9105 SEQ=1209768136 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB045AAB0000000001030307) 
Nov 28 09:05:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40135 DF PROTO=TCP SPT=49662 DPT=9105 SEQ=1209768136 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0466C20000000001030307) 
Nov 28 09:05:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40136 DF PROTO=TCP SPT=49662 DPT=9105 SEQ=1209768136 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0476820000000001030307) 
Nov 28 09:05:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53947 DF PROTO=TCP SPT=44546 DPT=9101 SEQ=2264730479 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB04819C0000000001030307) 
Nov 28 09:05:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53949 DF PROTO=TCP SPT=44546 DPT=9101 SEQ=2264730479 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB048DC30000000001030307) 
Nov 28 09:05:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 09:05:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 09:05:32 np0005538513.localdomain systemd[1]: tmp-crun.vW94hE.mount: Deactivated successfully.
Nov 28 09:05:32 np0005538513.localdomain podman[105472]: 2025-11-28 09:05:32.600291064 +0000 UTC m=+0.081983123 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true)
Nov 28 09:05:32 np0005538513.localdomain podman[105472]: 2025-11-28 09:05:32.637798687 +0000 UTC m=+0.119490736 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Nov 28 09:05:32 np0005538513.localdomain podman[105473]: 2025-11-28 09:05:32.652256985 +0000 UTC m=+0.129841327 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 28 09:05:32 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully.
Nov 28 09:05:32 np0005538513.localdomain podman[105473]: 2025-11-28 09:05:32.688353575 +0000 UTC m=+0.165937877 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 09:05:32 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully.
Nov 28 09:05:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54987 DF PROTO=TCP SPT=42326 DPT=9102 SEQ=2920543412 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0496820000000001030307) 
Nov 28 09:05:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 09:05:35 np0005538513.localdomain podman[105511]: 2025-11-28 09:05:35.343741131 +0000 UTC m=+0.085018487 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, container_name=metrics_qdr, io.buildah.version=1.41.4, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd)
Nov 28 09:05:35 np0005538513.localdomain podman[105511]: 2025-11-28 09:05:35.559499361 +0000 UTC m=+0.300776737 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, vcs-type=git, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 09:05:35 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully.
Nov 28 09:05:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19474 DF PROTO=TCP SPT=47204 DPT=9102 SEQ=200258602 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB04A1830000000001030307) 
Nov 28 09:05:37 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 09:05:37 np0005538513.localdomain recover_tripleo_nova_virtqemud[105541]: 61397
Nov 28 09:05:37 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 09:05:37 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 09:05:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17096 DF PROTO=TCP SPT=47480 DPT=9100 SEQ=2077246492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB04AD820000000001030307) 
Nov 28 09:05:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 09:05:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 09:05:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 09:05:39 np0005538513.localdomain systemd[1]: tmp-crun.UNz2Wb.mount: Deactivated successfully.
Nov 28 09:05:39 np0005538513.localdomain podman[105544]: Error: container f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b is not running
Nov 28 09:05:39 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Main process exited, code=exited, status=125/n/a
Nov 28 09:05:39 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Failed with result 'exit-code'.
Nov 28 09:05:39 np0005538513.localdomain podman[105542]: 2025-11-28 09:05:39.853114824 +0000 UTC m=+0.084173621 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044)
Nov 28 09:05:39 np0005538513.localdomain podman[105543]: 2025-11-28 09:05:39.918694487 +0000 UTC m=+0.147096543 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, url=https://www.redhat.com)
Nov 28 09:05:39 np0005538513.localdomain podman[105542]: 2025-11-28 09:05:39.936340514 +0000 UTC m=+0.167399311 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, name=rhosp17/openstack-cron, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, container_name=logrotate_crond, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Nov 28 09:05:39 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully.
Nov 28 09:05:39 np0005538513.localdomain podman[105543]: 2025-11-28 09:05:39.967423887 +0000 UTC m=+0.195825933 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, vcs-type=git, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, version=17.1.12)
Nov 28 09:05:39 np0005538513.localdomain podman[105543]: unhealthy
Nov 28 09:05:39 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:05:39 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'.
Nov 28 09:05:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 09:05:40 np0005538513.localdomain podman[105596]: 2025-11-28 09:05:40.848218069 +0000 UTC m=+0.086485683 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 09:05:41 np0005538513.localdomain podman[105596]: 2025-11-28 09:05:41.24845927 +0000 UTC m=+0.486726834 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 09:05:41 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 09:05:41 np0005538513.localdomain podman[105273]: time="2025-11-28T09:05:41Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_ipmi in 42 seconds, resorting to SIGKILL"
Nov 28 09:05:41 np0005538513.localdomain systemd[1]: libpod-f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.scope: Deactivated successfully.
Nov 28 09:05:41 np0005538513.localdomain systemd[1]: libpod-f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.scope: Consumed 6.086s CPU time.
Nov 28 09:05:41 np0005538513.localdomain podman[105273]: 2025-11-28 09:05:41.621836307 +0000 UTC m=+42.092958188 container died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.expose-services=)
Nov 28 09:05:41 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.timer: Deactivated successfully.
Nov 28 09:05:41 np0005538513.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.
Nov 28 09:05:41 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Failed to open /run/systemd/transient/f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: No such file or directory
Nov 28 09:05:41 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b-userdata-shm.mount: Deactivated successfully.
Nov 28 09:05:41 np0005538513.localdomain podman[105273]: 2025-11-28 09:05:41.671712454 +0000 UTC m=+42.142834315 container cleanup f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z)
Nov 28 09:05:41 np0005538513.localdomain podman[105273]: ceilometer_agent_ipmi
Nov 28 09:05:41 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.timer: Failed to open /run/systemd/transient/f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.timer: No such file or directory
Nov 28 09:05:41 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Failed to open /run/systemd/transient/f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: No such file or directory
Nov 28 09:05:41 np0005538513.localdomain podman[105620]: 2025-11-28 09:05:41.709629909 +0000 UTC m=+0.078262478 container cleanup f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12)
Nov 28 09:05:41 np0005538513.localdomain systemd[1]: libpod-conmon-f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.scope: Deactivated successfully.
Nov 28 09:05:41 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.timer: Failed to open /run/systemd/transient/f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.timer: No such file or directory
Nov 28 09:05:41 np0005538513.localdomain systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Failed to open /run/systemd/transient/f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: No such file or directory
Nov 28 09:05:41 np0005538513.localdomain podman[105633]: 2025-11-28 09:05:41.810461535 +0000 UTC m=+0.068831194 container cleanup f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, vcs-type=git, build-date=2025-11-19T00:12:45Z, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Nov 28 09:05:41 np0005538513.localdomain podman[105633]: ceilometer_agent_ipmi
Nov 28 09:05:41 np0005538513.localdomain systemd[1]: tripleo_ceilometer_agent_ipmi.service: Deactivated successfully.
Nov 28 09:05:41 np0005538513.localdomain systemd[1]: Stopped ceilometer_agent_ipmi container.
Nov 28 09:05:41 np0005538513.localdomain sudo[105230]: pam_unix(sudo:session): session closed for user root
Nov 28 09:05:42 np0005538513.localdomain sudo[105735]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dngfpxrbbgjbwhdguliowvyfznhhtzaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320742.0061426-113-89060883737624/AnsiballZ_systemd_service.py
Nov 28 09:05:42 np0005538513.localdomain sudo[105735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:05:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54619 DF PROTO=TCP SPT=49072 DPT=9100 SEQ=2495149367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB04B9C20000000001030307) 
Nov 28 09:05:42 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-b9124600c137d24812fa12ae9a3723386ce2301b24a72f8d26d11f6206571e1d-merged.mount: Deactivated successfully.
Nov 28 09:05:42 np0005538513.localdomain python3.9[105737]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_collectd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:05:42 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:05:42 np0005538513.localdomain systemd-rc-local-generator[105766]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:05:42 np0005538513.localdomain systemd-sysv-generator[105770]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:05:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:05:43 np0005538513.localdomain systemd[1]: Stopping collectd container...
Nov 28 09:05:43 np0005538513.localdomain systemd[1]: tmp-crun.UVRp2z.mount: Deactivated successfully.
Nov 28 09:05:43 np0005538513.localdomain systemd[1]: libpod-2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.scope: Deactivated successfully.
Nov 28 09:05:43 np0005538513.localdomain systemd[1]: libpod-2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.scope: Consumed 2.099s CPU time.
Nov 28 09:05:43 np0005538513.localdomain podman[105778]: 2025-11-28 09:05:43.20446602 +0000 UTC m=+0.162030055 container died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 09:05:43 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.timer: Deactivated successfully.
Nov 28 09:05:43 np0005538513.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.
Nov 28 09:05:43 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Failed to open /run/systemd/transient/2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: No such file or directory
Nov 28 09:05:43 np0005538513.localdomain podman[105778]: 2025-11-28 09:05:43.292946733 +0000 UTC m=+0.250510718 container cleanup 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, architecture=x86_64, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step3, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 28 09:05:43 np0005538513.localdomain podman[105778]: collectd
Nov 28 09:05:43 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.timer: Failed to open /run/systemd/transient/2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.timer: No such file or directory
Nov 28 09:05:43 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Failed to open /run/systemd/transient/2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: No such file or directory
Nov 28 09:05:43 np0005538513.localdomain podman[105792]: 2025-11-28 09:05:43.314394118 +0000 UTC m=+0.096141951 container cleanup 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, container_name=collectd, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, config_id=tripleo_step3)
Nov 28 09:05:43 np0005538513.localdomain systemd[1]: tripleo_collectd.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:05:43 np0005538513.localdomain systemd[1]: libpod-conmon-2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.scope: Deactivated successfully.
Nov 28 09:05:43 np0005538513.localdomain podman[105818]: error opening file `/run/crun/2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c/status`: No such file or directory
Nov 28 09:05:43 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.timer: Failed to open /run/systemd/transient/2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.timer: No such file or directory
Nov 28 09:05:43 np0005538513.localdomain systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Failed to open /run/systemd/transient/2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: No such file or directory
Nov 28 09:05:43 np0005538513.localdomain podman[105807]: 2025-11-28 09:05:43.419193068 +0000 UTC m=+0.073293124 container cleanup 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 09:05:43 np0005538513.localdomain podman[105807]: collectd
Nov 28 09:05:43 np0005538513.localdomain systemd[1]: tripleo_collectd.service: Failed with result 'exit-code'.
Nov 28 09:05:43 np0005538513.localdomain systemd[1]: Stopped collectd container.
Nov 28 09:05:43 np0005538513.localdomain sudo[105735]: pam_unix(sudo:session): session closed for user root
Nov 28 09:05:43 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-483382bf68693c05a65aded8bd4df683c0e0d8870bcc7441be08a396c5476266-merged.mount: Deactivated successfully.
Nov 28 09:05:43 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c-userdata-shm.mount: Deactivated successfully.
Nov 28 09:05:43 np0005538513.localdomain sudo[105912]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sorninhoeqfhhktqwlxnkuzcycjuupjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320743.5893538-113-245928845323180/AnsiballZ_systemd_service.py
Nov 28 09:05:43 np0005538513.localdomain sudo[105912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:05:44 np0005538513.localdomain python3.9[105914]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_iscsid.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:05:44 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:05:44 np0005538513.localdomain systemd-rc-local-generator[105944]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:05:44 np0005538513.localdomain systemd-sysv-generator[105947]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:05:44 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:05:44 np0005538513.localdomain systemd[1]: Stopping iscsid container...
Nov 28 09:05:44 np0005538513.localdomain systemd[1]: tmp-crun.SDRykm.mount: Deactivated successfully.
Nov 28 09:05:44 np0005538513.localdomain systemd[1]: libpod-08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.scope: Deactivated successfully.
Nov 28 09:05:44 np0005538513.localdomain systemd[1]: libpod-08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.scope: Consumed 1.103s CPU time.
Nov 28 09:05:44 np0005538513.localdomain podman[105955]: 2025-11-28 09:05:44.72798901 +0000 UTC m=+0.075358297 container died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, url=https://www.redhat.com)
Nov 28 09:05:44 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.timer: Deactivated successfully.
Nov 28 09:05:44 np0005538513.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.
Nov 28 09:05:44 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Failed to open /run/systemd/transient/08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: No such file or directory
Nov 28 09:05:44 np0005538513.localdomain podman[105955]: 2025-11-28 09:05:44.821074247 +0000 UTC m=+0.168443524 container cleanup 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, release=1761123044, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible)
Nov 28 09:05:44 np0005538513.localdomain podman[105955]: iscsid
Nov 28 09:05:44 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.timer: Failed to open /run/systemd/transient/08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.timer: No such file or directory
Nov 28 09:05:44 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Failed to open /run/systemd/transient/08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: No such file or directory
Nov 28 09:05:44 np0005538513.localdomain podman[105967]: 2025-11-28 09:05:44.831876982 +0000 UTC m=+0.103354166 container cleanup 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_id=tripleo_step3, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 09:05:44 np0005538513.localdomain systemd[1]: libpod-conmon-08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.scope: Deactivated successfully.
Nov 28 09:05:44 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.timer: Failed to open /run/systemd/transient/08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.timer: No such file or directory
Nov 28 09:05:44 np0005538513.localdomain systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Failed to open /run/systemd/transient/08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: No such file or directory
Nov 28 09:05:44 np0005538513.localdomain podman[105984]: 2025-11-28 09:05:44.931870942 +0000 UTC m=+0.066742440 container cleanup 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible)
Nov 28 09:05:44 np0005538513.localdomain podman[105984]: iscsid
Nov 28 09:05:44 np0005538513.localdomain systemd[1]: tripleo_iscsid.service: Deactivated successfully.
Nov 28 09:05:44 np0005538513.localdomain systemd[1]: Stopped iscsid container.
Nov 28 09:05:44 np0005538513.localdomain sudo[105912]: pam_unix(sudo:session): session closed for user root
Nov 28 09:05:45 np0005538513.localdomain sudo[106086]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymskpchljdbebqjljfgwkczhfajdoeso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320745.1240249-113-245569098122719/AnsiballZ_systemd_service.py
Nov 28 09:05:45 np0005538513.localdomain sudo[106086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:05:45 np0005538513.localdomain sudo[106089]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:05:45 np0005538513.localdomain sudo[106089]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:05:45 np0005538513.localdomain sudo[106089]: pam_unix(sudo:session): session closed for user root
Nov 28 09:05:45 np0005538513.localdomain python3.9[106088]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_logrotate_crond.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:05:45 np0005538513.localdomain sudo[106104]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:05:45 np0005538513.localdomain sudo[106104]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:05:45 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-7b21fd5920b03309569c61b370f896baf7b2149eb0e6261b2fb5b94e6a082fed-merged.mount: Deactivated successfully.
Nov 28 09:05:45 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93-userdata-shm.mount: Deactivated successfully.
Nov 28 09:05:45 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:05:45 np0005538513.localdomain systemd-rc-local-generator[106144]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:05:45 np0005538513.localdomain systemd-sysv-generator[106149]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:05:45 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:05:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 09:05:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 09:05:46 np0005538513.localdomain systemd[1]: Stopping logrotate_crond container...
Nov 28 09:05:46 np0005538513.localdomain podman[106173]: 2025-11-28 09:05:46.168421774 +0000 UTC m=+0.087552365 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, release=1761123044, build-date=2025-11-18T23:34:05Z)
Nov 28 09:05:46 np0005538513.localdomain podman[106172]: 2025-11-28 09:05:46.223403199 +0000 UTC m=+0.140526549 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, version=17.1.12, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 28 09:05:46 np0005538513.localdomain podman[106173]: 2025-11-28 09:05:46.236714111 +0000 UTC m=+0.155844732 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20251118.1)
Nov 28 09:05:46 np0005538513.localdomain podman[106173]: unhealthy
Nov 28 09:05:46 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:05:46 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 09:05:46 np0005538513.localdomain crond[70064]: (CRON) INFO (Shutting down)
Nov 28 09:05:46 np0005538513.localdomain systemd[1]: libpod-bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.scope: Deactivated successfully.
Nov 28 09:05:46 np0005538513.localdomain systemd[1]: libpod-bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.scope: Consumed 1.023s CPU time.
Nov 28 09:05:46 np0005538513.localdomain podman[106175]: 2025-11-28 09:05:46.319780187 +0000 UTC m=+0.233280494 container died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 09:05:46 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.timer: Deactivated successfully.
Nov 28 09:05:46 np0005538513.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.
Nov 28 09:05:46 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Failed to open /run/systemd/transient/bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: No such file or directory
Nov 28 09:05:46 np0005538513.localdomain podman[106172]: 2025-11-28 09:05:46.344292828 +0000 UTC m=+0.261416108 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044)
Nov 28 09:05:46 np0005538513.localdomain podman[106172]: unhealthy
Nov 28 09:05:46 np0005538513.localdomain sudo[106104]: pam_unix(sudo:session): session closed for user root
Nov 28 09:05:46 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:05:46 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 09:05:46 np0005538513.localdomain podman[106175]: 2025-11-28 09:05:46.41241244 +0000 UTC m=+0.325912747 container cleanup bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, container_name=logrotate_crond, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64)
Nov 28 09:05:46 np0005538513.localdomain podman[106175]: logrotate_crond
Nov 28 09:05:46 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.timer: Failed to open /run/systemd/transient/bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.timer: No such file or directory
Nov 28 09:05:46 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Failed to open /run/systemd/transient/bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: No such file or directory
Nov 28 09:05:46 np0005538513.localdomain podman[106243]: 2025-11-28 09:05:46.433285766 +0000 UTC m=+0.107830904 container cleanup bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-type=git, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron)
Nov 28 09:05:46 np0005538513.localdomain systemd[1]: libpod-conmon-bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.scope: Deactivated successfully.
Nov 28 09:05:46 np0005538513.localdomain podman[106269]: error opening file `/run/crun/bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3/status`: No such file or directory
Nov 28 09:05:46 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.timer: Failed to open /run/systemd/transient/bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.timer: No such file or directory
Nov 28 09:05:46 np0005538513.localdomain systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Failed to open /run/systemd/transient/bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: No such file or directory
Nov 28 09:05:46 np0005538513.localdomain podman[106258]: 2025-11-28 09:05:46.540478781 +0000 UTC m=+0.072638074 container cleanup bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, architecture=x86_64)
Nov 28 09:05:46 np0005538513.localdomain podman[106258]: logrotate_crond
Nov 28 09:05:46 np0005538513.localdomain systemd[1]: tripleo_logrotate_crond.service: Deactivated successfully.
Nov 28 09:05:46 np0005538513.localdomain systemd[1]: Stopped logrotate_crond container.
Nov 28 09:05:46 np0005538513.localdomain sudo[106086]: pam_unix(sudo:session): session closed for user root
Nov 28 09:05:46 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-926487bff9cf92c9a8f31e53ba63d52e32adb5e89e0ae8702a4e60968ca6f3f1-merged.mount: Deactivated successfully.
Nov 28 09:05:46 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3-userdata-shm.mount: Deactivated successfully.
Nov 28 09:05:46 np0005538513.localdomain sudo[106344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:05:46 np0005538513.localdomain sudo[106344]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:05:46 np0005538513.localdomain sudo[106344]: pam_unix(sudo:session): session closed for user root
Nov 28 09:05:46 np0005538513.localdomain sudo[106374]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntdnmanrravgkkgagzgxcgyuofvtwzgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320746.693609-113-47685089193806/AnsiballZ_systemd_service.py
Nov 28 09:05:46 np0005538513.localdomain sudo[106374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:05:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62197 DF PROTO=TCP SPT=54478 DPT=9882 SEQ=1118034570 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB04CC420000000001030307) 
Nov 28 09:05:47 np0005538513.localdomain python3.9[106377]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_metrics_qdr.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:05:47 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:05:47 np0005538513.localdomain systemd-rc-local-generator[106402]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:05:47 np0005538513.localdomain systemd-sysv-generator[106407]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:05:47 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:05:47 np0005538513.localdomain systemd[1]: Stopping metrics_qdr container...
Nov 28 09:05:47 np0005538513.localdomain kernel: qdrouterd[54084]: segfault at 0 ip 00007f65b00807cb sp 00007ffcc2077190 error 4 in libc.so.6[7f65b001d000+175000]
Nov 28 09:05:47 np0005538513.localdomain kernel: Code: 0b 00 64 44 89 23 85 c0 75 d4 e9 2b ff ff ff e8 db a5 00 00 e9 fd fe ff ff e8 41 1d 0d 00 90 f3 0f 1e fa 41 54 55 48 89 fd 53 <8b> 07 f6 c4 20 0f 85 aa 00 00 00 89 c2 81 e2 00 80 00 00 0f 84 a9
Nov 28 09:05:47 np0005538513.localdomain systemd[1]: Created slice Slice /system/systemd-coredump.
Nov 28 09:05:47 np0005538513.localdomain systemd[1]: Started Process Core Dump (PID 106430/UID 0).
Nov 28 09:05:47 np0005538513.localdomain systemd-coredump[106431]: Resource limits disable core dumping for process 54084 (qdrouterd).
Nov 28 09:05:47 np0005538513.localdomain systemd-coredump[106431]: Process 54084 (qdrouterd) of user 42465 dumped core.
Nov 28 09:05:47 np0005538513.localdomain systemd[1]: systemd-coredump@0-106430-0.service: Deactivated successfully.
Nov 28 09:05:47 np0005538513.localdomain podman[106418]: 2025-11-28 09:05:47.894162765 +0000 UTC m=+0.233530362 container died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public)
Nov 28 09:05:47 np0005538513.localdomain systemd[1]: libpod-63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.scope: Deactivated successfully.
Nov 28 09:05:47 np0005538513.localdomain systemd[1]: libpod-63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.scope: Consumed 27.231s CPU time.
Nov 28 09:05:47 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.timer: Deactivated successfully.
Nov 28 09:05:47 np0005538513.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.
Nov 28 09:05:47 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Failed to open /run/systemd/transient/63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: No such file or directory
Nov 28 09:05:47 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339-userdata-shm.mount: Deactivated successfully.
Nov 28 09:05:47 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-701465e48d77119796b927e784ad3d56a8f99cc392002110647f3a4cfb83b9e9-merged.mount: Deactivated successfully.
Nov 28 09:05:47 np0005538513.localdomain podman[106418]: 2025-11-28 09:05:47.953991459 +0000 UTC m=+0.293359056 container cleanup 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr)
Nov 28 09:05:47 np0005538513.localdomain podman[106418]: metrics_qdr
Nov 28 09:05:47 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.timer: Failed to open /run/systemd/transient/63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.timer: No such file or directory
Nov 28 09:05:47 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Failed to open /run/systemd/transient/63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: No such file or directory
Nov 28 09:05:47 np0005538513.localdomain podman[106435]: 2025-11-28 09:05:47.980845412 +0000 UTC m=+0.077873525 container cleanup 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, architecture=x86_64, release=1761123044, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container)
Nov 28 09:05:47 np0005538513.localdomain systemd[1]: tripleo_metrics_qdr.service: Main process exited, code=exited, status=139/n/a
Nov 28 09:05:48 np0005538513.localdomain systemd[1]: libpod-conmon-63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.scope: Deactivated successfully.
Nov 28 09:05:48 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.timer: Failed to open /run/systemd/transient/63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.timer: No such file or directory
Nov 28 09:05:48 np0005538513.localdomain systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Failed to open /run/systemd/transient/63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: No such file or directory
Nov 28 09:05:48 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23218 DF PROTO=TCP SPT=52560 DPT=9105 SEQ=1946760551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB04CFDB0000000001030307) 
Nov 28 09:05:48 np0005538513.localdomain podman[106452]: 2025-11-28 09:05:48.09655256 +0000 UTC m=+0.077783783 container cleanup 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1)
Nov 28 09:05:48 np0005538513.localdomain podman[106452]: metrics_qdr
Nov 28 09:05:48 np0005538513.localdomain systemd[1]: tripleo_metrics_qdr.service: Failed with result 'exit-code'.
Nov 28 09:05:48 np0005538513.localdomain systemd[1]: Stopped metrics_qdr container.
Nov 28 09:05:48 np0005538513.localdomain sudo[106374]: pam_unix(sudo:session): session closed for user root
Nov 28 09:05:48 np0005538513.localdomain sudo[106553]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sbijkzsoqjvgupprwsqqoqogsrjdqcnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320748.2971172-113-171551666900011/AnsiballZ_systemd_service.py
Nov 28 09:05:48 np0005538513.localdomain sudo[106553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:05:48 np0005538513.localdomain python3.9[106555]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_dhcp.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:05:48 np0005538513.localdomain sudo[106553]: pam_unix(sudo:session): session closed for user root
Nov 28 09:05:49 np0005538513.localdomain sudo[106646]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mulksxgyvhljnwuhanawbhcqcbartfrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320748.989231-113-124145724847222/AnsiballZ_systemd_service.py
Nov 28 09:05:49 np0005538513.localdomain sudo[106646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:05:49 np0005538513.localdomain python3.9[106648]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_l3_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:05:49 np0005538513.localdomain sudo[106646]: pam_unix(sudo:session): session closed for user root
Nov 28 09:05:49 np0005538513.localdomain sudo[106739]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfguhusgtrmyzjuqlfcqsrmmovbvnwsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320749.7025187-113-117931304304423/AnsiballZ_systemd_service.py
Nov 28 09:05:49 np0005538513.localdomain sudo[106739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:05:50 np0005538513.localdomain python3.9[106741]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_ovs_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:05:50 np0005538513.localdomain sudo[106739]: pam_unix(sudo:session): session closed for user root
Nov 28 09:05:50 np0005538513.localdomain sudo[106832]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ttsangcmchwvmqfzyblwwuxlzecxearp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320750.4180026-113-89176573250449/AnsiballZ_systemd_service.py
Nov 28 09:05:50 np0005538513.localdomain sudo[106832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:05:50 np0005538513.localdomain python3.9[106834]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:05:51 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:05:51 np0005538513.localdomain systemd-rc-local-generator[106860]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:05:51 np0005538513.localdomain systemd-sysv-generator[106866]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:05:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23220 DF PROTO=TCP SPT=52560 DPT=9105 SEQ=1946760551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB04DC020000000001030307) 
Nov 28 09:05:51 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:05:51 np0005538513.localdomain systemd[1]: Stopping nova_compute container...
Nov 28 09:05:51 np0005538513.localdomain systemd[1]: tmp-crun.1e26mL.mount: Deactivated successfully.
Nov 28 09:05:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23221 DF PROTO=TCP SPT=52560 DPT=9105 SEQ=1946760551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB04EBC20000000001030307) 
Nov 28 09:05:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59531 DF PROTO=TCP SPT=45700 DPT=9101 SEQ=242776502 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB04F6CA0000000001030307) 
Nov 28 09:06:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59533 DF PROTO=TCP SPT=45700 DPT=9101 SEQ=242776502 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0502C20000000001030307) 
Nov 28 09:06:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23222 DF PROTO=TCP SPT=52560 DPT=9105 SEQ=1946760551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB050B820000000001030307) 
Nov 28 09:06:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26901 DF PROTO=TCP SPT=41424 DPT=9100 SEQ=1474215830 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0517420000000001030307) 
Nov 28 09:06:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21995 DF PROTO=TCP SPT=56088 DPT=9100 SEQ=1367658504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0523830000000001030307) 
Nov 28 09:06:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 09:06:10 np0005538513.localdomain podman[106886]: Error: container c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 is not running
Nov 28 09:06:10 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=125/n/a
Nov 28 09:06:10 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'.
Nov 28 09:06:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 09:06:11 np0005538513.localdomain podman[106897]: 2025-11-28 09:06:11.84900679 +0000 UTC m=+0.084008116 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team)
Nov 28 09:06:12 np0005538513.localdomain podman[106897]: 2025-11-28 09:06:12.230478528 +0000 UTC m=+0.465479874 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=)
Nov 28 09:06:12 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully.
Nov 28 09:06:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26903 DF PROTO=TCP SPT=41424 DPT=9100 SEQ=1474215830 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB052F020000000001030307) 
Nov 28 09:06:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 09:06:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 09:06:16 np0005538513.localdomain podman[106921]: 2025-11-28 09:06:16.604672109 +0000 UTC m=+0.084964446 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team)
Nov 28 09:06:16 np0005538513.localdomain systemd[1]: tmp-crun.EDC0Is.mount: Deactivated successfully.
Nov 28 09:06:16 np0005538513.localdomain podman[106920]: 2025-11-28 09:06:16.65987356 +0000 UTC m=+0.142720276 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 09:06:16 np0005538513.localdomain podman[106921]: 2025-11-28 09:06:16.676740653 +0000 UTC m=+0.157032980 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller)
Nov 28 09:06:16 np0005538513.localdomain podman[106920]: 2025-11-28 09:06:16.679394356 +0000 UTC m=+0.162241092 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Nov 28 09:06:16 np0005538513.localdomain podman[106920]: unhealthy
Nov 28 09:06:16 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:06:16 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 09:06:16 np0005538513.localdomain podman[106921]: unhealthy
Nov 28 09:06:16 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:06:16 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 09:06:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18806 DF PROTO=TCP SPT=58632 DPT=9882 SEQ=3487650905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0541830000000001030307) 
Nov 28 09:06:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59540 DF PROTO=TCP SPT=43388 DPT=9105 SEQ=2886964641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0549020000000001030307) 
Nov 28 09:06:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59541 DF PROTO=TCP SPT=43388 DPT=9105 SEQ=2886964641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0551020000000001030307) 
Nov 28 09:06:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59542 DF PROTO=TCP SPT=43388 DPT=9105 SEQ=2886964641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0560C20000000001030307) 
Nov 28 09:06:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49276 DF PROTO=TCP SPT=58512 DPT=9101 SEQ=2552739480 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB056BFA0000000001030307) 
Nov 28 09:06:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49278 DF PROTO=TCP SPT=58512 DPT=9101 SEQ=2552739480 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0578020000000001030307) 
Nov 28 09:06:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39596 DF PROTO=TCP SPT=37864 DPT=9102 SEQ=3041625844 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0581020000000001030307) 
Nov 28 09:06:33 np0005538513.localdomain podman[106874]: time="2025-11-28T09:06:33Z" level=warning msg="StopSignal SIGTERM failed to stop container nova_compute in 42 seconds, resorting to SIGKILL"
Nov 28 09:06:33 np0005538513.localdomain systemd[1]: session-c11.scope: Deactivated successfully.
Nov 28 09:06:33 np0005538513.localdomain systemd[1]: libpod-c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.scope: Deactivated successfully.
Nov 28 09:06:33 np0005538513.localdomain systemd[1]: libpod-c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.scope: Consumed 34.339s CPU time.
Nov 28 09:06:33 np0005538513.localdomain podman[106874]: 2025-11-28 09:06:33.56624692 +0000 UTC m=+42.146618142 container died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12)
Nov 28 09:06:33 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.timer: Deactivated successfully.
Nov 28 09:06:33 np0005538513.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.
Nov 28 09:06:33 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed to open /run/systemd/transient/c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: No such file or directory
Nov 28 09:06:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d078e66fd31ac4112ed049a2e764bba9cb8e12df988d1bfceb2a6cc73592ff91-merged.mount: Deactivated successfully.
Nov 28 09:06:33 np0005538513.localdomain podman[106874]: 2025-11-28 09:06:33.634836148 +0000 UTC m=+42.215207340 container cleanup c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12)
Nov 28 09:06:33 np0005538513.localdomain podman[106874]: nova_compute
Nov 28 09:06:33 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.timer: Failed to open /run/systemd/transient/c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.timer: No such file or directory
Nov 28 09:06:33 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed to open /run/systemd/transient/c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: No such file or directory
Nov 28 09:06:33 np0005538513.localdomain podman[106964]: 2025-11-28 09:06:33.67266095 +0000 UTC m=+0.135357068 container cleanup c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute)
Nov 28 09:06:33 np0005538513.localdomain systemd[1]: libpod-conmon-c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.scope: Deactivated successfully.
Nov 28 09:06:33 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.timer: Failed to open /run/systemd/transient/c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.timer: No such file or directory
Nov 28 09:06:33 np0005538513.localdomain systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed to open /run/systemd/transient/c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: No such file or directory
Nov 28 09:06:33 np0005538513.localdomain podman[106980]: 2025-11-28 09:06:33.785558291 +0000 UTC m=+0.070828848 container cleanup c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4)
Nov 28 09:06:33 np0005538513.localdomain podman[106980]: nova_compute
Nov 28 09:06:33 np0005538513.localdomain systemd[1]: tripleo_nova_compute.service: Deactivated successfully.
Nov 28 09:06:33 np0005538513.localdomain systemd[1]: Stopped nova_compute container.
Nov 28 09:06:33 np0005538513.localdomain systemd[1]: tripleo_nova_compute.service: Consumed 1.140s CPU time, no IO.
Nov 28 09:06:33 np0005538513.localdomain sudo[106832]: pam_unix(sudo:session): session closed for user root
Nov 28 09:06:34 np0005538513.localdomain sudo[107082]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ogjodhfldusdcjyupwqwqurgjknrghdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320793.9638865-113-220713104538969/AnsiballZ_systemd_service.py
Nov 28 09:06:34 np0005538513.localdomain sudo[107082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:06:35 np0005538513.localdomain python3.9[107084]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:06:35 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:06:35 np0005538513.localdomain systemd-sysv-generator[107115]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:06:35 np0005538513.localdomain systemd-rc-local-generator[107110]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:06:35 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:06:35 np0005538513.localdomain systemd[1]: Stopping nova_migration_target container...
Nov 28 09:06:35 np0005538513.localdomain systemd[1]: tmp-crun.Ve2Ng0.mount: Deactivated successfully.
Nov 28 09:06:35 np0005538513.localdomain sshd[70379]: Received signal 15; terminating.
Nov 28 09:06:35 np0005538513.localdomain systemd[1]: libpod-9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.scope: Deactivated successfully.
Nov 28 09:06:35 np0005538513.localdomain systemd[1]: libpod-9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.scope: Consumed 32.848s CPU time.
Nov 28 09:06:35 np0005538513.localdomain podman[107124]: 2025-11-28 09:06:35.695468552 +0000 UTC m=+0.084894233 container died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z)
Nov 28 09:06:35 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.timer: Deactivated successfully.
Nov 28 09:06:35 np0005538513.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.
Nov 28 09:06:35 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Failed to open /run/systemd/transient/9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: No such file or directory
Nov 28 09:06:35 np0005538513.localdomain podman[107124]: 2025-11-28 09:06:35.752257183 +0000 UTC m=+0.141682864 container cleanup 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible)
Nov 28 09:06:35 np0005538513.localdomain podman[107124]: nova_migration_target
Nov 28 09:06:35 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.timer: Failed to open /run/systemd/transient/9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.timer: No such file or directory
Nov 28 09:06:35 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Failed to open /run/systemd/transient/9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: No such file or directory
Nov 28 09:06:35 np0005538513.localdomain podman[107138]: 2025-11-28 09:06:35.784638257 +0000 UTC m=+0.082860100 container cleanup 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step4, container_name=nova_migration_target, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Nov 28 09:06:35 np0005538513.localdomain systemd[1]: libpod-conmon-9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.scope: Deactivated successfully.
Nov 28 09:06:35 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.timer: Failed to open /run/systemd/transient/9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.timer: No such file or directory
Nov 28 09:06:35 np0005538513.localdomain systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Failed to open /run/systemd/transient/9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: No such file or directory
Nov 28 09:06:35 np0005538513.localdomain podman[107150]: 2025-11-28 09:06:35.876444514 +0000 UTC m=+0.057217365 container cleanup 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, container_name=nova_migration_target, config_id=tripleo_step4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4)
Nov 28 09:06:35 np0005538513.localdomain podman[107150]: nova_migration_target
Nov 28 09:06:35 np0005538513.localdomain systemd[1]: tripleo_nova_migration_target.service: Deactivated successfully.
Nov 28 09:06:35 np0005538513.localdomain systemd[1]: Stopped nova_migration_target container.
Nov 28 09:06:35 np0005538513.localdomain sudo[107082]: pam_unix(sudo:session): session closed for user root
Nov 28 09:06:36 np0005538513.localdomain sudo[107250]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gawhptgwtmhrujcqloboabjgclxihbly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320796.0659425-113-262460263134574/AnsiballZ_systemd_service.py
Nov 28 09:06:36 np0005538513.localdomain sudo[107250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:06:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36989 DF PROTO=TCP SPT=50824 DPT=9100 SEQ=1753447102 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB058C820000000001030307) 
Nov 28 09:06:36 np0005538513.localdomain python3.9[107252]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:06:36 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-4f1ef42e70cf0e1a0a8a92baa1944c6e077a6987018321471d4c334fae1280ec-merged.mount: Deactivated successfully.
Nov 28 09:06:36 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019-userdata-shm.mount: Deactivated successfully.
Nov 28 09:06:36 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:06:36 np0005538513.localdomain systemd-sysv-generator[107281]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:06:36 np0005538513.localdomain systemd-rc-local-generator[107276]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:06:36 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:06:37 np0005538513.localdomain systemd[1]: Stopping nova_virtlogd_wrapper container...
Nov 28 09:06:37 np0005538513.localdomain systemd[1]: libpod-8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be.scope: Deactivated successfully.
Nov 28 09:06:37 np0005538513.localdomain podman[107292]: 2025-11-28 09:06:37.156973459 +0000 UTC m=+0.071115906 container died 8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, io.openshift.expose-services=, config_id=tripleo_step3, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_virtlogd_wrapper, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 09:06:37 np0005538513.localdomain podman[107292]: 2025-11-28 09:06:37.19278036 +0000 UTC m=+0.106922777 container cleanup 8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.expose-services=, container_name=nova_virtlogd_wrapper, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Nov 28 09:06:37 np0005538513.localdomain podman[107292]: nova_virtlogd_wrapper
Nov 28 09:06:37 np0005538513.localdomain podman[107306]: 2025-11-28 09:06:37.220591652 +0000 UTC m=+0.059976151 container cleanup 8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_virtlogd_wrapper, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20251118.1)
Nov 28 09:06:37 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-870580852a6f869c68321019e7d77d0da890e64d97007d93989d967a33c02183-merged.mount: Deactivated successfully.
Nov 28 09:06:37 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be-userdata-shm.mount: Deactivated successfully.
Nov 28 09:06:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54622 DF PROTO=TCP SPT=49072 DPT=9100 SEQ=2495149367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0597830000000001030307) 
Nov 28 09:06:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36991 DF PROTO=TCP SPT=50824 DPT=9100 SEQ=1753447102 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB05A4420000000001030307) 
Nov 28 09:06:43 np0005538513.localdomain systemd[1]: Stopping User Manager for UID 0...
Nov 28 09:06:43 np0005538513.localdomain systemd[83313]: Activating special unit Exit the Session...
Nov 28 09:06:43 np0005538513.localdomain systemd[83313]: Removed slice User Background Tasks Slice.
Nov 28 09:06:43 np0005538513.localdomain systemd[83313]: Stopped target Main User Target.
Nov 28 09:06:43 np0005538513.localdomain systemd[83313]: Stopped target Basic System.
Nov 28 09:06:43 np0005538513.localdomain systemd[83313]: Stopped target Paths.
Nov 28 09:06:43 np0005538513.localdomain systemd[83313]: Stopped target Sockets.
Nov 28 09:06:43 np0005538513.localdomain systemd[83313]: Stopped target Timers.
Nov 28 09:06:43 np0005538513.localdomain systemd[83313]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 09:06:43 np0005538513.localdomain systemd[83313]: Closed D-Bus User Message Bus Socket.
Nov 28 09:06:43 np0005538513.localdomain systemd[83313]: Stopped Create User's Volatile Files and Directories.
Nov 28 09:06:43 np0005538513.localdomain systemd[83313]: Removed slice User Application Slice.
Nov 28 09:06:43 np0005538513.localdomain systemd[83313]: Reached target Shutdown.
Nov 28 09:06:43 np0005538513.localdomain systemd[83313]: Finished Exit the Session.
Nov 28 09:06:43 np0005538513.localdomain systemd[83313]: Reached target Exit the Session.
Nov 28 09:06:43 np0005538513.localdomain systemd[1]: user@0.service: Deactivated successfully.
Nov 28 09:06:43 np0005538513.localdomain systemd[1]: Stopped User Manager for UID 0.
Nov 28 09:06:43 np0005538513.localdomain systemd[1]: user@0.service: Consumed 4.523s CPU time, no IO.
Nov 28 09:06:43 np0005538513.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 28 09:06:43 np0005538513.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 28 09:06:43 np0005538513.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 28 09:06:43 np0005538513.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 28 09:06:43 np0005538513.localdomain systemd[1]: Removed slice User Slice of UID 0.
Nov 28 09:06:43 np0005538513.localdomain systemd[1]: user-0.slice: Consumed 5.476s CPU time.
Nov 28 09:06:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 09:06:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 09:06:46 np0005538513.localdomain podman[107323]: 2025-11-28 09:06:46.849755065 +0000 UTC m=+0.083713397 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, container_name=ovn_controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4)
Nov 28 09:06:46 np0005538513.localdomain podman[107323]: 2025-11-28 09:06:46.890359295 +0000 UTC m=+0.124317617 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, version=17.1.12)
Nov 28 09:06:46 np0005538513.localdomain podman[107323]: unhealthy
Nov 28 09:06:46 np0005538513.localdomain podman[107322]: 2025-11-28 09:06:46.903088179 +0000 UTC m=+0.138626989 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 09:06:46 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:06:46 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 09:06:46 np0005538513.localdomain podman[107322]: 2025-11-28 09:06:46.919434806 +0000 UTC m=+0.154973616 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, managed_by=tripleo_ansible)
Nov 28 09:06:46 np0005538513.localdomain podman[107322]: unhealthy
Nov 28 09:06:46 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:06:46 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 09:06:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11542 DF PROTO=TCP SPT=55520 DPT=9882 SEQ=1934118378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB05B6C30000000001030307) 
Nov 28 09:06:47 np0005538513.localdomain sudo[107364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:06:47 np0005538513.localdomain sudo[107364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:06:47 np0005538513.localdomain sudo[107364]: pam_unix(sudo:session): session closed for user root
Nov 28 09:06:47 np0005538513.localdomain sudo[107379]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:06:47 np0005538513.localdomain sudo[107379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:06:47 np0005538513.localdomain sudo[107379]: pam_unix(sudo:session): session closed for user root
Nov 28 09:06:48 np0005538513.localdomain sudo[107425]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:06:48 np0005538513.localdomain sudo[107425]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:06:48 np0005538513.localdomain sudo[107425]: pam_unix(sudo:session): session closed for user root
Nov 28 09:06:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42345 DF PROTO=TCP SPT=47674 DPT=9105 SEQ=3097923637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB05BE420000000001030307) 
Nov 28 09:06:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42346 DF PROTO=TCP SPT=47674 DPT=9105 SEQ=3097923637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB05C6430000000001030307) 
Nov 28 09:06:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42347 DF PROTO=TCP SPT=47674 DPT=9105 SEQ=3097923637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB05D6020000000001030307) 
Nov 28 09:06:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10202 DF PROTO=TCP SPT=49960 DPT=9101 SEQ=3082867192 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB05E12B0000000001030307) 
Nov 28 09:07:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10204 DF PROTO=TCP SPT=49960 DPT=9101 SEQ=3082867192 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB05ED420000000001030307) 
Nov 28 09:07:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42348 DF PROTO=TCP SPT=47674 DPT=9105 SEQ=3097923637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB05F5820000000001030307) 
Nov 28 09:07:05 np0005538513.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 09:07:05 np0005538513.localdomain recover_tripleo_nova_virtqemud[107441]: 61397
Nov 28 09:07:05 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 09:07:05 np0005538513.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 09:07:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47026 DF PROTO=TCP SPT=46070 DPT=9102 SEQ=2352280691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0601830000000001030307) 
Nov 28 09:07:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26906 DF PROTO=TCP SPT=41424 DPT=9100 SEQ=1474215830 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB060D830000000001030307) 
Nov 28 09:07:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39019 DF PROTO=TCP SPT=43748 DPT=9100 SEQ=68847106 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0619820000000001030307) 
Nov 28 09:07:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 09:07:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 09:07:17 np0005538513.localdomain systemd[1]: tmp-crun.xdByW0.mount: Deactivated successfully.
Nov 28 09:07:17 np0005538513.localdomain podman[107442]: 2025-11-28 09:07:17.097408443 +0000 UTC m=+0.085246184 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, tcib_managed=true)
Nov 28 09:07:17 np0005538513.localdomain podman[107442]: 2025-11-28 09:07:17.109898221 +0000 UTC m=+0.097735992 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 28 09:07:17 np0005538513.localdomain podman[107442]: unhealthy
Nov 28 09:07:17 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:07:17 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 09:07:17 np0005538513.localdomain systemd[1]: tmp-crun.KowIQv.mount: Deactivated successfully.
Nov 28 09:07:17 np0005538513.localdomain podman[107443]: 2025-11-28 09:07:17.149790297 +0000 UTC m=+0.133752308 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, architecture=x86_64, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 09:07:17 np0005538513.localdomain podman[107443]: 2025-11-28 09:07:17.163216534 +0000 UTC m=+0.147178545 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, container_name=ovn_controller, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 28 09:07:17 np0005538513.localdomain podman[107443]: unhealthy
Nov 28 09:07:17 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:07:17 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 09:07:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52135 DF PROTO=TCP SPT=33378 DPT=9882 SEQ=2005616798 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB062C020000000001030307) 
Nov 28 09:07:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9007 DF PROTO=TCP SPT=45510 DPT=9105 SEQ=2131438876 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0633830000000001030307) 
Nov 28 09:07:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9008 DF PROTO=TCP SPT=45510 DPT=9105 SEQ=2131438876 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB063B820000000001030307) 
Nov 28 09:07:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9009 DF PROTO=TCP SPT=45510 DPT=9105 SEQ=2131438876 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB064B430000000001030307) 
Nov 28 09:07:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50369 DF PROTO=TCP SPT=56354 DPT=9101 SEQ=3433932624 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0656590000000001030307) 
Nov 28 09:07:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50371 DF PROTO=TCP SPT=56354 DPT=9101 SEQ=3433932624 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0662420000000001030307) 
Nov 28 09:07:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6743 DF PROTO=TCP SPT=33562 DPT=9102 SEQ=583773517 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB066B420000000001030307) 
Nov 28 09:07:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6552 DF PROTO=TCP SPT=36284 DPT=9100 SEQ=4102555724 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0676C30000000001030307) 
Nov 28 09:07:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6745 DF PROTO=TCP SPT=33562 DPT=9102 SEQ=583773517 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0683020000000001030307) 
Nov 28 09:07:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6554 DF PROTO=TCP SPT=36284 DPT=9100 SEQ=4102555724 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB068E820000000001030307) 
Nov 28 09:07:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=992 DF PROTO=TCP SPT=60428 DPT=9882 SEQ=2180038041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB06A1020000000001030307) 
Nov 28 09:07:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 09:07:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 09:07:47 np0005538513.localdomain podman[107480]: 2025-11-28 09:07:47.327688304 +0000 UTC m=+0.067709341 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Nov 28 09:07:47 np0005538513.localdomain podman[107480]: 2025-11-28 09:07:47.33854888 +0000 UTC m=+0.078569847 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 09:07:47 np0005538513.localdomain podman[107480]: unhealthy
Nov 28 09:07:47 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:07:47 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 09:07:47 np0005538513.localdomain podman[107481]: 2025-11-28 09:07:47.379038516 +0000 UTC m=+0.110582851 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 09:07:47 np0005538513.localdomain podman[107481]: 2025-11-28 09:07:47.394355591 +0000 UTC m=+0.125899986 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, version=17.1.12, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 09:07:47 np0005538513.localdomain podman[107481]: unhealthy
Nov 28 09:07:47 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:07:47 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 09:07:48 np0005538513.localdomain sudo[107520]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:07:48 np0005538513.localdomain sudo[107520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:07:48 np0005538513.localdomain sudo[107520]: pam_unix(sudo:session): session closed for user root
Nov 28 09:07:48 np0005538513.localdomain sudo[107535]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:07:48 np0005538513.localdomain sudo[107535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:07:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5958 DF PROTO=TCP SPT=50594 DPT=9105 SEQ=961399672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB06A8C20000000001030307) 
Nov 28 09:07:49 np0005538513.localdomain sudo[107535]: pam_unix(sudo:session): session closed for user root
Nov 28 09:07:50 np0005538513.localdomain sudo[107580]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:07:50 np0005538513.localdomain sudo[107580]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:07:50 np0005538513.localdomain sudo[107580]: pam_unix(sudo:session): session closed for user root
Nov 28 09:07:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5959 DF PROTO=TCP SPT=50594 DPT=9105 SEQ=961399672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB06B0C20000000001030307) 
Nov 28 09:07:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5960 DF PROTO=TCP SPT=50594 DPT=9105 SEQ=961399672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB06C0830000000001030307) 
Nov 28 09:07:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2201 DF PROTO=TCP SPT=40632 DPT=9101 SEQ=2584186812 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB06CB8A0000000001030307) 
Nov 28 09:08:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2203 DF PROTO=TCP SPT=40632 DPT=9101 SEQ=2584186812 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB06D7830000000001030307) 
Nov 28 09:08:01 np0005538513.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: State 'stop-sigterm' timed out. Killing.
Nov 28 09:08:01 np0005538513.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Killing process 60620 (conmon) with signal SIGKILL.
Nov 28 09:08:01 np0005538513.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Main process exited, code=killed, status=9/KILL
Nov 28 09:08:01 np0005538513.localdomain systemd[1]: libpod-conmon-8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be.scope: Deactivated successfully.
Nov 28 09:08:01 np0005538513.localdomain podman[107606]: error opening file `/run/crun/8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be/status`: No such file or directory
Nov 28 09:08:01 np0005538513.localdomain podman[107595]: 2025-11-28 09:08:01.325474377 +0000 UTC m=+0.065406069 container cleanup 8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=nova_virtlogd_wrapper, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, architecture=x86_64, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 09:08:01 np0005538513.localdomain podman[107595]: nova_virtlogd_wrapper
Nov 28 09:08:01 np0005538513.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Failed with result 'timeout'.
Nov 28 09:08:01 np0005538513.localdomain systemd[1]: Stopped nova_virtlogd_wrapper container.
Nov 28 09:08:01 np0005538513.localdomain sudo[107250]: pam_unix(sudo:session): session closed for user root
Nov 28 09:08:01 np0005538513.localdomain sudo[107697]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndgwksljaqdamkhlqebrpsipnmjhnirj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320881.4822037-113-2930218503724/AnsiballZ_systemd_service.py
Nov 28 09:08:01 np0005538513.localdomain sudo[107697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:08:02 np0005538513.localdomain python3.9[107699]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:08:02 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:08:02 np0005538513.localdomain systemd-sysv-generator[107727]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:08:02 np0005538513.localdomain systemd-rc-local-generator[107723]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:08:02 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:08:02 np0005538513.localdomain systemd[1]: Stopping nova_virtnodedevd container...
Nov 28 09:08:02 np0005538513.localdomain systemd[1]: libpod-6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265.scope: Deactivated successfully.
Nov 28 09:08:02 np0005538513.localdomain systemd[1]: libpod-6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265.scope: Consumed 1.354s CPU time.
Nov 28 09:08:02 np0005538513.localdomain podman[107740]: 2025-11-28 09:08:02.54270545 +0000 UTC m=+0.076584976 container died 6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, container_name=nova_virtnodedevd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 09:08:02 np0005538513.localdomain systemd[1]: tmp-crun.TlXCti.mount: Deactivated successfully.
Nov 28 09:08:02 np0005538513.localdomain podman[107740]: 2025-11-28 09:08:02.59108006 +0000 UTC m=+0.124959556 container cleanup 6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_id=tripleo_step3, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_virtnodedevd, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true)
Nov 28 09:08:02 np0005538513.localdomain podman[107740]: nova_virtnodedevd
Nov 28 09:08:02 np0005538513.localdomain podman[107754]: 2025-11-28 09:08:02.624345001 +0000 UTC m=+0.070472136 container cleanup 6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, build-date=2025-11-19T00:35:22Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, config_id=tripleo_step3, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-nova-libvirt, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., container_name=nova_virtnodedevd, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Nov 28 09:08:02 np0005538513.localdomain systemd[1]: libpod-conmon-6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265.scope: Deactivated successfully.
Nov 28 09:08:02 np0005538513.localdomain podman[107782]: error opening file `/run/crun/6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265/status`: No such file or directory
Nov 28 09:08:02 np0005538513.localdomain podman[107770]: 2025-11-28 09:08:02.712766452 +0000 UTC m=+0.058351039 container cleanup 6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, release=1761123044, name=rhosp17/openstack-nova-libvirt, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., container_name=nova_virtnodedevd, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 28 09:08:02 np0005538513.localdomain podman[107770]: nova_virtnodedevd
Nov 28 09:08:02 np0005538513.localdomain systemd[1]: tripleo_nova_virtnodedevd.service: Deactivated successfully.
Nov 28 09:08:02 np0005538513.localdomain systemd[1]: Stopped nova_virtnodedevd container.
Nov 28 09:08:02 np0005538513.localdomain sudo[107697]: pam_unix(sudo:session): session closed for user root
Nov 28 09:08:03 np0005538513.localdomain sudo[107873]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qvoevldhrlqeumiwmpzhrrucdoknurfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320882.8690097-113-105622563525273/AnsiballZ_systemd_service.py
Nov 28 09:08:03 np0005538513.localdomain sudo[107873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:08:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2651 DF PROTO=TCP SPT=58140 DPT=9102 SEQ=2884902030 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB06E0820000000001030307) 
Nov 28 09:08:03 np0005538513.localdomain python3.9[107875]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:08:03 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:08:03 np0005538513.localdomain systemd-rc-local-generator[107900]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:08:03 np0005538513.localdomain systemd-sysv-generator[107903]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:08:03 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:08:03 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265-userdata-shm.mount: Deactivated successfully.
Nov 28 09:08:03 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-025413710f75ace93305dda8659754012c7e6b0ed71f82e0bba1b040134d7ebe-merged.mount: Deactivated successfully.
Nov 28 09:08:03 np0005538513.localdomain systemd[1]: Stopping nova_virtproxyd container...
Nov 28 09:08:03 np0005538513.localdomain systemd[1]: libpod-76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50.scope: Deactivated successfully.
Nov 28 09:08:03 np0005538513.localdomain podman[107916]: 2025-11-28 09:08:03.908106867 +0000 UTC m=+0.078760153 container died 76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., container_name=nova_virtproxyd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 28 09:08:03 np0005538513.localdomain podman[107916]: 2025-11-28 09:08:03.996180488 +0000 UTC m=+0.166833774 container cleanup 76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step3, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, release=1761123044, container_name=nova_virtproxyd, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-type=git)
Nov 28 09:08:03 np0005538513.localdomain podman[107916]: nova_virtproxyd
Nov 28 09:08:04 np0005538513.localdomain podman[107931]: 2025-11-28 09:08:04.005745945 +0000 UTC m=+0.090100485 container cleanup 76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, tcib_managed=true, config_id=tripleo_step3, container_name=nova_virtproxyd, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git)
Nov 28 09:08:04 np0005538513.localdomain systemd[1]: libpod-conmon-76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50.scope: Deactivated successfully.
Nov 28 09:08:04 np0005538513.localdomain podman[107960]: error opening file `/run/crun/76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50/status`: No such file or directory
Nov 28 09:08:04 np0005538513.localdomain podman[107948]: 2025-11-28 09:08:04.084110304 +0000 UTC m=+0.044941444 container cleanup 76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtproxyd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt)
Nov 28 09:08:04 np0005538513.localdomain podman[107948]: nova_virtproxyd
Nov 28 09:08:04 np0005538513.localdomain systemd[1]: tripleo_nova_virtproxyd.service: Deactivated successfully.
Nov 28 09:08:04 np0005538513.localdomain systemd[1]: Stopped nova_virtproxyd container.
Nov 28 09:08:04 np0005538513.localdomain sudo[107873]: pam_unix(sudo:session): session closed for user root
Nov 28 09:08:04 np0005538513.localdomain sudo[108052]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdwypknmdhhrczuwulqqmwycokbgegdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320884.191257-113-129795454679985/AnsiballZ_systemd_service.py
Nov 28 09:08:04 np0005538513.localdomain sudo[108052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:08:04 np0005538513.localdomain python3.9[108054]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:08:04 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-2ed6a910a473599a3c0eb40fb417f8a3b4e8b49460b9c3a2f878bde6830d5976-merged.mount: Deactivated successfully.
Nov 28 09:08:04 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50-userdata-shm.mount: Deactivated successfully.
Nov 28 09:08:05 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:08:05 np0005538513.localdomain systemd-rc-local-generator[108078]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:08:05 np0005538513.localdomain systemd-sysv-generator[108083]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:08:05 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:08:06 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud_recover.timer: Deactivated successfully.
Nov 28 09:08:06 np0005538513.localdomain systemd[1]: Stopped Check and recover tripleo_nova_virtqemud every 10m.
Nov 28 09:08:06 np0005538513.localdomain systemd[1]: Stopping nova_virtqemud container...
Nov 28 09:08:06 np0005538513.localdomain systemd[1]: tmp-crun.LESlEN.mount: Deactivated successfully.
Nov 28 09:08:06 np0005538513.localdomain systemd[1]: libpod-60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057.scope: Deactivated successfully.
Nov 28 09:08:06 np0005538513.localdomain systemd[1]: libpod-60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057.scope: Consumed 2.603s CPU time.
Nov 28 09:08:06 np0005538513.localdomain podman[108095]: 2025-11-28 09:08:06.173984315 +0000 UTC m=+0.071921421 container died 60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, container_name=nova_virtqemud, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step3, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4)
Nov 28 09:08:06 np0005538513.localdomain podman[108095]: 2025-11-28 09:08:06.204677677 +0000 UTC m=+0.102614753 container cleanup 60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, version=17.1.12, release=1761123044, config_id=tripleo_step3, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_virtqemud, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1)
Nov 28 09:08:06 np0005538513.localdomain podman[108095]: nova_virtqemud
Nov 28 09:08:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30837 DF PROTO=TCP SPT=47286 DPT=9102 SEQ=2784489092 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB06EB830000000001030307) 
Nov 28 09:08:06 np0005538513.localdomain podman[108110]: 2025-11-28 09:08:06.249849758 +0000 UTC m=+0.066309808 container cleanup 60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vcs-type=git, config_id=tripleo_step3, container_name=nova_virtqemud, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Nov 28 09:08:07 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-850a5494078c53945d7c69a5c914ed859c31bc07523103c44dd52329f5c128d7-merged.mount: Deactivated successfully.
Nov 28 09:08:07 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057-userdata-shm.mount: Deactivated successfully.
Nov 28 09:08:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39022 DF PROTO=TCP SPT=43748 DPT=9100 SEQ=68847106 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB06F7820000000001030307) 
Nov 28 09:08:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17415 DF PROTO=TCP SPT=33454 DPT=9100 SEQ=3195106899 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0703C30000000001030307) 
Nov 28 09:08:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51068 DF PROTO=TCP SPT=58596 DPT=9882 SEQ=1402631918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0716420000000001030307) 
Nov 28 09:08:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 09:08:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 09:08:17 np0005538513.localdomain systemd[1]: tmp-crun.6PTELK.mount: Deactivated successfully.
Nov 28 09:08:17 np0005538513.localdomain podman[108127]: 2025-11-28 09:08:17.835820367 +0000 UTC m=+0.075604805 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 09:08:17 np0005538513.localdomain podman[108128]: 2025-11-28 09:08:17.888942924 +0000 UTC m=+0.125872544 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, tcib_managed=true, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 09:08:17 np0005538513.localdomain podman[108127]: 2025-11-28 09:08:17.907414296 +0000 UTC m=+0.147198734 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 09:08:17 np0005538513.localdomain podman[108127]: unhealthy
Nov 28 09:08:17 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:08:17 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 09:08:17 np0005538513.localdomain podman[108128]: 2025-11-28 09:08:17.932438372 +0000 UTC m=+0.169367962 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com)
Nov 28 09:08:17 np0005538513.localdomain podman[108128]: unhealthy
Nov 28 09:08:17 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:08:17 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 09:08:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43567 DF PROTO=TCP SPT=45272 DPT=9105 SEQ=315254468 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB071DC20000000001030307) 
Nov 28 09:08:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43568 DF PROTO=TCP SPT=45272 DPT=9105 SEQ=315254468 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0725C20000000001030307) 
Nov 28 09:08:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43569 DF PROTO=TCP SPT=45272 DPT=9105 SEQ=315254468 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0735820000000001030307) 
Nov 28 09:08:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56153 DF PROTO=TCP SPT=49374 DPT=9101 SEQ=444321622 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0740BA0000000001030307) 
Nov 28 09:08:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56155 DF PROTO=TCP SPT=49374 DPT=9101 SEQ=444321622 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB074CC20000000001030307) 
Nov 28 09:08:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43570 DF PROTO=TCP SPT=45272 DPT=9105 SEQ=315254468 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0755820000000001030307) 
Nov 28 09:08:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:08:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4800.1 total, 600.0 interval
                                                          Cumulative writes: 4899 writes, 22K keys, 4899 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4899 writes, 608 syncs, 8.06 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 09:08:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40582 DF PROTO=TCP SPT=57634 DPT=9100 SEQ=456275993 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0761420000000001030307) 
Nov 28 09:08:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7902 DF PROTO=TCP SPT=40818 DPT=9102 SEQ=535589742 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB076D830000000001030307) 
Nov 28 09:08:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:08:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4800.3 total, 600.0 interval
                                                          Cumulative writes: 5616 writes, 25K keys, 5616 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5616 writes, 758 syncs, 7.41 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 09:08:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40584 DF PROTO=TCP SPT=57634 DPT=9100 SEQ=456275993 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0779020000000001030307) 
Nov 28 09:08:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19333 DF PROTO=TCP SPT=41712 DPT=9882 SEQ=3726534465 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB078B820000000001030307) 
Nov 28 09:08:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 09:08:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 09:08:48 np0005538513.localdomain podman[108168]: 2025-11-28 09:08:48.096202951 +0000 UTC m=+0.078779624 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, architecture=x86_64, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 09:08:48 np0005538513.localdomain podman[108168]: 2025-11-28 09:08:48.10909167 +0000 UTC m=+0.091668353 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.4, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 09:08:48 np0005538513.localdomain podman[108168]: unhealthy
Nov 28 09:08:48 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:08:48 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 09:08:48 np0005538513.localdomain systemd[1]: tmp-crun.1uzbzQ.mount: Deactivated successfully.
Nov 28 09:08:48 np0005538513.localdomain podman[108169]: 2025-11-28 09:08:48.157234473 +0000 UTC m=+0.136603546 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:08:48 np0005538513.localdomain podman[108169]: 2025-11-28 09:08:48.175331404 +0000 UTC m=+0.154700467 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, container_name=ovn_controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 09:08:48 np0005538513.localdomain podman[108169]: unhealthy
Nov 28 09:08:48 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:08:48 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 09:08:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35635 DF PROTO=TCP SPT=45508 DPT=9105 SEQ=1666481019 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0793020000000001030307) 
Nov 28 09:08:50 np0005538513.localdomain sudo[108209]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:08:50 np0005538513.localdomain sudo[108209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:08:50 np0005538513.localdomain sudo[108209]: pam_unix(sudo:session): session closed for user root
Nov 28 09:08:50 np0005538513.localdomain sudo[108224]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:08:50 np0005538513.localdomain sudo[108224]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:08:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35636 DF PROTO=TCP SPT=45508 DPT=9105 SEQ=1666481019 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB079B020000000001030307) 
Nov 28 09:08:51 np0005538513.localdomain sudo[108224]: pam_unix(sudo:session): session closed for user root
Nov 28 09:08:51 np0005538513.localdomain sudo[108270]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:08:51 np0005538513.localdomain sudo[108270]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:08:51 np0005538513.localdomain sudo[108270]: pam_unix(sudo:session): session closed for user root
Nov 28 09:08:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35637 DF PROTO=TCP SPT=45508 DPT=9105 SEQ=1666481019 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB07AAC20000000001030307) 
Nov 28 09:08:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34804 DF PROTO=TCP SPT=54386 DPT=9101 SEQ=368064243 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB07B5E90000000001030307) 
Nov 28 09:09:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34806 DF PROTO=TCP SPT=54386 DPT=9101 SEQ=368064243 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB07C2020000000001030307) 
Nov 28 09:09:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28986 DF PROTO=TCP SPT=38092 DPT=9102 SEQ=2847201981 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB07CAC20000000001030307) 
Nov 28 09:09:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14661 DF PROTO=TCP SPT=34100 DPT=9100 SEQ=1491282970 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB07D6820000000001030307) 
Nov 28 09:09:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17418 DF PROTO=TCP SPT=33454 DPT=9100 SEQ=3195106899 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB07E1820000000001030307) 
Nov 28 09:09:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14663 DF PROTO=TCP SPT=34100 DPT=9100 SEQ=1491282970 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB07EE420000000001030307) 
Nov 28 09:09:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65432 DF PROTO=TCP SPT=57836 DPT=9882 SEQ=2076022789 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0800C20000000001030307) 
Nov 28 09:09:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 09:09:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 09:09:18 np0005538513.localdomain podman[108286]: 2025-11-28 09:09:18.348168525 +0000 UTC m=+0.084724119 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Nov 28 09:09:18 np0005538513.localdomain systemd[1]: tmp-crun.BazvoZ.mount: Deactivated successfully.
Nov 28 09:09:18 np0005538513.localdomain podman[108285]: 2025-11-28 09:09:18.402983824 +0000 UTC m=+0.139221737 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Nov 28 09:09:18 np0005538513.localdomain podman[108286]: 2025-11-28 09:09:18.41573732 +0000 UTC m=+0.152292914 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, distribution-scope=public, container_name=ovn_controller, config_id=tripleo_step4, version=17.1.12, managed_by=tripleo_ansible)
Nov 28 09:09:18 np0005538513.localdomain podman[108285]: 2025-11-28 09:09:18.422580232 +0000 UTC m=+0.158818145 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true)
Nov 28 09:09:18 np0005538513.localdomain podman[108285]: unhealthy
Nov 28 09:09:18 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:09:18 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'.
Nov 28 09:09:18 np0005538513.localdomain podman[108286]: unhealthy
Nov 28 09:09:18 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:09:18 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'.
Nov 28 09:09:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37750 DF PROTO=TCP SPT=60094 DPT=9105 SEQ=3483768617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0808430000000001030307) 
Nov 28 09:09:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37751 DF PROTO=TCP SPT=60094 DPT=9105 SEQ=3483768617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0810430000000001030307) 
Nov 28 09:09:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37752 DF PROTO=TCP SPT=60094 DPT=9105 SEQ=3483768617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0820020000000001030307) 
Nov 28 09:09:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54525 DF PROTO=TCP SPT=60424 DPT=9101 SEQ=2486827943 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB082B1A0000000001030307) 
Nov 28 09:09:28 np0005538513.localdomain sshd[108328]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:09:28 np0005538513.localdomain sshd[108328]: error: kex_exchange_identification: Connection closed by remote host
Nov 28 09:09:28 np0005538513.localdomain sshd[108328]: Connection closed by 80.94.92.182 port 37700
Nov 28 09:09:30 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud.service: State 'stop-sigterm' timed out. Killing.
Nov 28 09:09:30 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud.service: Killing process 61393 (conmon) with signal SIGKILL.
Nov 28 09:09:30 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud.service: Main process exited, code=killed, status=9/KILL
Nov 28 09:09:30 np0005538513.localdomain systemd[1]: libpod-conmon-60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057.scope: Deactivated successfully.
Nov 28 09:09:30 np0005538513.localdomain podman[108340]: error opening file `/run/crun/60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057/status`: No such file or directory
Nov 28 09:09:30 np0005538513.localdomain podman[108329]: 2025-11-28 09:09:30.330888475 +0000 UTC m=+0.063918543 container cleanup 60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_virtqemud, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step3)
Nov 28 09:09:30 np0005538513.localdomain podman[108329]: nova_virtqemud
Nov 28 09:09:30 np0005538513.localdomain systemd[1]: tripleo_nova_virtqemud.service: Failed with result 'timeout'.
Nov 28 09:09:30 np0005538513.localdomain systemd[1]: Stopped nova_virtqemud container.
Nov 28 09:09:30 np0005538513.localdomain sudo[108052]: pam_unix(sudo:session): session closed for user root
Nov 28 09:09:30 np0005538513.localdomain sudo[108431]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkoryixjpsohgipbmqtenjaakwkvjmwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320970.5101824-113-148939210468197/AnsiballZ_systemd_service.py
Nov 28 09:09:30 np0005538513.localdomain sudo[108431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:09:31 np0005538513.localdomain python3.9[108433]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud_recover.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:09:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54527 DF PROTO=TCP SPT=60424 DPT=9101 SEQ=2486827943 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0837420000000001030307) 
Nov 28 09:09:32 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:09:32 np0005538513.localdomain systemd-rc-local-generator[108461]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:09:32 np0005538513.localdomain systemd-sysv-generator[108467]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:09:32 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:09:32 np0005538513.localdomain sudo[108431]: pam_unix(sudo:session): session closed for user root
Nov 28 09:09:32 np0005538513.localdomain sudo[108562]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdtulwrbipthouimlcwbjrudqivrqcci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320972.6233108-113-254351714095418/AnsiballZ_systemd_service.py
Nov 28 09:09:32 np0005538513.localdomain sudo[108562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:09:33 np0005538513.localdomain python3.9[108564]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:09:33 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:09:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37753 DF PROTO=TCP SPT=60094 DPT=9105 SEQ=3483768617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB083F820000000001030307) 
Nov 28 09:09:33 np0005538513.localdomain systemd-rc-local-generator[108588]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:09:33 np0005538513.localdomain systemd-sysv-generator[108593]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:09:33 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:09:33 np0005538513.localdomain systemd[1]: Stopping nova_virtsecretd container...
Nov 28 09:09:33 np0005538513.localdomain systemd[1]: libpod-2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76.scope: Deactivated successfully.
Nov 28 09:09:33 np0005538513.localdomain podman[108604]: 2025-11-28 09:09:33.634088529 +0000 UTC m=+0.074453179 container died 2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, release=1761123044, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_virtsecretd, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 28 09:09:33 np0005538513.localdomain podman[108604]: 2025-11-28 09:09:33.673130529 +0000 UTC m=+0.113495149 container cleanup 2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtsecretd, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team)
Nov 28 09:09:33 np0005538513.localdomain podman[108604]: nova_virtsecretd
Nov 28 09:09:33 np0005538513.localdomain podman[108617]: 2025-11-28 09:09:33.714370678 +0000 UTC m=+0.067633908 container cleanup 2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, container_name=nova_virtsecretd, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z, architecture=x86_64, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com)
Nov 28 09:09:33 np0005538513.localdomain systemd[1]: libpod-conmon-2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76.scope: Deactivated successfully.
Nov 28 09:09:33 np0005538513.localdomain podman[108645]: error opening file `/run/crun/2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76/status`: No such file or directory
Nov 28 09:09:33 np0005538513.localdomain podman[108634]: 2025-11-28 09:09:33.829673983 +0000 UTC m=+0.075336867 container cleanup 2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=nova_virtsecretd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, architecture=x86_64, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 09:09:33 np0005538513.localdomain podman[108634]: nova_virtsecretd
Nov 28 09:09:33 np0005538513.localdomain systemd[1]: tripleo_nova_virtsecretd.service: Deactivated successfully.
Nov 28 09:09:33 np0005538513.localdomain systemd[1]: Stopped nova_virtsecretd container.
Nov 28 09:09:33 np0005538513.localdomain sudo[108562]: pam_unix(sudo:session): session closed for user root
Nov 28 09:09:34 np0005538513.localdomain sudo[108738]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hegomyylxdpokdndskfforgvpxvoqnrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320973.999669-113-17214900719052/AnsiballZ_systemd_service.py
Nov 28 09:09:34 np0005538513.localdomain sudo[108738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:09:34 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e56fe7373565106f757a890fd2f61ff50243f466d810e377560c4fab1b4ea8c4-merged.mount: Deactivated successfully.
Nov 28 09:09:34 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76-userdata-shm.mount: Deactivated successfully.
Nov 28 09:09:34 np0005538513.localdomain python3.9[108740]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:09:34 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:09:34 np0005538513.localdomain systemd-rc-local-generator[108768]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:09:34 np0005538513.localdomain systemd-sysv-generator[108772]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:09:35 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:09:35 np0005538513.localdomain systemd[1]: Stopping nova_virtstoraged container...
Nov 28 09:09:35 np0005538513.localdomain systemd[1]: libpod-635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951.scope: Deactivated successfully.
Nov 28 09:09:35 np0005538513.localdomain podman[108781]: 2025-11-28 09:09:35.243776511 +0000 UTC m=+0.059770835 container died 635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_virtstoraged, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step3, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.expose-services=, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.12)
Nov 28 09:09:35 np0005538513.localdomain podman[108781]: 2025-11-28 09:09:35.289057685 +0000 UTC m=+0.105051999 container cleanup 635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:35:22Z, container_name=nova_virtstoraged, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step3, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 09:09:35 np0005538513.localdomain podman[108781]: nova_virtstoraged
Nov 28 09:09:35 np0005538513.localdomain podman[108795]: 2025-11-28 09:09:35.333577596 +0000 UTC m=+0.076770513 container cleanup 635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, version=17.1.12, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtstoraged, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z)
Nov 28 09:09:35 np0005538513.localdomain systemd[1]: libpod-conmon-635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951.scope: Deactivated successfully.
Nov 28 09:09:35 np0005538513.localdomain podman[108825]: error opening file `/run/crun/635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951/status`: No such file or directory
Nov 28 09:09:35 np0005538513.localdomain podman[108813]: 2025-11-28 09:09:35.431608675 +0000 UTC m=+0.065204652 container cleanup 635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=nova_virtstoraged)
Nov 28 09:09:35 np0005538513.localdomain podman[108813]: nova_virtstoraged
Nov 28 09:09:35 np0005538513.localdomain systemd[1]: tripleo_nova_virtstoraged.service: Deactivated successfully.
Nov 28 09:09:35 np0005538513.localdomain systemd[1]: Stopped nova_virtstoraged container.
Nov 28 09:09:35 np0005538513.localdomain sudo[108738]: pam_unix(sudo:session): session closed for user root
Nov 28 09:09:35 np0005538513.localdomain systemd[1]: tmp-crun.xpCcLo.mount: Deactivated successfully.
Nov 28 09:09:35 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-fc6f1986e221e102f7feefd3b96110a2bdd081d61203281438672df579c3b8f4-merged.mount: Deactivated successfully.
Nov 28 09:09:35 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951-userdata-shm.mount: Deactivated successfully.
Nov 28 09:09:35 np0005538513.localdomain sudo[108916]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-elvmzrlcabzatvtzjsjapctthylneiil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320975.5872188-113-62956662816705/AnsiballZ_systemd_service.py
Nov 28 09:09:35 np0005538513.localdomain sudo[108916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:09:36 np0005538513.localdomain python3.9[108918]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_controller.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:09:36 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:09:36 np0005538513.localdomain systemd-sysv-generator[108949]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:09:36 np0005538513.localdomain systemd-rc-local-generator[108942]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:09:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7905 DF PROTO=TCP SPT=40818 DPT=9102 SEQ=535589742 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB084B820000000001030307) 
Nov 28 09:09:36 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:09:36 np0005538513.localdomain systemd[1]: Stopping ovn_controller container...
Nov 28 09:09:36 np0005538513.localdomain systemd[1]: libpod-3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.scope: Deactivated successfully.
Nov 28 09:09:36 np0005538513.localdomain systemd[1]: libpod-3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.scope: Consumed 2.618s CPU time.
Nov 28 09:09:36 np0005538513.localdomain podman[108959]: 2025-11-28 09:09:36.628936991 +0000 UTC m=+0.076541424 container died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 09:09:36 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.timer: Deactivated successfully.
Nov 28 09:09:36 np0005538513.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.
Nov 28 09:09:36 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed to open /run/systemd/transient/3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: No such file or directory
Nov 28 09:09:36 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e-userdata-shm.mount: Deactivated successfully.
Nov 28 09:09:36 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-069878c0f25bf999d8bcfd5288562c93f75c544f63f9cb3c25f6c968781689e7-merged.mount: Deactivated successfully.
Nov 28 09:09:36 np0005538513.localdomain podman[108959]: 2025-11-28 09:09:36.674815924 +0000 UTC m=+0.122420337 container cleanup 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc.)
Nov 28 09:09:36 np0005538513.localdomain podman[108959]: ovn_controller
Nov 28 09:09:36 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.timer: Failed to open /run/systemd/transient/3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.timer: No such file or directory
Nov 28 09:09:36 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed to open /run/systemd/transient/3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: No such file or directory
Nov 28 09:09:36 np0005538513.localdomain podman[108971]: 2025-11-28 09:09:36.724531735 +0000 UTC m=+0.080975392 container cleanup 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, release=1761123044, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 09:09:36 np0005538513.localdomain systemd[1]: libpod-conmon-3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.scope: Deactivated successfully.
Nov 28 09:09:36 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.timer: Failed to open /run/systemd/transient/3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.timer: No such file or directory
Nov 28 09:09:36 np0005538513.localdomain systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed to open /run/systemd/transient/3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: No such file or directory
Nov 28 09:09:36 np0005538513.localdomain podman[108986]: 2025-11-28 09:09:36.819538821 +0000 UTC m=+0.065387999 container cleanup 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, vcs-type=git, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 09:09:36 np0005538513.localdomain podman[108986]: ovn_controller
Nov 28 09:09:36 np0005538513.localdomain systemd[1]: tripleo_ovn_controller.service: Deactivated successfully.
Nov 28 09:09:36 np0005538513.localdomain systemd[1]: Stopped ovn_controller container.
Nov 28 09:09:36 np0005538513.localdomain sudo[108916]: pam_unix(sudo:session): session closed for user root
Nov 28 09:09:37 np0005538513.localdomain sudo[109087]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wewpmpitejncazwdmkunnhpjwhxqglsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320976.9934945-113-3394327830110/AnsiballZ_systemd_service.py
Nov 28 09:09:37 np0005538513.localdomain sudo[109087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:09:37 np0005538513.localdomain python3.9[109089]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_metadata_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:09:37 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:09:37 np0005538513.localdomain systemd-rc-local-generator[109113]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:09:37 np0005538513.localdomain systemd-sysv-generator[109119]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:09:37 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:09:37 np0005538513.localdomain systemd[1]: Stopping ovn_metadata_agent container...
Nov 28 09:09:38 np0005538513.localdomain systemd[1]: libpod-1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.scope: Deactivated successfully.
Nov 28 09:09:38 np0005538513.localdomain systemd[1]: libpod-1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.scope: Consumed 10.934s CPU time.
Nov 28 09:09:38 np0005538513.localdomain podman[109130]: 2025-11-28 09:09:38.966098781 +0000 UTC m=+1.018096261 container died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_id=tripleo_step4, release=1761123044, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64)
Nov 28 09:09:38 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.timer: Deactivated successfully.
Nov 28 09:09:38 np0005538513.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.
Nov 28 09:09:38 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed to open /run/systemd/transient/1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: No such file or directory
Nov 28 09:09:39 np0005538513.localdomain systemd[1]: tmp-crun.7iLb7H.mount: Deactivated successfully.
Nov 28 09:09:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633-userdata-shm.mount: Deactivated successfully.
Nov 28 09:09:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-278e8886c08486a895bd57117a67612048d79e44c13305c2caa42c992931b27d-merged.mount: Deactivated successfully.
Nov 28 09:09:39 np0005538513.localdomain podman[109130]: 2025-11-28 09:09:39.044996367 +0000 UTC m=+1.096993797 container cleanup 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 09:09:39 np0005538513.localdomain podman[109130]: ovn_metadata_agent
Nov 28 09:09:39 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.timer: Failed to open /run/systemd/transient/1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.timer: No such file or directory
Nov 28 09:09:39 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed to open /run/systemd/transient/1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: No such file or directory
Nov 28 09:09:39 np0005538513.localdomain podman[109142]: 2025-11-28 09:09:39.067818924 +0000 UTC m=+0.093531231 container cleanup 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 09:09:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40587 DF PROTO=TCP SPT=57634 DPT=9100 SEQ=456275993 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0857820000000001030307) 
Nov 28 09:09:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36018 DF PROTO=TCP SPT=57636 DPT=9100 SEQ=2153070501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0863420000000001030307) 
Nov 28 09:09:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64177 DF PROTO=TCP SPT=51402 DPT=9882 SEQ=2193097018 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0875C20000000001030307) 
Nov 28 09:09:48 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55959 DF PROTO=TCP SPT=41712 DPT=9105 SEQ=4262444411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB08795B0000000001030307) 
Nov 28 09:09:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55961 DF PROTO=TCP SPT=41712 DPT=9105 SEQ=4262444411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0885820000000001030307) 
Nov 28 09:09:52 np0005538513.localdomain sudo[109162]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:09:52 np0005538513.localdomain sudo[109162]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:09:52 np0005538513.localdomain sudo[109162]: pam_unix(sudo:session): session closed for user root
Nov 28 09:09:52 np0005538513.localdomain sudo[109177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:09:52 np0005538513.localdomain sudo[109177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:09:52 np0005538513.localdomain sudo[109177]: pam_unix(sudo:session): session closed for user root
Nov 28 09:09:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55962 DF PROTO=TCP SPT=41712 DPT=9105 SEQ=4262444411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0895430000000001030307) 
Nov 28 09:09:55 np0005538513.localdomain sudo[109224]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:09:55 np0005538513.localdomain sudo[109224]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:09:55 np0005538513.localdomain sudo[109224]: pam_unix(sudo:session): session closed for user root
Nov 28 09:09:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58888 DF PROTO=TCP SPT=52048 DPT=9101 SEQ=3011082735 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB08A04A0000000001030307) 
Nov 28 09:10:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58890 DF PROTO=TCP SPT=52048 DPT=9101 SEQ=3011082735 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB08AC420000000001030307) 
Nov 28 09:10:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23435 DF PROTO=TCP SPT=57294 DPT=9102 SEQ=2423972304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB08B5420000000001030307) 
Nov 28 09:10:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47641 DF PROTO=TCP SPT=56056 DPT=9100 SEQ=274390103 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB08C0C20000000001030307) 
Nov 28 09:10:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23437 DF PROTO=TCP SPT=57294 DPT=9102 SEQ=2423972304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB08CD020000000001030307) 
Nov 28 09:10:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47643 DF PROTO=TCP SPT=56056 DPT=9100 SEQ=274390103 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB08D8820000000001030307) 
Nov 28 09:10:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36413 DF PROTO=TCP SPT=35634 DPT=9882 SEQ=1664321515 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB08EB020000000001030307) 
Nov 28 09:10:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55519 DF PROTO=TCP SPT=34028 DPT=9105 SEQ=454237546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB08F2820000000001030307) 
Nov 28 09:10:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55520 DF PROTO=TCP SPT=34028 DPT=9105 SEQ=454237546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB08FA820000000001030307) 
Nov 28 09:10:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55521 DF PROTO=TCP SPT=34028 DPT=9105 SEQ=454237546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB090A430000000001030307) 
Nov 28 09:10:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53353 DF PROTO=TCP SPT=43420 DPT=9101 SEQ=2773884380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB09157A0000000001030307) 
Nov 28 09:10:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53355 DF PROTO=TCP SPT=43420 DPT=9101 SEQ=2773884380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0921820000000001030307) 
Nov 28 09:10:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9006 DF PROTO=TCP SPT=51120 DPT=9102 SEQ=239423527 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB092A820000000001030307) 
Nov 28 09:10:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58283 DF PROTO=TCP SPT=59492 DPT=9102 SEQ=2294016223 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0935820000000001030307) 
Nov 28 09:10:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36021 DF PROTO=TCP SPT=57636 DPT=9100 SEQ=2153070501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0941820000000001030307) 
Nov 28 09:10:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7253 DF PROTO=TCP SPT=49748 DPT=9100 SEQ=2026696104 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB094DC30000000001030307) 
Nov 28 09:10:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47504 DF PROTO=TCP SPT=52916 DPT=9882 SEQ=35957365 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0960420000000001030307) 
Nov 28 09:10:48 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48678 DF PROTO=TCP SPT=40998 DPT=9105 SEQ=3512029773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0963BB0000000001030307) 
Nov 28 09:10:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48680 DF PROTO=TCP SPT=40998 DPT=9105 SEQ=3512029773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB096FC20000000001030307) 
Nov 28 09:10:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48681 DF PROTO=TCP SPT=40998 DPT=9105 SEQ=3512029773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB097F820000000001030307) 
Nov 28 09:10:55 np0005538513.localdomain sudo[109239]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:10:55 np0005538513.localdomain sudo[109239]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:10:55 np0005538513.localdomain sudo[109239]: pam_unix(sudo:session): session closed for user root
Nov 28 09:10:55 np0005538513.localdomain sudo[109254]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 09:10:55 np0005538513.localdomain sudo[109254]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:10:56 np0005538513.localdomain sudo[109254]: pam_unix(sudo:session): session closed for user root
Nov 28 09:10:56 np0005538513.localdomain sudo[109289]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:10:56 np0005538513.localdomain sudo[109289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:10:56 np0005538513.localdomain sudo[109289]: pam_unix(sudo:session): session closed for user root
Nov 28 09:10:56 np0005538513.localdomain sudo[109304]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:10:56 np0005538513.localdomain sudo[109304]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:10:57 np0005538513.localdomain sudo[109304]: pam_unix(sudo:session): session closed for user root
Nov 28 09:10:57 np0005538513.localdomain sudo[109353]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:10:57 np0005538513.localdomain sudo[109353]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:10:57 np0005538513.localdomain sudo[109353]: pam_unix(sudo:session): session closed for user root
Nov 28 09:10:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46832 DF PROTO=TCP SPT=59510 DPT=9101 SEQ=4033662674 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB098AA90000000001030307) 
Nov 28 09:11:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46834 DF PROTO=TCP SPT=59510 DPT=9101 SEQ=4033662674 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0996C20000000001030307) 
Nov 28 09:11:03 np0005538513.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: State 'stop-sigterm' timed out. Killing.
Nov 28 09:11:03 np0005538513.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: Killing process 70684 (conmon) with signal SIGKILL.
Nov 28 09:11:03 np0005538513.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: Main process exited, code=killed, status=9/KILL
Nov 28 09:11:03 np0005538513.localdomain systemd[1]: libpod-conmon-1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.scope: Deactivated successfully.
Nov 28 09:11:03 np0005538513.localdomain podman[109380]: error opening file `/run/crun/1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633/status`: No such file or directory
Nov 28 09:11:03 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.timer: Failed to open /run/systemd/transient/1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.timer: No such file or directory
Nov 28 09:11:03 np0005538513.localdomain systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed to open /run/systemd/transient/1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: No such file or directory
Nov 28 09:11:03 np0005538513.localdomain podman[109368]: 2025-11-28 09:11:03.356661159 +0000 UTC m=+0.086092211 container cleanup 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, vcs-type=git, distribution-scope=public, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:11:03 np0005538513.localdomain podman[109368]: ovn_metadata_agent
Nov 28 09:11:03 np0005538513.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: Failed with result 'timeout'.
Nov 28 09:11:03 np0005538513.localdomain systemd[1]: Stopped ovn_metadata_agent container.
Nov 28 09:11:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48682 DF PROTO=TCP SPT=40998 DPT=9105 SEQ=3512029773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB099F820000000001030307) 
Nov 28 09:11:03 np0005538513.localdomain sudo[109087]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:03 np0005538513.localdomain sudo[109471]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-texpsntjpsmfrljqnmwodwjseerpjtvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321063.5278127-113-57121939277607/AnsiballZ_systemd_service.py
Nov 28 09:11:03 np0005538513.localdomain sudo[109471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:04 np0005538513.localdomain python3.9[109473]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_rsyslog.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:11:05 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:11:05 np0005538513.localdomain systemd-rc-local-generator[109499]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:11:05 np0005538513.localdomain systemd-sysv-generator[109505]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:11:05 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:11:05 np0005538513.localdomain sudo[109471]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49224 DF PROTO=TCP SPT=44206 DPT=9100 SEQ=2623352065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB09AB420000000001030307) 
Nov 28 09:11:06 np0005538513.localdomain sudo[109601]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rprysiwphganvqjooqneihfxipkvqzdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321066.5085013-563-265223612284460/AnsiballZ_file.py
Nov 28 09:11:06 np0005538513.localdomain sudo[109601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:07 np0005538513.localdomain python3.9[109603]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:07 np0005538513.localdomain sudo[109601]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:07 np0005538513.localdomain sudo[109693]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eyynttlgtoiybmrgsodpipovkzpaeycd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321067.3270743-563-82895364284466/AnsiballZ_file.py
Nov 28 09:11:07 np0005538513.localdomain sudo[109693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:07 np0005538513.localdomain python3.9[109695]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:07 np0005538513.localdomain sudo[109693]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:08 np0005538513.localdomain sudo[109785]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-varuxksgxlmnpjfsvsgjsafvtgcjpjyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321067.9674268-563-185444063203798/AnsiballZ_file.py
Nov 28 09:11:08 np0005538513.localdomain sudo[109785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:08 np0005538513.localdomain python3.9[109787]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:08 np0005538513.localdomain sudo[109785]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:08 np0005538513.localdomain sudo[109877]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icuwaoshxxxnfgcdjbjvqltmppvrbrqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321068.6117797-563-53218354933023/AnsiballZ_file.py
Nov 28 09:11:08 np0005538513.localdomain sudo[109877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:09 np0005538513.localdomain python3.9[109879]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:09 np0005538513.localdomain sudo[109877]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:09 np0005538513.localdomain sudo[109969]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhuhfrywruauctlnphpvdshutjtpztpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321069.1805592-563-243301185221306/AnsiballZ_file.py
Nov 28 09:11:09 np0005538513.localdomain sudo[109969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31420 DF PROTO=TCP SPT=49396 DPT=9102 SEQ=1959724451 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB09B7420000000001030307) 
Nov 28 09:11:09 np0005538513.localdomain python3.9[109971]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:09 np0005538513.localdomain sudo[109969]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:10 np0005538513.localdomain sudo[110061]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwiiwyhumuuxutgbfyxljfrbclhdvsuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321069.7847013-563-150120801529133/AnsiballZ_file.py
Nov 28 09:11:10 np0005538513.localdomain sudo[110061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:10 np0005538513.localdomain python3.9[110063]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:10 np0005538513.localdomain sudo[110061]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:10 np0005538513.localdomain sudo[110153]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yymfwwslvihvcribqplfgydfglliigrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321070.3895683-563-49029110560216/AnsiballZ_file.py
Nov 28 09:11:10 np0005538513.localdomain sudo[110153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:10 np0005538513.localdomain python3.9[110155]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:10 np0005538513.localdomain sudo[110153]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:11 np0005538513.localdomain sudo[110245]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rrhiiocixcgakrcpypkehbeotiqpqsig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321070.9865878-563-99921399086064/AnsiballZ_file.py
Nov 28 09:11:11 np0005538513.localdomain sudo[110245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:11 np0005538513.localdomain python3.9[110247]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:11 np0005538513.localdomain sudo[110245]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:11 np0005538513.localdomain sudo[110337]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obdcztkhmupmflzyighboufnbheemivm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321071.5824513-563-103717495346816/AnsiballZ_file.py
Nov 28 09:11:11 np0005538513.localdomain sudo[110337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:12 np0005538513.localdomain python3.9[110339]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:12 np0005538513.localdomain sudo[110337]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47506 DF PROTO=TCP SPT=52916 DPT=9882 SEQ=35957365 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB09C1820000000001030307) 
Nov 28 09:11:12 np0005538513.localdomain sudo[110429]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oyheetykafxzjsvwklobuhcyhruxcssk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321072.143893-563-108529198040035/AnsiballZ_file.py
Nov 28 09:11:12 np0005538513.localdomain sudo[110429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:12 np0005538513.localdomain python3.9[110431]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:12 np0005538513.localdomain sudo[110429]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:13 np0005538513.localdomain sudo[110521]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yubbnxegetpretdtcdbxzocerjfosomr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321072.7541132-563-35811566540822/AnsiballZ_file.py
Nov 28 09:11:13 np0005538513.localdomain sudo[110521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:13 np0005538513.localdomain python3.9[110523]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:13 np0005538513.localdomain sudo[110521]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:13 np0005538513.localdomain sudo[110613]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hsdzokhyqmiwgpcnhifqbhclpdipbsia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321073.353809-563-153603013299610/AnsiballZ_file.py
Nov 28 09:11:13 np0005538513.localdomain sudo[110613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:13 np0005538513.localdomain python3.9[110615]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:13 np0005538513.localdomain sudo[110613]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:14 np0005538513.localdomain sudo[110705]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hnrugdokajeymbjwwthtjfasqswxcede ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321073.974908-563-31314075183574/AnsiballZ_file.py
Nov 28 09:11:14 np0005538513.localdomain sudo[110705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:14 np0005538513.localdomain python3.9[110707]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:14 np0005538513.localdomain sudo[110705]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:14 np0005538513.localdomain sudo[110797]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzlsvmtprhmopwwjezylrhpfqnmzdnzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321074.572427-563-19816021918336/AnsiballZ_file.py
Nov 28 09:11:14 np0005538513.localdomain sudo[110797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:15 np0005538513.localdomain python3.9[110799]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:15 np0005538513.localdomain sudo[110797]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:15 np0005538513.localdomain sudo[110889]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyzzwdhssbzcrvvccfazvmcqrbuhzsic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321075.154718-563-216689515737730/AnsiballZ_file.py
Nov 28 09:11:15 np0005538513.localdomain sudo[110889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:15 np0005538513.localdomain python3.9[110891]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:15 np0005538513.localdomain sudo[110889]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:15 np0005538513.localdomain sudo[110981]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mkjzliflhjhlcbbmuwweyqkeribkbisv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321075.7262957-563-36105169424701/AnsiballZ_file.py
Nov 28 09:11:15 np0005538513.localdomain sudo[110981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:16 np0005538513.localdomain python3.9[110983]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:16 np0005538513.localdomain sudo[110981]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:16 np0005538513.localdomain sudo[111073]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fopbyvwqkqnphpfwdqldivmupdelhitb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321076.321268-563-111723656643634/AnsiballZ_file.py
Nov 28 09:11:16 np0005538513.localdomain sudo[111073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:16 np0005538513.localdomain python3.9[111075]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:16 np0005538513.localdomain sudo[111073]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:17 np0005538513.localdomain sudo[111165]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-syxjqaputpanufdcyjcfwmotfnhicndg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321076.9482172-563-17840463687969/AnsiballZ_file.py
Nov 28 09:11:17 np0005538513.localdomain sudo[111165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45121 DF PROTO=TCP SPT=51856 DPT=9882 SEQ=2081403544 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB09D5820000000001030307) 
Nov 28 09:11:17 np0005538513.localdomain python3.9[111167]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:17 np0005538513.localdomain sudo[111165]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:17 np0005538513.localdomain sudo[111257]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-httfscpsfbntksiulacsbbzksvltjlpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321077.5211399-563-21953729934278/AnsiballZ_file.py
Nov 28 09:11:17 np0005538513.localdomain sudo[111257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:17 np0005538513.localdomain python3.9[111259]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:17 np0005538513.localdomain sudo[111257]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:18 np0005538513.localdomain sudo[111349]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdgyjcpwzfvxsrkgsrhbxtuaasfvcmvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321078.1073055-563-243492800509182/AnsiballZ_file.py
Nov 28 09:11:18 np0005538513.localdomain sudo[111349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:18 np0005538513.localdomain python3.9[111351]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:18 np0005538513.localdomain sudo[111349]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:18 np0005538513.localdomain sudo[111441]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdbrwyrfmfqhtnmappcobkaczxhvkgju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321078.6900537-563-80844089988805/AnsiballZ_file.py
Nov 28 09:11:18 np0005538513.localdomain sudo[111441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:19 np0005538513.localdomain python3.9[111443]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57945 DF PROTO=TCP SPT=41722 DPT=9105 SEQ=2884774350 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB09DD020000000001030307) 
Nov 28 09:11:19 np0005538513.localdomain sudo[111441]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:20 np0005538513.localdomain sudo[111533]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltokxqcnnbzjpmwpdecrwrqzvozocytm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321080.1240613-1013-237544454731786/AnsiballZ_file.py
Nov 28 09:11:20 np0005538513.localdomain sudo[111533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:20 np0005538513.localdomain python3.9[111535]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:20 np0005538513.localdomain sudo[111533]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:20 np0005538513.localdomain sudo[111625]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-etozrdqlafknpmbzgaleeofnreylkblq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321080.7186303-1013-209163179792705/AnsiballZ_file.py
Nov 28 09:11:20 np0005538513.localdomain sudo[111625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:21 np0005538513.localdomain python3.9[111627]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:21 np0005538513.localdomain sudo[111625]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57946 DF PROTO=TCP SPT=41722 DPT=9105 SEQ=2884774350 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB09E5020000000001030307) 
Nov 28 09:11:21 np0005538513.localdomain sudo[111717]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qukxfjomeybzpvhmsxcogmajvftxukwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321081.267967-1013-214345052887253/AnsiballZ_file.py
Nov 28 09:11:21 np0005538513.localdomain sudo[111717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:21 np0005538513.localdomain python3.9[111719]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:21 np0005538513.localdomain sudo[111717]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:22 np0005538513.localdomain sudo[111809]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yfgtvudllqlkivuptqnrrknqkmkhsoch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321081.8137596-1013-232959751813776/AnsiballZ_file.py
Nov 28 09:11:22 np0005538513.localdomain sudo[111809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:22 np0005538513.localdomain python3.9[111811]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:22 np0005538513.localdomain sudo[111809]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:22 np0005538513.localdomain sudo[111901]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-odgsphydluuibmaadremahavtmuifihe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321082.4048438-1013-15262833362428/AnsiballZ_file.py
Nov 28 09:11:22 np0005538513.localdomain sudo[111901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:22 np0005538513.localdomain python3.9[111903]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:22 np0005538513.localdomain sudo[111901]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:23 np0005538513.localdomain sudo[111993]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-trczcmwekxlhipaigvkmdxpdfvxlhlfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321082.9778771-1013-257655091500542/AnsiballZ_file.py
Nov 28 09:11:23 np0005538513.localdomain sudo[111993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:23 np0005538513.localdomain python3.9[111995]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:23 np0005538513.localdomain sudo[111993]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:23 np0005538513.localdomain sudo[112085]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnrpwakikgkyijmbsszkbjqriuflmpyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321083.5845456-1013-74662679309435/AnsiballZ_file.py
Nov 28 09:11:23 np0005538513.localdomain sudo[112085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:24 np0005538513.localdomain python3.9[112087]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:24 np0005538513.localdomain sudo[112085]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:24 np0005538513.localdomain sudo[112177]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvncealynbbagjhjuufdikjxizwmewto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321084.1952307-1013-130412490788031/AnsiballZ_file.py
Nov 28 09:11:24 np0005538513.localdomain sudo[112177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:24 np0005538513.localdomain python3.9[112179]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:24 np0005538513.localdomain sudo[112177]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:25 np0005538513.localdomain sudo[112269]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-papxkhfhumfjgktraztkkpuechoeiykq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321084.9537506-1013-217561968975882/AnsiballZ_file.py
Nov 28 09:11:25 np0005538513.localdomain sudo[112269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57947 DF PROTO=TCP SPT=41722 DPT=9105 SEQ=2884774350 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB09F4C30000000001030307) 
Nov 28 09:11:25 np0005538513.localdomain python3.9[112271]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:25 np0005538513.localdomain sudo[112269]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:25 np0005538513.localdomain sudo[112361]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hsekgbqolobmcddojxzsicbfawohrizy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321085.504641-1013-184869728549547/AnsiballZ_file.py
Nov 28 09:11:25 np0005538513.localdomain sudo[112361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:25 np0005538513.localdomain python3.9[112363]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:25 np0005538513.localdomain sudo[112361]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:26 np0005538513.localdomain sudo[112453]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhdlablzzoswomqxmfpooqchcscoirwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321086.0726066-1013-156642116038949/AnsiballZ_file.py
Nov 28 09:11:26 np0005538513.localdomain sudo[112453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:26 np0005538513.localdomain python3.9[112455]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:26 np0005538513.localdomain sudo[112453]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:26 np0005538513.localdomain sudo[112545]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzgmkuaemwimvfdxduwouyhmnxkaltyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321086.6385334-1013-127690617225694/AnsiballZ_file.py
Nov 28 09:11:26 np0005538513.localdomain sudo[112545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:27 np0005538513.localdomain python3.9[112547]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:27 np0005538513.localdomain sudo[112545]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:27 np0005538513.localdomain sudo[112637]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjdlzevpbumaxwotuoejuieoabwbgqmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321087.1953084-1013-229672626298175/AnsiballZ_file.py
Nov 28 09:11:27 np0005538513.localdomain sudo[112637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:27 np0005538513.localdomain python3.9[112639]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:27 np0005538513.localdomain sudo[112637]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:28 np0005538513.localdomain sudo[112729]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czpzvblapcqkiqasreawvnttemiuiiva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321087.7543318-1013-80923663609195/AnsiballZ_file.py
Nov 28 09:11:28 np0005538513.localdomain sudo[112729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31253 DF PROTO=TCP SPT=52284 DPT=9101 SEQ=1031790436 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB09FFDA0000000001030307) 
Nov 28 09:11:28 np0005538513.localdomain python3.9[112731]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:28 np0005538513.localdomain sudo[112729]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:28 np0005538513.localdomain sudo[112821]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cepqmqfztqjxavzbraknwhhcrzxbzlom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321088.3259516-1013-12532666721531/AnsiballZ_file.py
Nov 28 09:11:28 np0005538513.localdomain sudo[112821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:28 np0005538513.localdomain python3.9[112823]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:28 np0005538513.localdomain sudo[112821]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:29 np0005538513.localdomain sudo[112913]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mghchdxvgrshkrgjqradbjvfuxxmcptw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321088.905519-1013-103537683696563/AnsiballZ_file.py
Nov 28 09:11:29 np0005538513.localdomain sudo[112913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:29 np0005538513.localdomain python3.9[112915]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:29 np0005538513.localdomain sudo[112913]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:29 np0005538513.localdomain sudo[113005]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqocehsgjvwygwcpkmxtrqvkduyoosum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321089.4893532-1013-208371462120666/AnsiballZ_file.py
Nov 28 09:11:29 np0005538513.localdomain sudo[113005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:29 np0005538513.localdomain python3.9[113007]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:29 np0005538513.localdomain sudo[113005]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:30 np0005538513.localdomain sudo[113097]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvakuhzyabntnnljuecklnxwbrngawgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321090.0858698-1013-116172577228253/AnsiballZ_file.py
Nov 28 09:11:30 np0005538513.localdomain sudo[113097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:30 np0005538513.localdomain python3.9[113099]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:30 np0005538513.localdomain sudo[113097]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:31 np0005538513.localdomain sudo[113189]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-urnajlaomjcrcjzmrujeurwyabjamvah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321090.9059694-1013-21391298172884/AnsiballZ_file.py
Nov 28 09:11:31 np0005538513.localdomain sudo[113189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31255 DF PROTO=TCP SPT=52284 DPT=9101 SEQ=1031790436 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A0C020000000001030307) 
Nov 28 09:11:31 np0005538513.localdomain python3.9[113191]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:31 np0005538513.localdomain sudo[113189]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:31 np0005538513.localdomain sudo[113281]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubqfcfxyghztviadspmyubjmhvjwwfsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321091.5007772-1013-256777292749914/AnsiballZ_file.py
Nov 28 09:11:31 np0005538513.localdomain sudo[113281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:31 np0005538513.localdomain python3.9[113283]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:31 np0005538513.localdomain sudo[113281]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:32 np0005538513.localdomain sudo[113373]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zmjaqbrqkpgxksqfnxwlfsgkwgmgvhsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321092.0963113-1013-181473301779678/AnsiballZ_file.py
Nov 28 09:11:32 np0005538513.localdomain sudo[113373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:32 np0005538513.localdomain python3.9[113375]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:32 np0005538513.localdomain sudo[113373]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12806 DF PROTO=TCP SPT=59416 DPT=9102 SEQ=1390093034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A14C30000000001030307) 
Nov 28 09:11:33 np0005538513.localdomain sudo[113465]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jizwbmncnbdwtwaajzoocsrmkkybbahr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321093.5749393-1460-40812949602499/AnsiballZ_command.py
Nov 28 09:11:33 np0005538513.localdomain sudo[113465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:34 np0005538513.localdomain python3.9[113467]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:34 np0005538513.localdomain sudo[113465]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:35 np0005538513.localdomain python3.9[113559]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 09:11:35 np0005538513.localdomain sudo[113649]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utjbcrjwpbclftwopcvwngjgralylbph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321095.3472204-1514-227486834680893/AnsiballZ_systemd_service.py
Nov 28 09:11:35 np0005538513.localdomain sudo[113649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:35 np0005538513.localdomain python3.9[113651]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:11:35 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:11:36 np0005538513.localdomain systemd-rc-local-generator[113677]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:11:36 np0005538513.localdomain systemd-sysv-generator[113682]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:11:36 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:11:36 np0005538513.localdomain sudo[113649]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13450 DF PROTO=TCP SPT=43218 DPT=9100 SEQ=3957548637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A20420000000001030307) 
Nov 28 09:11:36 np0005538513.localdomain sudo[113777]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cbeqispqxmnzzovphksyttquqdxcyqyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321096.4323142-1538-60984529104307/AnsiballZ_command.py
Nov 28 09:11:36 np0005538513.localdomain sudo[113777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:36 np0005538513.localdomain python3.9[113779]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:36 np0005538513.localdomain sudo[113777]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:37 np0005538513.localdomain sudo[113870]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxebtkjnohttrspowyxjhscekvsppkbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321097.0447917-1538-47285365930273/AnsiballZ_command.py
Nov 28 09:11:37 np0005538513.localdomain sudo[113870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:37 np0005538513.localdomain python3.9[113872]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_ipmi.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:37 np0005538513.localdomain sudo[113870]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:38 np0005538513.localdomain sudo[113963]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xjdvtfplymxegjjtwcgpeyvqcsjfmpnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321097.7723758-1538-257976291273613/AnsiballZ_command.py
Nov 28 09:11:38 np0005538513.localdomain sudo[113963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:38 np0005538513.localdomain python3.9[113965]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_collectd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:38 np0005538513.localdomain sudo[113963]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:38 np0005538513.localdomain sudo[114056]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-noubvvrhqgqaednpcckleetfwbdfpeol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321098.3494997-1538-133696694346163/AnsiballZ_command.py
Nov 28 09:11:38 np0005538513.localdomain sudo[114056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:38 np0005538513.localdomain python3.9[114058]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_iscsid.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:38 np0005538513.localdomain sudo[114056]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:39 np0005538513.localdomain sudo[114149]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gegijzdkjnelxcfbnipdxurmurgdcucm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321098.9395168-1538-13817182502868/AnsiballZ_command.py
Nov 28 09:11:39 np0005538513.localdomain sudo[114149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7256 DF PROTO=TCP SPT=49748 DPT=9100 SEQ=2026696104 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A2B820000000001030307) 
Nov 28 09:11:39 np0005538513.localdomain python3.9[114151]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_logrotate_crond.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:39 np0005538513.localdomain sudo[114149]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:39 np0005538513.localdomain sudo[114242]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhyfeykbpwajnqsodykedxjagxqmkaxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321099.5313227-1538-28383296237650/AnsiballZ_command.py
Nov 28 09:11:39 np0005538513.localdomain sudo[114242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:39 np0005538513.localdomain python3.9[114244]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_metrics_qdr.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:40 np0005538513.localdomain sudo[114242]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:40 np0005538513.localdomain sudo[114335]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqbwflpvagixaqqtzzrejlfakftezvlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321100.127769-1538-30770380074327/AnsiballZ_command.py
Nov 28 09:11:40 np0005538513.localdomain sudo[114335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:40 np0005538513.localdomain python3.9[114337]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_dhcp.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:40 np0005538513.localdomain sudo[114335]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:41 np0005538513.localdomain sudo[114428]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jxyexyukqticnewvsmbkpghhvpgiqoka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321100.757972-1538-190076276911738/AnsiballZ_command.py
Nov 28 09:11:41 np0005538513.localdomain sudo[114428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:41 np0005538513.localdomain python3.9[114430]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_l3_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:41 np0005538513.localdomain sudo[114428]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:41 np0005538513.localdomain sudo[114521]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wodhprqrpqmzoyeeuatgfphkoxzjiznt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321101.342526-1538-155835366671321/AnsiballZ_command.py
Nov 28 09:11:41 np0005538513.localdomain sudo[114521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:41 np0005538513.localdomain python3.9[114523]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_ovs_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:41 np0005538513.localdomain sudo[114521]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:42 np0005538513.localdomain sudo[114614]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pwbrfwjzqwnpezfbmrkxknvzjqcjlfea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321101.924159-1538-192719814266423/AnsiballZ_command.py
Nov 28 09:11:42 np0005538513.localdomain sudo[114614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:42 np0005538513.localdomain python3.9[114616]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13452 DF PROTO=TCP SPT=43218 DPT=9100 SEQ=3957548637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A38020000000001030307) 
Nov 28 09:11:42 np0005538513.localdomain sudo[114614]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:42 np0005538513.localdomain sudo[114707]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmostmiswjqyjifhmwjzfjbsvgegwssx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321102.5881772-1538-240104369440688/AnsiballZ_command.py
Nov 28 09:11:42 np0005538513.localdomain sudo[114707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:43 np0005538513.localdomain python3.9[114709]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:43 np0005538513.localdomain sudo[114707]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:43 np0005538513.localdomain sudo[114800]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xcosfpnbsmasjiahpzoyuusatobjhrhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321103.3230336-1538-192430407739907/AnsiballZ_command.py
Nov 28 09:11:43 np0005538513.localdomain sudo[114800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:43 np0005538513.localdomain python3.9[114802]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:43 np0005538513.localdomain sudo[114800]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:44 np0005538513.localdomain sudo[114893]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjeidetseleranzyudyvaznhbxxbsvwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321103.8819294-1538-89724136016048/AnsiballZ_command.py
Nov 28 09:11:44 np0005538513.localdomain sudo[114893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:44 np0005538513.localdomain python3.9[114895]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:44 np0005538513.localdomain sudo[114893]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:44 np0005538513.localdomain sudo[114986]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzbrcekscpnemernqdjsayynbyvqcpwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321104.4597096-1538-267481589747001/AnsiballZ_command.py
Nov 28 09:11:44 np0005538513.localdomain sudo[114986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:44 np0005538513.localdomain python3.9[114988]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:44 np0005538513.localdomain sudo[114986]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:45 np0005538513.localdomain sudo[115079]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdlpqpyeravcizkrmpbgulfsqibwphvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321105.0231543-1538-5596898712156/AnsiballZ_command.py
Nov 28 09:11:45 np0005538513.localdomain sudo[115079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:45 np0005538513.localdomain python3.9[115081]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:45 np0005538513.localdomain sudo[115079]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:45 np0005538513.localdomain sudo[115172]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ispdcixmrolekneuhlqrogsrnojakpri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321105.5892208-1538-272671621947016/AnsiballZ_command.py
Nov 28 09:11:45 np0005538513.localdomain sudo[115172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:46 np0005538513.localdomain python3.9[115174]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud_recover.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:46 np0005538513.localdomain sudo[115172]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:46 np0005538513.localdomain sudo[115265]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cixztdwoxkebzmbmkyixowuzjlbgrpfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321106.1557715-1538-261207526494072/AnsiballZ_command.py
Nov 28 09:11:46 np0005538513.localdomain sudo[115265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:46 np0005538513.localdomain python3.9[115267]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6490 DF PROTO=TCP SPT=43228 DPT=9882 SEQ=1533842803 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A4A820000000001030307) 
Nov 28 09:11:47 np0005538513.localdomain sudo[115265]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:48 np0005538513.localdomain sudo[115358]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmbdclgpnifwuxinfalasvadjfkklggc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321107.9282887-1538-264369879338528/AnsiballZ_command.py
Nov 28 09:11:48 np0005538513.localdomain sudo[115358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:48 np0005538513.localdomain python3.9[115360]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:48 np0005538513.localdomain sudo[115358]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:48 np0005538513.localdomain sudo[115451]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adgxiyqwijlrrubehbmigvvcanefnesj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321108.5230036-1538-133976039031778/AnsiballZ_command.py
Nov 28 09:11:48 np0005538513.localdomain sudo[115451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:48 np0005538513.localdomain python3.9[115453]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_controller.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:49 np0005538513.localdomain sudo[115451]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35531 DF PROTO=TCP SPT=36474 DPT=9105 SEQ=3246025437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A52420000000001030307) 
Nov 28 09:11:49 np0005538513.localdomain sudo[115544]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-maftwxqxrnopskaphortdfazmywmfwsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321109.1095102-1538-209519669987387/AnsiballZ_command.py
Nov 28 09:11:49 np0005538513.localdomain sudo[115544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:49 np0005538513.localdomain python3.9[115546]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_metadata_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:49 np0005538513.localdomain sudo[115544]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:49 np0005538513.localdomain sudo[115637]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ulrjjoumfwwjwqvngwzhizxlmorrigtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321109.6830034-1538-89131868931863/AnsiballZ_command.py
Nov 28 09:11:49 np0005538513.localdomain sudo[115637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:50 np0005538513.localdomain python3.9[115639]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_rsyslog.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:50 np0005538513.localdomain sudo[115637]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35532 DF PROTO=TCP SPT=36474 DPT=9105 SEQ=3246025437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A5A420000000001030307) 
Nov 28 09:11:53 np0005538513.localdomain sshd[103971]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:11:53 np0005538513.localdomain systemd[1]: session-37.scope: Deactivated successfully.
Nov 28 09:11:53 np0005538513.localdomain systemd[1]: session-37.scope: Consumed 47.937s CPU time.
Nov 28 09:11:53 np0005538513.localdomain systemd-logind[764]: Session 37 logged out. Waiting for processes to exit.
Nov 28 09:11:53 np0005538513.localdomain systemd-logind[764]: Removed session 37.
Nov 28 09:11:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35533 DF PROTO=TCP SPT=36474 DPT=9105 SEQ=3246025437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A6A030000000001030307) 
Nov 28 09:11:57 np0005538513.localdomain sudo[115655]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:11:57 np0005538513.localdomain sudo[115655]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:11:57 np0005538513.localdomain sudo[115655]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:57 np0005538513.localdomain sudo[115670]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:11:57 np0005538513.localdomain sudo[115670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:11:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22290 DF PROTO=TCP SPT=51966 DPT=9101 SEQ=3923722619 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A75090000000001030307) 
Nov 28 09:11:58 np0005538513.localdomain sudo[115670]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:59 np0005538513.localdomain sudo[115717]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:11:59 np0005538513.localdomain sudo[115717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:11:59 np0005538513.localdomain sudo[115717]: pam_unix(sudo:session): session closed for user root
Nov 28 09:12:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22292 DF PROTO=TCP SPT=51966 DPT=9101 SEQ=3923722619 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A81020000000001030307) 
Nov 28 09:12:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35534 DF PROTO=TCP SPT=36474 DPT=9105 SEQ=3246025437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A89830000000001030307) 
Nov 28 09:12:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13522 DF PROTO=TCP SPT=60118 DPT=9100 SEQ=1126112828 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A95820000000001030307) 
Nov 28 09:12:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49229 DF PROTO=TCP SPT=44206 DPT=9100 SEQ=2623352065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0AA1830000000001030307) 
Nov 28 09:12:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13524 DF PROTO=TCP SPT=60118 DPT=9100 SEQ=1126112828 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0AAD430000000001030307) 
Nov 28 09:12:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29832 DF PROTO=TCP SPT=50094 DPT=9882 SEQ=1137643269 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0ABFC30000000001030307) 
Nov 28 09:12:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38586 DF PROTO=TCP SPT=42676 DPT=9105 SEQ=2931150111 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0AC7420000000001030307) 
Nov 28 09:12:20 np0005538513.localdomain sshd[115732]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:12:20 np0005538513.localdomain sshd[115732]: Accepted publickey for zuul from 192.168.122.31 port 56230 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:12:20 np0005538513.localdomain systemd-logind[764]: New session 38 of user zuul.
Nov 28 09:12:20 np0005538513.localdomain systemd[1]: Started Session 38 of User zuul.
Nov 28 09:12:20 np0005538513.localdomain sshd[115732]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:12:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38587 DF PROTO=TCP SPT=42676 DPT=9105 SEQ=2931150111 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0ACF430000000001030307) 
Nov 28 09:12:21 np0005538513.localdomain python3.9[115825]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 28 09:12:22 np0005538513.localdomain python3.9[115929]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:12:23 np0005538513.localdomain sudo[116019]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sagnwsbntgflxiqctdhgnikqxysqpcjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321142.6647704-93-46792553797439/AnsiballZ_command.py
Nov 28 09:12:23 np0005538513.localdomain sudo[116019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:12:23 np0005538513.localdomain python3.9[116021]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:12:23 np0005538513.localdomain sudo[116019]: pam_unix(sudo:session): session closed for user root
Nov 28 09:12:24 np0005538513.localdomain sudo[116112]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qvyeqbmdxefgmzgeopdkfgdetnkxusdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321143.6461802-129-71613761074473/AnsiballZ_stat.py
Nov 28 09:12:24 np0005538513.localdomain sudo[116112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:12:24 np0005538513.localdomain python3.9[116114]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:12:24 np0005538513.localdomain sudo[116112]: pam_unix(sudo:session): session closed for user root
Nov 28 09:12:25 np0005538513.localdomain sudo[116204]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-scwuhzvyyzrzhtncjpurdstbnylrrnir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321144.5588317-153-220283240423371/AnsiballZ_file.py
Nov 28 09:12:25 np0005538513.localdomain sudo[116204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:12:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38588 DF PROTO=TCP SPT=42676 DPT=9105 SEQ=2931150111 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0ADF020000000001030307) 
Nov 28 09:12:25 np0005538513.localdomain python3.9[116206]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:12:25 np0005538513.localdomain sudo[116204]: pam_unix(sudo:session): session closed for user root
Nov 28 09:12:25 np0005538513.localdomain sudo[116296]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqmnnowqanaufjxerbwnrijbsqyjqxij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321145.4133835-177-143025092198065/AnsiballZ_stat.py
Nov 28 09:12:25 np0005538513.localdomain sudo[116296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:12:25 np0005538513.localdomain python3.9[116298]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:12:25 np0005538513.localdomain sudo[116296]: pam_unix(sudo:session): session closed for user root
Nov 28 09:12:26 np0005538513.localdomain sudo[116369]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwjwyldveqmrjfbsmrdsdlrmdibpozgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321145.4133835-177-143025092198065/AnsiballZ_copy.py
Nov 28 09:12:26 np0005538513.localdomain sudo[116369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:12:26 np0005538513.localdomain python3.9[116371]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321145.4133835-177-143025092198065/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:12:26 np0005538513.localdomain sudo[116369]: pam_unix(sudo:session): session closed for user root
Nov 28 09:12:27 np0005538513.localdomain sudo[116461]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mqkynzmvcamqazumsqhdadlmrdyaqhco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321146.909894-222-187081869280040/AnsiballZ_setup.py
Nov 28 09:12:27 np0005538513.localdomain sudo[116461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:12:27 np0005538513.localdomain python3.9[116463]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:12:27 np0005538513.localdomain sudo[116461]: pam_unix(sudo:session): session closed for user root
Nov 28 09:12:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27309 DF PROTO=TCP SPT=38220 DPT=9101 SEQ=842800084 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0AEA390000000001030307) 
Nov 28 09:12:28 np0005538513.localdomain sudo[116557]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbwgabsaozmdrkmtxtikgeebjbsufnpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321147.9811113-246-65882383831165/AnsiballZ_file.py
Nov 28 09:12:28 np0005538513.localdomain sudo[116557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:12:28 np0005538513.localdomain python3.9[116559]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:12:28 np0005538513.localdomain sudo[116557]: pam_unix(sudo:session): session closed for user root
Nov 28 09:12:28 np0005538513.localdomain sudo[116649]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yezjkmiifhmmjrxghkebooghqsboblnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321148.6964803-273-49716187588457/AnsiballZ_file.py
Nov 28 09:12:28 np0005538513.localdomain sudo[116649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:12:29 np0005538513.localdomain python3.9[116651]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:12:29 np0005538513.localdomain sudo[116649]: pam_unix(sudo:session): session closed for user root
Nov 28 09:12:29 np0005538513.localdomain python3.9[116741]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:12:30 np0005538513.localdomain network[116758]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:12:30 np0005538513.localdomain network[116759]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:12:30 np0005538513.localdomain network[116760]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:12:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27311 DF PROTO=TCP SPT=38220 DPT=9101 SEQ=842800084 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0AF6430000000001030307) 
Nov 28 09:12:31 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:12:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6898 DF PROTO=TCP SPT=56826 DPT=9102 SEQ=3374587745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0AFF420000000001030307) 
Nov 28 09:12:35 np0005538513.localdomain python3.9[116957]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:12:36 np0005538513.localdomain python3.9[117047]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:12:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7407 DF PROTO=TCP SPT=45554 DPT=9100 SEQ=2959990013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B0AC20000000001030307) 
Nov 28 09:12:36 np0005538513.localdomain sudo[117141]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvnxwsbqqutamzkkpsrltkjotsvufyoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321156.568714-375-172043096682745/AnsiballZ_command.py
Nov 28 09:12:36 np0005538513.localdomain sudo[117141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:12:37 np0005538513.localdomain python3.9[117143]: ansible-ansible.legacy.command Invoked with _raw_params=# This is a hack to deploy RDO Delorean repos to RHEL as if it were Centos 9 Stream
                                                            set -euxo pipefail
                                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                                            python3 -m venv ./venv
                                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./repo-setup-main
                                                            # This is required for FIPS enabled until trunk.rdoproject.org
                                                            # is not being served from a centos7 host, tracked by
                                                            # https://issues.redhat.com/browse/RHOSZUUL-1517
                                                            dnf -y install crypto-policies
                                                            update-crypto-policies --set FIPS:NO-ENFORCE-EMS
                                                            ./venv/bin/repo-setup current-podified -b antelope -d centos9 --stream
                                                            
                                                            # Exclude ceph-common-18.2.7 as it's pulling newer openssl not compatible
                                                            # with rhel 9.2 openssh
                                                            dnf config-manager --setopt centos9-storage.exclude="ceph-common-18.2.7" --save
                                                            # FIXME: perform dnf upgrade for other packages in EDPM ansible
                                                            # here we only ensuring that decontainerized libvirt can start
                                                            dnf -y upgrade openstack-selinux
                                                            rm -f /run/virtlogd.pid
                                                            
                                                            rm -rf repo-setup-main
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:12:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13455 DF PROTO=TCP SPT=43218 DPT=9100 SEQ=3957548637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B15830000000001030307) 
Nov 28 09:12:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7409 DF PROTO=TCP SPT=45554 DPT=9100 SEQ=2959990013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B22820000000001030307) 
Nov 28 09:12:46 np0005538513.localdomain systemd[1]: Stopping OpenSSH server daemon...
Nov 28 09:12:46 np0005538513.localdomain sshd[44969]: Received signal 15; terminating.
Nov 28 09:12:46 np0005538513.localdomain systemd[1]: sshd.service: Deactivated successfully.
Nov 28 09:12:46 np0005538513.localdomain systemd[1]: Stopped OpenSSH server daemon.
Nov 28 09:12:46 np0005538513.localdomain systemd[1]: sshd.service: Consumed 1.441s CPU time.
Nov 28 09:12:46 np0005538513.localdomain systemd[1]: Stopped target sshd-keygen.target.
Nov 28 09:12:46 np0005538513.localdomain systemd[1]: Stopping sshd-keygen.target...
Nov 28 09:12:46 np0005538513.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 09:12:46 np0005538513.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 09:12:46 np0005538513.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 09:12:46 np0005538513.localdomain systemd[1]: Reached target sshd-keygen.target.
Nov 28 09:12:46 np0005538513.localdomain systemd[1]: Starting OpenSSH server daemon...
Nov 28 09:12:46 np0005538513.localdomain sshd[117187]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:12:46 np0005538513.localdomain sshd[117187]: Server listening on 0.0.0.0 port 22.
Nov 28 09:12:46 np0005538513.localdomain sshd[117187]: Server listening on :: port 22.
Nov 28 09:12:46 np0005538513.localdomain systemd[1]: Started OpenSSH server daemon.
Nov 28 09:12:46 np0005538513.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 09:12:46 np0005538513.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 28 09:12:46 np0005538513.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 09:12:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11896 DF PROTO=TCP SPT=42266 DPT=9882 SEQ=2507771196 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B35020000000001030307) 
Nov 28 09:12:47 np0005538513.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 09:12:47 np0005538513.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 28 09:12:47 np0005538513.localdomain systemd[1]: run-r4557b5a9d04f48e79325a350f4e111bf.service: Deactivated successfully.
Nov 28 09:12:47 np0005538513.localdomain systemd[1]: run-rf4bf9086689948fcbad8d1e7ff44a82b.service: Deactivated successfully.
Nov 28 09:12:47 np0005538513.localdomain systemd[1]: Stopping OpenSSH server daemon...
Nov 28 09:12:47 np0005538513.localdomain sshd[117187]: Received signal 15; terminating.
Nov 28 09:12:47 np0005538513.localdomain systemd[1]: sshd.service: Deactivated successfully.
Nov 28 09:12:47 np0005538513.localdomain systemd[1]: Stopped OpenSSH server daemon.
Nov 28 09:12:47 np0005538513.localdomain systemd[1]: Stopped target sshd-keygen.target.
Nov 28 09:12:47 np0005538513.localdomain systemd[1]: Stopping sshd-keygen.target...
Nov 28 09:12:47 np0005538513.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 09:12:47 np0005538513.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 09:12:47 np0005538513.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 09:12:47 np0005538513.localdomain systemd[1]: Reached target sshd-keygen.target.
Nov 28 09:12:47 np0005538513.localdomain systemd[1]: Starting OpenSSH server daemon...
Nov 28 09:12:47 np0005538513.localdomain sshd[117359]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:12:47 np0005538513.localdomain sshd[117359]: Server listening on 0.0.0.0 port 22.
Nov 28 09:12:47 np0005538513.localdomain sshd[117359]: Server listening on :: port 22.
Nov 28 09:12:47 np0005538513.localdomain systemd[1]: Started OpenSSH server daemon.
Nov 28 09:12:48 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2 DF PROTO=TCP SPT=35578 DPT=9105 SEQ=1878191914 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B387B0000000001030307) 
Nov 28 09:12:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4 DF PROTO=TCP SPT=35578 DPT=9105 SEQ=1878191914 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B44820000000001030307) 
Nov 28 09:12:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5 DF PROTO=TCP SPT=35578 DPT=9105 SEQ=1878191914 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B54420000000001030307) 
Nov 28 09:12:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54495 DF PROTO=TCP SPT=52084 DPT=9101 SEQ=1576886192 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B5F690000000001030307) 
Nov 28 09:12:59 np0005538513.localdomain sudo[117456]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:12:59 np0005538513.localdomain sudo[117456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:12:59 np0005538513.localdomain sudo[117456]: pam_unix(sudo:session): session closed for user root
Nov 28 09:12:59 np0005538513.localdomain sudo[117471]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:12:59 np0005538513.localdomain sudo[117471]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:13:00 np0005538513.localdomain podman[117555]: 2025-11-28 09:13:00.492168491 +0000 UTC m=+0.092795288 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, version=7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, release=553, GIT_CLEAN=True, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, build-date=2025-09-24T08:57:55)
Nov 28 09:13:00 np0005538513.localdomain podman[117555]: 2025-11-28 09:13:00.595500935 +0000 UTC m=+0.196127692 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, name=rhceph, description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.component=rhceph-container, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.buildah.version=1.33.12, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, CEPH_POINT_RELEASE=, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:13:00 np0005538513.localdomain sudo[117471]: pam_unix(sudo:session): session closed for user root
Nov 28 09:13:00 np0005538513.localdomain sudo[117624]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:13:00 np0005538513.localdomain sudo[117624]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:13:00 np0005538513.localdomain sudo[117624]: pam_unix(sudo:session): session closed for user root
Nov 28 09:13:01 np0005538513.localdomain sudo[117639]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:13:01 np0005538513.localdomain sudo[117639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:13:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54497 DF PROTO=TCP SPT=52084 DPT=9101 SEQ=1576886192 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B6B820000000001030307) 
Nov 28 09:13:01 np0005538513.localdomain sudo[117639]: pam_unix(sudo:session): session closed for user root
Nov 28 09:13:02 np0005538513.localdomain sudo[117689]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:13:02 np0005538513.localdomain sudo[117689]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:13:02 np0005538513.localdomain sudo[117689]: pam_unix(sudo:session): session closed for user root
Nov 28 09:13:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43740 DF PROTO=TCP SPT=39052 DPT=9102 SEQ=2134080702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B74420000000001030307) 
Nov 28 09:13:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32980 DF PROTO=TCP SPT=60448 DPT=9102 SEQ=2357390439 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B7F820000000001030307) 
Nov 28 09:13:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13527 DF PROTO=TCP SPT=60118 DPT=9100 SEQ=1126112828 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B8B820000000001030307) 
Nov 28 09:13:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53970 DF PROTO=TCP SPT=41942 DPT=9100 SEQ=800026680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B97C20000000001030307) 
Nov 28 09:13:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8233 DF PROTO=TCP SPT=36980 DPT=9882 SEQ=4151490587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0BAA430000000001030307) 
Nov 28 09:13:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59422 DF PROTO=TCP SPT=53994 DPT=9105 SEQ=2631582463 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0BB1C20000000001030307) 
Nov 28 09:13:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59423 DF PROTO=TCP SPT=53994 DPT=9105 SEQ=2631582463 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0BB9C20000000001030307) 
Nov 28 09:13:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59424 DF PROTO=TCP SPT=53994 DPT=9105 SEQ=2631582463 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0BC9830000000001030307) 
Nov 28 09:13:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64987 DF PROTO=TCP SPT=47716 DPT=9101 SEQ=549047946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0BD4990000000001030307) 
Nov 28 09:13:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64989 DF PROTO=TCP SPT=47716 DPT=9101 SEQ=549047946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0BE0820000000001030307) 
Nov 28 09:13:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59425 DF PROTO=TCP SPT=53994 DPT=9105 SEQ=2631582463 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0BE9820000000001030307) 
Nov 28 09:13:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44952 DF PROTO=TCP SPT=35102 DPT=9100 SEQ=3646895248 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0BF5020000000001030307) 
Nov 28 09:13:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4950 DF PROTO=TCP SPT=40686 DPT=9102 SEQ=27247962 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C01420000000001030307) 
Nov 28 09:13:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8235 DF PROTO=TCP SPT=36980 DPT=9882 SEQ=4151490587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C0B820000000001030307) 
Nov 28 09:13:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25418 DF PROTO=TCP SPT=48412 DPT=9882 SEQ=2403479743 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C1F430000000001030307) 
Nov 28 09:13:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42779 DF PROTO=TCP SPT=34424 DPT=9105 SEQ=850738264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C27020000000001030307) 
Nov 28 09:13:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42780 DF PROTO=TCP SPT=34424 DPT=9105 SEQ=850738264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C2F030000000001030307) 
Nov 28 09:13:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42781 DF PROTO=TCP SPT=34424 DPT=9105 SEQ=850738264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C3EC20000000001030307) 
Nov 28 09:13:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38049 DF PROTO=TCP SPT=37430 DPT=9101 SEQ=2111172106 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C49CB0000000001030307) 
Nov 28 09:13:59 np0005538513.localdomain kernel: SELinux:  Converting 2754 SID table entries...
Nov 28 09:13:59 np0005538513.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 09:13:59 np0005538513.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 09:13:59 np0005538513.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 09:13:59 np0005538513.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 09:13:59 np0005538513.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 09:13:59 np0005538513.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 09:13:59 np0005538513.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 09:14:00 np0005538513.localdomain sudo[117141]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38051 DF PROTO=TCP SPT=37430 DPT=9101 SEQ=2111172106 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C55C20000000001030307) 
Nov 28 09:14:02 np0005538513.localdomain sudo[118088]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:14:02 np0005538513.localdomain dbus-broker-launch[755]: avc:  op=load_policy lsm=selinux seqno=17 res=1
Nov 28 09:14:02 np0005538513.localdomain sudo[118088]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:14:02 np0005538513.localdomain sudo[118088]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:03 np0005538513.localdomain sudo[118103]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:14:03 np0005538513.localdomain sudo[118103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:14:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51483 DF PROTO=TCP SPT=56590 DPT=9102 SEQ=1973360585 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C5EC20000000001030307) 
Nov 28 09:14:03 np0005538513.localdomain sudo[118103]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:04 np0005538513.localdomain sudo[118149]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:14:04 np0005538513.localdomain sudo[118149]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:14:04 np0005538513.localdomain sudo[118149]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43745 DF PROTO=TCP SPT=39052 DPT=9102 SEQ=2134080702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C69830000000001030307) 
Nov 28 09:14:06 np0005538513.localdomain sudo[118239]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bsvjrgvumfbaknpfktgsqjjfvxmlqcku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321245.9772208-402-78969108054805/AnsiballZ_file.py
Nov 28 09:14:06 np0005538513.localdomain sudo[118239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:06 np0005538513.localdomain python3.9[118241]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:14:06 np0005538513.localdomain sudo[118239]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:06 np0005538513.localdomain sudo[118331]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fcwpucgwnpsbgdlledzmgnplnizviynb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321246.659117-426-172103223993847/AnsiballZ_stat.py
Nov 28 09:14:06 np0005538513.localdomain sudo[118331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:07 np0005538513.localdomain python3.9[118333]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/edpm.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:14:07 np0005538513.localdomain sudo[118331]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:07 np0005538513.localdomain sudo[118404]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udbwiqtewdqnnatbdwhtotxkmqrivjkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321246.659117-426-172103223993847/AnsiballZ_copy.py
Nov 28 09:14:07 np0005538513.localdomain sudo[118404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:07 np0005538513.localdomain python3.9[118406]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/edpm.fact mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321246.659117-426-172103223993847/.source.fact _original_basename=.5ew5x_ts follow=False checksum=03aee63dcf9b49b0ac4473b2f1a1b5d3783aa639 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:14:07 np0005538513.localdomain sudo[118404]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:08 np0005538513.localdomain python3.9[118496]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:14:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53973 DF PROTO=TCP SPT=41942 DPT=9100 SEQ=800026680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C75820000000001030307) 
Nov 28 09:14:09 np0005538513.localdomain sudo[118592]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qdyqdmgoxqgprbcgrervqpvnbhzrqgow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321249.4119084-501-251273938605684/AnsiballZ_setup.py
Nov 28 09:14:09 np0005538513.localdomain sudo[118592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:09 np0005538513.localdomain python3.9[118594]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:14:10 np0005538513.localdomain sudo[118592]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:10 np0005538513.localdomain sudo[118646]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykdffzllcizrgozdnedqxyhwxsxjfwks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321249.4119084-501-251273938605684/AnsiballZ_dnf.py
Nov 28 09:14:10 np0005538513.localdomain sudo[118646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:10 np0005538513.localdomain python3.9[118648]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:14:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41221 DF PROTO=TCP SPT=50396 DPT=9100 SEQ=701157145 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C82030000000001030307) 
Nov 28 09:14:14 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:14:14 np0005538513.localdomain systemd-rc-local-generator[118681]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:14:14 np0005538513.localdomain systemd-sysv-generator[118687]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:14:14 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:14:14 np0005538513.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 09:14:15 np0005538513.localdomain sudo[118646]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:16 np0005538513.localdomain sudo[118785]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjlxctgxgryswjlixokhnistkhsupwnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321256.0232723-537-230063160167516/AnsiballZ_command.py
Nov 28 09:14:16 np0005538513.localdomain sudo[118785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:16 np0005538513.localdomain python3.9[118787]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:14:17 np0005538513.localdomain sudo[118785]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58354 DF PROTO=TCP SPT=41250 DPT=9882 SEQ=1232032057 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C94820000000001030307) 
Nov 28 09:14:18 np0005538513.localdomain sudo[119024]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhixgwibivjgqkcmntdvjhdpxhohxgls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321257.608073-561-241699924054525/AnsiballZ_selinux.py
Nov 28 09:14:18 np0005538513.localdomain sudo[119024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:18 np0005538513.localdomain python3.9[119026]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 28 09:14:18 np0005538513.localdomain sudo[119024]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38687 DF PROTO=TCP SPT=40280 DPT=9105 SEQ=623990083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C9C030000000001030307) 
Nov 28 09:14:19 np0005538513.localdomain sudo[119116]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yxrexpalwxoxhwaqhjqqgtkgpkvzbrke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321258.9576495-594-62895666236014/AnsiballZ_command.py
Nov 28 09:14:19 np0005538513.localdomain sudo[119116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:19 np0005538513.localdomain python3.9[119118]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 28 09:14:19 np0005538513.localdomain sudo[119116]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:20 np0005538513.localdomain sudo[119209]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvgsmbgetzqjttahmswglgjpdzljesam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321260.2503834-618-41635554352199/AnsiballZ_file.py
Nov 28 09:14:20 np0005538513.localdomain sudo[119209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:20 np0005538513.localdomain python3.9[119211]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:14:20 np0005538513.localdomain sudo[119209]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38688 DF PROTO=TCP SPT=40280 DPT=9105 SEQ=623990083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0CA4020000000001030307) 
Nov 28 09:14:21 np0005538513.localdomain sudo[119301]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xltrtgkonpjrbtzrfsrvnyfixzabosex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321260.951151-642-165176893770590/AnsiballZ_mount.py
Nov 28 09:14:21 np0005538513.localdomain sudo[119301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:21 np0005538513.localdomain python3.9[119303]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 28 09:14:21 np0005538513.localdomain sudo[119301]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:22 np0005538513.localdomain sudo[119393]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nysavenwbqrnilwtxqwqjybwrxuxwien ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321262.64539-726-234917804272057/AnsiballZ_file.py
Nov 28 09:14:22 np0005538513.localdomain sudo[119393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:23 np0005538513.localdomain python3.9[119395]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:14:23 np0005538513.localdomain sudo[119393]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:23 np0005538513.localdomain sudo[119485]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tdexcbdvrstgfiokqueuuargxiqwwndz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321263.4025307-750-234091474317493/AnsiballZ_stat.py
Nov 28 09:14:23 np0005538513.localdomain sudo[119485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:23 np0005538513.localdomain python3.9[119487]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:14:23 np0005538513.localdomain sudo[119485]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:24 np0005538513.localdomain sudo[119558]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iunjozcljjhxsvbtbmyyjmoalmyrywsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321263.4025307-750-234091474317493/AnsiballZ_copy.py
Nov 28 09:14:24 np0005538513.localdomain sudo[119558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:24 np0005538513.localdomain python3.9[119560]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321263.4025307-750-234091474317493/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:14:24 np0005538513.localdomain sudo[119558]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38689 DF PROTO=TCP SPT=40280 DPT=9105 SEQ=623990083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0CB3C30000000001030307) 
Nov 28 09:14:25 np0005538513.localdomain sudo[119650]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-irxtojkmgfimcnrdjocgencnhykolsza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321265.5042274-822-234585076299518/AnsiballZ_stat.py
Nov 28 09:14:25 np0005538513.localdomain sudo[119650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:25 np0005538513.localdomain python3.9[119652]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:14:25 np0005538513.localdomain sudo[119650]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:27 np0005538513.localdomain sudo[119744]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvilpldrvlldjkmxregxkqjjphnbndly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321266.6213722-861-100680398916145/AnsiballZ_getent.py
Nov 28 09:14:27 np0005538513.localdomain sudo[119744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:27 np0005538513.localdomain python3.9[119746]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 28 09:14:27 np0005538513.localdomain sudo[119744]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:27 np0005538513.localdomain sudo[119837]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfclrufdslamimezxsonvllwwqcuverq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321267.6472452-891-163315777376965/AnsiballZ_getent.py
Nov 28 09:14:27 np0005538513.localdomain sudo[119837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62894 DF PROTO=TCP SPT=49510 DPT=9101 SEQ=1712621461 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0CBEFA0000000001030307) 
Nov 28 09:14:28 np0005538513.localdomain python3.9[119839]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 28 09:14:28 np0005538513.localdomain sudo[119837]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:28 np0005538513.localdomain sudo[119930]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umznabuywmoxozleooxohifwzbsqgcwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321268.4146814-915-1838571579384/AnsiballZ_group.py
Nov 28 09:14:28 np0005538513.localdomain sudo[119930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:29 np0005538513.localdomain python3.9[119932]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 09:14:29 np0005538513.localdomain groupmod[119933]: group changed in /etc/group (group hugetlbfs/985, new gid: 42477)
Nov 28 09:14:29 np0005538513.localdomain groupmod[119933]: group changed in /etc/passwd (group hugetlbfs/985, new gid: 42477)
Nov 28 09:14:29 np0005538513.localdomain sudo[119930]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:29 np0005538513.localdomain sudo[120028]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shtmbznanksyobdggxrxivljggcpentz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321269.4009573-942-126831946671511/AnsiballZ_file.py
Nov 28 09:14:29 np0005538513.localdomain sudo[120028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:29 np0005538513.localdomain python3.9[120030]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 28 09:14:29 np0005538513.localdomain sudo[120028]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:30 np0005538513.localdomain sudo[120120]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvenewqctsvzjfmltaghfolbyqxunptl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321270.3690095-975-134617365733572/AnsiballZ_dnf.py
Nov 28 09:14:30 np0005538513.localdomain sudo[120120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:30 np0005538513.localdomain python3.9[120122]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:14:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62896 DF PROTO=TCP SPT=49510 DPT=9101 SEQ=1712621461 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0CCB020000000001030307) 
Nov 28 09:14:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38690 DF PROTO=TCP SPT=40280 DPT=9105 SEQ=623990083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0CD3830000000001030307) 
Nov 28 09:14:34 np0005538513.localdomain sudo[120120]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:34 np0005538513.localdomain sudo[120214]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ddfzckhmxcwaxeayzbbmxjkflnudwjdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321274.5493562-999-265953294599879/AnsiballZ_file.py
Nov 28 09:14:34 np0005538513.localdomain sudo[120214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:34 np0005538513.localdomain python3.9[120216]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:14:34 np0005538513.localdomain sudo[120214]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8860 DF PROTO=TCP SPT=57342 DPT=9100 SEQ=3269640789 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0CDF820000000001030307) 
Nov 28 09:14:39 np0005538513.localdomain sudo[120307]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klbpotnaraiojpwyphsixlxunxnexytf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321278.9657657-1023-43509833346468/AnsiballZ_stat.py
Nov 28 09:14:39 np0005538513.localdomain sudo[120307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:39 np0005538513.localdomain python3.9[120309]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:14:39 np0005538513.localdomain sudo[120307]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44957 DF PROTO=TCP SPT=35102 DPT=9100 SEQ=3646895248 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0CEB830000000001030307) 
Nov 28 09:14:39 np0005538513.localdomain sudo[120380]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-erjfgeabodpwabyqeliudfcgkwegtdfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321278.9657657-1023-43509833346468/AnsiballZ_copy.py
Nov 28 09:14:39 np0005538513.localdomain sudo[120380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:39 np0005538513.localdomain python3.9[120382]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321278.9657657-1023-43509833346468/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:14:39 np0005538513.localdomain sudo[120380]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:42 np0005538513.localdomain sshd[120397]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:14:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8862 DF PROTO=TCP SPT=57342 DPT=9100 SEQ=3269640789 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0CF7420000000001030307) 
Nov 28 09:14:42 np0005538513.localdomain sshd[120397]: Invalid user sol from 80.94.92.182 port 41240
Nov 28 09:14:43 np0005538513.localdomain sshd[120397]: Connection closed by invalid user sol 80.94.92.182 port 41240 [preauth]
Nov 28 09:14:44 np0005538513.localdomain sudo[120474]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzrgzhuvrfsblefbumglmsyocphjvqlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321283.8414593-1068-169591717919320/AnsiballZ_systemd.py
Nov 28 09:14:44 np0005538513.localdomain sudo[120474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:44 np0005538513.localdomain python3.9[120476]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:14:45 np0005538513.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 28 09:14:45 np0005538513.localdomain systemd[1]: Stopped Load Kernel Modules.
Nov 28 09:14:45 np0005538513.localdomain systemd[1]: Stopping Load Kernel Modules...
Nov 28 09:14:45 np0005538513.localdomain systemd[1]: Starting Load Kernel Modules...
Nov 28 09:14:45 np0005538513.localdomain systemd-modules-load[120480]: Module 'msr' is built in
Nov 28 09:14:45 np0005538513.localdomain systemd[1]: Finished Load Kernel Modules.
Nov 28 09:14:45 np0005538513.localdomain sudo[120474]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:46 np0005538513.localdomain sudo[120571]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ywgdzllljabcpomzdpuixwexdeuqhhia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321285.9471498-1092-17497424745442/AnsiballZ_stat.py
Nov 28 09:14:46 np0005538513.localdomain sudo[120571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:46 np0005538513.localdomain python3.9[120573]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:14:46 np0005538513.localdomain sudo[120571]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:46 np0005538513.localdomain sudo[120644]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwkjlnhmpgpipxszzfleepibasvimwkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321285.9471498-1092-17497424745442/AnsiballZ_copy.py
Nov 28 09:14:46 np0005538513.localdomain sudo[120644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:46 np0005538513.localdomain python3.9[120646]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321285.9471498-1092-17497424745442/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:14:46 np0005538513.localdomain sudo[120644]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59354 DF PROTO=TCP SPT=39978 DPT=9882 SEQ=1600516699 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D09C20000000001030307) 
Nov 28 09:14:48 np0005538513.localdomain sudo[120736]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vbhvihqhlkkkcvaunkbbeonibbfftups ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321287.727828-1146-31566370204187/AnsiballZ_dnf.py
Nov 28 09:14:48 np0005538513.localdomain sudo[120736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:48 np0005538513.localdomain python3.9[120738]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:14:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5412 DF PROTO=TCP SPT=45724 DPT=9105 SEQ=3240212235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D11420000000001030307) 
Nov 28 09:14:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5413 DF PROTO=TCP SPT=45724 DPT=9105 SEQ=3240212235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D19420000000001030307) 
Nov 28 09:14:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5414 DF PROTO=TCP SPT=45724 DPT=9105 SEQ=3240212235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D29020000000001030307) 
Nov 28 09:14:55 np0005538513.localdomain sudo[120736]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:56 np0005538513.localdomain python3.9[120831]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:14:57 np0005538513.localdomain python3.9[120923]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 28 09:14:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38332 DF PROTO=TCP SPT=41366 DPT=9101 SEQ=1105872768 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D34290000000001030307) 
Nov 28 09:14:58 np0005538513.localdomain python3.9[121013]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:14:59 np0005538513.localdomain sudo[121103]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljeapmproxkpoqzsbljodkkoewvlexpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321298.9602258-1269-264814472813291/AnsiballZ_systemd.py
Nov 28 09:14:59 np0005538513.localdomain sudo[121103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:59 np0005538513.localdomain python3.9[121105]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:15:00 np0005538513.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 28 09:15:00 np0005538513.localdomain systemd[1]: tuned.service: Deactivated successfully.
Nov 28 09:15:00 np0005538513.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 28 09:15:00 np0005538513.localdomain systemd[1]: tuned.service: Consumed 1.744s CPU time, no IO.
Nov 28 09:15:00 np0005538513.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 28 09:15:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38334 DF PROTO=TCP SPT=41366 DPT=9101 SEQ=1105872768 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D40420000000001030307) 
Nov 28 09:15:01 np0005538513.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Nov 28 09:15:01 np0005538513.localdomain sudo[121103]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:02 np0005538513.localdomain python3.9[121207]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 28 09:15:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41135 DF PROTO=TCP SPT=54278 DPT=9102 SEQ=767752292 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D49020000000001030307) 
Nov 28 09:15:04 np0005538513.localdomain sudo[121222]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:15:04 np0005538513.localdomain sudo[121222]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:15:04 np0005538513.localdomain sudo[121222]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:04 np0005538513.localdomain sudo[121237]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:15:04 np0005538513.localdomain sudo[121237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:15:05 np0005538513.localdomain sudo[121237]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:05 np0005538513.localdomain sudo[121316]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:15:05 np0005538513.localdomain sudo[121316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:15:05 np0005538513.localdomain sudo[121316]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:06 np0005538513.localdomain sudo[121373]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjfvwqhvkusyyosqgolwssmclqimpwkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321305.787108-1440-35531266295144/AnsiballZ_systemd.py
Nov 28 09:15:06 np0005538513.localdomain sudo[121373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:06 np0005538513.localdomain python3.9[121375]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:15:06 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:15:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28466 DF PROTO=TCP SPT=33224 DPT=9100 SEQ=2100537511 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D54C20000000001030307) 
Nov 28 09:15:06 np0005538513.localdomain systemd-rc-local-generator[121400]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:15:06 np0005538513.localdomain systemd-sysv-generator[121405]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:15:06 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:15:06 np0005538513.localdomain sudo[121373]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:07 np0005538513.localdomain sudo[121503]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uhjjmicidontzuxtlrrxqtorsokomczn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321306.8447285-1440-98129137848166/AnsiballZ_systemd.py
Nov 28 09:15:07 np0005538513.localdomain sudo[121503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:07 np0005538513.localdomain python3.9[121505]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:15:07 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:15:07 np0005538513.localdomain systemd-rc-local-generator[121531]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:15:07 np0005538513.localdomain systemd-sysv-generator[121537]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:15:07 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:15:07 np0005538513.localdomain sudo[121503]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41224 DF PROTO=TCP SPT=50396 DPT=9100 SEQ=701157145 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D5F820000000001030307) 
Nov 28 09:15:09 np0005538513.localdomain sudo[121633]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vliotpqyslscprrwiyinwggrchrzajou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321309.1733346-1488-246667618603679/AnsiballZ_command.py
Nov 28 09:15:09 np0005538513.localdomain sudo[121633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:09 np0005538513.localdomain python3.9[121635]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:15:09 np0005538513.localdomain sudo[121633]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:10 np0005538513.localdomain sudo[121726]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjighslwaxlssezxrugruzwqugmzruva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321309.918904-1512-156640814746146/AnsiballZ_command.py
Nov 28 09:15:10 np0005538513.localdomain sudo[121726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:10 np0005538513.localdomain python3.9[121728]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:15:10 np0005538513.localdomain kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k FS
Nov 28 09:15:10 np0005538513.localdomain sudo[121726]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:10 np0005538513.localdomain sudo[121819]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-innhouaovormtpviqovvrqcpfpydrubw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321310.650141-1536-119095446185180/AnsiballZ_command.py
Nov 28 09:15:10 np0005538513.localdomain sudo[121819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:11 np0005538513.localdomain python3.9[121821]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:15:12 np0005538513.localdomain sudo[121819]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28468 DF PROTO=TCP SPT=33224 DPT=9100 SEQ=2100537511 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D6C820000000001030307) 
Nov 28 09:15:12 np0005538513.localdomain sudo[121918]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-funhfsblvinbtlblsxheuvgepnnxehrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321312.4053228-1560-212005767488516/AnsiballZ_command.py
Nov 28 09:15:12 np0005538513.localdomain sudo[121918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:12 np0005538513.localdomain python3.9[121920]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:15:12 np0005538513.localdomain sudo[121918]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:13 np0005538513.localdomain sudo[122011]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwnkeqxxerbyuxtbglhbvkgusaqsfvsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321313.137035-1584-263069689156432/AnsiballZ_systemd.py
Nov 28 09:15:13 np0005538513.localdomain sudo[122011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:13 np0005538513.localdomain python3.9[122013]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:15:13 np0005538513.localdomain systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 28 09:15:13 np0005538513.localdomain systemd[1]: Stopped Apply Kernel Variables.
Nov 28 09:15:13 np0005538513.localdomain systemd[1]: Stopping Apply Kernel Variables...
Nov 28 09:15:13 np0005538513.localdomain systemd[1]: Starting Apply Kernel Variables...
Nov 28 09:15:13 np0005538513.localdomain systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 28 09:15:13 np0005538513.localdomain systemd[1]: Finished Apply Kernel Variables.
Nov 28 09:15:13 np0005538513.localdomain sudo[122011]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:14 np0005538513.localdomain sshd[115732]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:15:14 np0005538513.localdomain systemd[1]: session-38.scope: Deactivated successfully.
Nov 28 09:15:14 np0005538513.localdomain systemd[1]: session-38.scope: Consumed 1min 55.336s CPU time.
Nov 28 09:15:14 np0005538513.localdomain systemd-logind[764]: Session 38 logged out. Waiting for processes to exit.
Nov 28 09:15:14 np0005538513.localdomain systemd-logind[764]: Removed session 38.
Nov 28 09:15:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57701 DF PROTO=TCP SPT=53970 DPT=9882 SEQ=1263280768 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D7F020000000001030307) 
Nov 28 09:15:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29488 DF PROTO=TCP SPT=43470 DPT=9105 SEQ=2570715432 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D86820000000001030307) 
Nov 28 09:15:19 np0005538513.localdomain sshd[122033]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:15:19 np0005538513.localdomain sshd[122033]: Accepted publickey for zuul from 192.168.122.31 port 55636 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:15:19 np0005538513.localdomain systemd-logind[764]: New session 39 of user zuul.
Nov 28 09:15:19 np0005538513.localdomain systemd[1]: Started Session 39 of User zuul.
Nov 28 09:15:19 np0005538513.localdomain sshd[122033]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:15:20 np0005538513.localdomain python3.9[122126]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:15:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29489 DF PROTO=TCP SPT=43470 DPT=9105 SEQ=2570715432 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D8E820000000001030307) 
Nov 28 09:15:22 np0005538513.localdomain python3.9[122220]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:15:23 np0005538513.localdomain sudo[122314]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rvlneuwtkyofsdyfhoocylnnjtobnmcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321322.9716604-110-42355803632006/AnsiballZ_command.py
Nov 28 09:15:23 np0005538513.localdomain sudo[122314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:23 np0005538513.localdomain python3.9[122316]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:15:23 np0005538513.localdomain sudo[122314]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:24 np0005538513.localdomain python3.9[122407]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:15:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29490 DF PROTO=TCP SPT=43470 DPT=9105 SEQ=2570715432 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D9E420000000001030307) 
Nov 28 09:15:25 np0005538513.localdomain sudo[122501]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-libwwfmacuxiyutoihpwaidpklicwbvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321325.1411345-170-169521990389159/AnsiballZ_setup.py
Nov 28 09:15:25 np0005538513.localdomain sudo[122501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:25 np0005538513.localdomain python3.9[122503]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:15:25 np0005538513.localdomain sudo[122501]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:26 np0005538513.localdomain sudo[122555]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-euhcqhhokbziumgabjszbyhlgegdlrij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321325.1411345-170-169521990389159/AnsiballZ_dnf.py
Nov 28 09:15:26 np0005538513.localdomain sudo[122555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:26 np0005538513.localdomain python3.9[122557]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:15:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24265 DF PROTO=TCP SPT=36684 DPT=9101 SEQ=1335908385 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0DA9590000000001030307) 
Nov 28 09:15:29 np0005538513.localdomain sudo[122555]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:30 np0005538513.localdomain sudo[122649]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pozqeubqeecqyojhoerrvdlaphhmmubj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321330.1019335-206-126567560311884/AnsiballZ_setup.py
Nov 28 09:15:30 np0005538513.localdomain sudo[122649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:30 np0005538513.localdomain python3.9[122651]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:15:31 np0005538513.localdomain sudo[122649]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24267 DF PROTO=TCP SPT=36684 DPT=9101 SEQ=1335908385 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0DB5420000000001030307) 
Nov 28 09:15:31 np0005538513.localdomain sudo[122804]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhlzbacmxipfzxcfizwdlamdxirsvetz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321331.4269338-239-237395872460818/AnsiballZ_file.py
Nov 28 09:15:31 np0005538513.localdomain sudo[122804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:32 np0005538513.localdomain python3.9[122806]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:15:32 np0005538513.localdomain sudo[122804]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:32 np0005538513.localdomain sudo[122896]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ooibiwpfjezjhoaslmhuuoeieokdvczz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321332.2118583-263-218444800197601/AnsiballZ_command.py
Nov 28 09:15:32 np0005538513.localdomain sudo[122896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:32 np0005538513.localdomain python3.9[122898]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:15:32 np0005538513.localdomain sudo[122896]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:33 np0005538513.localdomain sudo[123000]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhuhlszcchcsjnjusonbtjdfujkdybht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321332.9691794-287-173809684785038/AnsiballZ_stat.py
Nov 28 09:15:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21035 DF PROTO=TCP SPT=34850 DPT=9102 SEQ=2583488510 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0DBE430000000001030307) 
Nov 28 09:15:33 np0005538513.localdomain sudo[123000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:33 np0005538513.localdomain python3.9[123002]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:15:33 np0005538513.localdomain sudo[123000]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:33 np0005538513.localdomain sudo[123048]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qliotglrhcmevoiljjzyofqbpttgysjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321332.9691794-287-173809684785038/AnsiballZ_file.py
Nov 28 09:15:33 np0005538513.localdomain sudo[123048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:34 np0005538513.localdomain python3.9[123050]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:15:34 np0005538513.localdomain sudo[123048]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:34 np0005538513.localdomain sudo[123140]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tjwmgqviolxqvsdyihihdwxdvvscdnmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321334.234006-323-74672324063493/AnsiballZ_stat.py
Nov 28 09:15:34 np0005538513.localdomain sudo[123140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:34 np0005538513.localdomain python3.9[123142]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:15:34 np0005538513.localdomain sudo[123140]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:35 np0005538513.localdomain sudo[123213]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjjzwcmnbnbzotrjwpiafbprzpuunlyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321334.234006-323-74672324063493/AnsiballZ_copy.py
Nov 28 09:15:35 np0005538513.localdomain sudo[123213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:35 np0005538513.localdomain python3.9[123215]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321334.234006-323-74672324063493/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:15:35 np0005538513.localdomain sudo[123213]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:36 np0005538513.localdomain sudo[123305]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhgxuckjoavwmecjgrvipgihklikfioc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321335.6854346-371-228649412931576/AnsiballZ_ini_file.py
Nov 28 09:15:36 np0005538513.localdomain sudo[123305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8351 DF PROTO=TCP SPT=45700 DPT=9102 SEQ=3988058356 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0DC9820000000001030307) 
Nov 28 09:15:36 np0005538513.localdomain python3.9[123307]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:15:36 np0005538513.localdomain sudo[123305]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:36 np0005538513.localdomain systemd-journald[47227]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation.
Nov 28 09:15:36 np0005538513.localdomain systemd-journald[47227]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 09:15:36 np0005538513.localdomain rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:15:36 np0005538513.localdomain rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:15:36 np0005538513.localdomain sudo[123398]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jroxrddnirwqdejrnsztfnyffuvpknna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321336.4111388-371-246906630236193/AnsiballZ_ini_file.py
Nov 28 09:15:36 np0005538513.localdomain sudo[123398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:36 np0005538513.localdomain python3.9[123400]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:15:36 np0005538513.localdomain sudo[123398]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:37 np0005538513.localdomain sudo[123490]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymleiqrqvvkwzsdrunrdthsefjptgyfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321336.975832-371-46439663847925/AnsiballZ_ini_file.py
Nov 28 09:15:37 np0005538513.localdomain sudo[123490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:37 np0005538513.localdomain python3.9[123492]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:15:37 np0005538513.localdomain sudo[123490]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:37 np0005538513.localdomain sudo[123582]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kbspnmcfgwwqjsegkvuszgqtksinmclq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321337.5151641-371-59967352643266/AnsiballZ_ini_file.py
Nov 28 09:15:37 np0005538513.localdomain sudo[123582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:37 np0005538513.localdomain python3.9[123584]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:15:38 np0005538513.localdomain sudo[123582]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:39 np0005538513.localdomain python3.9[123674]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:15:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8865 DF PROTO=TCP SPT=57342 DPT=9100 SEQ=3269640789 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0DD5820000000001030307) 
Nov 28 09:15:39 np0005538513.localdomain sudo[123766]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yghozuewiskjsudwsojcsazrbugatxcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321339.3753347-491-202552343643938/AnsiballZ_dnf.py
Nov 28 09:15:39 np0005538513.localdomain sudo[123766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:39 np0005538513.localdomain python3.9[123768]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 09:15:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18742 DF PROTO=TCP SPT=51008 DPT=9100 SEQ=3993436640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0DE1830000000001030307) 
Nov 28 09:15:43 np0005538513.localdomain sudo[123766]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:43 np0005538513.localdomain sudo[123860]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbznvmvmsqdcmvpifctppvafucioshdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321343.4213512-515-15668871505193/AnsiballZ_dnf.py
Nov 28 09:15:43 np0005538513.localdomain sudo[123860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:43 np0005538513.localdomain python3.9[123862]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 09:15:47 np0005538513.localdomain sudo[123860]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50121 DF PROTO=TCP SPT=44726 DPT=9882 SEQ=2210243804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0DF4020000000001030307) 
Nov 28 09:15:47 np0005538513.localdomain sudo[123954]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ouzegsjhaezumopjojfroxvtunqzdwyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321347.5928364-545-206202464223591/AnsiballZ_dnf.py
Nov 28 09:15:47 np0005538513.localdomain sudo[123954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:48 np0005538513.localdomain python3.9[123956]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 09:15:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49702 DF PROTO=TCP SPT=39658 DPT=9105 SEQ=877194211 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0DFBC20000000001030307) 
Nov 28 09:15:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49703 DF PROTO=TCP SPT=39658 DPT=9105 SEQ=877194211 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E03C20000000001030307) 
Nov 28 09:15:51 np0005538513.localdomain sudo[123954]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:52 np0005538513.localdomain sudo[124054]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvzqjlrgijqvfpjgtjqdujymgptffaev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321351.808201-572-150325392665602/AnsiballZ_dnf.py
Nov 28 09:15:52 np0005538513.localdomain sudo[124054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:52 np0005538513.localdomain python3.9[124056]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 09:15:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50122 DF PROTO=TCP SPT=44726 DPT=9882 SEQ=2210243804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E13820000000001030307) 
Nov 28 09:15:55 np0005538513.localdomain sudo[124054]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:56 np0005538513.localdomain sudo[124148]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ainpxmczgpfzsdddyomssagsqfrgkjis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321356.1677396-608-182423753778431/AnsiballZ_dnf.py
Nov 28 09:15:56 np0005538513.localdomain sudo[124148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:56 np0005538513.localdomain python3.9[124150]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 09:15:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49204 DF PROTO=TCP SPT=40540 DPT=9101 SEQ=4276460536 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E1E890000000001030307) 
Nov 28 09:15:59 np0005538513.localdomain sudo[124148]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:00 np0005538513.localdomain sudo[124242]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uurqmqqnurdtsdgkqwzpybzvcjjesnkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321360.3151696-635-71726425431544/AnsiballZ_dnf.py
Nov 28 09:16:00 np0005538513.localdomain sudo[124242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:16:00 np0005538513.localdomain python3.9[124244]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 09:16:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49206 DF PROTO=TCP SPT=40540 DPT=9101 SEQ=4276460536 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E2A830000000001030307) 
Nov 28 09:16:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23294 DF PROTO=TCP SPT=49232 DPT=9102 SEQ=3662492745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E33820000000001030307) 
Nov 28 09:16:04 np0005538513.localdomain sudo[124242]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:05 np0005538513.localdomain sudo[124336]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sldclbizlwbpjdmtxfsgzunqlzdbgpbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321364.7533884-662-135300081999068/AnsiballZ_dnf.py
Nov 28 09:16:05 np0005538513.localdomain sudo[124336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:16:05 np0005538513.localdomain python3.9[124338]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 09:16:06 np0005538513.localdomain sudo[124341]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:16:06 np0005538513.localdomain sudo[124341]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:16:06 np0005538513.localdomain sudo[124341]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:06 np0005538513.localdomain sudo[124356]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:16:06 np0005538513.localdomain sudo[124356]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:16:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46330 DF PROTO=TCP SPT=47586 DPT=9100 SEQ=124756526 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E3F020000000001030307) 
Nov 28 09:16:06 np0005538513.localdomain sudo[124356]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:07 np0005538513.localdomain sudo[124404]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:16:07 np0005538513.localdomain sudo[124404]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:16:07 np0005538513.localdomain sudo[124404]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23296 DF PROTO=TCP SPT=49232 DPT=9102 SEQ=3662492745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E4B420000000001030307) 
Nov 28 09:16:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46332 DF PROTO=TCP SPT=47586 DPT=9100 SEQ=124756526 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E56C20000000001030307) 
Nov 28 09:16:17 np0005538513.localdomain sudo[124336]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16903 DF PROTO=TCP SPT=35070 DPT=9882 SEQ=1773086771 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E69420000000001030307) 
Nov 28 09:16:17 np0005538513.localdomain sudo[124585]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yoiyrgzrjezokowqqoszywndtprlhfpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321377.6549783-698-140993541932558/AnsiballZ_file.py
Nov 28 09:16:17 np0005538513.localdomain sudo[124585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:16:18 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17457 DF PROTO=TCP SPT=60926 DPT=9105 SEQ=1125394785 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E6CCB0000000001030307) 
Nov 28 09:16:18 np0005538513.localdomain python3.9[124587]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:16:18 np0005538513.localdomain sudo[124585]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:18 np0005538513.localdomain sudo[124690]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efnjnldqscuoytpjcwhvmylkeuhmcwhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321378.3839915-722-222428517102657/AnsiballZ_stat.py
Nov 28 09:16:18 np0005538513.localdomain sudo[124690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:16:18 np0005538513.localdomain python3.9[124692]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:16:18 np0005538513.localdomain sudo[124690]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:19 np0005538513.localdomain sudo[124763]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-juhepuxwwuktgwldeoyezmllkeekutnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321378.3839915-722-222428517102657/AnsiballZ_copy.py
Nov 28 09:16:19 np0005538513.localdomain sudo[124763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:16:19 np0005538513.localdomain python3.9[124765]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764321378.3839915-722-222428517102657/.source.json _original_basename=.fwcp6m_k follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:16:19 np0005538513.localdomain sudo[124763]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:20 np0005538513.localdomain sudo[124855]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpclbdwjbedeurzjggmujaynjldhmmly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321379.9712658-776-83119554592928/AnsiballZ_podman_image.py
Nov 28 09:16:20 np0005538513.localdomain sudo[124855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:16:20 np0005538513.localdomain python3.9[124857]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 09:16:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17459 DF PROTO=TCP SPT=60926 DPT=9105 SEQ=1125394785 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E78C20000000001030307) 
Nov 28 09:16:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17460 DF PROTO=TCP SPT=60926 DPT=9105 SEQ=1125394785 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E88820000000001030307) 
Nov 28 09:16:26 np0005538513.localdomain podman[124870]: 2025-11-28 09:16:20.837235796 +0000 UTC m=+0.042189929 image pull  quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 28 09:16:27 np0005538513.localdomain sudo[124855]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57647 DF PROTO=TCP SPT=56874 DPT=9101 SEQ=1905996746 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E93BB0000000001030307) 
Nov 28 09:16:28 np0005538513.localdomain sudo[125067]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-honyucsnjvxmosmhqfzpccygoeeslppl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321387.9428015-809-59021909713739/AnsiballZ_podman_image.py
Nov 28 09:16:28 np0005538513.localdomain sudo[125067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:16:28 np0005538513.localdomain python3.9[125069]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 09:16:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57649 DF PROTO=TCP SPT=56874 DPT=9101 SEQ=1905996746 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E9FC20000000001030307) 
Nov 28 09:16:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3368 DF PROTO=TCP SPT=33524 DPT=9102 SEQ=3116283377 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0EA8C20000000001030307) 
Nov 28 09:16:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21040 DF PROTO=TCP SPT=34850 DPT=9102 SEQ=2583488510 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0EB3820000000001030307) 
Nov 28 09:16:36 np0005538513.localdomain podman[125082]: 2025-11-28 09:16:28.743117886 +0000 UTC m=+0.045231304 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 09:16:36 np0005538513.localdomain sudo[125067]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:37 np0005538513.localdomain sudo[125282]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oihwalqxlfkurwqbkbyfpqqwmflaosxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321397.617117-845-279373307884672/AnsiballZ_podman_image.py
Nov 28 09:16:37 np0005538513.localdomain sudo[125282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:16:38 np0005538513.localdomain python3.9[125284]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 09:16:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18745 DF PROTO=TCP SPT=51008 DPT=9100 SEQ=3993436640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0EBF830000000001030307) 
Nov 28 09:16:40 np0005538513.localdomain podman[125296]: 2025-11-28 09:16:38.188204953 +0000 UTC m=+0.046738470 image pull  quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Nov 28 09:16:40 np0005538513.localdomain sudo[125282]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:40 np0005538513.localdomain sudo[125457]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-clzgatvelozamzikqwfnhvkqkzxwdmrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321400.6516218-872-39457151815478/AnsiballZ_podman_image.py
Nov 28 09:16:40 np0005538513.localdomain sudo[125457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:16:41 np0005538513.localdomain python3.9[125459]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 09:16:42 np0005538513.localdomain podman[125472]: 2025-11-28 09:16:41.244963625 +0000 UTC m=+0.028103392 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 09:16:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33296 DF PROTO=TCP SPT=38448 DPT=9100 SEQ=3436919252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0ECC020000000001030307) 
Nov 28 09:16:42 np0005538513.localdomain sudo[125457]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:43 np0005538513.localdomain sudo[125631]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvhjxvvsrnpgyssfggjvosrqbfcpnayz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321402.9241505-899-135306816416759/AnsiballZ_podman_image.py
Nov 28 09:16:43 np0005538513.localdomain sudo[125631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:16:43 np0005538513.localdomain python3.9[125633]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 09:16:46 np0005538513.localdomain podman[125646]: 2025-11-28 09:16:43.547577273 +0000 UTC m=+0.043823469 image pull  quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 28 09:16:47 np0005538513.localdomain sudo[125631]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36958 DF PROTO=TCP SPT=50924 DPT=9882 SEQ=611293252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0EDE820000000001030307) 
Nov 28 09:16:47 np0005538513.localdomain sudo[125822]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blepxlczdybzpublwluqrsoceftuxvee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321407.3293114-899-153920465662478/AnsiballZ_podman_image.py
Nov 28 09:16:47 np0005538513.localdomain sudo[125822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:16:47 np0005538513.localdomain python3.9[125824]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 09:16:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24092 DF PROTO=TCP SPT=44216 DPT=9105 SEQ=3684348139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0EE6020000000001030307) 
Nov 28 09:16:50 np0005538513.localdomain podman[125838]: 2025-11-28 09:16:47.905410938 +0000 UTC m=+0.048655370 image pull  quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Nov 28 09:16:50 np0005538513.localdomain sudo[125822]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24093 DF PROTO=TCP SPT=44216 DPT=9105 SEQ=3684348139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0EEE020000000001030307) 
Nov 28 09:16:51 np0005538513.localdomain sshd[122033]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:16:51 np0005538513.localdomain systemd[1]: session-39.scope: Deactivated successfully.
Nov 28 09:16:51 np0005538513.localdomain systemd[1]: session-39.scope: Consumed 1min 31.367s CPU time.
Nov 28 09:16:51 np0005538513.localdomain systemd-logind[764]: Session 39 logged out. Waiting for processes to exit.
Nov 28 09:16:51 np0005538513.localdomain systemd-logind[764]: Removed session 39.
Nov 28 09:16:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24094 DF PROTO=TCP SPT=44216 DPT=9105 SEQ=3684348139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0EFDC20000000001030307) 
Nov 28 09:16:57 np0005538513.localdomain sshd[126187]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:16:57 np0005538513.localdomain sshd[126187]: Accepted publickey for zuul from 192.168.122.31 port 39392 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:16:57 np0005538513.localdomain systemd-logind[764]: New session 40 of user zuul.
Nov 28 09:16:57 np0005538513.localdomain systemd[1]: Started Session 40 of User zuul.
Nov 28 09:16:57 np0005538513.localdomain sshd[126187]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:16:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31337 DF PROTO=TCP SPT=38766 DPT=9101 SEQ=361867278 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F08E90000000001030307) 
Nov 28 09:16:58 np0005538513.localdomain python3.9[126291]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:16:59 np0005538513.localdomain sudo[126385]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hidzmirditewgwdgrluobnhwwjmykgoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321419.4638052-68-196719840924061/AnsiballZ_getent.py
Nov 28 09:16:59 np0005538513.localdomain sudo[126385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:00 np0005538513.localdomain python3.9[126387]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 28 09:17:00 np0005538513.localdomain sudo[126385]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:01 np0005538513.localdomain sudo[126478]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrnbnvxkkrfycogdnimyugcwhssnwiog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321420.7996528-104-270550813327522/AnsiballZ_setup.py
Nov 28 09:17:01 np0005538513.localdomain sudo[126478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31339 DF PROTO=TCP SPT=38766 DPT=9101 SEQ=361867278 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F15020000000001030307) 
Nov 28 09:17:01 np0005538513.localdomain python3.9[126480]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:17:01 np0005538513.localdomain sudo[126478]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:02 np0005538513.localdomain sudo[126532]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-noymlpumxqpvfqtlwhwrimvwsqobvqfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321420.7996528-104-270550813327522/AnsiballZ_dnf.py
Nov 28 09:17:02 np0005538513.localdomain sudo[126532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:02 np0005538513.localdomain python3.9[126534]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch3.3'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 09:17:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24095 DF PROTO=TCP SPT=44216 DPT=9105 SEQ=3684348139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F1D820000000001030307) 
Nov 28 09:17:05 np0005538513.localdomain sudo[126532]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23299 DF PROTO=TCP SPT=49232 DPT=9102 SEQ=3662492745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F29820000000001030307) 
Nov 28 09:17:07 np0005538513.localdomain sudo[126652]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mctwugmwzqvgwatlhbyxnguhwhilfnbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321427.0686285-146-71496184311877/AnsiballZ_dnf.py
Nov 28 09:17:07 np0005538513.localdomain sudo[126652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:07 np0005538513.localdomain python3.9[126654]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:17:07 np0005538513.localdomain sudo[126655]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:17:07 np0005538513.localdomain sudo[126655]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:17:07 np0005538513.localdomain sudo[126655]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:07 np0005538513.localdomain sudo[126671]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:17:07 np0005538513.localdomain sudo[126671]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:17:08 np0005538513.localdomain sudo[126671]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1098 DF PROTO=TCP SPT=60774 DPT=9102 SEQ=635022861 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F35820000000001030307) 
Nov 28 09:17:11 np0005538513.localdomain sudo[126652]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:12 np0005538513.localdomain sudo[126857]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:17:12 np0005538513.localdomain sudo[126857]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:17:12 np0005538513.localdomain sudo[126857]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55541 DF PROTO=TCP SPT=42570 DPT=9100 SEQ=476428184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F41420000000001030307) 
Nov 28 09:17:13 np0005538513.localdomain sudo[126947]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzbjhuvsktvpwwruipyhfmlpmzmqlooa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321432.650717-170-235933523815450/AnsiballZ_systemd.py
Nov 28 09:17:13 np0005538513.localdomain sudo[126947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:13 np0005538513.localdomain python3.9[126949]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 09:17:13 np0005538513.localdomain sudo[126947]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:15 np0005538513.localdomain python3.9[127042]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:17:16 np0005538513.localdomain sudo[127132]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mbrjayuniacdjzscehdfrkvicbbqjxpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321435.870494-224-230978649636335/AnsiballZ_sefcontext.py
Nov 28 09:17:16 np0005538513.localdomain sudo[127132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:16 np0005538513.localdomain python3.9[127134]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 28 09:17:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37031 DF PROTO=TCP SPT=55112 DPT=9882 SEQ=2605183829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F53C20000000001030307) 
Nov 28 09:17:18 np0005538513.localdomain kernel: SELinux:  Converting 2756 SID table entries...
Nov 28 09:17:18 np0005538513.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 09:17:18 np0005538513.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 09:17:18 np0005538513.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 09:17:18 np0005538513.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 09:17:18 np0005538513.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 09:17:18 np0005538513.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 09:17:18 np0005538513.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 09:17:18 np0005538513.localdomain sudo[127132]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54123 DF PROTO=TCP SPT=51150 DPT=9105 SEQ=2386692259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F5B430000000001030307) 
Nov 28 09:17:19 np0005538513.localdomain python3.9[127452]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:17:20 np0005538513.localdomain sudo[127548]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oibapecpqtsxyfitsuxeuwqmrnimizzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321440.145152-278-119594109069657/AnsiballZ_dnf.py
Nov 28 09:17:20 np0005538513.localdomain dbus-broker-launch[755]: avc:  op=load_policy lsm=selinux seqno=18 res=1
Nov 28 09:17:20 np0005538513.localdomain sudo[127548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:20 np0005538513.localdomain python3.9[127550]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:17:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54124 DF PROTO=TCP SPT=51150 DPT=9105 SEQ=2386692259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F63420000000001030307) 
Nov 28 09:17:23 np0005538513.localdomain sudo[127548]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:24 np0005538513.localdomain sudo[127642]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgrrtzkbmoacmarjluquqyojtaqranmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321444.0151694-302-94019252135781/AnsiballZ_command.py
Nov 28 09:17:24 np0005538513.localdomain sudo[127642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:24 np0005538513.localdomain python3.9[127644]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:17:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54125 DF PROTO=TCP SPT=51150 DPT=9105 SEQ=2386692259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F73030000000001030307) 
Nov 28 09:17:25 np0005538513.localdomain sudo[127642]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:26 np0005538513.localdomain sudo[127887]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vriwkfuzfoeqtmbkjldqsidsiledthsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321445.6120543-326-246816468850905/AnsiballZ_file.py
Nov 28 09:17:26 np0005538513.localdomain sudo[127887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:26 np0005538513.localdomain python3.9[127889]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 09:17:26 np0005538513.localdomain sudo[127887]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:26 np0005538513.localdomain python3.9[127979]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:17:27 np0005538513.localdomain sudo[128071]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqrbptvuyrzgwacjoyqmgwnwrqarqlzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321447.4003108-380-82743766203384/AnsiballZ_dnf.py
Nov 28 09:17:27 np0005538513.localdomain sudo[128071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:27 np0005538513.localdomain python3.9[128073]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:17:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41582 DF PROTO=TCP SPT=57622 DPT=9101 SEQ=1880801936 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F7E1A0000000001030307) 
Nov 28 09:17:30 np0005538513.localdomain sudo[128071]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41584 DF PROTO=TCP SPT=57622 DPT=9101 SEQ=1880801936 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F8A430000000001030307) 
Nov 28 09:17:31 np0005538513.localdomain sudo[128165]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ilkexsftzhlzdmveahxaddopqpfaxckw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321451.2818627-404-248303326509972/AnsiballZ_dnf.py
Nov 28 09:17:31 np0005538513.localdomain sudo[128165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:31 np0005538513.localdomain python3.9[128167]: ansible-ansible.legacy.dnf Invoked with name=['openstack-network-scripts'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:17:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64797 DF PROTO=TCP SPT=55252 DPT=9102 SEQ=2816659087 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F93020000000001030307) 
Nov 28 09:17:34 np0005538513.localdomain sudo[128165]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:35 np0005538513.localdomain sudo[128259]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzdbmvahrvcviaysxerqdsejkcxwpvvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321455.1594114-428-115671421609662/AnsiballZ_systemd.py
Nov 28 09:17:35 np0005538513.localdomain sudo[128259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:35 np0005538513.localdomain python3.9[128261]: ansible-ansible.builtin.systemd Invoked with enabled=True name=network daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 28 09:17:35 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:17:35 np0005538513.localdomain systemd-sysv-generator[128292]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:17:35 np0005538513.localdomain systemd-rc-local-generator[128288]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:17:35 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:17:36 np0005538513.localdomain sudo[128259]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29988 DF PROTO=TCP SPT=38016 DPT=9100 SEQ=1368347695 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F9E830000000001030307) 
Nov 28 09:17:37 np0005538513.localdomain sudo[128391]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkqwaxwxxslokneeiktnhnlumeeoqtxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321457.5290809-458-269446687753278/AnsiballZ_stat.py
Nov 28 09:17:37 np0005538513.localdomain sudo[128391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:38 np0005538513.localdomain python3.9[128393]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:17:38 np0005538513.localdomain sudo[128391]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:38 np0005538513.localdomain sudo[128483]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzdvlhwasopltiuuvsrghiraioftuaiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321458.2748005-485-49208800160029/AnsiballZ_ini_file.py
Nov 28 09:17:38 np0005538513.localdomain sudo[128483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:39 np0005538513.localdomain python3.9[128485]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:17:39 np0005538513.localdomain sudo[128483]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33299 DF PROTO=TCP SPT=38448 DPT=9100 SEQ=3436919252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0FA9820000000001030307) 
Nov 28 09:17:39 np0005538513.localdomain sudo[128577]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rkomdiuhvctmkcdhsayewqucehwqqsns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321459.2053952-509-61420613927582/AnsiballZ_ini_file.py
Nov 28 09:17:39 np0005538513.localdomain sudo[128577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:39 np0005538513.localdomain python3.9[128579]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:17:39 np0005538513.localdomain sudo[128577]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:40 np0005538513.localdomain sudo[128669]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mknzxtafvxosbzckkcagxjbptxyrjfbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321459.9010916-533-230582339384781/AnsiballZ_ini_file.py
Nov 28 09:17:40 np0005538513.localdomain sudo[128669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:40 np0005538513.localdomain python3.9[128671]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:17:40 np0005538513.localdomain sudo[128669]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:41 np0005538513.localdomain sudo[128761]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmgbfeifnicuyysqnlbpiwesxbsbjdyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321460.8359594-563-258500151359875/AnsiballZ_stat.py
Nov 28 09:17:41 np0005538513.localdomain sudo[128761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:41 np0005538513.localdomain python3.9[128763]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:17:41 np0005538513.localdomain sudo[128761]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:41 np0005538513.localdomain sudo[128834]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwidlstjvnqhzhvfbrqhpycukdrebgas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321460.8359594-563-258500151359875/AnsiballZ_copy.py
Nov 28 09:17:41 np0005538513.localdomain sudo[128834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:41 np0005538513.localdomain python3.9[128836]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321460.8359594-563-258500151359875/.source _original_basename=.69igfolw follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:17:41 np0005538513.localdomain sudo[128834]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29990 DF PROTO=TCP SPT=38016 DPT=9100 SEQ=1368347695 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0FB6420000000001030307) 
Nov 28 09:17:42 np0005538513.localdomain sudo[128926]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txfsdhkgwvnwrzuphsbmshqcizndzfdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321462.1553705-608-90414573505519/AnsiballZ_file.py
Nov 28 09:17:42 np0005538513.localdomain sudo[128926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:42 np0005538513.localdomain python3.9[128928]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:17:42 np0005538513.localdomain sudo[128926]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:43 np0005538513.localdomain sudo[129018]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svyhafjgvnmspuwtnfgabtwfpkpjgvho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321462.8700142-632-104795901709981/AnsiballZ_edpm_os_net_config_mappings.py
Nov 28 09:17:43 np0005538513.localdomain sudo[129018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:43 np0005538513.localdomain python3.9[129020]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 28 09:17:43 np0005538513.localdomain sudo[129018]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:44 np0005538513.localdomain sudo[129110]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzlghktxgxzvqpgvgnxijoymecxzrlqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321463.908719-659-108087040513869/AnsiballZ_file.py
Nov 28 09:17:44 np0005538513.localdomain sudo[129110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:44 np0005538513.localdomain python3.9[129112]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:17:44 np0005538513.localdomain sudo[129110]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:44 np0005538513.localdomain sudo[129202]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uiqpqbiepwnnfpzzwtnrousaxzblcazq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321464.721-689-136042312256638/AnsiballZ_stat.py
Nov 28 09:17:44 np0005538513.localdomain sudo[129202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:45 np0005538513.localdomain python3.9[129204]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/config.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:17:45 np0005538513.localdomain sudo[129202]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:45 np0005538513.localdomain sudo[129275]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fheulmfaolzemizghzpvzsxexupaihfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321464.721-689-136042312256638/AnsiballZ_copy.py
Nov 28 09:17:45 np0005538513.localdomain sudo[129275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:45 np0005538513.localdomain python3.9[129277]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/os-net-config/config.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321464.721-689-136042312256638/.source.yaml _original_basename=.w35bxh1u follow=False checksum=0cadac3cfc033a4e07cfac59b43f6459e787700a force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:17:45 np0005538513.localdomain sudo[129275]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:46 np0005538513.localdomain sudo[129367]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwwxyyxrywandoxheftfycqwyvxywcxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321466.0295477-734-23944972146421/AnsiballZ_slurp.py
Nov 28 09:17:46 np0005538513.localdomain sudo[129367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:46 np0005538513.localdomain python3.9[129369]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 28 09:17:46 np0005538513.localdomain sudo[129367]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22453 DF PROTO=TCP SPT=44742 DPT=9882 SEQ=4064839962 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0FC8C20000000001030307) 
Nov 28 09:17:47 np0005538513.localdomain sudo[129472]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-puenroltqkleenbkivixinpbxdgmwvys ; ANSIBLE_ASYNC_DIR='~/.ansible_async' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321467.0727656-761-150858680704529/async_wrapper.py j621656087058 300 /home/zuul/.ansible/tmp/ansible-tmp-1764321467.0727656-761-150858680704529/AnsiballZ_edpm_os_net_config.py _
Nov 28 09:17:47 np0005538513.localdomain sudo[129472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:47 np0005538513.localdomain ansible-async_wrapper.py[129474]: Invoked with j621656087058 300 /home/zuul/.ansible/tmp/ansible-tmp-1764321467.0727656-761-150858680704529/AnsiballZ_edpm_os_net_config.py _
Nov 28 09:17:47 np0005538513.localdomain ansible-async_wrapper.py[129477]: Starting module and watcher
Nov 28 09:17:47 np0005538513.localdomain ansible-async_wrapper.py[129477]: Start watching 129478 (300)
Nov 28 09:17:47 np0005538513.localdomain ansible-async_wrapper.py[129478]: Start module (129478)
Nov 28 09:17:47 np0005538513.localdomain ansible-async_wrapper.py[129474]: Return async_wrapper task started.
Nov 28 09:17:47 np0005538513.localdomain sudo[129472]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:48 np0005538513.localdomain python3.9[129479]: ansible-edpm_os_net_config Invoked with cleanup=False config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=False
Nov 28 09:17:48 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49278 DF PROTO=TCP SPT=56494 DPT=9105 SEQ=4039797680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0FCC5B0000000001030307) 
Nov 28 09:17:48 np0005538513.localdomain ansible-async_wrapper.py[129478]: Module complete (129478)
Nov 28 09:17:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49280 DF PROTO=TCP SPT=56494 DPT=9105 SEQ=4039797680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0FD8820000000001030307) 
Nov 28 09:17:51 np0005538513.localdomain sudo[129569]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtchfelqodnowxmmjytiajwwkjritbcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321471.0030575-761-27149893867116/AnsiballZ_async_status.py
Nov 28 09:17:51 np0005538513.localdomain sudo[129569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:51 np0005538513.localdomain python3.9[129571]: ansible-ansible.legacy.async_status Invoked with jid=j621656087058.129474 mode=status _async_dir=/root/.ansible_async
Nov 28 09:17:51 np0005538513.localdomain sudo[129569]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:51 np0005538513.localdomain sudo[129628]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usaknysmdamhnytdyrtpqwhfamlnndra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321471.0030575-761-27149893867116/AnsiballZ_async_status.py
Nov 28 09:17:51 np0005538513.localdomain sudo[129628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:52 np0005538513.localdomain python3.9[129630]: ansible-ansible.legacy.async_status Invoked with jid=j621656087058.129474 mode=cleanup _async_dir=/root/.ansible_async
Nov 28 09:17:52 np0005538513.localdomain sudo[129628]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:52 np0005538513.localdomain sudo[129720]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wuusrfldcicxboocjykpfsexwgoavxoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321472.3655045-827-274069601578939/AnsiballZ_stat.py
Nov 28 09:17:52 np0005538513.localdomain sudo[129720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:52 np0005538513.localdomain python3.9[129722]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:17:52 np0005538513.localdomain sudo[129720]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:52 np0005538513.localdomain ansible-async_wrapper.py[129477]: Done in kid B.
Nov 28 09:17:53 np0005538513.localdomain sudo[129793]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vyputgyajgvwpaqxcmgomsfgteuhkmyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321472.3655045-827-274069601578939/AnsiballZ_copy.py
Nov 28 09:17:53 np0005538513.localdomain sudo[129793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:53 np0005538513.localdomain python3.9[129795]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321472.3655045-827-274069601578939/.source.returncode _original_basename=.eqmqbu20 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:17:53 np0005538513.localdomain sudo[129793]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:53 np0005538513.localdomain sudo[129885]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqbgvnprjfsddkjalkpspduzdrvturzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321473.5790632-875-17894140427829/AnsiballZ_stat.py
Nov 28 09:17:53 np0005538513.localdomain sudo[129885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:54 np0005538513.localdomain python3.9[129887]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:17:54 np0005538513.localdomain sudo[129885]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:54 np0005538513.localdomain sudo[129958]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfdexmvzcytdwtiwaxwnririlhjlupzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321473.5790632-875-17894140427829/AnsiballZ_copy.py
Nov 28 09:17:54 np0005538513.localdomain sudo[129958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:54 np0005538513.localdomain python3.9[129960]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321473.5790632-875-17894140427829/.source.cfg _original_basename=.phx59nse follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:17:54 np0005538513.localdomain sudo[129958]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:55 np0005538513.localdomain sudo[130050]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovjjepqbiwaewdsirsipljydiqcncwyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321474.790834-920-261131409310751/AnsiballZ_systemd.py
Nov 28 09:17:55 np0005538513.localdomain sudo[130050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49281 DF PROTO=TCP SPT=56494 DPT=9105 SEQ=4039797680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0FE8420000000001030307) 
Nov 28 09:17:55 np0005538513.localdomain python3.9[130052]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:17:56 np0005538513.localdomain systemd[1]: Reloading Network Manager...
Nov 28 09:17:56 np0005538513.localdomain NetworkManager[5967]: <info>  [1764321476.4809] audit: op="reload" arg="0" pid=130056 uid=0 result="success"
Nov 28 09:17:56 np0005538513.localdomain NetworkManager[5967]: <info>  [1764321476.4824] config: signal: SIGHUP (no changes from disk)
Nov 28 09:17:56 np0005538513.localdomain systemd[1]: Reloaded Network Manager.
Nov 28 09:17:56 np0005538513.localdomain sudo[130050]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:56 np0005538513.localdomain sshd[126187]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:17:56 np0005538513.localdomain systemd-logind[764]: Session 40 logged out. Waiting for processes to exit.
Nov 28 09:17:56 np0005538513.localdomain systemd[1]: session-40.scope: Deactivated successfully.
Nov 28 09:17:56 np0005538513.localdomain systemd[1]: session-40.scope: Consumed 34.968s CPU time.
Nov 28 09:17:56 np0005538513.localdomain systemd-logind[764]: Removed session 40.
Nov 28 09:17:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22179 DF PROTO=TCP SPT=51420 DPT=9101 SEQ=1333309435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0FF3490000000001030307) 
Nov 28 09:18:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22181 DF PROTO=TCP SPT=51420 DPT=9101 SEQ=1333309435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0FFF420000000001030307) 
Nov 28 09:18:02 np0005538513.localdomain sshd[130071]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:18:02 np0005538513.localdomain sshd[130071]: Accepted publickey for zuul from 192.168.122.31 port 50114 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:18:02 np0005538513.localdomain systemd-logind[764]: New session 41 of user zuul.
Nov 28 09:18:02 np0005538513.localdomain systemd[1]: Started Session 41 of User zuul.
Nov 28 09:18:02 np0005538513.localdomain sshd[130071]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:18:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38310 DF PROTO=TCP SPT=54804 DPT=9102 SEQ=1841594550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1008420000000001030307) 
Nov 28 09:18:03 np0005538513.localdomain python3.9[130164]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:18:04 np0005538513.localdomain python3.9[130258]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:18:06 np0005538513.localdomain python3.9[130411]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:18:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1101 DF PROTO=TCP SPT=60774 DPT=9102 SEQ=635022861 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1013830000000001030307) 
Nov 28 09:18:06 np0005538513.localdomain sshd[130071]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:18:06 np0005538513.localdomain systemd[1]: session-41.scope: Deactivated successfully.
Nov 28 09:18:06 np0005538513.localdomain systemd[1]: session-41.scope: Consumed 2.074s CPU time.
Nov 28 09:18:06 np0005538513.localdomain systemd-logind[764]: Session 41 logged out. Waiting for processes to exit.
Nov 28 09:18:06 np0005538513.localdomain systemd-logind[764]: Removed session 41.
Nov 28 09:18:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55544 DF PROTO=TCP SPT=42570 DPT=9100 SEQ=476428184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB101F820000000001030307) 
Nov 28 09:18:11 np0005538513.localdomain sshd[130427]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:18:12 np0005538513.localdomain sshd[130427]: Accepted publickey for zuul from 192.168.122.31 port 49968 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:18:12 np0005538513.localdomain systemd-logind[764]: New session 42 of user zuul.
Nov 28 09:18:12 np0005538513.localdomain systemd[1]: Started Session 42 of User zuul.
Nov 28 09:18:12 np0005538513.localdomain sshd[130427]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:18:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36527 DF PROTO=TCP SPT=37356 DPT=9100 SEQ=3976809844 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB102B820000000001030307) 
Nov 28 09:18:12 np0005538513.localdomain sudo[130476]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:18:12 np0005538513.localdomain sudo[130476]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:18:12 np0005538513.localdomain sudo[130476]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:12 np0005538513.localdomain sudo[130505]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:18:12 np0005538513.localdomain sudo[130505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:18:13 np0005538513.localdomain python3.9[130550]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:18:13 np0005538513.localdomain sudo[130505]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:13 np0005538513.localdomain sudo[130660]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:18:13 np0005538513.localdomain sudo[130660]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:18:13 np0005538513.localdomain sudo[130660]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:13 np0005538513.localdomain sudo[130691]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 28 09:18:13 np0005538513.localdomain sudo[130691]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:18:13 np0005538513.localdomain python3.9[130688]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:18:14 np0005538513.localdomain sudo[130691]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:14 np0005538513.localdomain sudo[130818]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjvvrdzxgmbqltgiiipookvexfbfrgdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321494.4252257-80-236570815933488/AnsiballZ_setup.py
Nov 28 09:18:14 np0005538513.localdomain sudo[130818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:15 np0005538513.localdomain python3.9[130820]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:18:15 np0005538513.localdomain sudo[130818]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:15 np0005538513.localdomain sudo[130872]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkwygfqcgsryblmfnemolqbjezcwflvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321494.4252257-80-236570815933488/AnsiballZ_dnf.py
Nov 28 09:18:15 np0005538513.localdomain sudo[130872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:15 np0005538513.localdomain python3.9[130874]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:18:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57216 DF PROTO=TCP SPT=41028 DPT=9882 SEQ=160539914 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB103E020000000001030307) 
Nov 28 09:18:17 np0005538513.localdomain sudo[130877]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:18:17 np0005538513.localdomain sudo[130877]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:18:17 np0005538513.localdomain sudo[130877]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:18 np0005538513.localdomain sudo[130872]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11109 DF PROTO=TCP SPT=43810 DPT=9105 SEQ=2250170345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1045830000000001030307) 
Nov 28 09:18:19 np0005538513.localdomain sudo[130981]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqipxtlpgrwtbacibifqvjldwuvnevok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321499.307913-116-270675841837769/AnsiballZ_setup.py
Nov 28 09:18:19 np0005538513.localdomain sudo[130981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:19 np0005538513.localdomain python3.9[130983]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:18:20 np0005538513.localdomain sudo[130981]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:20 np0005538513.localdomain sudo[131136]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dacnlskxkzjdwtcdkvgyqwhbflghwygv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321500.572104-149-245837162225896/AnsiballZ_file.py
Nov 28 09:18:20 np0005538513.localdomain sudo[131136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11110 DF PROTO=TCP SPT=43810 DPT=9105 SEQ=2250170345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB104D820000000001030307) 
Nov 28 09:18:21 np0005538513.localdomain python3.9[131138]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:18:21 np0005538513.localdomain sudo[131136]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:21 np0005538513.localdomain sudo[131228]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llkefhezhypvchhpqjqpfcrunyaucdnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321501.3916287-173-119862418578109/AnsiballZ_command.py
Nov 28 09:18:21 np0005538513.localdomain sudo[131228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:22 np0005538513.localdomain python3.9[131230]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:18:22 np0005538513.localdomain sudo[131228]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:22 np0005538513.localdomain sudo[131333]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ywpjkvierulticmcqwidrudqpywlwdgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321502.2713795-197-247100811596024/AnsiballZ_stat.py
Nov 28 09:18:22 np0005538513.localdomain sudo[131333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:22 np0005538513.localdomain python3.9[131335]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:18:22 np0005538513.localdomain sudo[131333]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:23 np0005538513.localdomain sudo[131381]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugnboujvhtdlgjjtpgzmihyabintxgal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321502.2713795-197-247100811596024/AnsiballZ_file.py
Nov 28 09:18:23 np0005538513.localdomain sudo[131381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:23 np0005538513.localdomain python3.9[131383]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:18:23 np0005538513.localdomain sudo[131381]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:23 np0005538513.localdomain sudo[131473]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrinkatkvrtjjjhktdesgxxwczsipsir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321503.5280254-233-246298421959352/AnsiballZ_stat.py
Nov 28 09:18:23 np0005538513.localdomain sudo[131473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:24 np0005538513.localdomain python3.9[131475]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:18:24 np0005538513.localdomain sudo[131473]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:24 np0005538513.localdomain sudo[131521]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hihgoihcwaorjbnzscqozuwvonqkyqpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321503.5280254-233-246298421959352/AnsiballZ_file.py
Nov 28 09:18:24 np0005538513.localdomain sudo[131521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:24 np0005538513.localdomain python3.9[131523]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:18:24 np0005538513.localdomain sudo[131521]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11111 DF PROTO=TCP SPT=43810 DPT=9105 SEQ=2250170345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB105D430000000001030307) 
Nov 28 09:18:25 np0005538513.localdomain sudo[131613]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvbrqampqixxqyaeotqmvcohhbwnuxcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321504.856383-272-65574180233441/AnsiballZ_ini_file.py
Nov 28 09:18:25 np0005538513.localdomain sudo[131613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:25 np0005538513.localdomain python3.9[131615]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:18:25 np0005538513.localdomain sudo[131613]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:25 np0005538513.localdomain sudo[131705]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgeuzdzqganjtcemrapmwfdrmxqlywlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321505.665644-272-150684182864787/AnsiballZ_ini_file.py
Nov 28 09:18:25 np0005538513.localdomain sudo[131705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:26 np0005538513.localdomain python3.9[131707]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:18:26 np0005538513.localdomain sudo[131705]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:26 np0005538513.localdomain sudo[131797]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rnagiwguhgjlvydnkxkxybounyibthhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321506.265817-272-178786110601254/AnsiballZ_ini_file.py
Nov 28 09:18:26 np0005538513.localdomain sudo[131797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:26 np0005538513.localdomain python3.9[131799]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:18:26 np0005538513.localdomain sudo[131797]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:27 np0005538513.localdomain sudo[131889]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gugeqhhcizsbruhdzdbiehagxjieuhbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321506.8900356-272-32086338681458/AnsiballZ_ini_file.py
Nov 28 09:18:27 np0005538513.localdomain sudo[131889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:27 np0005538513.localdomain python3.9[131891]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:18:27 np0005538513.localdomain sudo[131889]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:27 np0005538513.localdomain sudo[131981]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjpgrotnpwxkluflwivwrjhykcvsryjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321507.6698463-365-117498056098764/AnsiballZ_dnf.py
Nov 28 09:18:27 np0005538513.localdomain sudo[131981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55113 DF PROTO=TCP SPT=56848 DPT=9101 SEQ=2674059226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1068790000000001030307) 
Nov 28 09:18:28 np0005538513.localdomain python3.9[131983]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:18:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55115 DF PROTO=TCP SPT=56848 DPT=9101 SEQ=2674059226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1074820000000001030307) 
Nov 28 09:18:31 np0005538513.localdomain sudo[131981]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:31 np0005538513.localdomain sshd[132000]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:18:31 np0005538513.localdomain sshd[132008]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:18:32 np0005538513.localdomain sshd[132008]: error: kex_exchange_identification: read: Connection reset by peer
Nov 28 09:18:32 np0005538513.localdomain sshd[132008]: Connection reset by 45.140.17.97 port 31006
Nov 28 09:18:32 np0005538513.localdomain sudo[132077]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iniucktsushnvepfkedptsloyurhioyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321511.9336526-398-153006595507972/AnsiballZ_setup.py
Nov 28 09:18:32 np0005538513.localdomain sudo[132077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:32 np0005538513.localdomain python3.9[132079]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:18:32 np0005538513.localdomain sudo[132077]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59513 DF PROTO=TCP SPT=39556 DPT=9102 SEQ=1558159202 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB107D830000000001030307) 
Nov 28 09:18:33 np0005538513.localdomain sudo[132171]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-topnpbbrtntyrlfliwnorqnlkirwiufg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321513.1738617-422-87991495855255/AnsiballZ_stat.py
Nov 28 09:18:33 np0005538513.localdomain sudo[132171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:33 np0005538513.localdomain python3.9[132173]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:18:33 np0005538513.localdomain sudo[132171]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:34 np0005538513.localdomain sudo[132263]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjotwkzmtslqdbfgampbuxeihcsbnxnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321513.9428809-449-209081055090976/AnsiballZ_stat.py
Nov 28 09:18:34 np0005538513.localdomain sudo[132263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:34 np0005538513.localdomain python3.9[132265]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:18:34 np0005538513.localdomain sudo[132263]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:18:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 5400.1 total, 600.0 interval
                                                          Cumulative writes: 4899 writes, 22K keys, 4899 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4899 writes, 608 syncs, 8.06 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 09:18:35 np0005538513.localdomain sudo[132355]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdhejdqsxrrjpqeacsuwjwtcxenhdxmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321515.4789898-479-109809858136067/AnsiballZ_command.py
Nov 28 09:18:35 np0005538513.localdomain sudo[132355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:35 np0005538513.localdomain python3.9[132357]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:18:35 np0005538513.localdomain sudo[132355]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41586 DF PROTO=TCP SPT=39660 DPT=9100 SEQ=739763758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1089020000000001030307) 
Nov 28 09:18:36 np0005538513.localdomain sudo[132448]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncvpvehrqwzlgavrepdpdvtbdosqkfjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321516.261907-509-233775773767964/AnsiballZ_service_facts.py
Nov 28 09:18:36 np0005538513.localdomain sudo[132448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:36 np0005538513.localdomain python3.9[132450]: ansible-service_facts Invoked
Nov 28 09:18:36 np0005538513.localdomain network[132467]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:18:36 np0005538513.localdomain network[132468]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:18:36 np0005538513.localdomain network[132469]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:18:37 np0005538513.localdomain sshd[132486]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:18:38 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:18:38 np0005538513.localdomain sshd[132486]: Invalid user solana from 80.94.92.182 port 44544
Nov 28 09:18:39 np0005538513.localdomain sshd[132486]: Connection closed by invalid user solana 80.94.92.182 port 44544 [preauth]
Nov 28 09:18:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59515 DF PROTO=TCP SPT=39556 DPT=9102 SEQ=1558159202 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1095420000000001030307) 
Nov 28 09:18:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:18:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 5400.3 total, 600.0 interval
                                                          Cumulative writes: 5616 writes, 25K keys, 5616 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5616 writes, 758 syncs, 7.41 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 09:18:39 np0005538513.localdomain sudo[132448]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41588 DF PROTO=TCP SPT=39660 DPT=9100 SEQ=739763758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB10A0C30000000001030307) 
Nov 28 09:18:44 np0005538513.localdomain sudo[132683]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xflbwjpofwzhzacqeerpsmjlahilljxp ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1764321524.3909152-554-41146985495550/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1764321524.3909152-554-41146985495550/args
Nov 28 09:18:44 np0005538513.localdomain sudo[132683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:44 np0005538513.localdomain sudo[132683]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:45 np0005538513.localdomain sudo[132790]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whgslcdmvdryefjrzzvmupzzeebsxonl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321525.0793743-587-51120481953644/AnsiballZ_dnf.py
Nov 28 09:18:45 np0005538513.localdomain sudo[132790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:45 np0005538513.localdomain python3.9[132792]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:18:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14866 DF PROTO=TCP SPT=33380 DPT=9882 SEQ=1888493366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB10B3420000000001030307) 
Nov 28 09:18:48 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34761 DF PROTO=TCP SPT=48630 DPT=9105 SEQ=2335367779 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB10B6BB0000000001030307) 
Nov 28 09:18:48 np0005538513.localdomain sudo[132790]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:49 np0005538513.localdomain sudo[132884]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wynpuhlawtzffwawtmbdmvgcnfadlelz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321529.2075417-626-102333261291482/AnsiballZ_package_facts.py
Nov 28 09:18:49 np0005538513.localdomain sudo[132884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:50 np0005538513.localdomain python3.9[132886]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 28 09:18:50 np0005538513.localdomain sudo[132884]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34763 DF PROTO=TCP SPT=48630 DPT=9105 SEQ=2335367779 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB10C2C30000000001030307) 
Nov 28 09:18:51 np0005538513.localdomain sudo[132976]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkleqytcribyoyuryhvatggofjizspck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321531.035931-656-190740213154570/AnsiballZ_stat.py
Nov 28 09:18:51 np0005538513.localdomain sudo[132976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:51 np0005538513.localdomain python3.9[132978]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:18:51 np0005538513.localdomain sudo[132976]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:52 np0005538513.localdomain sudo[133051]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-brgirxglmgsptbrjrmgeitmcwycnipuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321531.035931-656-190740213154570/AnsiballZ_copy.py
Nov 28 09:18:52 np0005538513.localdomain sudo[133051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:52 np0005538513.localdomain python3.9[133053]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321531.035931-656-190740213154570/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:18:52 np0005538513.localdomain sudo[133051]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:52 np0005538513.localdomain sudo[133145]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojtupprifmbcgnzekmfrvyaybskjryxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321532.5170507-701-187470642978333/AnsiballZ_stat.py
Nov 28 09:18:52 np0005538513.localdomain sudo[133145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:52 np0005538513.localdomain python3.9[133147]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:18:53 np0005538513.localdomain sudo[133145]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:53 np0005538513.localdomain sudo[133220]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avfhvxkdopvoneowqdmoadqegwxtexif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321532.5170507-701-187470642978333/AnsiballZ_copy.py
Nov 28 09:18:53 np0005538513.localdomain sudo[133220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:53 np0005538513.localdomain python3.9[133222]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321532.5170507-701-187470642978333/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:18:53 np0005538513.localdomain sudo[133220]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:55 np0005538513.localdomain sudo[133314]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofylzihjkbhspsegmgnlsrviukqmabfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321534.5790005-764-134778841789411/AnsiballZ_lineinfile.py
Nov 28 09:18:55 np0005538513.localdomain sudo[133314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34764 DF PROTO=TCP SPT=48630 DPT=9105 SEQ=2335367779 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB10D2820000000001030307) 
Nov 28 09:18:55 np0005538513.localdomain python3.9[133316]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:18:55 np0005538513.localdomain sudo[133314]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:56 np0005538513.localdomain sudo[133408]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lqxumcwuptufjdgejiitobiwxrxbywxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321536.2720404-809-31772295433592/AnsiballZ_setup.py
Nov 28 09:18:56 np0005538513.localdomain sudo[133408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:56 np0005538513.localdomain python3.9[133410]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:18:57 np0005538513.localdomain sudo[133408]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:57 np0005538513.localdomain sudo[133462]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xquafoiscmotkakczcvrbffxqldmwtoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321536.2720404-809-31772295433592/AnsiballZ_systemd.py
Nov 28 09:18:57 np0005538513.localdomain sudo[133462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1090 DF PROTO=TCP SPT=40788 DPT=9101 SEQ=3176365671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB10DDA90000000001030307) 
Nov 28 09:18:58 np0005538513.localdomain python3.9[133464]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:18:58 np0005538513.localdomain sudo[133462]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:00 np0005538513.localdomain sudo[133556]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwhiioojxtjnukvxoahlefcwiaqeenpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321539.930079-857-209430485598052/AnsiballZ_setup.py
Nov 28 09:19:00 np0005538513.localdomain sudo[133556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:00 np0005538513.localdomain python3.9[133558]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:19:00 np0005538513.localdomain sudo[133556]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:01 np0005538513.localdomain sudo[133610]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ysxdlmxecsilviuefqyytvvskpmfrouz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321539.930079-857-209430485598052/AnsiballZ_systemd.py
Nov 28 09:19:01 np0005538513.localdomain sudo[133610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:01 np0005538513.localdomain auditd[725]: Audit daemon rotating log files
Nov 28 09:19:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1092 DF PROTO=TCP SPT=40788 DPT=9101 SEQ=3176365671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB10E9C30000000001030307) 
Nov 28 09:19:01 np0005538513.localdomain python3.9[133612]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:19:01 np0005538513.localdomain chronyd[26085]: chronyd exiting
Nov 28 09:19:01 np0005538513.localdomain systemd[1]: Stopping NTP client/server...
Nov 28 09:19:01 np0005538513.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Nov 28 09:19:01 np0005538513.localdomain systemd[1]: Stopped NTP client/server.
Nov 28 09:19:01 np0005538513.localdomain systemd[1]: Starting NTP client/server...
Nov 28 09:19:01 np0005538513.localdomain chronyd[133620]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Nov 28 09:19:01 np0005538513.localdomain chronyd[133620]: Frequency -30.844 +/- 0.497 ppm read from /var/lib/chrony/drift
Nov 28 09:19:01 np0005538513.localdomain chronyd[133620]: Loaded seccomp filter (level 2)
Nov 28 09:19:01 np0005538513.localdomain systemd[1]: Started NTP client/server.
Nov 28 09:19:01 np0005538513.localdomain sudo[133610]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:01 np0005538513.localdomain sshd[130427]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:19:01 np0005538513.localdomain systemd[1]: session-42.scope: Deactivated successfully.
Nov 28 09:19:01 np0005538513.localdomain systemd[1]: session-42.scope: Consumed 28.330s CPU time.
Nov 28 09:19:01 np0005538513.localdomain systemd-logind[764]: Session 42 logged out. Waiting for processes to exit.
Nov 28 09:19:01 np0005538513.localdomain systemd-logind[764]: Removed session 42.
Nov 28 09:19:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3513 DF PROTO=TCP SPT=41230 DPT=9102 SEQ=2610679703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB10F2820000000001030307) 
Nov 28 09:19:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38315 DF PROTO=TCP SPT=54804 DPT=9102 SEQ=1841594550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB10FD830000000001030307) 
Nov 28 09:19:07 np0005538513.localdomain sshd[133636]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:19:07 np0005538513.localdomain sshd[133636]: Accepted publickey for zuul from 192.168.122.31 port 34412 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:19:07 np0005538513.localdomain systemd-logind[764]: New session 43 of user zuul.
Nov 28 09:19:07 np0005538513.localdomain systemd[1]: Started Session 43 of User zuul.
Nov 28 09:19:07 np0005538513.localdomain sshd[133636]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:19:08 np0005538513.localdomain python3.9[133729]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:19:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36530 DF PROTO=TCP SPT=37356 DPT=9100 SEQ=3976809844 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1109820000000001030307) 
Nov 28 09:19:09 np0005538513.localdomain sudo[133823]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbtgkztbjidokciwfarsaiscfmjsnioi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321549.4193535-59-18636669091631/AnsiballZ_file.py
Nov 28 09:19:09 np0005538513.localdomain sudo[133823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:10 np0005538513.localdomain python3.9[133825]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:10 np0005538513.localdomain sudo[133823]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:10 np0005538513.localdomain sudo[133928]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzljihjskdtpxnpdkujxkvnaqdqtnplm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321550.2437453-83-109241453236902/AnsiballZ_stat.py
Nov 28 09:19:10 np0005538513.localdomain sudo[133928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:10 np0005538513.localdomain python3.9[133930]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:10 np0005538513.localdomain sudo[133928]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:11 np0005538513.localdomain sudo[133976]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phmbuanwfzhofezszfvtzkcouiufbhep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321550.2437453-83-109241453236902/AnsiballZ_file.py
Nov 28 09:19:11 np0005538513.localdomain sudo[133976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:11 np0005538513.localdomain python3.9[133978]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.2j_vxb6s recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:11 np0005538513.localdomain sudo[133976]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:12 np0005538513.localdomain sudo[134068]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfimrpnhjiwpqhblokcplcrynbfbtsxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321552.1577518-143-233066384468499/AnsiballZ_stat.py
Nov 28 09:19:12 np0005538513.localdomain sudo[134068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11548 DF PROTO=TCP SPT=60450 DPT=9100 SEQ=4174456282 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1116020000000001030307) 
Nov 28 09:19:12 np0005538513.localdomain python3.9[134070]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:12 np0005538513.localdomain sudo[134068]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:13 np0005538513.localdomain sudo[134143]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlfiytynetkssopxjbrzhshayfcwgmvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321552.1577518-143-233066384468499/AnsiballZ_copy.py
Nov 28 09:19:13 np0005538513.localdomain sudo[134143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:13 np0005538513.localdomain python3.9[134145]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321552.1577518-143-233066384468499/.source _original_basename=.86_hfc7z follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:13 np0005538513.localdomain sudo[134143]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:13 np0005538513.localdomain sudo[134235]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-munjomeqfxeofprvihzgkfatsiarkcke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321553.6342032-191-236433043843322/AnsiballZ_file.py
Nov 28 09:19:13 np0005538513.localdomain sudo[134235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:14 np0005538513.localdomain python3.9[134237]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:19:14 np0005538513.localdomain sudo[134235]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:14 np0005538513.localdomain sudo[134327]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sknbmtoayacntdphwbjevumdqkfjpmhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321554.3577225-215-101002390405577/AnsiballZ_stat.py
Nov 28 09:19:14 np0005538513.localdomain sudo[134327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:14 np0005538513.localdomain python3.9[134329]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:14 np0005538513.localdomain sudo[134327]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:15 np0005538513.localdomain sudo[134400]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ieipbjfxjvedlovqulchuekcfevqbkgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321554.3577225-215-101002390405577/AnsiballZ_copy.py
Nov 28 09:19:15 np0005538513.localdomain sudo[134400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:15 np0005538513.localdomain python3.9[134402]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321554.3577225-215-101002390405577/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:19:15 np0005538513.localdomain sudo[134400]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:15 np0005538513.localdomain sudo[134492]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-knkbytqvcwadytbnxxgkkmqzoonexxjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321555.5790176-215-65413931635646/AnsiballZ_stat.py
Nov 28 09:19:15 np0005538513.localdomain sudo[134492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:16 np0005538513.localdomain python3.9[134494]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:16 np0005538513.localdomain sudo[134492]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:16 np0005538513.localdomain sudo[134565]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbqumompfgrehaxijbswxhmfvaasrmpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321555.5790176-215-65413931635646/AnsiballZ_copy.py
Nov 28 09:19:16 np0005538513.localdomain sudo[134565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:16 np0005538513.localdomain python3.9[134567]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321555.5790176-215-65413931635646/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:19:16 np0005538513.localdomain sudo[134565]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:17 np0005538513.localdomain sudo[134657]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yitwjfcbxmsapcvviglfzhmuodfhsqep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321556.7500916-302-275740966703067/AnsiballZ_file.py
Nov 28 09:19:17 np0005538513.localdomain sudo[134657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50708 DF PROTO=TCP SPT=53558 DPT=9882 SEQ=4101534268 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1128820000000001030307) 
Nov 28 09:19:17 np0005538513.localdomain python3.9[134659]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:17 np0005538513.localdomain sudo[134657]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:17 np0005538513.localdomain sudo[134674]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:19:17 np0005538513.localdomain sudo[134674]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:19:17 np0005538513.localdomain sudo[134674]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:17 np0005538513.localdomain sudo[134689]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:19:17 np0005538513.localdomain sudo[134689]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:19:18 np0005538513.localdomain sudo[134800]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-brrfxlcisybpxaixqqoycgbvmenaxmnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321557.631082-326-17041010042634/AnsiballZ_stat.py
Nov 28 09:19:18 np0005538513.localdomain sudo[134800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:18 np0005538513.localdomain sudo[134689]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:18 np0005538513.localdomain python3.9[134807]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:18 np0005538513.localdomain sudo[134800]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:18 np0005538513.localdomain sudo[134846]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:19:18 np0005538513.localdomain sudo[134846]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:19:18 np0005538513.localdomain sudo[134846]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:18 np0005538513.localdomain sudo[134881]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 -- inventory --format=json-pretty --filter-for-batch
Nov 28 09:19:18 np0005538513.localdomain sudo[134881]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:19:18 np0005538513.localdomain sudo[134914]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfzuxqprxbqvjbstmghzsctazpngpojl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321557.631082-326-17041010042634/AnsiballZ_copy.py
Nov 28 09:19:18 np0005538513.localdomain sudo[134914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:18 np0005538513.localdomain python3.9[134917]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321557.631082-326-17041010042634/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:18 np0005538513.localdomain sudo[134914]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29552 DF PROTO=TCP SPT=36232 DPT=9105 SEQ=3194225600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1130030000000001030307) 
Nov 28 09:19:19 np0005538513.localdomain podman[135017]: 
Nov 28 09:19:19 np0005538513.localdomain podman[135017]: 2025-11-28 09:19:19.298149209 +0000 UTC m=+0.083396103 container create 2243db928cb8758362702ee34cc3c653bda1e715a69c2c4b18805c93270daf67 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_snyder, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, architecture=x86_64, name=rhceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:19:19 np0005538513.localdomain systemd[1]: Started libpod-conmon-2243db928cb8758362702ee34cc3c653bda1e715a69c2c4b18805c93270daf67.scope.
Nov 28 09:19:19 np0005538513.localdomain podman[135017]: 2025-11-28 09:19:19.262269379 +0000 UTC m=+0.047516303 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:19:19 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:19:19 np0005538513.localdomain podman[135017]: 2025-11-28 09:19:19.380178608 +0000 UTC m=+0.165425492 container init 2243db928cb8758362702ee34cc3c653bda1e715a69c2c4b18805c93270daf67 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_snyder, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, ceph=True, architecture=x86_64, build-date=2025-09-24T08:57:55, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=)
Nov 28 09:19:19 np0005538513.localdomain sudo[135062]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrhydcnbbstyybdkarksaepmjtpiumkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321559.0707934-371-1603773660681/AnsiballZ_stat.py
Nov 28 09:19:19 np0005538513.localdomain sudo[135062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:19 np0005538513.localdomain podman[135017]: 2025-11-28 09:19:19.39210484 +0000 UTC m=+0.177351724 container start 2243db928cb8758362702ee34cc3c653bda1e715a69c2c4b18805c93270daf67 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_snyder, distribution-scope=public, GIT_BRANCH=main, name=rhceph, version=7, architecture=x86_64, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, release=553, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc.)
Nov 28 09:19:19 np0005538513.localdomain podman[135017]: 2025-11-28 09:19:19.392537233 +0000 UTC m=+0.177784107 container attach 2243db928cb8758362702ee34cc3c653bda1e715a69c2c4b18805c93270daf67 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_snyder, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, RELEASE=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, architecture=x86_64, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, GIT_BRANCH=main, vcs-type=git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, ceph=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=)
Nov 28 09:19:19 np0005538513.localdomain optimistic_snyder[135063]: 167 167
Nov 28 09:19:19 np0005538513.localdomain systemd[1]: libpod-2243db928cb8758362702ee34cc3c653bda1e715a69c2c4b18805c93270daf67.scope: Deactivated successfully.
Nov 28 09:19:19 np0005538513.localdomain podman[135017]: 2025-11-28 09:19:19.395338271 +0000 UTC m=+0.180585175 container died 2243db928cb8758362702ee34cc3c653bda1e715a69c2c4b18805c93270daf67 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_snyder, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, release=553, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, ceph=True, vcs-type=git)
Nov 28 09:19:19 np0005538513.localdomain podman[135070]: 2025-11-28 09:19:19.506002194 +0000 UTC m=+0.093442286 container remove 2243db928cb8758362702ee34cc3c653bda1e715a69c2c4b18805c93270daf67 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_snyder, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, release=553, build-date=2025-09-24T08:57:55, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, vcs-type=git, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:19:19 np0005538513.localdomain systemd[1]: libpod-conmon-2243db928cb8758362702ee34cc3c653bda1e715a69c2c4b18805c93270daf67.scope: Deactivated successfully.
Nov 28 09:19:19 np0005538513.localdomain python3.9[135068]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:19 np0005538513.localdomain sudo[135062]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:19 np0005538513.localdomain podman[135102]: 
Nov 28 09:19:19 np0005538513.localdomain podman[135102]: 2025-11-28 09:19:19.731520339 +0000 UTC m=+0.073019039 container create c38276a3e9b8f2b0ca5ae16a876280ec5001167f36802775a2ce4814145b2e79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_hamilton, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, version=7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, architecture=x86_64, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, RELEASE=main, build-date=2025-09-24T08:57:55, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=)
Nov 28 09:19:19 np0005538513.localdomain systemd[1]: Started libpod-conmon-c38276a3e9b8f2b0ca5ae16a876280ec5001167f36802775a2ce4814145b2e79.scope.
Nov 28 09:19:19 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:19:19 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67f0ca1719e64b291c1616d6e829851871af4278f7df2287f450e9eae4e3b1e2/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 09:19:19 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67f0ca1719e64b291c1616d6e829851871af4278f7df2287f450e9eae4e3b1e2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 09:19:19 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67f0ca1719e64b291c1616d6e829851871af4278f7df2287f450e9eae4e3b1e2/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 09:19:19 np0005538513.localdomain podman[135102]: 2025-11-28 09:19:19.799594564 +0000 UTC m=+0.141093314 container init c38276a3e9b8f2b0ca5ae16a876280ec5001167f36802775a2ce4814145b2e79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_hamilton, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, distribution-scope=public, RELEASE=main, architecture=x86_64, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.openshift.expose-services=)
Nov 28 09:19:19 np0005538513.localdomain podman[135102]: 2025-11-28 09:19:19.704958771 +0000 UTC m=+0.046457511 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:19:19 np0005538513.localdomain podman[135102]: 2025-11-28 09:19:19.810888086 +0000 UTC m=+0.152386786 container start c38276a3e9b8f2b0ca5ae16a876280ec5001167f36802775a2ce4814145b2e79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_hamilton, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.openshift.tags=rhceph ceph, release=553, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public)
Nov 28 09:19:19 np0005538513.localdomain podman[135102]: 2025-11-28 09:19:19.811178525 +0000 UTC m=+0.152677225 container attach c38276a3e9b8f2b0ca5ae16a876280ec5001167f36802775a2ce4814145b2e79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_hamilton, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, version=7, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, ceph=True, vcs-type=git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, architecture=x86_64, io.buildah.version=1.33.12, release=553, io.openshift.expose-services=)
Nov 28 09:19:19 np0005538513.localdomain sudo[135183]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igxewswwupxsqoqcjykzafadegtsasca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321559.0707934-371-1603773660681/AnsiballZ_copy.py
Nov 28 09:19:19 np0005538513.localdomain sudo[135183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:20 np0005538513.localdomain python3.9[135185]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321559.0707934-371-1603773660681/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:20 np0005538513.localdomain sudo[135183]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:20 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d9969c749ffddba729ff682fdacf5b8212a9e2f3075e8d8892007d3e32ddb2df-merged.mount: Deactivated successfully.
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]: [
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:     {
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:         "available": false,
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:         "ceph_device": false,
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:         "lsm_data": {},
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:         "lvs": [],
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:         "path": "/dev/sr0",
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:         "rejected_reasons": [
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:             "Has a FileSystem",
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:             "Insufficient space (<5GB)"
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:         ],
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:         "sys_api": {
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:             "actuators": null,
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:             "device_nodes": "sr0",
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:             "human_readable_size": "482.00 KB",
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:             "id_bus": "ata",
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:             "model": "QEMU DVD-ROM",
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:             "nr_requests": "2",
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:             "partitions": {},
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:             "path": "/dev/sr0",
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:             "removable": "1",
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:             "rev": "2.5+",
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:             "ro": "0",
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:             "rotational": "1",
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:             "sas_address": "",
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:             "sas_device_handle": "",
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:             "scheduler_mode": "mq-deadline",
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:             "sectors": 0,
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:             "sectorsize": "2048",
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:             "size": 493568.0,
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:             "support_discard": "0",
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:             "type": "disk",
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:             "vendor": "QEMU"
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:         }
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]:     }
Nov 28 09:19:20 np0005538513.localdomain friendly_hamilton[135148]: ]
Nov 28 09:19:20 np0005538513.localdomain systemd[1]: libpod-c38276a3e9b8f2b0ca5ae16a876280ec5001167f36802775a2ce4814145b2e79.scope: Deactivated successfully.
Nov 28 09:19:20 np0005538513.localdomain podman[135102]: 2025-11-28 09:19:20.726164992 +0000 UTC m=+1.067663742 container died c38276a3e9b8f2b0ca5ae16a876280ec5001167f36802775a2ce4814145b2e79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_hamilton, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_CLEAN=True, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, ceph=True, name=rhceph, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, version=7, RELEASE=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 09:19:20 np0005538513.localdomain systemd[1]: tmp-crun.7vOIXL.mount: Deactivated successfully.
Nov 28 09:19:20 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-67f0ca1719e64b291c1616d6e829851871af4278f7df2287f450e9eae4e3b1e2-merged.mount: Deactivated successfully.
Nov 28 09:19:20 np0005538513.localdomain podman[136796]: 2025-11-28 09:19:20.831407266 +0000 UTC m=+0.091675912 container remove c38276a3e9b8f2b0ca5ae16a876280ec5001167f36802775a2ce4814145b2e79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_hamilton, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, name=rhceph, vcs-type=git, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, version=7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, distribution-scope=public)
Nov 28 09:19:20 np0005538513.localdomain systemd[1]: libpod-conmon-c38276a3e9b8f2b0ca5ae16a876280ec5001167f36802775a2ce4814145b2e79.scope: Deactivated successfully.
Nov 28 09:19:20 np0005538513.localdomain sudo[134881]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:20 np0005538513.localdomain sudo[136853]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xkjuyfjcimknfymysugdbfvpdzuaxdkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321560.338831-416-136191891640558/AnsiballZ_systemd.py
Nov 28 09:19:20 np0005538513.localdomain sudo[136853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29553 DF PROTO=TCP SPT=36232 DPT=9105 SEQ=3194225600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1138020000000001030307) 
Nov 28 09:19:21 np0005538513.localdomain sudo[136856]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:19:21 np0005538513.localdomain sudo[136856]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:19:21 np0005538513.localdomain sudo[136856]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:21 np0005538513.localdomain python3.9[136855]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:19:21 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:19:21 np0005538513.localdomain systemd-rc-local-generator[136895]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:19:21 np0005538513.localdomain systemd-sysv-generator[136899]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:19:21 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:19:21 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:19:21 np0005538513.localdomain systemd-sysv-generator[136938]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:19:21 np0005538513.localdomain systemd-rc-local-generator[136935]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:19:21 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:19:21 np0005538513.localdomain systemd[1]: Starting EDPM Container Shutdown...
Nov 28 09:19:21 np0005538513.localdomain systemd[1]: Finished EDPM Container Shutdown.
Nov 28 09:19:21 np0005538513.localdomain sudo[136853]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:22 np0005538513.localdomain sudo[137036]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fpdiwjwrfpaipduwkyytyomxoybfrete ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321562.01919-440-46142071843002/AnsiballZ_stat.py
Nov 28 09:19:22 np0005538513.localdomain sudo[137036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:22 np0005538513.localdomain python3.9[137038]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:22 np0005538513.localdomain sudo[137036]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:22 np0005538513.localdomain sudo[137109]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqudjdznfhmfuhorctkuvhmkqfymimlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321562.01919-440-46142071843002/AnsiballZ_copy.py
Nov 28 09:19:22 np0005538513.localdomain sudo[137109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:23 np0005538513.localdomain python3.9[137111]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321562.01919-440-46142071843002/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:23 np0005538513.localdomain sudo[137109]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:23 np0005538513.localdomain sudo[137201]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lhgueinjgybboijezasgcjgsiwhydfah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321563.3719065-485-137940842376703/AnsiballZ_stat.py
Nov 28 09:19:23 np0005538513.localdomain sudo[137201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:23 np0005538513.localdomain python3.9[137203]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:23 np0005538513.localdomain sudo[137201]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:24 np0005538513.localdomain sudo[137274]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xnlytppblhdmntxspjlbremdbuehgobe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321563.3719065-485-137940842376703/AnsiballZ_copy.py
Nov 28 09:19:24 np0005538513.localdomain sudo[137274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:24 np0005538513.localdomain python3.9[137276]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321563.3719065-485-137940842376703/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:24 np0005538513.localdomain sudo[137274]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:24 np0005538513.localdomain sudo[137366]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fgkhedhgeaegejijkldebgwhqeeeajtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321564.6038299-530-177746198070304/AnsiballZ_systemd.py
Nov 28 09:19:24 np0005538513.localdomain sudo[137366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:25 np0005538513.localdomain python3.9[137368]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:19:25 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:19:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29554 DF PROTO=TCP SPT=36232 DPT=9105 SEQ=3194225600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1147C30000000001030307) 
Nov 28 09:19:25 np0005538513.localdomain systemd-sysv-generator[137399]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:19:25 np0005538513.localdomain systemd-rc-local-generator[137394]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:19:25 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:19:25 np0005538513.localdomain systemd[1]: Starting Create netns directory...
Nov 28 09:19:25 np0005538513.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 09:19:25 np0005538513.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 09:19:25 np0005538513.localdomain systemd[1]: Finished Create netns directory.
Nov 28 09:19:25 np0005538513.localdomain sudo[137366]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:26 np0005538513.localdomain python3.9[137500]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:19:26 np0005538513.localdomain network[137517]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:19:26 np0005538513.localdomain network[137518]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:19:26 np0005538513.localdomain network[137519]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:19:27 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:19:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13627 DF PROTO=TCP SPT=36168 DPT=9101 SEQ=1442700734 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1152D90000000001030307) 
Nov 28 09:19:30 np0005538513.localdomain sudo[137718]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dyakurkinssagpvyyrglejgqpulqogye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321570.0929236-608-184580113764586/AnsiballZ_stat.py
Nov 28 09:19:30 np0005538513.localdomain sudo[137718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:30 np0005538513.localdomain python3.9[137720]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:30 np0005538513.localdomain sudo[137718]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:30 np0005538513.localdomain sudo[137793]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uobiwimolflqyfjzksqnhbfkyjghurdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321570.0929236-608-184580113764586/AnsiballZ_copy.py
Nov 28 09:19:30 np0005538513.localdomain sudo[137793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13629 DF PROTO=TCP SPT=36168 DPT=9101 SEQ=1442700734 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB115EC20000000001030307) 
Nov 28 09:19:31 np0005538513.localdomain python3.9[137795]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321570.0929236-608-184580113764586/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:31 np0005538513.localdomain sudo[137793]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:31 np0005538513.localdomain sudo[137886]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-elfmofaragizfmqnhgxdvrqdpqbqbibh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321571.4455428-653-90403267946352/AnsiballZ_systemd.py
Nov 28 09:19:31 np0005538513.localdomain sudo[137886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:32 np0005538513.localdomain python3.9[137888]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:19:32 np0005538513.localdomain systemd[1]: Reloading OpenSSH server daemon...
Nov 28 09:19:32 np0005538513.localdomain systemd[1]: Reloaded OpenSSH server daemon.
Nov 28 09:19:32 np0005538513.localdomain sshd[117359]: Received SIGHUP; restarting.
Nov 28 09:19:32 np0005538513.localdomain sshd[117359]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:19:32 np0005538513.localdomain sshd[117359]: Server listening on 0.0.0.0 port 22.
Nov 28 09:19:32 np0005538513.localdomain sshd[117359]: Server listening on :: port 22.
Nov 28 09:19:32 np0005538513.localdomain sudo[137886]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:32 np0005538513.localdomain sudo[137982]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlqunyjumuqnxeaayydfgcwivdflelme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321572.3425293-677-69340104856617/AnsiballZ_file.py
Nov 28 09:19:32 np0005538513.localdomain sudo[137982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:32 np0005538513.localdomain python3.9[137984]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:32 np0005538513.localdomain sudo[137982]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29555 DF PROTO=TCP SPT=36232 DPT=9105 SEQ=3194225600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1167820000000001030307) 
Nov 28 09:19:33 np0005538513.localdomain sudo[138074]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjgmjaaemccxunjqitcuptjfkomhplbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321573.0202067-701-120072531886176/AnsiballZ_stat.py
Nov 28 09:19:33 np0005538513.localdomain sudo[138074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:33 np0005538513.localdomain python3.9[138076]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:33 np0005538513.localdomain sudo[138074]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:34 np0005538513.localdomain sudo[138147]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wspznqnanieqeyadsxeqvhyssisknjcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321573.0202067-701-120072531886176/AnsiballZ_copy.py
Nov 28 09:19:34 np0005538513.localdomain sudo[138147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:34 np0005538513.localdomain python3.9[138149]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321573.0202067-701-120072531886176/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:34 np0005538513.localdomain sudo[138147]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:35 np0005538513.localdomain sudo[138239]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emcjniviflgupskxaomwmlrbecrprkmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321574.9749548-755-263598480834345/AnsiballZ_timezone.py
Nov 28 09:19:35 np0005538513.localdomain sudo[138239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:35 np0005538513.localdomain python3.9[138241]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 28 09:19:35 np0005538513.localdomain systemd[1]: Starting Time & Date Service...
Nov 28 09:19:35 np0005538513.localdomain systemd[1]: Started Time & Date Service.
Nov 28 09:19:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5521 DF PROTO=TCP SPT=34726 DPT=9100 SEQ=4115780546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1173430000000001030307) 
Nov 28 09:19:36 np0005538513.localdomain sudo[138239]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:37 np0005538513.localdomain sudo[138335]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iuilxjlwhwepuxvyvbcxmnrtnjvsqbtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321577.0150936-782-107035606692989/AnsiballZ_file.py
Nov 28 09:19:37 np0005538513.localdomain sudo[138335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:37 np0005538513.localdomain python3.9[138337]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:37 np0005538513.localdomain sudo[138335]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:37 np0005538513.localdomain sudo[138427]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eyaxghlnolbcqgnopepqyhxkglxktaag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321577.7063465-806-100933902431769/AnsiballZ_stat.py
Nov 28 09:19:37 np0005538513.localdomain sudo[138427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:38 np0005538513.localdomain python3.9[138429]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:38 np0005538513.localdomain sudo[138427]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:38 np0005538513.localdomain sudo[138500]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxvamydalbglcrjcnzvvrwgreqgwsgyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321577.7063465-806-100933902431769/AnsiballZ_copy.py
Nov 28 09:19:38 np0005538513.localdomain sudo[138500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:38 np0005538513.localdomain python3.9[138502]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321577.7063465-806-100933902431769/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:38 np0005538513.localdomain sudo[138500]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:39 np0005538513.localdomain sudo[138592]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ohyqfdylyenbfvyaaeprlegsdqqzlgqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321578.9847374-851-225080342505182/AnsiballZ_stat.py
Nov 28 09:19:39 np0005538513.localdomain sudo[138592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:39 np0005538513.localdomain python3.9[138594]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:39 np0005538513.localdomain sudo[138592]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41591 DF PROTO=TCP SPT=39660 DPT=9100 SEQ=739763758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB117F820000000001030307) 
Nov 28 09:19:39 np0005538513.localdomain sudo[138665]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcsdvlqndkehimfeqnxjxxpxqhcwxlub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321578.9847374-851-225080342505182/AnsiballZ_copy.py
Nov 28 09:19:39 np0005538513.localdomain sudo[138665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:39 np0005538513.localdomain python3.9[138667]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321578.9847374-851-225080342505182/.source.yaml _original_basename=._9jqw2ll follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:40 np0005538513.localdomain sudo[138665]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:40 np0005538513.localdomain sudo[138757]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtkvcridwtotnrtdonxppwychobranjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321580.2831042-896-191108226772632/AnsiballZ_stat.py
Nov 28 09:19:40 np0005538513.localdomain sudo[138757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:40 np0005538513.localdomain python3.9[138759]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:40 np0005538513.localdomain sudo[138757]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:41 np0005538513.localdomain sudo[138832]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-unupbaujyyzdsqjchukslkxehmnigtet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321580.2831042-896-191108226772632/AnsiballZ_copy.py
Nov 28 09:19:41 np0005538513.localdomain sudo[138832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:41 np0005538513.localdomain python3.9[138834]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321580.2831042-896-191108226772632/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:41 np0005538513.localdomain sudo[138832]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:42 np0005538513.localdomain sudo[138924]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ihgecclwgqmdmbkkonjlnwpuuvrmfuif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321581.5894725-941-278731911314348/AnsiballZ_command.py
Nov 28 09:19:42 np0005538513.localdomain sudo[138924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:42 np0005538513.localdomain python3.9[138926]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:19:42 np0005538513.localdomain sudo[138924]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5523 DF PROTO=TCP SPT=34726 DPT=9100 SEQ=4115780546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB118B030000000001030307) 
Nov 28 09:19:42 np0005538513.localdomain sudo[139017]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tcgwmjyvmimeqgnrhkvpywlamwpaaffk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321582.4547222-965-76252540404655/AnsiballZ_command.py
Nov 28 09:19:42 np0005538513.localdomain sudo[139017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:42 np0005538513.localdomain python3.9[139019]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:19:42 np0005538513.localdomain sudo[139017]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:43 np0005538513.localdomain sudo[139110]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybdelyfhbnwoemjfxbzewtjdwfxydsxs ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764321583.1989393-989-132370474081662/AnsiballZ_edpm_nftables_from_files.py
Nov 28 09:19:43 np0005538513.localdomain sudo[139110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:43 np0005538513.localdomain python3[139112]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 28 09:19:43 np0005538513.localdomain sudo[139110]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:44 np0005538513.localdomain sudo[139202]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkbxaljijzbldlpjdiabklpbehkytduh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321584.1865373-1013-11462905983491/AnsiballZ_stat.py
Nov 28 09:19:44 np0005538513.localdomain sudo[139202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:44 np0005538513.localdomain python3.9[139204]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:44 np0005538513.localdomain sudo[139202]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:44 np0005538513.localdomain sudo[139275]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgwuojzpbsxwwwuuqbvxtrwhjskbzjrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321584.1865373-1013-11462905983491/AnsiballZ_copy.py
Nov 28 09:19:44 np0005538513.localdomain sudo[139275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:45 np0005538513.localdomain python3.9[139277]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321584.1865373-1013-11462905983491/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:45 np0005538513.localdomain sudo[139275]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:45 np0005538513.localdomain sudo[139367]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxahtayyhqdlboysjgjospzisxuqxskl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321585.5141056-1058-249363852328996/AnsiballZ_stat.py
Nov 28 09:19:45 np0005538513.localdomain sudo[139367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:46 np0005538513.localdomain python3.9[139369]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:46 np0005538513.localdomain sudo[139367]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:46 np0005538513.localdomain sudo[139440]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdzspuxemqvydaoezignajumptdqilru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321585.5141056-1058-249363852328996/AnsiballZ_copy.py
Nov 28 09:19:46 np0005538513.localdomain sudo[139440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:46 np0005538513.localdomain python3.9[139442]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321585.5141056-1058-249363852328996/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:46 np0005538513.localdomain sudo[139440]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:47 np0005538513.localdomain sudo[139532]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hggvjbctnsreagpwywqwkpucvuonnkye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321586.9067307-1103-128874515078201/AnsiballZ_stat.py
Nov 28 09:19:47 np0005538513.localdomain sudo[139532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:47 np0005538513.localdomain python3.9[139534]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:47 np0005538513.localdomain sudo[139532]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56612 DF PROTO=TCP SPT=36132 DPT=9102 SEQ=3743218911 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB119F820000000001030307) 
Nov 28 09:19:47 np0005538513.localdomain sudo[139605]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwjrzadyxrvfxvihhvducwzopvlampog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321586.9067307-1103-128874515078201/AnsiballZ_copy.py
Nov 28 09:19:47 np0005538513.localdomain sudo[139605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:47 np0005538513.localdomain python3.9[139607]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321586.9067307-1103-128874515078201/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:48 np0005538513.localdomain sudo[139605]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:48 np0005538513.localdomain sudo[139697]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqsompjzbeisyosogugititqegecxybv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321588.1701167-1148-136979814319947/AnsiballZ_stat.py
Nov 28 09:19:48 np0005538513.localdomain sudo[139697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:48 np0005538513.localdomain python3.9[139699]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:48 np0005538513.localdomain sudo[139697]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:48 np0005538513.localdomain sudo[139770]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cpcatqotugxhedxmgtxqixoqbhpgwbos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321588.1701167-1148-136979814319947/AnsiballZ_copy.py
Nov 28 09:19:48 np0005538513.localdomain sudo[139770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:49 np0005538513.localdomain python3.9[139772]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321588.1701167-1148-136979814319947/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:49 np0005538513.localdomain sudo[139770]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29556 DF PROTO=TCP SPT=36232 DPT=9105 SEQ=3194225600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB11A7820000000001030307) 
Nov 28 09:19:49 np0005538513.localdomain sudo[139862]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kineundfwoivvjguyvzhjzuhnveqptvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321589.483295-1193-185498845146240/AnsiballZ_stat.py
Nov 28 09:19:49 np0005538513.localdomain sudo[139862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:50 np0005538513.localdomain python3.9[139864]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:50 np0005538513.localdomain sudo[139862]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:50 np0005538513.localdomain sudo[139935]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzcuefmfiwrbqiqiyivddzwngynigagc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321589.483295-1193-185498845146240/AnsiballZ_copy.py
Nov 28 09:19:50 np0005538513.localdomain sudo[139935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:50 np0005538513.localdomain python3.9[139937]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321589.483295-1193-185498845146240/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:50 np0005538513.localdomain sudo[139935]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:51 np0005538513.localdomain sudo[140027]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osgnirqdjuzgcgzrnwveypmcpskxhbzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321591.0050218-1238-187681454674652/AnsiballZ_file.py
Nov 28 09:19:51 np0005538513.localdomain sudo[140027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:51 np0005538513.localdomain python3.9[140029]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:51 np0005538513.localdomain sudo[140027]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:51 np0005538513.localdomain sudo[140119]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlnujbutqpaoztherechzsrvsxzdtixx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321591.6526165-1262-143276226710855/AnsiballZ_command.py
Nov 28 09:19:51 np0005538513.localdomain sudo[140119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:52 np0005538513.localdomain python3.9[140121]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:19:52 np0005538513.localdomain sudo[140119]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:52 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34767 DF PROTO=TCP SPT=48630 DPT=9105 SEQ=2335367779 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB11B1820000000001030307) 
Nov 28 09:19:52 np0005538513.localdomain sudo[140214]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-esulmghuidzkywglfomiemyaxkcjyoxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321592.3597906-1286-69811487504162/AnsiballZ_blockinfile.py
Nov 28 09:19:52 np0005538513.localdomain sudo[140214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:52 np0005538513.localdomain python3.9[140216]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:52 np0005538513.localdomain sudo[140214]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:53 np0005538513.localdomain sudo[140307]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqgltyzbhpkjaaogaasoemvyyqcdbskn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321593.2702036-1313-101407717061061/AnsiballZ_file.py
Nov 28 09:19:53 np0005538513.localdomain sudo[140307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:53 np0005538513.localdomain python3.9[140309]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:53 np0005538513.localdomain sudo[140307]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:54 np0005538513.localdomain sudo[140399]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hblseurntkpkmpoiabvlxihajlfsjqye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321593.8259938-1313-130028199799305/AnsiballZ_file.py
Nov 28 09:19:54 np0005538513.localdomain sudo[140399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:54 np0005538513.localdomain python3.9[140401]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:54 np0005538513.localdomain sudo[140399]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:54 np0005538513.localdomain sudo[140491]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gogcpmmvdhvvebuekzufxhnsxhajeckb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321594.5384977-1358-41558662283085/AnsiballZ_mount.py
Nov 28 09:19:54 np0005538513.localdomain sudo[140491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:55 np0005538513.localdomain python3.9[140493]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 28 09:19:55 np0005538513.localdomain sudo[140491]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:55 np0005538513.localdomain sudo[140584]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdmllngqkphlosextecyxzokplugdhtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321595.413134-1358-156508401742307/AnsiballZ_mount.py
Nov 28 09:19:55 np0005538513.localdomain sudo[140584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:55 np0005538513.localdomain python3.9[140586]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 28 09:19:55 np0005538513.localdomain sudo[140584]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:56 np0005538513.localdomain sshd[133636]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:19:56 np0005538513.localdomain systemd-logind[764]: Session 43 logged out. Waiting for processes to exit.
Nov 28 09:19:56 np0005538513.localdomain systemd[1]: session-43.scope: Deactivated successfully.
Nov 28 09:19:56 np0005538513.localdomain systemd[1]: session-43.scope: Consumed 27.744s CPU time.
Nov 28 09:19:56 np0005538513.localdomain systemd-logind[764]: Removed session 43.
Nov 28 09:19:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43308 DF PROTO=TCP SPT=41988 DPT=9101 SEQ=2314623050 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB11C8090000000001030307) 
Nov 28 09:20:02 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1096 DF PROTO=TCP SPT=40788 DPT=9101 SEQ=3176365671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB11D7820000000001030307) 
Nov 28 09:20:02 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16425 DF PROTO=TCP SPT=60498 DPT=9102 SEQ=1550634953 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB11D8F40000000001030307) 
Nov 28 09:20:02 np0005538513.localdomain sshd[140603]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:20:02 np0005538513.localdomain sshd[140603]: Accepted publickey for zuul from 192.168.122.30 port 59176 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:20:03 np0005538513.localdomain systemd-logind[764]: New session 44 of user zuul.
Nov 28 09:20:03 np0005538513.localdomain systemd[1]: Started Session 44 of User zuul.
Nov 28 09:20:03 np0005538513.localdomain sshd[140603]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:20:03 np0005538513.localdomain sudo[140696]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdaslizguivuncizhnxniosuwihvopgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321603.1337836-21-10809655125670/AnsiballZ_tempfile.py
Nov 28 09:20:03 np0005538513.localdomain sudo[140696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:03 np0005538513.localdomain python3.9[140698]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 28 09:20:03 np0005538513.localdomain sudo[140696]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:05 np0005538513.localdomain sudo[140788]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkllxjdxxdukzoaaoawjfsithnstnvte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321604.8235729-93-178516229429798/AnsiballZ_stat.py
Nov 28 09:20:05 np0005538513.localdomain sudo[140788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:05 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=932 DF PROTO=TCP SPT=59560 DPT=9100 SEQ=409018828 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB11E4810000000001030307) 
Nov 28 09:20:05 np0005538513.localdomain python3.9[140790]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:20:05 np0005538513.localdomain sudo[140788]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3518 DF PROTO=TCP SPT=41230 DPT=9102 SEQ=2610679703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB11E9820000000001030307) 
Nov 28 09:20:06 np0005538513.localdomain sudo[140882]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lnyhqpxjlkvenrsooupidygfdlavpljq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321606.2357693-141-176954816964210/AnsiballZ_slurp.py
Nov 28 09:20:06 np0005538513.localdomain sudo[140882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:06 np0005538513.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 28 09:20:06 np0005538513.localdomain python3.9[140884]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Nov 28 09:20:06 np0005538513.localdomain sudo[140882]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:08 np0005538513.localdomain sudo[140977]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkffqrdqjnjkaozdchsrltijyydlbjhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321607.5966923-189-262347571873459/AnsiballZ_stat.py
Nov 28 09:20:08 np0005538513.localdomain sudo[140977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:08 np0005538513.localdomain python3.9[140979]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.x7_26xdp follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:20:08 np0005538513.localdomain sudo[140977]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:08 np0005538513.localdomain sudo[141052]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-agvmjitkhbxpktogcbexkvmwkuyzefiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321607.5966923-189-262347571873459/AnsiballZ_copy.py
Nov 28 09:20:08 np0005538513.localdomain sudo[141052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:08 np0005538513.localdomain python3.9[141054]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.x7_26xdp mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321607.5966923-189-262347571873459/.source.x7_26xdp _original_basename=.ye43wm4w follow=False checksum=37b6ce2b006ecd64876d6796769d1ed663c9f074 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:20:08 np0005538513.localdomain sudo[141052]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11551 DF PROTO=TCP SPT=60450 DPT=9100 SEQ=4174456282 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB11F3820000000001030307) 
Nov 28 09:20:10 np0005538513.localdomain sudo[141144]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bajzjqtrhhlesigrquclgntgqjayukrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321610.2630424-279-245046631188549/AnsiballZ_setup.py
Nov 28 09:20:10 np0005538513.localdomain sudo[141144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:11 np0005538513.localdomain python3.9[141146]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:20:11 np0005538513.localdomain sudo[141144]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:12 np0005538513.localdomain sudo[141236]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxpibuymdybdxleskkjqtlsbwhrfeely ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321612.0673285-328-89565962165696/AnsiballZ_blockinfile.py
Nov 28 09:20:12 np0005538513.localdomain sudo[141236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:12 np0005538513.localdomain python3.9[141238]: ansible-ansible.builtin.blockinfile Invoked with block=np0005538513.localdomain,192.168.122.106,np0005538513* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCToHi/c1OL/UxMWy2v/t0tcvSlMeoKa6EPBYbcu51p2Gn2UxEPgCRLM9+84Smh2pxAR4Y/5LVm2lbZ9Gf4okHGg5GLIyqzxxqbQHyR+YRljujVEOvksUPuKCptzx9fQj2Ij2t9GPGHc5klgGPIKjx0pza8T37vdz+G9y7zuK5wWI66AeN8y/6dD2hvi1Lp94VRSvTTEo+nUOFSIgsOwqQO+ZSwTgjG1pmtESBe8nkhW0I0BQPX46v9f1PN1LXDg8cN2FSVjQ91RI0uCvTaBYJ3soFBFspgiJ113zapbQCaNwg7lK7ofS0QT5WONP3QIsDAq1gSpWuOdS2DRY4NU3WMd4m5tLbj+ubiWr39rNU/zQiEl8r38aiM0OwOfuQ9S8wxO7phpVCQrbOkYCLLijdy/xTODvP+jYohTMWX8Gh6IVeVtm6SB2Tw3lDBCjpqlclCSs905Xe+mTJ6WYTaz+Q1xgflKEeemzJ0+rt+QZbrmL7u5MUdf/l/yOLAgACNsws=
                                                            np0005538513.localdomain,192.168.122.106,np0005538513* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAcFP+DjLmcEEAm8Lwvxl6FPIO6oOWnH/RhIcXcMqT1F
                                                            np0005538513.localdomain,192.168.122.106,np0005538513* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBCKBYRInRUdTiZ6KYKN+DMW+w3dTbv2b2ZRO5doLdo2BjNWxCzSevWq4Ptdwg4i7AwfVsH37MVU5ijvc8yJB7o=
                                                            np0005538512.localdomain,192.168.122.105,np0005538512* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCy9/gxqH+eMqafXwUuPf+1Clpw4qsugdFefisnCDhJ5U7Pc+eWMUQVMS0ErxabBJhneDOyPwXwIbv72cEAtmgfvHDlSuS3mt8LRzKqsv1dXTy4Zqb3JGVzrvxo0iczGRsn2MIDJUv/Zjq9YqVeCnDj2HOwV+qx+EFecEFXS797FxsnMmTw0A5z8yUtBuJEGAKQX96LpZc4k5ltq+Uy0rK85Kk7cGR4A+wrIChLC8wggxvA99NdPEBtne6Chb+3PcbYUcTGhGtV6FGzpgbWmuWT/gcANb+fJE5/4n87loLmBMsmvGhvQuN9kuJ20g6nwPJbPTpIbV6XALx4tbma68bL3RL+lcGlh3jf0pEXPfolrB/MRmJn5ggMLjRv50FrowQalnCEgWE0gtd9IGjmqFz3jP008bGotn9rcacbjC2AvE+5NEjp7TzXGnFcD6jW8+9AWiusCww4ULs/oWbi0GLkmhwU5EifitDYF2+r1CigAdlEjb6sa0wAQSmclWk6guM=
                                                            np0005538512.localdomain,192.168.122.105,np0005538512* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIK28mLCPVbXy0OXsvv/yFemdmkq0TouDg2F8iIBtrFNP
                                                            np0005538512.localdomain,192.168.122.105,np0005538512* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOAaBZ7v0nx9ZqEqgPbFZS0ak6RTWK6bkXL/jWgEJnhpVMoiRYOxmcwlW3qCW0ftaWYgMItu1j7anWibS+umVXI=
                                                            np0005538511.localdomain,192.168.122.104,np0005538511* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDqtmgm0KAOOIJ7a8whlZPfasnwJpfcm6zVmQjiKHZZrcojE/a6oALfufKXbfWWiLjJ2VzyK9v7QPNXhIWxgAKT9J40A1lSpSAmmxMaWvy+hzzvePs0Z4Fc4bFX7V4zBGI+dAJ+eAu73z6OKNuMhxBrL46ejpRFbqjwBP3veWRiLOMbyPn+Wc+amop0p1eEzV2QHMAIC5Dwm6/tYNLixNSa/Ea0ciaY3jWii+IGhYy+wqQP+9qkoVf9bZ4Ewa+7UfXI/q4zvvic/Znb8ZpCpezLnH4ilBORLyV9r/wkkkVGY7UVgUdSoLVjzTGQAtHl2ZgA3zJ2F2ES9QcBEvrHygT4vGgtEaxQn8XFhBwhzCpPaLyXti/6d+8M36cJx+7gv1eEfDgLz3MNR+tcnFSew9N6dIN4afV0DvA/9FsWk8PTqddN4iHcZzRo0GiDJWNtB+gYVZOytTYMZm2Cyv59IthEzxaB+wTZoSdCeuEeTM0ohYspOKirIPqMPuCbGbtrJFE=
                                                            np0005538511.localdomain,192.168.122.104,np0005538511* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOw3UAOk5rmRZZUABN/csr2bxG0kPuwFOfnLWM0dbphK
                                                            np0005538511.localdomain,192.168.122.104,np0005538511* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOporAXIBWakUq++II8S8bptvpP8um9hXQ1t0EGSEC6CKLIa5aENxiSz3hPWhpfOMIda2pAiC8tHJ/ctg1cA7bI=
                                                            np0005538515.localdomain,192.168.122.108,np0005538515* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDbwc/gBZF5hmsFU6BSK1/DT5hduj2+3ukzoCGLU6mgpBv7BiInN7vVOqXilL+QUAWOvfKTekaQe1Vv/2jpygQnlu6MEMopmac/36IfVjgt39zxCULfSWv3Gp8tLP0ATF2LfhHBWFrGX7G3Bg3AiNfIUnQIQadBaKIByl+FfA7nJ7phwBAwJaQxvByGDeMwC2CWIUPgVqKclcw1WmldPnNmwquLlCbAeMV2hHlBfnVk8BI6fsOUcBB6a05zRpJpbrl584F+qkiQX0RpZYJQdZCoLiJStJv39lYhgiAWChUOVJsCbeNQnC9/Xgs5JhmRESgXh7Tm+8UNW9DxSHN7BS5qKYPUULdjobSp2v9pFOx30MLMsNd5r3JE07pgm5PpjuviSGEvJ8DIAPTF3kUXM43wax1q9rGV4ZfoJiLAwS9CmWWDWZDg17cnC5z+3qi+K8HUKz8LxQCHI+yEtTFzUEYyXTQfQbNvkauEHI/PwFA1iC+4/2g/0UhtjkM+FO2Czwk=
                                                            np0005538515.localdomain,192.168.122.108,np0005538515* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPVAkJQTOfLnB4ufl+yfJWTOwj/+yeZMYj9KPcqQhG41
                                                            np0005538515.localdomain,192.168.122.108,np0005538515* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJxSQcYu8iH02KDWynHrNs+wu90XfG3ktCJ/ydvMFl7Khrh5CImI23f+XeJr4A7okpxJw7hhtVd+bcWjM/VGibU=
                                                            np0005538510.localdomain,192.168.122.103,np0005538510* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAxqgPHnyGChl6yd1/HRo8ox+w8llSVhIj8iYUdDG7IquyLr4/CZguZzRkngbXi/Dq544iKS4kFL/zPKi+yuxeFs4b6fgo4vGoV8wwKNSJXx0d0hOQa9651VqB6k/trENRTgLa2fHkXgF+/g0f7HvloQfhr7qjhTBRV4l4UfJiOEpMvMxN6map/D0JuHlAZGZ5mGUoBTEMuPGEPvMWqe0kc/I8WIgsMsvijOGM2xDxsOqAYlV9a8faoyMdacWUNkeQTfPF6h+z8xdvP8qWPtrPKWHMpcGicTI6pFZ2JxOjWnMaBXs2j/CN7HFLbyOCwuAvAu9efAbxJvgtZlO++6kSlq7SHMzwv7PLP69GaQJHR+jANJ/O2BchbxL09mIkpFSzLSS0k7xXJlwqnAMciIlTaud2n5Hqnnb06WgtvD6O0nnuCLH5am7F1YDGJJgUmNbbgF1PuwzOZqQy+tA2igji/n2z87KkGZdIbrHdPU1PPIlzVGPO6aO02RhvtD+/iQM=
                                                            np0005538510.localdomain,192.168.122.103,np0005538510* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIO316T0CvGWuEUZtluJgtZ9ZZEUIgwqLNzmYcEgwx90d
                                                            np0005538510.localdomain,192.168.122.103,np0005538510* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFs0shW57fSaFIES4CjKi1hUQjnXLq99+vhyRfpt8xn5+tcCwnrhlVxDAoMMHaxjmVGblslVcZ1lb3oEH51GZuE=
                                                            np0005538514.localdomain,192.168.122.107,np0005538514* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLIqwhlOevSQuHXF0nrkLOzRoSQqnWWb/cXzK4um93clqGujVOE9PUyL6ONBo/qlr4Pp+QMzSsIFwjW1T/6G+Ce2CS/TGphIUxvvB9NhBt+OJl/zDUEmjAU6bwVIx6ApqtimsXWWIap9GEtVWA5P9pcqPMyGzq1mCzwCS252Ylioij0zZxfMrxTt3RSsWrDED61vRes0ZKd8HERTLN+Lzis5t8f74zfwTesOea6CRkIHth4cUP7ua3q+KhhbhIPj+fXWN5w+qVbcTMJSYyUPsZ2ymPhR4x4db1oPk1Jg14dw1BnmAZZl3v8o4l7bUQ2Fj/PE1JbSiApxbK+V0KdZGMrG4iVbnMmzwBXPXHa6lNQGneflVd3MNEepnTnXx4hAVpaJHc8EtIREq8aPe07DW0wL9clpTKaSGU2Ma+BLXmSDPkuPh6JWLxn3iM1yybL574NnGt2MgBj6z2tiSb4NkNmaBkoG8PMHw8YUSabKBBZNiMEO2GKBpHZldSrYvOZHU=
                                                            np0005538514.localdomain,192.168.122.107,np0005538514* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINhivqz2RYo1kKlRUCCEwVKn/fRbUXKh+9HKcoRBbRik
                                                            np0005538514.localdomain,192.168.122.107,np0005538514* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEj7Mfl3DOkiBgUjao8Ey8r/pUITSMDHIaEViUpgeShgnNz3/omNuAseQqHK6/tA9gN/Uo8Pq1wRSxeBtUVD++U=
                                                             create=True mode=0644 path=/tmp/ansible.x7_26xdp state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:20:12 np0005538513.localdomain sudo[141236]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:13 np0005538513.localdomain sudo[141328]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecelotdnrnuhpkehkxqnqlxkupfbnxan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321613.463223-376-150299501310319/AnsiballZ_command.py
Nov 28 09:20:13 np0005538513.localdomain sudo[141328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:14 np0005538513.localdomain python3.9[141330]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.x7_26xdp' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:20:14 np0005538513.localdomain sudo[141328]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:14 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50711 DF PROTO=TCP SPT=53558 DPT=9882 SEQ=4101534268 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1207830000000001030307) 
Nov 28 09:20:16 np0005538513.localdomain sudo[141422]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojiwpdbzspmuirqdpupvqpkeshueobrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321615.5677655-424-236871403104910/AnsiballZ_file.py
Nov 28 09:20:16 np0005538513.localdomain sudo[141422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:16 np0005538513.localdomain python3.9[141424]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.x7_26xdp state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:20:16 np0005538513.localdomain sudo[141422]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:17 np0005538513.localdomain sshd[140603]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:20:17 np0005538513.localdomain systemd[1]: session-44.scope: Deactivated successfully.
Nov 28 09:20:17 np0005538513.localdomain systemd[1]: session-44.scope: Consumed 4.266s CPU time.
Nov 28 09:20:17 np0005538513.localdomain systemd-logind[764]: Session 44 logged out. Waiting for processes to exit.
Nov 28 09:20:17 np0005538513.localdomain systemd-logind[764]: Removed session 44.
Nov 28 09:20:18 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17686 DF PROTO=TCP SPT=38134 DPT=9105 SEQ=77650652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB12164B0000000001030307) 
Nov 28 09:20:21 np0005538513.localdomain sudo[141439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:20:21 np0005538513.localdomain sudo[141439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:20:21 np0005538513.localdomain sudo[141439]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:21 np0005538513.localdomain sudo[141454]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:20:22 np0005538513.localdomain sudo[141454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:20:22 np0005538513.localdomain sudo[141454]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:24 np0005538513.localdomain sshd[141501]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:20:24 np0005538513.localdomain sshd[141501]: Accepted publickey for zuul from 192.168.122.30 port 42842 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:20:24 np0005538513.localdomain systemd-logind[764]: New session 45 of user zuul.
Nov 28 09:20:24 np0005538513.localdomain systemd[1]: Started Session 45 of User zuul.
Nov 28 09:20:24 np0005538513.localdomain sshd[141501]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:20:25 np0005538513.localdomain sudo[141537]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:20:25 np0005538513.localdomain sudo[141537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:20:25 np0005538513.localdomain sudo[141537]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:25 np0005538513.localdomain python3.9[141609]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:20:27 np0005538513.localdomain sudo[141703]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dyqhsfevxmddqbsxhyowsppasrzsesbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321626.3216078-56-193134621652851/AnsiballZ_systemd.py
Nov 28 09:20:27 np0005538513.localdomain sudo[141703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:27 np0005538513.localdomain python3.9[141705]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 28 09:20:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23918 DF PROTO=TCP SPT=56768 DPT=9101 SEQ=1563538910 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB123D3A0000000001030307) 
Nov 28 09:20:28 np0005538513.localdomain sudo[141703]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:28 np0005538513.localdomain sudo[141797]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-scwtgymjthlgknefysrwufihylmrxqhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321628.5192194-80-253082674764166/AnsiballZ_systemd.py
Nov 28 09:20:28 np0005538513.localdomain sudo[141797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:29 np0005538513.localdomain python3.9[141799]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:20:29 np0005538513.localdomain sudo[141797]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:29 np0005538513.localdomain sudo[141890]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmwinuhgtgdaqpypcxxqqsfzoekqgfwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321629.4397707-107-182662761082178/AnsiballZ_command.py
Nov 28 09:20:29 np0005538513.localdomain sudo[141890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:30 np0005538513.localdomain python3.9[141892]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:20:30 np0005538513.localdomain sudo[141890]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:30 np0005538513.localdomain sudo[141983]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xggfavgqucwtwgvixmehavqvrysibmcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321630.2668147-131-22097055576586/AnsiballZ_stat.py
Nov 28 09:20:30 np0005538513.localdomain sudo[141983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:30 np0005538513.localdomain python3.9[141985]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:20:30 np0005538513.localdomain sudo[141983]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:31 np0005538513.localdomain sudo[142077]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axqqvtlslkkayzcpxpmenkqwrxboyums ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321631.1030858-155-198266236033161/AnsiballZ_command.py
Nov 28 09:20:31 np0005538513.localdomain sudo[142077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:31 np0005538513.localdomain python3.9[142079]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:20:31 np0005538513.localdomain sudo[142077]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:32 np0005538513.localdomain sudo[142172]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wemblbctmgramkwcchwbgxhlvdzdyvla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321631.7611563-179-270339517692235/AnsiballZ_file.py
Nov 28 09:20:32 np0005538513.localdomain sudo[142172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:32 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36977 DF PROTO=TCP SPT=38326 DPT=9102 SEQ=496899107 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB124E240000000001030307) 
Nov 28 09:20:32 np0005538513.localdomain python3.9[142174]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:20:32 np0005538513.localdomain sudo[142172]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:32 np0005538513.localdomain sshd[141501]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:20:32 np0005538513.localdomain systemd[1]: session-45.scope: Deactivated successfully.
Nov 28 09:20:32 np0005538513.localdomain systemd[1]: session-45.scope: Consumed 3.892s CPU time.
Nov 28 09:20:32 np0005538513.localdomain systemd-logind[764]: Session 45 logged out. Waiting for processes to exit.
Nov 28 09:20:32 np0005538513.localdomain systemd-logind[764]: Removed session 45.
Nov 28 09:20:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36978 DF PROTO=TCP SPT=38326 DPT=9102 SEQ=496899107 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1252430000000001030307) 
Nov 28 09:20:35 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47629 DF PROTO=TCP SPT=46458 DPT=9100 SEQ=1375931586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1259B10000000001030307) 
Nov 28 09:20:35 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36979 DF PROTO=TCP SPT=38326 DPT=9102 SEQ=496899107 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB125A420000000001030307) 
Nov 28 09:20:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47630 DF PROTO=TCP SPT=46458 DPT=9100 SEQ=1375931586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB125DC20000000001030307) 
Nov 28 09:20:38 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47631 DF PROTO=TCP SPT=46458 DPT=9100 SEQ=1375931586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1265C20000000001030307) 
Nov 28 09:20:38 np0005538513.localdomain sshd[142189]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:20:38 np0005538513.localdomain sshd[142189]: Accepted publickey for zuul from 192.168.122.30 port 46230 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:20:38 np0005538513.localdomain systemd-logind[764]: New session 46 of user zuul.
Nov 28 09:20:38 np0005538513.localdomain systemd[1]: Started Session 46 of User zuul.
Nov 28 09:20:38 np0005538513.localdomain sshd[142189]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:20:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36980 DF PROTO=TCP SPT=38326 DPT=9102 SEQ=496899107 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB126A020000000001030307) 
Nov 28 09:20:39 np0005538513.localdomain python3.9[142282]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:20:40 np0005538513.localdomain sudo[142376]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-moxuihhvjejmcnwkiinycghdnegtvmvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321640.298063-62-213010674768128/AnsiballZ_setup.py
Nov 28 09:20:40 np0005538513.localdomain sudo[142376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:40 np0005538513.localdomain python3.9[142378]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:20:41 np0005538513.localdomain sudo[142376]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:41 np0005538513.localdomain sudo[142430]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ifhgumxdtetjigqfypfezbkiqwkfboid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321640.298063-62-213010674768128/AnsiballZ_dnf.py
Nov 28 09:20:41 np0005538513.localdomain sudo[142430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:41 np0005538513.localdomain python3.9[142432]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 09:20:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47632 DF PROTO=TCP SPT=46458 DPT=9100 SEQ=1375931586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1275820000000001030307) 
Nov 28 09:20:44 np0005538513.localdomain sudo[142430]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:45 np0005538513.localdomain python3.9[142524]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:20:46 np0005538513.localdomain sudo[142615]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nawoiyoyhkakcrpmlsdzxlhoixrmtgjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321646.5628612-125-98295638879919/AnsiballZ_file.py
Nov 28 09:20:46 np0005538513.localdomain sudo[142615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:47 np0005538513.localdomain python3.9[142617]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/reboot_required/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:20:47 np0005538513.localdomain sudo[142615]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57206 DF PROTO=TCP SPT=35184 DPT=9882 SEQ=3273484680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1288020000000001030307) 
Nov 28 09:20:47 np0005538513.localdomain sudo[142707]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izzipjvqyonegrjtclazpgpzfuberpzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321647.3975554-149-12216883494778/AnsiballZ_file.py
Nov 28 09:20:47 np0005538513.localdomain sudo[142707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:47 np0005538513.localdomain python3.9[142709]: ansible-ansible.builtin.file Invoked with mode=0600 path=/var/lib/openstack/reboot_required/needs_restarting state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:20:47 np0005538513.localdomain sudo[142707]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:48 np0005538513.localdomain sudo[142799]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lfnmzzldgizuxllfxcsweantktvggypg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321648.0453622-173-139927042058809/AnsiballZ_lineinfile.py
Nov 28 09:20:48 np0005538513.localdomain sudo[142799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:48 np0005538513.localdomain python3.9[142801]: ansible-ansible.builtin.lineinfile Invoked with dest=/var/lib/openstack/reboot_required/needs_restarting line=Not root, Subscription Management repositories not updated
                                                            Core libraries or services have been updated since boot-up:
                                                              * systemd
                                                            
                                                            Reboot is required to fully utilize these updates.
                                                            More information: https://access.redhat.com/solutions/27943 path=/var/lib/openstack/reboot_required/needs_restarting state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:20:48 np0005538513.localdomain sudo[142799]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7445 DF PROTO=TCP SPT=55160 DPT=9105 SEQ=4117917505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB128F820000000001030307) 
Nov 28 09:20:49 np0005538513.localdomain python3.9[142891]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 09:20:50 np0005538513.localdomain python3.9[142981]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:20:50 np0005538513.localdomain python3.9[143073]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:20:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7446 DF PROTO=TCP SPT=55160 DPT=9105 SEQ=4117917505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1297820000000001030307) 
Nov 28 09:20:51 np0005538513.localdomain sshd[142189]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:20:51 np0005538513.localdomain systemd[1]: session-46.scope: Deactivated successfully.
Nov 28 09:20:51 np0005538513.localdomain systemd[1]: session-46.scope: Consumed 8.706s CPU time.
Nov 28 09:20:51 np0005538513.localdomain systemd-logind[764]: Session 46 logged out. Waiting for processes to exit.
Nov 28 09:20:51 np0005538513.localdomain systemd-logind[764]: Removed session 46.
Nov 28 09:20:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7447 DF PROTO=TCP SPT=55160 DPT=9105 SEQ=4117917505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB12A7430000000001030307) 
Nov 28 09:20:56 np0005538513.localdomain sshd[143088]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:20:57 np0005538513.localdomain sshd[143088]: Accepted publickey for zuul from 192.168.122.30 port 36868 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:20:57 np0005538513.localdomain systemd-logind[764]: New session 47 of user zuul.
Nov 28 09:20:57 np0005538513.localdomain systemd[1]: Started Session 47 of User zuul.
Nov 28 09:20:57 np0005538513.localdomain sshd[143088]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:20:58 np0005538513.localdomain python3.9[143181]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:20:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7015 DF PROTO=TCP SPT=54504 DPT=9101 SEQ=3538890158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB12B26A0000000001030307) 
Nov 28 09:21:00 np0005538513.localdomain sudo[143275]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxxqovprzjytrqerfktkadtodwmzsufg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321659.639074-157-216695191508598/AnsiballZ_file.py
Nov 28 09:21:00 np0005538513.localdomain sudo[143275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:00 np0005538513.localdomain python3.9[143277]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:21:01 np0005538513.localdomain sudo[143275]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7017 DF PROTO=TCP SPT=54504 DPT=9101 SEQ=3538890158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB12BE820000000001030307) 
Nov 28 09:21:01 np0005538513.localdomain sudo[143367]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ccoxcddompkifqeyydpswhllqxofqrmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321661.1702838-183-142382663671636/AnsiballZ_stat.py
Nov 28 09:21:01 np0005538513.localdomain sudo[143367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:01 np0005538513.localdomain python3.9[143369]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:01 np0005538513.localdomain sudo[143367]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:02 np0005538513.localdomain sudo[143440]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aofoxgnyhnyjaaiviybqxamgzvyzfues ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321661.1702838-183-142382663671636/AnsiballZ_copy.py
Nov 28 09:21:02 np0005538513.localdomain sudo[143440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:02 np0005538513.localdomain python3.9[143442]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321661.1702838-183-142382663671636/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:03 np0005538513.localdomain sudo[143440]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38609 DF PROTO=TCP SPT=57388 DPT=9102 SEQ=2518440198 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB12C7420000000001030307) 
Nov 28 09:21:03 np0005538513.localdomain sudo[143532]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnvhdrpmwvyrqjczuebvepbpwoqoozbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321663.264411-231-115394055243404/AnsiballZ_file.py
Nov 28 09:21:03 np0005538513.localdomain sudo[143532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:03 np0005538513.localdomain python3.9[143534]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-sriov setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:21:03 np0005538513.localdomain sudo[143532]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:04 np0005538513.localdomain sudo[143624]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aaztcvpzjdfialxhmsdqufsxyovqdpoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321663.9007866-258-277568398861764/AnsiballZ_stat.py
Nov 28 09:21:04 np0005538513.localdomain sudo[143624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:04 np0005538513.localdomain python3.9[143626]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:04 np0005538513.localdomain sudo[143624]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:04 np0005538513.localdomain sudo[143697]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fstholryqncyyhtwkzhpfoomooajbpzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321663.9007866-258-277568398861764/AnsiballZ_copy.py
Nov 28 09:21:04 np0005538513.localdomain sudo[143697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:04 np0005538513.localdomain python3.9[143699]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321663.9007866-258-277568398861764/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:04 np0005538513.localdomain sudo[143697]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:05 np0005538513.localdomain sudo[143789]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ttysogiwcglgcahhlmvvhvmyabohabjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321665.1405544-306-58224783759736/AnsiballZ_file.py
Nov 28 09:21:05 np0005538513.localdomain sudo[143789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:05 np0005538513.localdomain python3.9[143791]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-dhcp setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:21:05 np0005538513.localdomain sudo[143789]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:06 np0005538513.localdomain sudo[143881]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xjllogqulvrvdfbhfzwjhdbltrgwkaju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321665.795746-332-1941741771696/AnsiballZ_stat.py
Nov 28 09:21:06 np0005538513.localdomain sudo[143881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:06 np0005538513.localdomain python3.9[143883]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:06 np0005538513.localdomain sudo[143881]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2907 DF PROTO=TCP SPT=51904 DPT=9100 SEQ=1746863639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB12D3030000000001030307) 
Nov 28 09:21:06 np0005538513.localdomain sudo[143954]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-asegyphfyfcqzyhplfbbgxfawtaagqox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321665.795746-332-1941741771696/AnsiballZ_copy.py
Nov 28 09:21:06 np0005538513.localdomain sudo[143954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:06 np0005538513.localdomain python3.9[143956]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321665.795746-332-1941741771696/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:06 np0005538513.localdomain sudo[143954]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:07 np0005538513.localdomain sudo[144046]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khyexrqljsxhxqlzpveflndnlyskemiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321667.0417805-381-228894371197424/AnsiballZ_file.py
Nov 28 09:21:07 np0005538513.localdomain sudo[144046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:07 np0005538513.localdomain python3.9[144048]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:21:07 np0005538513.localdomain sudo[144046]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:08 np0005538513.localdomain sudo[144138]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltzinhxzijmngiplygujhzxfldhytqdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321667.7813623-401-260500616199848/AnsiballZ_stat.py
Nov 28 09:21:08 np0005538513.localdomain sudo[144138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:08 np0005538513.localdomain python3.9[144140]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:08 np0005538513.localdomain sudo[144138]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:08 np0005538513.localdomain sudo[144211]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yeuawlvgqiwyyfugtvbjoabfowveqbyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321667.7813623-401-260500616199848/AnsiballZ_copy.py
Nov 28 09:21:08 np0005538513.localdomain sudo[144211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:08 np0005538513.localdomain python3.9[144213]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321667.7813623-401-260500616199848/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:08 np0005538513.localdomain sudo[144211]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:09 np0005538513.localdomain sudo[144303]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ssklgiftevvakgbowbqogtssofzboeze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321668.9662642-447-116810836504204/AnsiballZ_file.py
Nov 28 09:21:09 np0005538513.localdomain sudo[144303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38611 DF PROTO=TCP SPT=57388 DPT=9102 SEQ=2518440198 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB12DF020000000001030307) 
Nov 28 09:21:09 np0005538513.localdomain python3.9[144305]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:21:09 np0005538513.localdomain sudo[144303]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:10 np0005538513.localdomain sudo[144395]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqkhqpuvuigqiwfuscdwuzhkkushdaly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321669.7539186-473-223586156076465/AnsiballZ_stat.py
Nov 28 09:21:10 np0005538513.localdomain sudo[144395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:10 np0005538513.localdomain python3.9[144397]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:10 np0005538513.localdomain sudo[144395]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:10 np0005538513.localdomain sudo[144468]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lujsyffrtztlwpzzgfalkafzkllxkidt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321669.7539186-473-223586156076465/AnsiballZ_copy.py
Nov 28 09:21:10 np0005538513.localdomain sudo[144468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:10 np0005538513.localdomain chronyd[133620]: Selected source 23.133.168.247 (pool.ntp.org)
Nov 28 09:21:10 np0005538513.localdomain python3.9[144470]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321669.7539186-473-223586156076465/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:10 np0005538513.localdomain sudo[144468]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:11 np0005538513.localdomain sudo[144560]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhpilppbzolobsuguszphvibbxxfhjeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321671.690037-521-9194951078531/AnsiballZ_file.py
Nov 28 09:21:11 np0005538513.localdomain sudo[144560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:12 np0005538513.localdomain python3.9[144562]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:21:12 np0005538513.localdomain sudo[144560]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2909 DF PROTO=TCP SPT=51904 DPT=9100 SEQ=1746863639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB12EAC20000000001030307) 
Nov 28 09:21:12 np0005538513.localdomain sudo[144652]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hnjwvnscgjggxvksuqwrlowmuxzhcqmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321672.3269958-547-108216557261270/AnsiballZ_stat.py
Nov 28 09:21:12 np0005538513.localdomain sudo[144652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:12 np0005538513.localdomain python3.9[144654]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:12 np0005538513.localdomain sudo[144652]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:13 np0005538513.localdomain sudo[144725]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icgilszchnybvfebkugowcqeuscqjkac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321672.3269958-547-108216557261270/AnsiballZ_copy.py
Nov 28 09:21:13 np0005538513.localdomain sudo[144725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:13 np0005538513.localdomain python3.9[144727]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321672.3269958-547-108216557261270/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:13 np0005538513.localdomain sudo[144725]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:14 np0005538513.localdomain sudo[144817]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vtuyshydteqaoubjrahqvdkpgufvlckw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321673.5830784-598-129515263759965/AnsiballZ_file.py
Nov 28 09:21:14 np0005538513.localdomain sudo[144817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:14 np0005538513.localdomain python3.9[144819]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:21:14 np0005538513.localdomain sudo[144817]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:15 np0005538513.localdomain sudo[144909]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edlpkgplrqljwdwykvpbwekltjzcyhuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321674.9151146-625-241708710462876/AnsiballZ_stat.py
Nov 28 09:21:15 np0005538513.localdomain sudo[144909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:15 np0005538513.localdomain python3.9[144911]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:15 np0005538513.localdomain sudo[144909]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:15 np0005538513.localdomain sudo[144982]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhgvmtxotjtewpnsbkdcsqadfuerhjpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321674.9151146-625-241708710462876/AnsiballZ_copy.py
Nov 28 09:21:15 np0005538513.localdomain sudo[144982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:15 np0005538513.localdomain python3.9[144984]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321674.9151146-625-241708710462876/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:15 np0005538513.localdomain sudo[144982]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:16 np0005538513.localdomain sudo[145074]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qioirvpkmhbpeapmefzzqbtbxwuwhjtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321676.044107-671-112396490206318/AnsiballZ_file.py
Nov 28 09:21:16 np0005538513.localdomain sudo[145074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:16 np0005538513.localdomain python3.9[145076]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:21:16 np0005538513.localdomain sudo[145074]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:16 np0005538513.localdomain sudo[145166]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nivclwwsqhbvyxmlosxqnspqdmpvzitk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321676.6958027-693-251209932070235/AnsiballZ_stat.py
Nov 28 09:21:16 np0005538513.localdomain sudo[145166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:17 np0005538513.localdomain python3.9[145168]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:17 np0005538513.localdomain sudo[145166]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11366 DF PROTO=TCP SPT=50366 DPT=9882 SEQ=1649817095 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB12FD420000000001030307) 
Nov 28 09:21:17 np0005538513.localdomain sudo[145239]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xuzkpcwjpuxbngakeyuiicdqadauijmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321676.6958027-693-251209932070235/AnsiballZ_copy.py
Nov 28 09:21:17 np0005538513.localdomain sudo[145239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:17 np0005538513.localdomain python3.9[145241]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321676.6958027-693-251209932070235/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:17 np0005538513.localdomain sudo[145239]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:18 np0005538513.localdomain sshd[143088]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:21:18 np0005538513.localdomain systemd-logind[764]: Session 47 logged out. Waiting for processes to exit.
Nov 28 09:21:18 np0005538513.localdomain systemd[1]: session-47.scope: Deactivated successfully.
Nov 28 09:21:18 np0005538513.localdomain systemd[1]: session-47.scope: Consumed 11.541s CPU time.
Nov 28 09:21:18 np0005538513.localdomain systemd-logind[764]: Removed session 47.
Nov 28 09:21:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9355 DF PROTO=TCP SPT=36044 DPT=9105 SEQ=3290678572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1304C20000000001030307) 
Nov 28 09:21:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9356 DF PROTO=TCP SPT=36044 DPT=9105 SEQ=3290678572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB130CC20000000001030307) 
Nov 28 09:21:24 np0005538513.localdomain sshd[145256]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:21:24 np0005538513.localdomain sshd[145256]: Accepted publickey for zuul from 192.168.122.30 port 41956 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:21:24 np0005538513.localdomain systemd-logind[764]: New session 48 of user zuul.
Nov 28 09:21:24 np0005538513.localdomain systemd[1]: Started Session 48 of User zuul.
Nov 28 09:21:24 np0005538513.localdomain sshd[145256]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:21:24 np0005538513.localdomain sudo[145349]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wahcozxmaecokeskqsdyoqsjscurmkka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321684.391017-26-226503555066403/AnsiballZ_file.py
Nov 28 09:21:24 np0005538513.localdomain sudo[145349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:25 np0005538513.localdomain python3.9[145351]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:25 np0005538513.localdomain sudo[145349]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9357 DF PROTO=TCP SPT=36044 DPT=9105 SEQ=3290678572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB131C820000000001030307) 
Nov 28 09:21:25 np0005538513.localdomain sudo[145366]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:21:25 np0005538513.localdomain sudo[145366]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:21:25 np0005538513.localdomain sudo[145366]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:25 np0005538513.localdomain sudo[145381]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 09:21:25 np0005538513.localdomain sudo[145381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:21:26 np0005538513.localdomain sudo[145381]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:26 np0005538513.localdomain sudo[145449]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:21:26 np0005538513.localdomain sudo[145449]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:21:26 np0005538513.localdomain sudo[145449]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:26 np0005538513.localdomain sudo[145464]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:21:26 np0005538513.localdomain sudo[145464]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:21:26 np0005538513.localdomain sudo[145522]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezvhsxsqtrybyzcqshtqnmjqrpbzclbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321686.172728-62-152621756306967/AnsiballZ_stat.py
Nov 28 09:21:26 np0005538513.localdomain sudo[145522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:26 np0005538513.localdomain python3.9[145524]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:26 np0005538513.localdomain sudo[145522]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:27 np0005538513.localdomain sudo[145464]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:27 np0005538513.localdomain sudo[145628]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bnwachzjbuqzdbrgiygajbrhfovfeqnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321686.172728-62-152621756306967/AnsiballZ_copy.py
Nov 28 09:21:27 np0005538513.localdomain sudo[145628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:27 np0005538513.localdomain sudo[145631]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:21:27 np0005538513.localdomain sudo[145631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:21:27 np0005538513.localdomain sudo[145631]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:27 np0005538513.localdomain python3.9[145630]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321686.172728-62-152621756306967/.source.conf _original_basename=ceph.conf follow=False checksum=e86499341cc75988f759ac10cb7bf332387204b7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:27 np0005538513.localdomain sudo[145628]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:28 np0005538513.localdomain sudo[145735]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-solxevvilkvgqcmxtmzwjgzebarsgwyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321687.7213748-62-256550331317520/AnsiballZ_stat.py
Nov 28 09:21:28 np0005538513.localdomain sudo[145735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26177 DF PROTO=TCP SPT=43728 DPT=9101 SEQ=465637821 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB13279A0000000001030307) 
Nov 28 09:21:28 np0005538513.localdomain python3.9[145737]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:28 np0005538513.localdomain sudo[145735]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:28 np0005538513.localdomain sudo[145808]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvllxqzvuzfvvfpmwapqktuyzkazyhhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321687.7213748-62-256550331317520/AnsiballZ_copy.py
Nov 28 09:21:28 np0005538513.localdomain sudo[145808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:28 np0005538513.localdomain python3.9[145810]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321687.7213748-62-256550331317520/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=98ffd20e3b9db1cae39a950d9da1f69e92796658 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:28 np0005538513.localdomain sudo[145808]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:29 np0005538513.localdomain sshd[145256]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:21:29 np0005538513.localdomain systemd[1]: session-48.scope: Deactivated successfully.
Nov 28 09:21:29 np0005538513.localdomain systemd[1]: session-48.scope: Consumed 2.408s CPU time.
Nov 28 09:21:29 np0005538513.localdomain systemd-logind[764]: Session 48 logged out. Waiting for processes to exit.
Nov 28 09:21:29 np0005538513.localdomain systemd-logind[764]: Removed session 48.
Nov 28 09:21:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26179 DF PROTO=TCP SPT=43728 DPT=9101 SEQ=465637821 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1333C20000000001030307) 
Nov 28 09:21:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55117 DF PROTO=TCP SPT=57938 DPT=9102 SEQ=386827035 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB133C820000000001030307) 
Nov 28 09:21:34 np0005538513.localdomain sshd[145825]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:21:34 np0005538513.localdomain sshd[145825]: Accepted publickey for zuul from 192.168.122.30 port 48038 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:21:34 np0005538513.localdomain systemd-logind[764]: New session 49 of user zuul.
Nov 28 09:21:34 np0005538513.localdomain systemd[1]: Started Session 49 of User zuul.
Nov 28 09:21:34 np0005538513.localdomain sshd[145825]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:21:35 np0005538513.localdomain python3.9[145918]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:21:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36983 DF PROTO=TCP SPT=38326 DPT=9102 SEQ=496899107 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1347820000000001030307) 
Nov 28 09:21:37 np0005538513.localdomain sudo[146012]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwondeqwhaxczzomwzbatsmqkhzgmiri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321696.5695064-62-231737615680286/AnsiballZ_file.py
Nov 28 09:21:37 np0005538513.localdomain sudo[146012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:37 np0005538513.localdomain python3.9[146014]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:21:37 np0005538513.localdomain sudo[146012]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:38 np0005538513.localdomain sudo[146105]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzyxrixozrzzacizuilkgxlpccqocube ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321697.8596811-62-80138937761902/AnsiballZ_file.py
Nov 28 09:21:38 np0005538513.localdomain sudo[146105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:38 np0005538513.localdomain python3.9[146107]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:21:38 np0005538513.localdomain sudo[146105]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:39 np0005538513.localdomain python3.9[146197]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:21:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47635 DF PROTO=TCP SPT=46458 DPT=9100 SEQ=1375931586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1353820000000001030307) 
Nov 28 09:21:39 np0005538513.localdomain sudo[146287]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-symsltywpiggbaabutfhgyywrmznesgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321699.429937-131-122882253827606/AnsiballZ_seboolean.py
Nov 28 09:21:39 np0005538513.localdomain sudo[146287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:40 np0005538513.localdomain python3.9[146289]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 28 09:21:40 np0005538513.localdomain sudo[146287]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:40 np0005538513.localdomain sudo[146379]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icsglkiiorollgfhtftsdxnqsyabiwmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321700.6744177-161-84819052130689/AnsiballZ_setup.py
Nov 28 09:21:40 np0005538513.localdomain sudo[146379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:41 np0005538513.localdomain python3.9[146381]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:21:41 np0005538513.localdomain sudo[146379]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:41 np0005538513.localdomain sudo[146433]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kheysmekwxuvpmbvbwsfcahzrrzpvdvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321700.6744177-161-84819052130689/AnsiballZ_dnf.py
Nov 28 09:21:41 np0005538513.localdomain sudo[146433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:42 np0005538513.localdomain python3.9[146435]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:21:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50292 DF PROTO=TCP SPT=53066 DPT=9100 SEQ=1538634092 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB135FC30000000001030307) 
Nov 28 09:21:45 np0005538513.localdomain sudo[146433]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:46 np0005538513.localdomain sudo[146527]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eyufteergtckxngslcqcbxrsfyhfdndl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321705.4881158-197-280715827966723/AnsiballZ_systemd.py
Nov 28 09:21:46 np0005538513.localdomain sudo[146527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:46 np0005538513.localdomain python3.9[146529]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 09:21:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28567 DF PROTO=TCP SPT=60740 DPT=9882 SEQ=1022673085 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1372420000000001030307) 
Nov 28 09:21:47 np0005538513.localdomain sudo[146527]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:48 np0005538513.localdomain sudo[146622]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vfzddnbtgczrwbxgszvenxctqkswtgbl ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764321707.6577759-221-229496056813430/AnsiballZ_edpm_nftables_snippet.py
Nov 28 09:21:48 np0005538513.localdomain sudo[146622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:48 np0005538513.localdomain python3[146624]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                                            rule:
                                                              proto: udp
                                                              dport: 4789
                                                          - rule_name: 119 neutron geneve networks
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              state: ["UNTRACKED"]
                                                          - rule_name: 120 neutron geneve networks no conntrack
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              table: raw
                                                              chain: OUTPUT
                                                              jump: NOTRACK
                                                              action: append
                                                              state: []
                                                          - rule_name: 121 neutron geneve networks no conntrack
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              table: raw
                                                              chain: PREROUTING
                                                              jump: NOTRACK
                                                              action: append
                                                              state: []
                                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 28 09:21:48 np0005538513.localdomain sudo[146622]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41712 DF PROTO=TCP SPT=53084 DPT=9105 SEQ=4197712296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB137A030000000001030307) 
Nov 28 09:21:49 np0005538513.localdomain sudo[146714]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txayaodzczfufigtuxdstkopsbvtbthq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321709.5010664-248-78815661172727/AnsiballZ_file.py
Nov 28 09:21:49 np0005538513.localdomain sudo[146714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:49 np0005538513.localdomain python3.9[146716]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:49 np0005538513.localdomain sudo[146714]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:50 np0005538513.localdomain sudo[146806]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dijrietsndowdeczsbhvbpszbyxkskfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321710.2061727-272-156971786272027/AnsiballZ_stat.py
Nov 28 09:21:50 np0005538513.localdomain sudo[146806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:50 np0005538513.localdomain python3.9[146808]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:50 np0005538513.localdomain sudo[146806]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:51 np0005538513.localdomain sudo[146854]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kyjkimqabhhtaovdvklncacxwonvuhpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321710.2061727-272-156971786272027/AnsiballZ_file.py
Nov 28 09:21:51 np0005538513.localdomain sudo[146854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41713 DF PROTO=TCP SPT=53084 DPT=9105 SEQ=4197712296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1382030000000001030307) 
Nov 28 09:21:51 np0005538513.localdomain python3.9[146856]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:51 np0005538513.localdomain sudo[146854]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:51 np0005538513.localdomain sudo[146946]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eiazzqanhkljfqqudvxcqbbejwvuhgjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321711.4792767-308-202884410970214/AnsiballZ_stat.py
Nov 28 09:21:51 np0005538513.localdomain sudo[146946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:51 np0005538513.localdomain python3.9[146948]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:51 np0005538513.localdomain sudo[146946]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:52 np0005538513.localdomain sudo[146994]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dwwopkdjlzbxzjbouuajhnaaphqlvhqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321711.4792767-308-202884410970214/AnsiballZ_file.py
Nov 28 09:21:52 np0005538513.localdomain sudo[146994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:52 np0005538513.localdomain python3.9[146996]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.glr7xwbq recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:52 np0005538513.localdomain sudo[146994]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:52 np0005538513.localdomain sudo[147086]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gqluwffzyoqxdirebxabpmyqkpqevsyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321712.5527048-344-68605163215209/AnsiballZ_stat.py
Nov 28 09:21:52 np0005538513.localdomain sudo[147086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:52 np0005538513.localdomain python3.9[147088]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:52 np0005538513.localdomain sudo[147086]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:53 np0005538513.localdomain sudo[147134]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dwzmkpumcdzjgeduthlmbhluyiyyhcih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321712.5527048-344-68605163215209/AnsiballZ_file.py
Nov 28 09:21:53 np0005538513.localdomain sudo[147134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:53 np0005538513.localdomain python3.9[147136]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:53 np0005538513.localdomain sudo[147134]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:54 np0005538513.localdomain sudo[147226]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aayesawpayqkyizewengeajhbokvohrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321713.7694075-383-35496679795774/AnsiballZ_command.py
Nov 28 09:21:54 np0005538513.localdomain sudo[147226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:54 np0005538513.localdomain python3.9[147228]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:21:54 np0005538513.localdomain sudo[147226]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:55 np0005538513.localdomain sudo[147319]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umcyaoshjubvdtaqzmhoxiydqyhcoucw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764321714.6030757-407-79989874616682/AnsiballZ_edpm_nftables_from_files.py
Nov 28 09:21:55 np0005538513.localdomain sudo[147319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:55 np0005538513.localdomain python3[147321]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 28 09:21:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41714 DF PROTO=TCP SPT=53084 DPT=9105 SEQ=4197712296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1391C20000000001030307) 
Nov 28 09:21:55 np0005538513.localdomain sudo[147319]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:55 np0005538513.localdomain sudo[147411]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-haocqfwnxstmclvlatquohnjcakprwgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321715.4084322-431-101665468607006/AnsiballZ_stat.py
Nov 28 09:21:55 np0005538513.localdomain sudo[147411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:55 np0005538513.localdomain python3.9[147413]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:55 np0005538513.localdomain sudo[147411]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:56 np0005538513.localdomain sudo[147486]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cbhmjsbqhdrjztjiowefsavtmadblevf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321715.4084322-431-101665468607006/AnsiballZ_copy.py
Nov 28 09:21:56 np0005538513.localdomain sudo[147486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:56 np0005538513.localdomain python3.9[147488]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321715.4084322-431-101665468607006/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:56 np0005538513.localdomain sudo[147486]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:57 np0005538513.localdomain sudo[147578]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oosupgcatnvbmnuihyxvpljkgllprfry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321716.769259-476-122576591274181/AnsiballZ_stat.py
Nov 28 09:21:57 np0005538513.localdomain sudo[147578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:57 np0005538513.localdomain python3.9[147580]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:57 np0005538513.localdomain sudo[147578]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:57 np0005538513.localdomain sudo[147653]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ajotxnydowbrjxplorvxzgzxkwvwjozx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321716.769259-476-122576591274181/AnsiballZ_copy.py
Nov 28 09:21:57 np0005538513.localdomain sudo[147653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:57 np0005538513.localdomain python3.9[147655]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321716.769259-476-122576591274181/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:57 np0005538513.localdomain sudo[147653]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57372 DF PROTO=TCP SPT=39120 DPT=9101 SEQ=487601255 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB139CCA0000000001030307) 
Nov 28 09:21:58 np0005538513.localdomain sudo[147745]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gecqxowiveenkvjpjuolnyagqvekvzxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321718.0005498-521-259838118134816/AnsiballZ_stat.py
Nov 28 09:21:58 np0005538513.localdomain sudo[147745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:58 np0005538513.localdomain python3.9[147747]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:58 np0005538513.localdomain sudo[147745]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:58 np0005538513.localdomain sudo[147820]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhhlfqwrgtnxhhvymrauldrduxzdjgue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321718.0005498-521-259838118134816/AnsiballZ_copy.py
Nov 28 09:21:58 np0005538513.localdomain sudo[147820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:59 np0005538513.localdomain python3.9[147822]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321718.0005498-521-259838118134816/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:59 np0005538513.localdomain sudo[147820]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:00 np0005538513.localdomain sudo[147912]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qknprzjsigglivzcnkoyhzacilzfxgrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321720.0569258-566-248504590464415/AnsiballZ_stat.py
Nov 28 09:22:00 np0005538513.localdomain sudo[147912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:00 np0005538513.localdomain python3.9[147914]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:22:00 np0005538513.localdomain sudo[147912]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:00 np0005538513.localdomain sudo[147987]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hqxrffevmsdvckmpnqkxmmcziccmsmvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321720.0569258-566-248504590464415/AnsiballZ_copy.py
Nov 28 09:22:00 np0005538513.localdomain sudo[147987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:01 np0005538513.localdomain python3.9[147989]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321720.0569258-566-248504590464415/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:01 np0005538513.localdomain sudo[147987]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57374 DF PROTO=TCP SPT=39120 DPT=9101 SEQ=487601255 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB13A8C20000000001030307) 
Nov 28 09:22:02 np0005538513.localdomain sudo[148079]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-difjnmgjaxtbgcebncqxhpytdwplnmih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321722.4914188-611-70940371795726/AnsiballZ_stat.py
Nov 28 09:22:02 np0005538513.localdomain sudo[148079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:03 np0005538513.localdomain python3.9[148081]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:22:03 np0005538513.localdomain sudo[148079]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:03 np0005538513.localdomain sudo[148154]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klsdggmhicpfoeslqpnuhesudlespocz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321722.4914188-611-70940371795726/AnsiballZ_copy.py
Nov 28 09:22:03 np0005538513.localdomain sudo[148154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41715 DF PROTO=TCP SPT=53084 DPT=9105 SEQ=4197712296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB13B1820000000001030307) 
Nov 28 09:22:03 np0005538513.localdomain python3.9[148156]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321722.4914188-611-70940371795726/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:03 np0005538513.localdomain sudo[148154]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:04 np0005538513.localdomain sudo[148246]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ckbutikfneogiktvrvigsxgpfgwraqet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321723.7652202-656-246250612084459/AnsiballZ_file.py
Nov 28 09:22:04 np0005538513.localdomain sudo[148246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:04 np0005538513.localdomain python3.9[148248]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:04 np0005538513.localdomain sudo[148246]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:04 np0005538513.localdomain sudo[148338]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egklzhwgnpjivhglebcfkndjpaydjwyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321724.452788-680-101022673296829/AnsiballZ_command.py
Nov 28 09:22:04 np0005538513.localdomain sudo[148338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:04 np0005538513.localdomain python3.9[148340]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:22:04 np0005538513.localdomain sudo[148338]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:05 np0005538513.localdomain sudo[148433]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iappzabyrcpfnzqhmyfybhdegyzxbjdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321725.1248052-704-249931608348611/AnsiballZ_blockinfile.py
Nov 28 09:22:05 np0005538513.localdomain sudo[148433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:05 np0005538513.localdomain python3.9[148435]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:05 np0005538513.localdomain sudo[148433]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:06 np0005538513.localdomain sudo[148525]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbrzosbrqqmziipqvmcikuspwokeeczd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321726.0068212-731-66104851240392/AnsiballZ_command.py
Nov 28 09:22:06 np0005538513.localdomain sudo[148525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25988 DF PROTO=TCP SPT=41342 DPT=9100 SEQ=1338736920 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB13BD420000000001030307) 
Nov 28 09:22:06 np0005538513.localdomain python3.9[148527]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:22:06 np0005538513.localdomain sudo[148525]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:06 np0005538513.localdomain sudo[148618]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-beasitbvalzmnyliixzqmjfxzviluwbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321726.6497648-755-58412073132360/AnsiballZ_stat.py
Nov 28 09:22:06 np0005538513.localdomain sudo[148618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:07 np0005538513.localdomain python3.9[148620]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:22:07 np0005538513.localdomain sudo[148618]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:07 np0005538513.localdomain sudo[148712]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llnpsyqfignjixiyczvczurojhsrkkoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321727.411491-779-89845395277179/AnsiballZ_command.py
Nov 28 09:22:07 np0005538513.localdomain sudo[148712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:07 np0005538513.localdomain python3.9[148714]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:22:07 np0005538513.localdomain sudo[148712]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:08 np0005538513.localdomain sudo[148807]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qgnsobknbfnvlwitphacxmgpszyeathd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321728.0293617-803-29517187231800/AnsiballZ_file.py
Nov 28 09:22:08 np0005538513.localdomain sudo[148807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:08 np0005538513.localdomain python3.9[148809]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:08 np0005538513.localdomain sudo[148807]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2912 DF PROTO=TCP SPT=51904 DPT=9100 SEQ=1746863639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB13C9820000000001030307) 
Nov 28 09:22:09 np0005538513.localdomain python3.9[148899]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:22:10 np0005538513.localdomain sudo[148990]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdpsfsgfewyganqbbuojddsuugnbhnau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321730.4641001-923-196835129973059/AnsiballZ_command.py
Nov 28 09:22:10 np0005538513.localdomain sudo[148990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:10 np0005538513.localdomain python3.9[148992]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=np0005538513.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:28:f9:1a:af" external_ids:ovn-encap-ip=172.19.0.106 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:22:10 np0005538513.localdomain ovs-vsctl[148993]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=np0005538513.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:28:f9:1a:af external_ids:ovn-encap-ip=172.19.0.106 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 28 09:22:10 np0005538513.localdomain sudo[148990]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:11 np0005538513.localdomain sudo[149083]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovdcqflzplktudmswordxihkfdzrbltr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321731.207146-950-239265760313358/AnsiballZ_command.py
Nov 28 09:22:11 np0005538513.localdomain sudo[149083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:11 np0005538513.localdomain python3.9[149085]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                                            ovs-vsctl show | grep -q "Manager"
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:22:11 np0005538513.localdomain sudo[149083]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:12 np0005538513.localdomain python3.9[149178]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:22:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25990 DF PROTO=TCP SPT=41342 DPT=9100 SEQ=1338736920 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB13D5020000000001030307) 
Nov 28 09:22:12 np0005538513.localdomain sudo[149270]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nsokrsjbxbjqbrtzunvclrfkawwoutxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321732.6721559-1004-139217721560008/AnsiballZ_file.py
Nov 28 09:22:12 np0005538513.localdomain sudo[149270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:13 np0005538513.localdomain python3.9[149272]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:22:13 np0005538513.localdomain sudo[149270]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:13 np0005538513.localdomain sudo[149362]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dohwksmsynkrrsxoufneuonzvbtzczby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321733.3385124-1028-65068541237205/AnsiballZ_stat.py
Nov 28 09:22:13 np0005538513.localdomain sudo[149362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:13 np0005538513.localdomain python3.9[149364]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:22:13 np0005538513.localdomain sudo[149362]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:14 np0005538513.localdomain sudo[149410]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mansdoayrenbtrbondoxknjyuhcuidxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321733.3385124-1028-65068541237205/AnsiballZ_file.py
Nov 28 09:22:14 np0005538513.localdomain sudo[149410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:14 np0005538513.localdomain python3.9[149412]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:22:14 np0005538513.localdomain sudo[149410]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:14 np0005538513.localdomain sudo[149502]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-johlgkfbbqmhgasyyhdccsgdffqubela ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321734.3459604-1028-171737382748752/AnsiballZ_stat.py
Nov 28 09:22:14 np0005538513.localdomain sudo[149502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:14 np0005538513.localdomain python3.9[149504]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:22:14 np0005538513.localdomain sudo[149502]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:14 np0005538513.localdomain sudo[149550]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qnjpxhnjkxwciaaxcyptprosjbpboygy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321734.3459604-1028-171737382748752/AnsiballZ_file.py
Nov 28 09:22:14 np0005538513.localdomain sudo[149550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:15 np0005538513.localdomain python3.9[149552]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:22:15 np0005538513.localdomain sudo[149550]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:15 np0005538513.localdomain sudo[149642]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbjlqeofydmlzenozypbttxyfvnkxnmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321735.3576763-1097-187206935493196/AnsiballZ_file.py
Nov 28 09:22:15 np0005538513.localdomain sudo[149642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:15 np0005538513.localdomain python3.9[149644]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:15 np0005538513.localdomain sudo[149642]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:16 np0005538513.localdomain sudo[149734]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dovnotlnzcpcaqzrdfzfwhyvhkdqwocx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321735.9860308-1121-248346746701314/AnsiballZ_stat.py
Nov 28 09:22:16 np0005538513.localdomain sudo[149734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:16 np0005538513.localdomain python3.9[149736]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:22:16 np0005538513.localdomain sudo[149734]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:16 np0005538513.localdomain sudo[149782]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-buffhrbgwqdsepbhubfqhhreqdtlssaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321735.9860308-1121-248346746701314/AnsiballZ_file.py
Nov 28 09:22:16 np0005538513.localdomain sudo[149782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:16 np0005538513.localdomain python3.9[149784]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:16 np0005538513.localdomain sudo[149782]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23834 DF PROTO=TCP SPT=54550 DPT=9882 SEQ=1439506991 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB13E7820000000001030307) 
Nov 28 09:22:17 np0005538513.localdomain sudo[149874]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugutukyzpnkdzdykhmsppflpckgombjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321737.0681949-1157-197886961122178/AnsiballZ_stat.py
Nov 28 09:22:17 np0005538513.localdomain sudo[149874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:17 np0005538513.localdomain python3.9[149876]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:22:17 np0005538513.localdomain sudo[149874]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:17 np0005538513.localdomain sudo[149922]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dkuvqemmobfdnrucdyqbzedylqajgtgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321737.0681949-1157-197886961122178/AnsiballZ_file.py
Nov 28 09:22:17 np0005538513.localdomain sudo[149922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:17 np0005538513.localdomain python3.9[149924]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:17 np0005538513.localdomain sudo[149922]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:18 np0005538513.localdomain sudo[150014]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uilmdvzuymakozushgxmujevjlxzparo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321738.1889255-1193-239112750176494/AnsiballZ_systemd.py
Nov 28 09:22:18 np0005538513.localdomain sudo[150014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:18 np0005538513.localdomain python3.9[150016]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:22:18 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:22:18 np0005538513.localdomain systemd-rc-local-generator[150038]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:22:18 np0005538513.localdomain systemd-sysv-generator[150043]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:22:18 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:22:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10603 DF PROTO=TCP SPT=38760 DPT=9105 SEQ=4002833696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB13EF020000000001030307) 
Nov 28 09:22:20 np0005538513.localdomain sudo[150014]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:20 np0005538513.localdomain sudo[150143]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-novkrzocfgwddzhcfmvadjzvpcvqidew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321740.2635393-1217-125681972245705/AnsiballZ_stat.py
Nov 28 09:22:20 np0005538513.localdomain sudo[150143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:20 np0005538513.localdomain python3.9[150145]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:22:20 np0005538513.localdomain sudo[150143]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:20 np0005538513.localdomain sudo[150191]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzetsmwcgsgpazbyjulyumvpribtvznb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321740.2635393-1217-125681972245705/AnsiballZ_file.py
Nov 28 09:22:20 np0005538513.localdomain sudo[150191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10604 DF PROTO=TCP SPT=38760 DPT=9105 SEQ=4002833696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB13F7020000000001030307) 
Nov 28 09:22:21 np0005538513.localdomain python3.9[150193]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:21 np0005538513.localdomain sudo[150191]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:22 np0005538513.localdomain sudo[150283]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iygmmotywvllwkkpotllifvsmqmbuwzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321742.4433508-1253-193373884555211/AnsiballZ_stat.py
Nov 28 09:22:22 np0005538513.localdomain sudo[150283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:22 np0005538513.localdomain python3.9[150285]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:22:22 np0005538513.localdomain sudo[150283]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:23 np0005538513.localdomain sudo[150331]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-maipqpvofjpwravldkxppjbbrneihhps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321742.4433508-1253-193373884555211/AnsiballZ_file.py
Nov 28 09:22:23 np0005538513.localdomain sudo[150331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:23 np0005538513.localdomain python3.9[150333]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:23 np0005538513.localdomain sudo[150331]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:23 np0005538513.localdomain sudo[150423]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olrztyejblgvbfuyiikaeulcxsjtliqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321743.557369-1289-90241429129519/AnsiballZ_systemd.py
Nov 28 09:22:23 np0005538513.localdomain sudo[150423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:24 np0005538513.localdomain python3.9[150425]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:22:24 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:22:24 np0005538513.localdomain systemd-sysv-generator[150453]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:22:24 np0005538513.localdomain systemd-rc-local-generator[150449]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:22:24 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:22:24 np0005538513.localdomain systemd[1]: Starting Create netns directory...
Nov 28 09:22:24 np0005538513.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 09:22:24 np0005538513.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 09:22:24 np0005538513.localdomain systemd[1]: Finished Create netns directory.
Nov 28 09:22:24 np0005538513.localdomain sudo[150423]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10605 DF PROTO=TCP SPT=38760 DPT=9105 SEQ=4002833696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1406C20000000001030307) 
Nov 28 09:22:25 np0005538513.localdomain sudo[150557]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhwengmblkteqnsdpzqmjohawhomvdcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321744.977894-1319-58316998185814/AnsiballZ_file.py
Nov 28 09:22:25 np0005538513.localdomain sudo[150557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:25 np0005538513.localdomain python3.9[150559]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:22:25 np0005538513.localdomain sudo[150557]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:25 np0005538513.localdomain sudo[150649]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdoqjnmmavfhgqhwyysasbippdwajcor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321745.6111827-1343-50852746497548/AnsiballZ_stat.py
Nov 28 09:22:25 np0005538513.localdomain sudo[150649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:26 np0005538513.localdomain python3.9[150651]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:22:26 np0005538513.localdomain sudo[150649]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:26 np0005538513.localdomain sudo[150722]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgereyhawhmgqxpodbhitnjbmzivssyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321745.6111827-1343-50852746497548/AnsiballZ_copy.py
Nov 28 09:22:26 np0005538513.localdomain sudo[150722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:26 np0005538513.localdomain python3.9[150724]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321745.6111827-1343-50852746497548/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:22:26 np0005538513.localdomain sudo[150722]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:27 np0005538513.localdomain sudo[150814]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovfovghdsvbmoetowlzluilmgvibpwwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321747.0541978-1394-8156263630336/AnsiballZ_file.py
Nov 28 09:22:27 np0005538513.localdomain sudo[150814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:27 np0005538513.localdomain python3.9[150816]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:22:27 np0005538513.localdomain sudo[150814]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:27 np0005538513.localdomain sudo[150831]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:22:27 np0005538513.localdomain sudo[150831]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:22:27 np0005538513.localdomain sudo[150831]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:27 np0005538513.localdomain sudo[150846]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:22:27 np0005538513.localdomain sudo[150846]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:22:27 np0005538513.localdomain sudo[150936]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbzkiwhbudyvhoizvjdvtalrposzemof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321747.7073126-1418-738027007607/AnsiballZ_stat.py
Nov 28 09:22:27 np0005538513.localdomain sudo[150936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37497 DF PROTO=TCP SPT=51806 DPT=9101 SEQ=1834210490 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1411FA0000000001030307) 
Nov 28 09:22:28 np0005538513.localdomain python3.9[150938]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:22:28 np0005538513.localdomain sudo[150936]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:28 np0005538513.localdomain sudo[150846]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:28 np0005538513.localdomain sudo[151042]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqidhhlyautxndfxhixofbxatocnstky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321747.7073126-1418-738027007607/AnsiballZ_copy.py
Nov 28 09:22:28 np0005538513.localdomain sudo[151042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:28 np0005538513.localdomain python3.9[151044]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321747.7073126-1418-738027007607/.source.json _original_basename=.edz0brm4 follow=False checksum=38f75f59f5c2ef6b5da12297bfd31cd1e97012ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:28 np0005538513.localdomain sudo[151042]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:28 np0005538513.localdomain sudo[151059]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:22:28 np0005538513.localdomain sudo[151059]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:22:28 np0005538513.localdomain sudo[151059]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:29 np0005538513.localdomain sudo[151149]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hijbthldlsjxormzjmkqbqgspfhcsqvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321748.952236-1463-121541741578097/AnsiballZ_file.py
Nov 28 09:22:29 np0005538513.localdomain sudo[151149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:29 np0005538513.localdomain python3.9[151151]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:29 np0005538513.localdomain sudo[151149]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:29 np0005538513.localdomain sudo[151241]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cztpjsodfghlfjeljvqmmqyogpddznlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321749.616662-1487-152500213979187/AnsiballZ_stat.py
Nov 28 09:22:29 np0005538513.localdomain sudo[151241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:30 np0005538513.localdomain sudo[151241]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:30 np0005538513.localdomain sudo[151314]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jcpoyvnqqdwbtlnlxfjpkgjdjqbgjocc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321749.616662-1487-152500213979187/AnsiballZ_copy.py
Nov 28 09:22:30 np0005538513.localdomain sudo[151314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:30 np0005538513.localdomain sudo[151314]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37499 DF PROTO=TCP SPT=51806 DPT=9101 SEQ=1834210490 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB141E020000000001030307) 
Nov 28 09:22:31 np0005538513.localdomain sudo[151406]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjxvhrledbvxfveejmhkrhzxofaylewe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321751.1559973-1538-30291263024703/AnsiballZ_container_config_data.py
Nov 28 09:22:31 np0005538513.localdomain sudo[151406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:32 np0005538513.localdomain python3.9[151408]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 28 09:22:32 np0005538513.localdomain sudo[151406]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:32 np0005538513.localdomain sudo[151498]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofegvlxuabxzpcwpcwkrzrsvlispmshf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321752.3798904-1565-266515521423440/AnsiballZ_container_config_hash.py
Nov 28 09:22:32 np0005538513.localdomain sudo[151498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:32 np0005538513.localdomain python3.9[151500]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:22:32 np0005538513.localdomain sudo[151498]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21783 DF PROTO=TCP SPT=34170 DPT=9102 SEQ=1601837050 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1427020000000001030307) 
Nov 28 09:22:34 np0005538513.localdomain sudo[151590]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-voanbjvzsrxhgsarzvzfsczdfxbzlbbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321754.4761431-1592-219130510017804/AnsiballZ_podman_container_info.py
Nov 28 09:22:34 np0005538513.localdomain sudo[151590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:34 np0005538513.localdomain python3.9[151592]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 28 09:22:35 np0005538513.localdomain sudo[151590]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:36 np0005538513.localdomain sshd[151634]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:22:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20579 DF PROTO=TCP SPT=59158 DPT=9100 SEQ=703998991 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1432820000000001030307) 
Nov 28 09:22:37 np0005538513.localdomain sshd[151634]: Invalid user solv from 80.94.92.182 port 47822
Nov 28 09:22:37 np0005538513.localdomain sshd[151634]: Connection closed by invalid user solv 80.94.92.182 port 47822 [preauth]
Nov 28 09:22:38 np0005538513.localdomain sudo[151711]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-unglfgzvcqtuqiemifwklxcmcgbioobx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764321758.0839663-1631-10540360913575/AnsiballZ_edpm_container_manage.py
Nov 28 09:22:38 np0005538513.localdomain sudo[151711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:38 np0005538513.localdomain python3[151713]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:22:39 np0005538513.localdomain python3[151713]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "52cb1910f3f090372807028d1c2aea98d2557b1086636469529f290368ecdf69",
                                                                    "Digest": "sha256:7ab0ee81fdc9b162df9b50eb2e264c777d08f90975a442620ec940edabe300b2",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:7ab0ee81fdc9b162df9b50eb2e264c777d08f90975a442620ec940edabe300b2"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-11-26T06:43:38.999472418Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 345745352,
                                                                    "VirtualSize": 345745352,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/d63efe17da859108a09d9b90626ba0c433787abe209cd4ac755f6ba2a5206671/diff:/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/d8443c9fdf039c2367e44e0edbe81c941f30f604c3f1eccc2fc81efb5a97a784/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/d8443c9fdf039c2367e44e0edbe81c941f30f604c3f1eccc2fc81efb5a97a784/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:1e3477d3ea795ca64b46f28aa9428ba791c4250e0fd05e173a4b9c0fb0bdee23",
                                                                              "sha256:41a433848ac42a81e513766649f77cfa09e37aae045bcbbb33be77f7cf86edc4",
                                                                              "sha256:055d9012b48b3c8064accd40b6372c79c29fedd85061a710ada00677f88b1db9"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.55004106Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550061231Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550071761Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550082711Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550094371Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550104472Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.937139683Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:33.845342269Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:37.752912815Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.066850603Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.343690066Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.121414134Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.758394881Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.023293708Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.666927498Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.274045447Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.934810694Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:42.460051822Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.056709748Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.656939418Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.391634882Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.866551538Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.384686341Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.893815667Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:50.280039705Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:51.365780205Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:52.238116267Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:54.354755699Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.47438266Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474435383Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474444143Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474450953Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:58.542433842Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:14:26.691247936Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:15:32.288422734Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openvswitch openvswitch-ovn-common python3-netifaces python3-openvswitch tcpdump && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:15:33.83333928Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:42:58.179075923Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-ovn-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:43:38.997189664Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openvswitch-ovn-host && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:43:40.109412373Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 28 09:22:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50295 DF PROTO=TCP SPT=53066 DPT=9100 SEQ=1538634092 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB143D820000000001030307) 
Nov 28 09:22:39 np0005538513.localdomain podman[151764]: 2025-11-28 09:22:39.267415631 +0000 UTC m=+0.093466262 container remove 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_id=tripleo_step4, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Nov 28 09:22:39 np0005538513.localdomain python3[151713]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_controller
Nov 28 09:22:39 np0005538513.localdomain podman[151777]: 
Nov 28 09:22:39 np0005538513.localdomain podman[151777]: 2025-11-28 09:22:39.373524679 +0000 UTC m=+0.086916839 container create 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller)
Nov 28 09:22:39 np0005538513.localdomain podman[151777]: 2025-11-28 09:22:39.334291437 +0000 UTC m=+0.047683667 image pull  quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 28 09:22:39 np0005538513.localdomain python3[151713]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 28 09:22:39 np0005538513.localdomain sudo[151711]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:40 np0005538513.localdomain sudo[151904]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-unomdwtuellcehxvrewbjfwjxfnlmmct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321759.7603366-1655-144220970362294/AnsiballZ_stat.py
Nov 28 09:22:40 np0005538513.localdomain sudo[151904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:40 np0005538513.localdomain python3.9[151906]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:22:40 np0005538513.localdomain sudo[151904]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:40 np0005538513.localdomain sudo[151998]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbjdabuixjdbtyxzvbmdbapxkdwfeunm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321760.5939267-1682-115880274968409/AnsiballZ_file.py
Nov 28 09:22:40 np0005538513.localdomain sudo[151998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:41 np0005538513.localdomain python3.9[152000]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:41 np0005538513.localdomain sudo[151998]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:41 np0005538513.localdomain sudo[152044]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ochgeubpyyncdrwxvmytkkaawkqhgjee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321760.5939267-1682-115880274968409/AnsiballZ_stat.py
Nov 28 09:22:41 np0005538513.localdomain sudo[152044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:41 np0005538513.localdomain python3.9[152046]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:22:41 np0005538513.localdomain sudo[152044]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:41 np0005538513.localdomain sudo[152135]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxoaerxwbhewvtqetbxkblpjqydsztuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321761.5425732-1682-228749997265211/AnsiballZ_copy.py
Nov 28 09:22:41 np0005538513.localdomain sudo[152135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:42 np0005538513.localdomain python3.9[152137]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764321761.5425732-1682-228749997265211/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:42 np0005538513.localdomain sudo[152135]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:42 np0005538513.localdomain sudo[152181]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vjnmyysdmbekphvgsxafrsgdbdexgicd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321761.5425732-1682-228749997265211/AnsiballZ_systemd.py
Nov 28 09:22:42 np0005538513.localdomain sudo[152181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20581 DF PROTO=TCP SPT=59158 DPT=9100 SEQ=703998991 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB144A420000000001030307) 
Nov 28 09:22:42 np0005538513.localdomain python3.9[152183]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:22:42 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:22:42 np0005538513.localdomain systemd-rc-local-generator[152205]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:22:42 np0005538513.localdomain systemd-sysv-generator[152209]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:22:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:22:43 np0005538513.localdomain sudo[152181]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:43 np0005538513.localdomain sudo[152263]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-duzdjigrmsluomvfwsjzhcmavyygowma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321761.5425732-1682-228749997265211/AnsiballZ_systemd.py
Nov 28 09:22:43 np0005538513.localdomain sudo[152263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:43 np0005538513.localdomain python3.9[152265]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:22:43 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:22:43 np0005538513.localdomain systemd-sysv-generator[152298]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:22:43 np0005538513.localdomain systemd-rc-local-generator[152294]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:22:43 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:22:43 np0005538513.localdomain systemd[1]: Starting ovn_controller container...
Nov 28 09:22:44 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:22:44 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8e76290f26abea91a0c30e5a77de17af49be781827908e370bab34dfbcbda46f/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 28 09:22:44 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:22:44 np0005538513.localdomain podman[152307]: 2025-11-28 09:22:44.156243396 +0000 UTC m=+0.139304869 container init 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: + sudo -E kolla_set_configs
Nov 28 09:22:44 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:22:44 np0005538513.localdomain podman[152307]: 2025-11-28 09:22:44.185034701 +0000 UTC m=+0.168096124 container start 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:22:44 np0005538513.localdomain edpm-start-podman-container[152307]: ovn_controller
Nov 28 09:22:44 np0005538513.localdomain systemd[1]: Created slice User Slice of UID 0.
Nov 28 09:22:44 np0005538513.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 28 09:22:44 np0005538513.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 28 09:22:44 np0005538513.localdomain systemd[1]: Starting User Manager for UID 0...
Nov 28 09:22:44 np0005538513.localdomain systemd[152352]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Nov 28 09:22:44 np0005538513.localdomain podman[152329]: 2025-11-28 09:22:44.288340193 +0000 UTC m=+0.098373323 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, container_name=ovn_controller, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:22:44 np0005538513.localdomain podman[152329]: 2025-11-28 09:22:44.332249059 +0000 UTC m=+0.142282149 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 09:22:44 np0005538513.localdomain edpm-start-podman-container[152306]: Creating additional drop-in dependency for "ovn_controller" (9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c)
Nov 28 09:22:44 np0005538513.localdomain podman[152329]: unhealthy
Nov 28 09:22:44 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:22:44 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Failed with result 'exit-code'.
Nov 28 09:22:44 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:22:44 np0005538513.localdomain systemd[152352]: Queued start job for default target Main User Target.
Nov 28 09:22:44 np0005538513.localdomain systemd[152352]: Created slice User Application Slice.
Nov 28 09:22:44 np0005538513.localdomain systemd-journald[47227]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation.
Nov 28 09:22:44 np0005538513.localdomain systemd-journald[47227]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 09:22:44 np0005538513.localdomain systemd[152352]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 28 09:22:44 np0005538513.localdomain rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:22:44 np0005538513.localdomain systemd[152352]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 09:22:44 np0005538513.localdomain systemd[152352]: Reached target Paths.
Nov 28 09:22:44 np0005538513.localdomain systemd[152352]: Reached target Timers.
Nov 28 09:22:44 np0005538513.localdomain systemd[152352]: Starting D-Bus User Message Bus Socket...
Nov 28 09:22:44 np0005538513.localdomain systemd[152352]: Starting Create User's Volatile Files and Directories...
Nov 28 09:22:44 np0005538513.localdomain systemd[152352]: Listening on D-Bus User Message Bus Socket.
Nov 28 09:22:44 np0005538513.localdomain systemd[152352]: Reached target Sockets.
Nov 28 09:22:44 np0005538513.localdomain rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:22:44 np0005538513.localdomain systemd[152352]: Finished Create User's Volatile Files and Directories.
Nov 28 09:22:44 np0005538513.localdomain systemd[152352]: Reached target Basic System.
Nov 28 09:22:44 np0005538513.localdomain systemd[152352]: Reached target Main User Target.
Nov 28 09:22:44 np0005538513.localdomain systemd[152352]: Startup finished in 138ms.
Nov 28 09:22:44 np0005538513.localdomain systemd-rc-local-generator[152407]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:22:44 np0005538513.localdomain systemd-sysv-generator[152414]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:22:44 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:22:44 np0005538513.localdomain systemd[1]: Started User Manager for UID 0.
Nov 28 09:22:44 np0005538513.localdomain systemd[1]: Started ovn_controller container.
Nov 28 09:22:44 np0005538513.localdomain systemd[1]: Started Session c12 of User root.
Nov 28 09:22:44 np0005538513.localdomain sudo[152263]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: INFO:__main__:Validating config file
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: INFO:__main__:Writing out command to execute
Nov 28 09:22:44 np0005538513.localdomain systemd[1]: session-c12.scope: Deactivated successfully.
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: ++ cat /run_command
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: + ARGS=
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: + sudo kolla_copy_cacerts
Nov 28 09:22:44 np0005538513.localdomain systemd[1]: Started Session c13 of User root.
Nov 28 09:22:44 np0005538513.localdomain systemd[1]: session-c13.scope: Deactivated successfully.
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: + [[ ! -n '' ]]
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: + . kolla_extend_start
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '\'''
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: + umask 0022
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00005|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00007|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connected
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00012|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00013|main|INFO|OVS feature set changed, force recompute.
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00014|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00015|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00017|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00018|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00020|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00021|main|INFO|OVS feature set changed, force recompute.
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00022|ovn_bfd|INFO|Disabled BFD on interface ovn-07900d-0
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00023|ovn_bfd|INFO|Disabled BFD on interface ovn-c3237d-0
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00024|ovn_bfd|INFO|Disabled BFD on interface ovn-11aa47-0
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00025|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00026|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00027|binding|INFO|Claiming lport 09612b07-5142-4b0f-9dab-74bf4403f69f for this chassis.
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00028|binding|INFO|09612b07-5142-4b0f-9dab-74bf4403f69f: Claiming fa:16:3e:f4:fc:6c 192.168.0.142
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00029|binding|INFO|Removing lport 09612b07-5142-4b0f-9dab-74bf4403f69f ovn-installed in OVS
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00030|ovn_bfd|INFO|Enabled BFD on interface ovn-07900d-0
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00031|ovn_bfd|INFO|Enabled BFD on interface ovn-c3237d-0
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00032|ovn_bfd|INFO|Enabled BFD on interface ovn-11aa47-0
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00033|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00034|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00035|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00036|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00037|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00038|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 09:22:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:44Z|00039|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 09:22:45 np0005538513.localdomain sudo[152521]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-leuadcdvwzuudzpbfgofkwgvvmjbuuqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321765.3735628-1766-138088958141096/AnsiballZ_command.py
Nov 28 09:22:45 np0005538513.localdomain sudo[152521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:45 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:45Z|00040|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 09:22:45 np0005538513.localdomain python3.9[152523]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:22:45 np0005538513.localdomain ovs-vsctl[152524]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 28 09:22:45 np0005538513.localdomain sudo[152521]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:45 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:45Z|00041|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 09:22:45 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:45Z|00042|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 09:22:46 np0005538513.localdomain sudo[152614]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zvnliebhmtiqmqjlyenjngnvuyhtdqsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321766.0462608-1790-175528347028785/AnsiballZ_command.py
Nov 28 09:22:46 np0005538513.localdomain sudo[152614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:46 np0005538513.localdomain python3.9[152616]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:22:46 np0005538513.localdomain ovs-vsctl[152618]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 28 09:22:46 np0005538513.localdomain sudo[152614]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1580 DF PROTO=TCP SPT=59038 DPT=9882 SEQ=781079388 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB145CC30000000001030307) 
Nov 28 09:22:47 np0005538513.localdomain sudo[152709]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tjkuwivykxeqjpwytfteuejbjimrcleo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321767.549184-1832-35584107563986/AnsiballZ_command.py
Nov 28 09:22:47 np0005538513.localdomain sudo[152709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:47 np0005538513.localdomain python3.9[152711]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:22:47 np0005538513.localdomain ovs-vsctl[152712]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 28 09:22:48 np0005538513.localdomain sudo[152709]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:48 np0005538513.localdomain sshd[145825]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:22:48 np0005538513.localdomain systemd[1]: session-49.scope: Deactivated successfully.
Nov 28 09:22:48 np0005538513.localdomain systemd[1]: session-49.scope: Consumed 39.933s CPU time.
Nov 28 09:22:48 np0005538513.localdomain systemd-logind[764]: Session 49 logged out. Waiting for processes to exit.
Nov 28 09:22:48 np0005538513.localdomain systemd-logind[764]: Removed session 49.
Nov 28 09:22:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17756 DF PROTO=TCP SPT=41240 DPT=9105 SEQ=822122634 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1464420000000001030307) 
Nov 28 09:22:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17757 DF PROTO=TCP SPT=41240 DPT=9105 SEQ=822122634 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB146C430000000001030307) 
Nov 28 09:22:52 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:52Z|00043|binding|INFO|Setting lport 09612b07-5142-4b0f-9dab-74bf4403f69f ovn-installed in OVS
Nov 28 09:22:52 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:22:52Z|00044|binding|INFO|Setting lport 09612b07-5142-4b0f-9dab-74bf4403f69f up in Southbound
Nov 28 09:22:54 np0005538513.localdomain sshd[152727]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:22:54 np0005538513.localdomain sshd[152727]: Accepted publickey for zuul from 192.168.122.30 port 47864 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:22:54 np0005538513.localdomain systemd[1]: Stopping User Manager for UID 0...
Nov 28 09:22:54 np0005538513.localdomain systemd[152352]: Activating special unit Exit the Session...
Nov 28 09:22:54 np0005538513.localdomain systemd[152352]: Stopped target Main User Target.
Nov 28 09:22:54 np0005538513.localdomain systemd[152352]: Stopped target Basic System.
Nov 28 09:22:54 np0005538513.localdomain systemd[152352]: Stopped target Paths.
Nov 28 09:22:54 np0005538513.localdomain systemd[152352]: Stopped target Sockets.
Nov 28 09:22:54 np0005538513.localdomain systemd[152352]: Stopped target Timers.
Nov 28 09:22:54 np0005538513.localdomain systemd[152352]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 09:22:54 np0005538513.localdomain systemd[152352]: Closed D-Bus User Message Bus Socket.
Nov 28 09:22:54 np0005538513.localdomain systemd[152352]: Stopped Create User's Volatile Files and Directories.
Nov 28 09:22:54 np0005538513.localdomain systemd[152352]: Removed slice User Application Slice.
Nov 28 09:22:54 np0005538513.localdomain systemd[152352]: Reached target Shutdown.
Nov 28 09:22:54 np0005538513.localdomain systemd[152352]: Finished Exit the Session.
Nov 28 09:22:54 np0005538513.localdomain systemd[152352]: Reached target Exit the Session.
Nov 28 09:22:55 np0005538513.localdomain systemd-logind[764]: New session 51 of user zuul.
Nov 28 09:22:55 np0005538513.localdomain systemd[1]: Started Session 51 of User zuul.
Nov 28 09:22:55 np0005538513.localdomain systemd[1]: user@0.service: Deactivated successfully.
Nov 28 09:22:55 np0005538513.localdomain systemd[1]: Stopped User Manager for UID 0.
Nov 28 09:22:55 np0005538513.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 28 09:22:55 np0005538513.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 28 09:22:55 np0005538513.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 28 09:22:55 np0005538513.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 28 09:22:55 np0005538513.localdomain systemd[1]: Removed slice User Slice of UID 0.
Nov 28 09:22:55 np0005538513.localdomain sshd[152727]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:22:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17758 DF PROTO=TCP SPT=41240 DPT=9105 SEQ=822122634 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB147C030000000001030307) 
Nov 28 09:22:56 np0005538513.localdomain python3.9[152824]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:22:57 np0005538513.localdomain sudo[152918]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pshffokdrysfoxoukjjhppnyjrxfxxeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321777.151131-62-100189092208800/AnsiballZ_file.py
Nov 28 09:22:57 np0005538513.localdomain sudo[152918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:57 np0005538513.localdomain python3.9[152920]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:22:57 np0005538513.localdomain sudo[152918]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46699 DF PROTO=TCP SPT=42092 DPT=9101 SEQ=2393569372 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB14872A0000000001030307) 
Nov 28 09:22:58 np0005538513.localdomain sudo[153010]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idecuoinemsvjgjbkskcccgtyorlkhfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321777.9160056-62-87696246035858/AnsiballZ_file.py
Nov 28 09:22:58 np0005538513.localdomain sudo[153010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:58 np0005538513.localdomain python3.9[153012]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:22:58 np0005538513.localdomain sudo[153010]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:59 np0005538513.localdomain sudo[153102]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mqvibjzjaqnzvqrvxaqytmisukzzvdtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321778.5128677-62-203015235610581/AnsiballZ_file.py
Nov 28 09:22:59 np0005538513.localdomain sudo[153102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:59 np0005538513.localdomain python3.9[153104]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:22:59 np0005538513.localdomain sudo[153102]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:00 np0005538513.localdomain sudo[153194]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lipjbteevzvtgjuanmsudaoblzevgkpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321779.7811632-62-156388457490674/AnsiballZ_file.py
Nov 28 09:23:00 np0005538513.localdomain sudo[153194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:00 np0005538513.localdomain python3.9[153196]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:00 np0005538513.localdomain sudo[153194]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:00 np0005538513.localdomain sudo[153286]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wefexypbwmtdthhktjotvbxdgecltuyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321780.539952-62-18151476187770/AnsiballZ_file.py
Nov 28 09:23:00 np0005538513.localdomain sudo[153286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:00 np0005538513.localdomain python3.9[153288]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:00 np0005538513.localdomain sudo[153286]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46701 DF PROTO=TCP SPT=42092 DPT=9101 SEQ=2393569372 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1493420000000001030307) 
Nov 28 09:23:01 np0005538513.localdomain python3.9[153378]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:23:02 np0005538513.localdomain sudo[153468]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqohyfcgvnbxnkdlxxxgibwwgeldlylq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321781.9570541-194-177473571391323/AnsiballZ_seboolean.py
Nov 28 09:23:02 np0005538513.localdomain sudo[153468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:02 np0005538513.localdomain python3.9[153470]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 28 09:23:02 np0005538513.localdomain sudo[153468]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17759 DF PROTO=TCP SPT=41240 DPT=9105 SEQ=822122634 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB149B820000000001030307) 
Nov 28 09:23:03 np0005538513.localdomain python3.9[153561]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:04 np0005538513.localdomain python3.9[153634]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321782.9446158-218-87392814631384/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:04 np0005538513.localdomain python3.9[153724]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:05 np0005538513.localdomain python3.9[153797]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321784.3537-263-196086026412499/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:05 np0005538513.localdomain sudo[153887]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avdlhazpjpimieoisgouymurvzwihpqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321785.583253-314-227468372245533/AnsiballZ_setup.py
Nov 28 09:23:05 np0005538513.localdomain sudo[153887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:06 np0005538513.localdomain python3.9[153889]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:23:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16218 DF PROTO=TCP SPT=57288 DPT=9102 SEQ=1664019527 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB14A7830000000001030307) 
Nov 28 09:23:06 np0005538513.localdomain sudo[153887]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:06 np0005538513.localdomain sudo[153941]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojabxdvmaezyetzlxfjuenrcrepxtknj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321785.583253-314-227468372245533/AnsiballZ_dnf.py
Nov 28 09:23:06 np0005538513.localdomain sudo[153941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:07 np0005538513.localdomain python3.9[153943]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:23:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25993 DF PROTO=TCP SPT=41342 DPT=9100 SEQ=1338736920 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB14B3820000000001030307) 
Nov 28 09:23:10 np0005538513.localdomain sudo[153941]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:10 np0005538513.localdomain sudo[154035]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iknwrnirxvtttqwrszlwcjxfzfmbikce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321790.2597823-350-238161117789591/AnsiballZ_systemd.py
Nov 28 09:23:10 np0005538513.localdomain sudo[154035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:11 np0005538513.localdomain python3.9[154037]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 09:23:11 np0005538513.localdomain sudo[154035]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:11 np0005538513.localdomain python3.9[154130]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:12 np0005538513.localdomain python3.9[154201]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321791.4540122-374-146874500130316/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19397 DF PROTO=TCP SPT=57738 DPT=9100 SEQ=970466610 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB14BF820000000001030307) 
Nov 28 09:23:13 np0005538513.localdomain python3.9[154291]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:13 np0005538513.localdomain python3.9[154362]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321792.4888258-374-92163016567378/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:14 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:23:14 np0005538513.localdomain python3.9[154452]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:14 np0005538513.localdomain podman[154453]: 2025-11-28 09:23:14.876362224 +0000 UTC m=+0.105230194 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 09:23:14 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:23:14Z|00045|memory|INFO|16988 kB peak resident set size after 30.1 seconds
Nov 28 09:23:14 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:23:14Z|00046|memory|INFO|idl-cells-OVN_Southbound:4028 idl-cells-Open_vSwitch:1045 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:76 lflow-cache-entries-cache-matches:195 lflow-cache-size-KB:289 local_datapath_usage-KB:1 ofctrl_desired_flow_usage-KB:154 ofctrl_installed_flow_usage-KB:111 ofctrl_sb_flow_ref_usage-KB:67
Nov 28 09:23:14 np0005538513.localdomain podman[154453]: 2025-11-28 09:23:14.919424098 +0000 UTC m=+0.148292058 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:23:14 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:23:15 np0005538513.localdomain python3.9[154548]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321794.4513798-506-46654658540954/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=aa9e89725fbcebf7a5c773d7b97083445b7b7759 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:15 np0005538513.localdomain python3.9[154638]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:16 np0005538513.localdomain python3.9[154709]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321795.5571332-506-117245659445644/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=979187b925479d81d0609f4188e5b95fe1f92c18 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:17 np0005538513.localdomain python3.9[154799]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:23:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56445 DF PROTO=TCP SPT=54420 DPT=9882 SEQ=1838728032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB14D2030000000001030307) 
Nov 28 09:23:17 np0005538513.localdomain sudo[154891]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ccomdwuudqqbkkkhhdvtdwdnmqnxrhbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321797.3999782-620-61575864875249/AnsiballZ_file.py
Nov 28 09:23:17 np0005538513.localdomain sudo[154891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:17 np0005538513.localdomain python3.9[154893]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:17 np0005538513.localdomain sudo[154891]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:18 np0005538513.localdomain sudo[154983]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zaxiqnhmwpgckncrrdxygzqxbyrpblyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321798.5551994-644-217141071858718/AnsiballZ_stat.py
Nov 28 09:23:18 np0005538513.localdomain sudo[154983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:19 np0005538513.localdomain python3.9[154985]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:19 np0005538513.localdomain sudo[154983]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18393 DF PROTO=TCP SPT=54340 DPT=9105 SEQ=1226643617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB14D9820000000001030307) 
Nov 28 09:23:19 np0005538513.localdomain sudo[155031]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mksjfcymzdmuktawohhnzdyaidvstcqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321798.5551994-644-217141071858718/AnsiballZ_file.py
Nov 28 09:23:19 np0005538513.localdomain sudo[155031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:19 np0005538513.localdomain python3.9[155033]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:19 np0005538513.localdomain sudo[155031]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:19 np0005538513.localdomain sudo[155123]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqnjxfqitsiwhdvemxtiognpkptejtmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321799.6805298-644-90358056969684/AnsiballZ_stat.py
Nov 28 09:23:19 np0005538513.localdomain sudo[155123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:20 np0005538513.localdomain python3.9[155125]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:20 np0005538513.localdomain sudo[155123]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:20 np0005538513.localdomain sudo[155171]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fetzbmoebwfmrjmfvorxiqwjblbkomfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321799.6805298-644-90358056969684/AnsiballZ_file.py
Nov 28 09:23:20 np0005538513.localdomain sudo[155171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:20 np0005538513.localdomain python3.9[155173]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:20 np0005538513.localdomain sudo[155171]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18394 DF PROTO=TCP SPT=54340 DPT=9105 SEQ=1226643617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB14E1820000000001030307) 
Nov 28 09:23:21 np0005538513.localdomain sudo[155263]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxabgewoiphihizlcdatpznjdhxpjijb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321801.3629265-713-55951344398825/AnsiballZ_file.py
Nov 28 09:23:21 np0005538513.localdomain sudo[155263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:21 np0005538513.localdomain python3.9[155265]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:23:21 np0005538513.localdomain sudo[155263]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:22 np0005538513.localdomain sudo[155355]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ddanlabjgvlqodddnpuogqrwqhgijtly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321802.0682886-737-13820257401520/AnsiballZ_stat.py
Nov 28 09:23:22 np0005538513.localdomain sudo[155355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:22 np0005538513.localdomain python3.9[155357]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:22 np0005538513.localdomain sudo[155355]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:22 np0005538513.localdomain sudo[155403]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-butkqlpeanjpjgqimgifszjbfbgvgvvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321802.0682886-737-13820257401520/AnsiballZ_file.py
Nov 28 09:23:22 np0005538513.localdomain sudo[155403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:22 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:23:22Z|00047|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Nov 28 09:23:23 np0005538513.localdomain python3.9[155405]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:23:23 np0005538513.localdomain sudo[155403]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:23 np0005538513.localdomain sudo[155495]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwgmrkxzsosnpkxyjtfkokfkjgwnooaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321803.2387078-773-97802465769294/AnsiballZ_stat.py
Nov 28 09:23:23 np0005538513.localdomain sudo[155495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:23 np0005538513.localdomain python3.9[155497]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:23 np0005538513.localdomain sudo[155495]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:23 np0005538513.localdomain sudo[155543]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-skiiypconpbzdamkwwvjuyoxinirbqxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321803.2387078-773-97802465769294/AnsiballZ_file.py
Nov 28 09:23:23 np0005538513.localdomain sudo[155543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:24 np0005538513.localdomain python3.9[155545]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:23:24 np0005538513.localdomain sudo[155543]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:24 np0005538513.localdomain sudo[155635]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhnjqumrwnlzbosadznfvkhbhdsjtirg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321804.3365278-809-162653780298003/AnsiballZ_systemd.py
Nov 28 09:23:24 np0005538513.localdomain sudo[155635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:24 np0005538513.localdomain python3.9[155637]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:23:24 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:23:25 np0005538513.localdomain systemd-rc-local-generator[155658]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:23:25 np0005538513.localdomain systemd-sysv-generator[155665]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:23:25 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:23:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18395 DF PROTO=TCP SPT=54340 DPT=9105 SEQ=1226643617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB14F1420000000001030307) 
Nov 28 09:23:25 np0005538513.localdomain sudo[155635]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:25 np0005538513.localdomain sudo[155765]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjslxgdvyzqbkgfbrwdidjzuotexuagn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321805.4876175-833-53147251895589/AnsiballZ_stat.py
Nov 28 09:23:25 np0005538513.localdomain sudo[155765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:25 np0005538513.localdomain python3.9[155767]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:26 np0005538513.localdomain sudo[155765]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:26 np0005538513.localdomain sudo[155813]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujqhyvfqzmymdlslgtfljelnmmwkkkfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321805.4876175-833-53147251895589/AnsiballZ_file.py
Nov 28 09:23:26 np0005538513.localdomain sudo[155813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:26 np0005538513.localdomain python3.9[155815]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:23:26 np0005538513.localdomain sudo[155813]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:26 np0005538513.localdomain sudo[155905]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pceoxcebifxbcyntuxvhpkiimjxpqiff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321806.630893-869-236147294605261/AnsiballZ_stat.py
Nov 28 09:23:26 np0005538513.localdomain sudo[155905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:27 np0005538513.localdomain python3.9[155907]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:27 np0005538513.localdomain sudo[155905]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:27 np0005538513.localdomain sudo[155953]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iqvmaodgxmipclbfoetsbzfkmoylzeju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321806.630893-869-236147294605261/AnsiballZ_file.py
Nov 28 09:23:27 np0005538513.localdomain sudo[155953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:27 np0005538513.localdomain python3.9[155955]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:23:27 np0005538513.localdomain sudo[155953]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:28 np0005538513.localdomain sudo[156045]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ucyfpshxnjsnqzfepetabdxjuwoydafd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321807.737149-905-140755421162339/AnsiballZ_systemd.py
Nov 28 09:23:28 np0005538513.localdomain sudo[156045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37344 DF PROTO=TCP SPT=49172 DPT=9101 SEQ=1056236817 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB14FC5A0000000001030307) 
Nov 28 09:23:28 np0005538513.localdomain python3.9[156047]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:23:28 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:23:28 np0005538513.localdomain systemd-rc-local-generator[156073]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:23:28 np0005538513.localdomain systemd-sysv-generator[156076]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:23:28 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:23:28 np0005538513.localdomain systemd[1]: Starting Create netns directory...
Nov 28 09:23:28 np0005538513.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 09:23:28 np0005538513.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 09:23:28 np0005538513.localdomain systemd[1]: Finished Create netns directory.
Nov 28 09:23:28 np0005538513.localdomain sudo[156045]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:29 np0005538513.localdomain sudo[156130]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:23:29 np0005538513.localdomain sudo[156130]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:23:29 np0005538513.localdomain sudo[156130]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:29 np0005538513.localdomain sudo[156164]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:23:29 np0005538513.localdomain sudo[156164]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:23:29 np0005538513.localdomain sudo[156209]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gnwcmminogpcjjehlxtdnogkzuuhimqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321809.1100185-935-44567810958386/AnsiballZ_file.py
Nov 28 09:23:29 np0005538513.localdomain sudo[156209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:29 np0005538513.localdomain python3.9[156211]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:29 np0005538513.localdomain sudo[156209]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:30 np0005538513.localdomain sudo[156379]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hctnagjulvihfytsjgadvyklldgxmthl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321809.8248074-959-142873168532102/AnsiballZ_stat.py
Nov 28 09:23:30 np0005538513.localdomain sudo[156379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:30 np0005538513.localdomain systemd[1]: tmp-crun.NQYsSb.mount: Deactivated successfully.
Nov 28 09:23:30 np0005538513.localdomain podman[156365]: 2025-11-28 09:23:30.172984062 +0000 UTC m=+0.111030158 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, vcs-type=git, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, release=553, distribution-scope=public, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., RELEASE=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, name=rhceph, GIT_BRANCH=main, version=7)
Nov 28 09:23:30 np0005538513.localdomain python3.9[156386]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:30 np0005538513.localdomain podman[156365]: 2025-11-28 09:23:30.314660051 +0000 UTC m=+0.252706197 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., version=7, architecture=x86_64, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, ceph=True, RELEASE=main, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:23:30 np0005538513.localdomain sudo[156379]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:30 np0005538513.localdomain sudo[156164]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:30 np0005538513.localdomain sudo[156495]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:23:30 np0005538513.localdomain sudo[156495]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:23:30 np0005538513.localdomain sudo[156495]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:30 np0005538513.localdomain sudo[156525]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qpnihlzivhzwhtkellcsgkfznikgqmdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321809.8248074-959-142873168532102/AnsiballZ_copy.py
Nov 28 09:23:30 np0005538513.localdomain sudo[156525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:30 np0005538513.localdomain sudo[156528]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:23:30 np0005538513.localdomain sudo[156528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:23:30 np0005538513.localdomain python3.9[156532]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321809.8248074-959-142873168532102/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:30 np0005538513.localdomain sudo[156525]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37346 DF PROTO=TCP SPT=49172 DPT=9101 SEQ=1056236817 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1508420000000001030307) 
Nov 28 09:23:31 np0005538513.localdomain sudo[156528]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:31 np0005538513.localdomain sudo[156635]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:23:31 np0005538513.localdomain sudo[156635]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:23:31 np0005538513.localdomain sudo[156635]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:32 np0005538513.localdomain sudo[156680]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-isaieocncqfxbxarxtnvdemhvmjsfetx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321811.3187597-1010-167753456680800/AnsiballZ_file.py
Nov 28 09:23:32 np0005538513.localdomain sudo[156680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:32 np0005538513.localdomain python3.9[156682]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:32 np0005538513.localdomain sudo[156680]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:33 np0005538513.localdomain sudo[156772]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjdhbvuvhrpcolutnojfelwnakejghxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321812.986586-1034-109923243887108/AnsiballZ_stat.py
Nov 28 09:23:33 np0005538513.localdomain sudo[156772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19342 DF PROTO=TCP SPT=40296 DPT=9102 SEQ=1626653550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1511430000000001030307) 
Nov 28 09:23:33 np0005538513.localdomain python3.9[156774]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:33 np0005538513.localdomain sudo[156772]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:33 np0005538513.localdomain sudo[156847]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-myqxlblyhnnrqwmvaimtlxfnxwcjrrbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321812.986586-1034-109923243887108/AnsiballZ_copy.py
Nov 28 09:23:33 np0005538513.localdomain sudo[156847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:34 np0005538513.localdomain python3.9[156849]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321812.986586-1034-109923243887108/.source.json _original_basename=.w7ks84j_ follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:23:34 np0005538513.localdomain sudo[156847]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:34 np0005538513.localdomain sudo[156939]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pskiqdpncnrydttzslylcdyvavxumody ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321814.4785156-1079-89958068441453/AnsiballZ_file.py
Nov 28 09:23:34 np0005538513.localdomain sudo[156939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:34 np0005538513.localdomain python3.9[156941]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:23:34 np0005538513.localdomain sudo[156939]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:35 np0005538513.localdomain sudo[157031]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrgdwduukhxktustzogebzfpquduwkga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321815.170423-1103-215998493953813/AnsiballZ_stat.py
Nov 28 09:23:35 np0005538513.localdomain sudo[157031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:35 np0005538513.localdomain sudo[157031]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:35 np0005538513.localdomain sudo[157104]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xdwvsudpudpzxnmwytfigvzmgwautzya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321815.170423-1103-215998493953813/AnsiballZ_copy.py
Nov 28 09:23:35 np0005538513.localdomain sudo[157104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:36 np0005538513.localdomain sudo[157104]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61330 DF PROTO=TCP SPT=47258 DPT=9100 SEQ=1542535815 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB151CC20000000001030307) 
Nov 28 09:23:36 np0005538513.localdomain sudo[157196]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkxcawpfdymfktslgvnqevkuvcivahru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321816.4325914-1154-227783456515326/AnsiballZ_container_config_data.py
Nov 28 09:23:36 np0005538513.localdomain sudo[157196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:36 np0005538513.localdomain python3.9[157198]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 28 09:23:36 np0005538513.localdomain sudo[157196]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:37 np0005538513.localdomain sudo[157288]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvkvcpmofuggpodgdmkjqhsmlznebfzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321817.2373378-1181-183904690124160/AnsiballZ_container_config_hash.py
Nov 28 09:23:37 np0005538513.localdomain sudo[157288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:37 np0005538513.localdomain python3.9[157290]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:23:37 np0005538513.localdomain sudo[157288]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:38 np0005538513.localdomain sudo[157380]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xewrupjhdtihvkvtpdyjqahomaxuaxai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321818.1537101-1208-26848379034654/AnsiballZ_podman_container_info.py
Nov 28 09:23:38 np0005538513.localdomain sudo[157380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:38 np0005538513.localdomain python3.9[157382]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 28 09:23:39 np0005538513.localdomain sudo[157380]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19344 DF PROTO=TCP SPT=40296 DPT=9102 SEQ=1626653550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1529020000000001030307) 
Nov 28 09:23:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61332 DF PROTO=TCP SPT=47258 DPT=9100 SEQ=1542535815 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1534820000000001030307) 
Nov 28 09:23:42 np0005538513.localdomain sudo[157499]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxohruwkwwszzzcgnwzpggjyqipmycnq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764321822.0665336-1247-211526077947920/AnsiballZ_edpm_container_manage.py
Nov 28 09:23:42 np0005538513.localdomain sudo[157499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:42 np0005538513.localdomain python3[157501]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:23:43 np0005538513.localdomain python3[157501]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071",
                                                                    "Digest": "sha256:2b8255d3a22035616e569dbe22862a2560e15cdaefedae0059a354d558788e1e",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:2b8255d3a22035616e569dbe22862a2560e15cdaefedae0059a354d558788e1e"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-11-26T06:34:14.989876147Z",
                                                                    "Config": {
                                                                         "User": "neutron",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 784145152,
                                                                    "VirtualSize": 784145152,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/f04f6aa8018da724c9daa5ca37db7cd13477323f1b725eec5dac97862d883048/diff:/var/lib/containers/storage/overlay/47afe78ba3ac18f156703d7ad9e4be64941a9d1bd472a4c2a59f4f2c3531ee35/diff:/var/lib/containers/storage/overlay/f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a/diff:/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/b574f97f279779c52df37c61d993141d596fdb6544fa700fbddd8f35f27a4d3b/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/b574f97f279779c52df37c61d993141d596fdb6544fa700fbddd8f35f27a4d3b/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:1e3477d3ea795ca64b46f28aa9428ba791c4250e0fd05e173a4b9c0fb0bdee23",
                                                                              "sha256:c136b33417f134a3b932677bcf7a2df089c29f20eca250129eafd2132d4708bb",
                                                                              "sha256:bc63f71478d9d90db803b468b28e5d9e0268adbace958b608ab10bd0819798bd",
                                                                              "sha256:3277562ff4450bdcd859dd0b0be874b10dd6f3502be711d42aab9ff44a85cf28",
                                                                              "sha256:982219792b3d83fa04ae12d0161dd3b982e7e3ed68293e6c876d50161b73746b"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "neutron",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.55004106Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550061231Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550071761Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550082711Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550094371Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550104472Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.937139683Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:33.845342269Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:37.752912815Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.066850603Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.343690066Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.121414134Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.758394881Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.023293708Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.666927498Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.274045447Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.934810694Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:42.460051822Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.056709748Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.656939418Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.391634882Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.866551538Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.384686341Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.893815667Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:50.280039705Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:51.365780205Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:52.238116267Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:54.354755699Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.47438266Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474435383Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474444143Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474450953Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:58.542433842Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:13:58.883943816Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:14:39.655921147Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:14:42.534184087Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:20:42.438406248Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:21:24.54454259Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage neutron",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:45.350498288Z",
                                                                              "created_by": "/bin/sh -c dnf -y install iputils net-tools openstack-neutron openstack-neutron-rpc-server openstack-neutron-ml2 openvswitch python3-networking-baremetal python3-openvswitch python3-unbound && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:45.889263301Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/neutron-base/neutron_sudoers /etc/sudoers.d/neutron_sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:46.291004499Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers.d/neutron_sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:23:25.184071037Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:32:10.991588202Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:33:28.900936438Z",
                                                                              "created_by": "/bin/sh -c dnf -y install libseccomp podman && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:33:33.145210084Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:33:40.18160951Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-agent-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:34:14.986660399Z",
                                                                              "created_by": "/bin/sh -c dnf -y install python3-networking-ovn-metadata-agent && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:34:14.986745051Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER neutron",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:34:18.63064752Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 09:23:43 np0005538513.localdomain podman[157550]: 2025-11-28 09:23:43.197051557 +0000 UTC m=+0.089198084 container remove 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team)
Nov 28 09:23:43 np0005538513.localdomain python3[157501]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_metadata_agent
Nov 28 09:23:43 np0005538513.localdomain podman[157563]: 
Nov 28 09:23:43 np0005538513.localdomain podman[157563]: 2025-11-28 09:23:43.341364658 +0000 UTC m=+0.125884008 container create ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_managed=true)
Nov 28 09:23:43 np0005538513.localdomain podman[157563]: 2025-11-28 09:23:43.260918003 +0000 UTC m=+0.045437363 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 09:23:43 np0005538513.localdomain python3[157501]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311 --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 09:23:43 np0005538513.localdomain sudo[157499]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:43 np0005538513.localdomain sudo[157685]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egeqtnlzibnuvimadknivhqppwouwtrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321823.7071917-1271-42149098704355/AnsiballZ_stat.py
Nov 28 09:23:43 np0005538513.localdomain sudo[157685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:44 np0005538513.localdomain python3.9[157687]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:23:44 np0005538513.localdomain sudo[157685]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:45 np0005538513.localdomain sudo[157779]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igiigakgzzpkzoptsayuxckwxllkslbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321824.860242-1298-238987456607967/AnsiballZ_file.py
Nov 28 09:23:45 np0005538513.localdomain sudo[157779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:23:45 np0005538513.localdomain podman[157781]: 2025-11-28 09:23:45.237521862 +0000 UTC m=+0.089090440 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 28 09:23:45 np0005538513.localdomain podman[157781]: 2025-11-28 09:23:45.3154679 +0000 UTC m=+0.167036498 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 28 09:23:45 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:23:45 np0005538513.localdomain python3.9[157782]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:23:45 np0005538513.localdomain sudo[157779]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:45 np0005538513.localdomain sudo[157850]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gsleniwctyrofmeuonwqytozxfxvitue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321824.860242-1298-238987456607967/AnsiballZ_stat.py
Nov 28 09:23:45 np0005538513.localdomain sudo[157850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:45 np0005538513.localdomain python3.9[157852]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:23:45 np0005538513.localdomain sudo[157850]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:46 np0005538513.localdomain sudo[157941]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybutbboiihyjhzhmpimmxcyfrsiehane ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321825.8203-1298-116854091113190/AnsiballZ_copy.py
Nov 28 09:23:46 np0005538513.localdomain sudo[157941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:47 np0005538513.localdomain python3.9[157943]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764321825.8203-1298-116854091113190/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:23:47 np0005538513.localdomain sudo[157941]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26702 DF PROTO=TCP SPT=51334 DPT=9882 SEQ=1327201159 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1547030000000001030307) 
Nov 28 09:23:47 np0005538513.localdomain sudo[157987]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oyywntyfdazztoxzwhzqmdeyaynrbdld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321825.8203-1298-116854091113190/AnsiballZ_systemd.py
Nov 28 09:23:47 np0005538513.localdomain sudo[157987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:47 np0005538513.localdomain python3.9[157989]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:23:47 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:23:47 np0005538513.localdomain systemd-rc-local-generator[158012]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:23:47 np0005538513.localdomain systemd-sysv-generator[158019]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:23:47 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:23:47 np0005538513.localdomain sudo[157987]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:48 np0005538513.localdomain sudo[158069]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydwsiowfrmyihtalxnihbcrayjqjkhjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321825.8203-1298-116854091113190/AnsiballZ_systemd.py
Nov 28 09:23:48 np0005538513.localdomain sudo[158069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:48 np0005538513.localdomain python3.9[158071]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:23:48 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:23:48 np0005538513.localdomain systemd-rc-local-generator[158098]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:23:48 np0005538513.localdomain systemd-sysv-generator[158104]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:23:48 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:23:48 np0005538513.localdomain systemd[1]: Starting ovn_metadata_agent container...
Nov 28 09:23:49 np0005538513.localdomain systemd[1]: tmp-crun.XnaYak.mount: Deactivated successfully.
Nov 28 09:23:49 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:23:49 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51878c7750469bb637905b80c196d06233ea74ae919e9279342c4fa33ab172a0/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 28 09:23:49 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51878c7750469bb637905b80c196d06233ea74ae919e9279342c4fa33ab172a0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 09:23:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:23:49 np0005538513.localdomain podman[158113]: 2025-11-28 09:23:49.068002365 +0000 UTC m=+0.157847267 container init ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: + sudo -E kolla_set_configs
Nov 28 09:23:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:23:49 np0005538513.localdomain podman[158113]: 2025-11-28 09:23:49.114144649 +0000 UTC m=+0.203989511 container start ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 09:23:49 np0005538513.localdomain edpm-start-podman-container[158113]: ovn_metadata_agent
Nov 28 09:23:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41542 DF PROTO=TCP SPT=47128 DPT=9105 SEQ=1499306541 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB154EC20000000001030307) 
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: INFO:__main__:Validating config file
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: INFO:__main__:Copying service configuration files
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: INFO:__main__:Writing out command to execute
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/40d5da59-6201-424a-8380-80ecc3d67c7e.pid.haproxy
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/40d5da59-6201-424a-8380-80ecc3d67c7e.conf
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: ++ cat /run_command
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: + CMD=neutron-ovn-metadata-agent
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: + ARGS=
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: + sudo kolla_copy_cacerts
Nov 28 09:23:49 np0005538513.localdomain podman[158133]: 2025-11-28 09:23:49.200518493 +0000 UTC m=+0.084914233 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=starting, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:23:49 np0005538513.localdomain podman[158133]: 2025-11-28 09:23:49.205649563 +0000 UTC m=+0.090045233 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Nov 28 09:23:49 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: + [[ ! -n '' ]]
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: + . kolla_extend_start
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: Running command: 'neutron-ovn-metadata-agent'
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: + umask 0022
Nov 28 09:23:49 np0005538513.localdomain ovn_metadata_agent[158125]: + exec neutron-ovn-metadata-agent
Nov 28 09:23:49 np0005538513.localdomain edpm-start-podman-container[158112]: Creating additional drop-in dependency for "ovn_metadata_agent" (ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8)
Nov 28 09:23:49 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:23:49 np0005538513.localdomain systemd-sysv-generator[158204]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:23:49 np0005538513.localdomain systemd-rc-local-generator[158201]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:23:49 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:23:49 np0005538513.localdomain systemd[1]: tmp-crun.FAfUYS.mount: Deactivated successfully.
Nov 28 09:23:49 np0005538513.localdomain systemd[1]: Started ovn_metadata_agent container.
Nov 28 09:23:49 np0005538513.localdomain sudo[158069]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:50 np0005538513.localdomain sshd[152727]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:23:50 np0005538513.localdomain systemd-logind[764]: Session 51 logged out. Waiting for processes to exit.
Nov 28 09:23:50 np0005538513.localdomain systemd[1]: session-51.scope: Deactivated successfully.
Nov 28 09:23:50 np0005538513.localdomain systemd[1]: session-51.scope: Consumed 31.577s CPU time.
Nov 28 09:23:50 np0005538513.localdomain systemd-logind[764]: Removed session 51.
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.744 158130 INFO neutron.common.config [-] Logging enabled!
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.744 158130 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.745 158130 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.745 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.745 158130 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.745 158130 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.745 158130 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.745 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.746 158130 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.746 158130 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.746 158130 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.746 158130 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.746 158130 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.746 158130 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.746 158130 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.746 158130 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.746 158130 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.746 158130 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.747 158130 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.747 158130 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.747 158130 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.747 158130 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.747 158130 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.747 158130 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.747 158130 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.747 158130 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.747 158130 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.747 158130 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.748 158130 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.748 158130 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.748 158130 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.748 158130 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.748 158130 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.748 158130 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.748 158130 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.748 158130 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.748 158130 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.748 158130 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.749 158130 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.749 158130 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.749 158130 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.749 158130 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.749 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.749 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.749 158130 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.749 158130 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.749 158130 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.749 158130 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.750 158130 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.750 158130 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.750 158130 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.750 158130 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.750 158130 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.750 158130 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.750 158130 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.750 158130 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.750 158130 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.750 158130 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.751 158130 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.751 158130 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.751 158130 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.751 158130 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.751 158130 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.751 158130 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.751 158130 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.751 158130 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.751 158130 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.751 158130 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.752 158130 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.752 158130 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.752 158130 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.752 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.752 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.752 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.752 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.752 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.752 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.752 158130 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.753 158130 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.753 158130 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.753 158130 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.753 158130 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.753 158130 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.753 158130 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.753 158130 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.753 158130 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.753 158130 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.753 158130 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.754 158130 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.754 158130 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.754 158130 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.754 158130 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.754 158130 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.754 158130 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.754 158130 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.754 158130 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.754 158130 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.754 158130 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.754 158130 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.755 158130 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.755 158130 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.755 158130 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.755 158130 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.755 158130 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.755 158130 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.755 158130 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.755 158130 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.755 158130 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.755 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.756 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.756 158130 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.756 158130 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.756 158130 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.756 158130 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.756 158130 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.756 158130 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.756 158130 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.756 158130 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.756 158130 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.757 158130 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.757 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.757 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.757 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.757 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.757 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.757 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.757 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.757 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.758 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.758 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.758 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.758 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.758 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.758 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.758 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.758 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.758 158130 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.758 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.759 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.759 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.759 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.759 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.759 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.759 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.759 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.759 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.759 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.759 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.760 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.760 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.760 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.760 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.760 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.760 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.760 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.760 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.760 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.760 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.761 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.761 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.761 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.761 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.761 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.761 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.761 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.761 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.761 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.761 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.762 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.762 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.762 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.762 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.762 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.762 158130 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.762 158130 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.762 158130 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.762 158130 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.762 158130 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.763 158130 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.763 158130 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.763 158130 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.763 158130 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.763 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.763 158130 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.763 158130 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.763 158130 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.763 158130 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.763 158130 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.764 158130 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.764 158130 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.764 158130 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.764 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.764 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.764 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.764 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.764 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.764 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.764 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.764 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.765 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.765 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.765 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.765 158130 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.765 158130 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.765 158130 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.765 158130 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.765 158130 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.765 158130 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.765 158130 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.766 158130 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.766 158130 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.766 158130 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.766 158130 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.766 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.766 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.766 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.766 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.766 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.766 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.767 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.767 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.767 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.767 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.767 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.767 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.767 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.767 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.767 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.767 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.768 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.768 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.768 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.768 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.768 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.768 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.768 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.768 158130 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.768 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.768 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.769 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.769 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.769 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.769 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.769 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.769 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.769 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.769 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.769 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.769 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.770 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.770 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.770 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.770 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.770 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.770 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.770 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.770 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.770 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.770 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.771 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.771 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.771 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.771 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.771 158130 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.771 158130 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.771 158130 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.771 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.771 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.771 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.772 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.772 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.772 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.772 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.772 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.772 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.772 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.772 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.772 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.772 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.773 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.773 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.773 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.773 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.773 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.773 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.773 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.773 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.773 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.774 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.774 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.774 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.774 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.774 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.774 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.774 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.774 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.774 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.774 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.775 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.775 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.775 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.775 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.775 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.775 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.826 158130 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.826 158130 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.826 158130 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.826 158130 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.826 158130 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.842 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name c85299c6-8e38-42c8-8509-2eaaf15c050c (UUID: c85299c6-8e38-42c8-8509-2eaaf15c050c) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.856 158130 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.857 158130 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.857 158130 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.857 158130 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.859 158130 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.862 158130 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.874 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: PortBindingCreateWithChassis(events=('create',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:fc:6c 192.168.0.142'], port_security=['fa:16:3e:f4:fc:6c 192.168.0.142'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.142/24', 'neutron:device_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005538513.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-40d5da59-6201-424a-8380-80ecc3d67c7e', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '9dda653c53224db086060962b0702694', 'neutron:revision_number': '7', 'neutron:security_group_ids': '6d2c5a31-c9e5-413a-bccf-f97c7687bd94 b3c60f08-3369-426b-b744-9cef04caaa7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f3122580-f73f-40fa-a838-6bad2ff9da2f, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=09612b07-5142-4b0f-9dab-74bf4403f69f) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.874 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'c85299c6-8e38-42c8-8509-2eaaf15c050c'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], external_ids={'neutron:ovn-metadata-id': '68f92086-2b44-5496-b923-e898b18e44d4', 'neutron:ovn-metadata-sb-cfg': '1'}, name=c85299c6-8e38-42c8-8509-2eaaf15c050c, nb_cfg_timestamp=1764321773767, nb_cfg=4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.875 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 09612b07-5142-4b0f-9dab-74bf4403f69f in datapath 40d5da59-6201-424a-8380-80ecc3d67c7e bound to our chassis on insert
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.875 158130 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fba977a0b50>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.876 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.876 158130 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.876 158130 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.876 158130 INFO oslo_service.service [-] Starting 1 workers
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.879 158130 DEBUG oslo_service.service [-] Started child 158228 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.881 158130 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 40d5da59-6201-424a-8380-80ecc3d67c7e
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.883 158130 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp5o39znh6/privsep.sock']
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.883 158228 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-938909'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.914 158228 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.914 158228 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.914 158228 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.917 158228 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.919 158228 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected
Nov 28 09:23:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:50.934 158228 INFO eventlet.wsgi.server [-] (158228) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Nov 28 09:23:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41543 DF PROTO=TCP SPT=47128 DPT=9105 SEQ=1499306541 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1556C20000000001030307) 
Nov 28 09:23:51 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:51.522 158130 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 28 09:23:51 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:51.523 158130 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp5o39znh6/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 28 09:23:51 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:51.420 158233 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 28 09:23:51 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:51.426 158233 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 28 09:23:51 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:51.430 158233 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Nov 28 09:23:51 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:51.430 158233 INFO oslo.privsep.daemon [-] privsep daemon running as pid 158233
Nov 28 09:23:51 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:51.527 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[fc30ef09-60d6-4096-93ef-f4f2e7dcfede]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:23:51 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:51.939 158233 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:23:51 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:51.939 158233 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:23:51 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:51.939 158233 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:23:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:52.386 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[5834f15f-82c4-4260-a200-4368ee9e96ce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:23:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:52.388 158130 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpw1frvsfb/privsep.sock']
Nov 28 09:23:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:52.958 158130 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 28 09:23:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:52.959 158130 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpw1frvsfb/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 28 09:23:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:52.852 158244 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 28 09:23:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:52.859 158244 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 28 09:23:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:52.863 158244 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 28 09:23:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:52.863 158244 INFO oslo.privsep.daemon [-] privsep daemon running as pid 158244
Nov 28 09:23:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:52.962 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[ab96dbf2-acfe-432c-abee-ed751524657a]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:23:53 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:53.369 158244 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:23:53 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:53.369 158244 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:23:53 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:53.369 158244 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:23:53 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:53.824 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[4bca20db-92d9-4951-ac75-93c5bcdc5bf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:23:53 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:53.827 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[5a6d563e-0729-423e-8c23-5929ca2e50ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:23:53 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:53.847 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[d31dfd4b-d810-4361-880d-827cd6d6bab9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:23:53 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:53.864 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[b8d32f5d-eab5-4e5c-b4c5-1c99b295e0b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap40d5da59-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:28:4d:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7143, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7143, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483664], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662048, 'reachable_time': 36277, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 17, 'outoctets': 1164, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 17, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 1164, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 17, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 158254, 'error': None, 'target': 'ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:23:53 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:53.881 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[3b2c7a7f-a7b4-41b7-8415-5f2b42567c76]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap40d5da59-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662058, 'tstamp': 662058}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 158255, 'error': None, 'target': 'ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.0.2'], ['IFA_LOCAL', '192.168.0.2'], ['IFA_BROADCAST', '192.168.0.255'], ['IFA_LABEL', 'tap40d5da59-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662060, 'tstamp': 662060}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 158255, 'error': None, 'target': 'ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::a9fe:a9fe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662064, 'tstamp': 662064}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 158255, 'error': None, 'target': 'ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe28:4d05'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662048, 'tstamp': 662048}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 158255, 'error': None, 'target': 'ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:23:53 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:53.940 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[73dbde93-01db-4b03-9d28-bbec95e16c00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:23:53 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:53.941 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap40d5da59-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:23:53 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:53.947 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap40d5da59-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:23:53 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:53.947 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 09:23:53 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:53.948 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap40d5da59-60, col_values=(('external_ids', {'iface-id': '3ff57c88-06c6-4894-984a-80ce116d1456'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:23:53 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:53.948 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 09:23:53 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:53.952 158130 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp42n1wdi2/privsep.sock']
Nov 28 09:23:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:54.534 158130 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 28 09:23:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:54.535 158130 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp42n1wdi2/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 28 09:23:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:54.442 158264 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 28 09:23:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:54.447 158264 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 28 09:23:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:54.450 158264 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Nov 28 09:23:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:54.451 158264 INFO oslo.privsep.daemon [-] privsep daemon running as pid 158264
Nov 28 09:23:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:54.537 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[76a75132-fcb3-41f8-b8fa-318b80958263]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:23:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:54.974 158264 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:23:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:54.974 158264 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:23:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:54.974 158264 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:23:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41544 DF PROTO=TCP SPT=47128 DPT=9105 SEQ=1499306541 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1566830000000001030307) 
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.409 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[8ed73b80-1d33-48af-bf4a-d01dc14a1d28]: (4, ['ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e']) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.412 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, column=external_ids, values=({'neutron:ovn-metadata-id': '68f92086-2b44-5496-b923-e898b18e44d4'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.413 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.414 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.430 158130 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.430 158130 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.430 158130 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.430 158130 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.430 158130 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.430 158130 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.431 158130 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.431 158130 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.431 158130 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.432 158130 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.432 158130 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.432 158130 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.432 158130 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.432 158130 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.433 158130 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.433 158130 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.433 158130 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.434 158130 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.434 158130 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.434 158130 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.434 158130 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.435 158130 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.435 158130 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.435 158130 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.435 158130 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.436 158130 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.436 158130 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.436 158130 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.436 158130 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.437 158130 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.437 158130 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.437 158130 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.437 158130 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.438 158130 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.438 158130 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.438 158130 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.438 158130 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.439 158130 DEBUG oslo_service.service [-] host                           = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.439 158130 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.439 158130 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.439 158130 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.439 158130 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.440 158130 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.440 158130 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.440 158130 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.440 158130 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.440 158130 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.441 158130 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.441 158130 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.441 158130 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.441 158130 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.442 158130 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.442 158130 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.442 158130 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.442 158130 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.442 158130 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.443 158130 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.443 158130 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.443 158130 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.443 158130 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.443 158130 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.444 158130 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.444 158130 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.444 158130 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.444 158130 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.445 158130 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.445 158130 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.445 158130 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.445 158130 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.446 158130 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.446 158130 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.446 158130 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.446 158130 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.446 158130 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.447 158130 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.447 158130 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.447 158130 DEBUG oslo_service.service [-] nova_metadata_protocol         = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.447 158130 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.448 158130 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.448 158130 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.448 158130 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.448 158130 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.448 158130 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.449 158130 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.449 158130 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.449 158130 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.449 158130 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.449 158130 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.450 158130 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.450 158130 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.450 158130 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.450 158130 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.451 158130 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.451 158130 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.451 158130 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.451 158130 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.451 158130 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.452 158130 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.452 158130 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.452 158130 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.452 158130 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.452 158130 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.453 158130 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.453 158130 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.453 158130 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.453 158130 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.453 158130 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.454 158130 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.454 158130 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.454 158130 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.454 158130 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.455 158130 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.455 158130 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.455 158130 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.455 158130 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.456 158130 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.456 158130 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.456 158130 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.456 158130 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.457 158130 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.457 158130 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.457 158130 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.457 158130 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.458 158130 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.458 158130 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.458 158130 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.458 158130 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.458 158130 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.459 158130 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.459 158130 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.459 158130 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.460 158130 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.460 158130 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.460 158130 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.460 158130 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.461 158130 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.461 158130 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.461 158130 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.461 158130 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.462 158130 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.462 158130 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.462 158130 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.462 158130 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.462 158130 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.463 158130 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.463 158130 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.463 158130 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.463 158130 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.463 158130 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.464 158130 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.464 158130 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.464 158130 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.464 158130 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.464 158130 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.465 158130 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.465 158130 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.465 158130 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.465 158130 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.465 158130 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.466 158130 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.466 158130 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.466 158130 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.466 158130 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.467 158130 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.467 158130 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.467 158130 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.467 158130 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.467 158130 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.468 158130 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.468 158130 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.468 158130 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.468 158130 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.468 158130 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.469 158130 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.469 158130 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.469 158130 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.469 158130 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.470 158130 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.470 158130 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.470 158130 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.470 158130 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.470 158130 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.471 158130 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.471 158130 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.471 158130 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.471 158130 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.471 158130 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.472 158130 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.472 158130 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.472 158130 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.472 158130 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.473 158130 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.473 158130 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.473 158130 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.473 158130 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.474 158130 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.474 158130 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.474 158130 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.474 158130 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.474 158130 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.475 158130 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.475 158130 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.475 158130 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.475 158130 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.475 158130 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.476 158130 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.476 158130 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.476 158130 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.476 158130 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.476 158130 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.477 158130 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.477 158130 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.477 158130 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.477 158130 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.478 158130 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.478 158130 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.478 158130 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.478 158130 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.478 158130 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.478 158130 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.479 158130 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.479 158130 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.479 158130 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.479 158130 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.479 158130 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.479 158130 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.479 158130 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.479 158130 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.480 158130 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.480 158130 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.480 158130 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.480 158130 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.480 158130 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.480 158130 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.480 158130 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.480 158130 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.481 158130 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.481 158130 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.481 158130 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.481 158130 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.481 158130 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.481 158130 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.481 158130 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.482 158130 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.482 158130 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.482 158130 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.482 158130 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.482 158130 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.482 158130 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.482 158130 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.482 158130 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.483 158130 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.483 158130 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.483 158130 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.483 158130 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.483 158130 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.483 158130 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.483 158130 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.483 158130 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.484 158130 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.484 158130 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.484 158130 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.484 158130 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.484 158130 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.484 158130 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.484 158130 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.485 158130 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.485 158130 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.485 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.485 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.485 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.485 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.485 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.486 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.486 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.486 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.486 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.486 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.486 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.486 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.487 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.487 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.487 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.487 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.487 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.487 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.487 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.488 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.488 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.488 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.488 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.488 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.488 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.488 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.488 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.489 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.489 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.489 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.489 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.489 158130 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.489 158130 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.489 158130 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.490 158130 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:23:55.490 158130 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 09:23:56 np0005538513.localdomain sshd[158270]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:23:56 np0005538513.localdomain sshd[158270]: Accepted publickey for zuul from 192.168.122.30 port 57640 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:23:56 np0005538513.localdomain systemd-logind[764]: New session 52 of user zuul.
Nov 28 09:23:57 np0005538513.localdomain systemd[1]: Started Session 52 of User zuul.
Nov 28 09:23:57 np0005538513.localdomain sshd[158270]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:23:57 np0005538513.localdomain python3.9[158363]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:23:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17377 DF PROTO=TCP SPT=34796 DPT=9101 SEQ=678617077 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1571890000000001030307) 
Nov 28 09:23:59 np0005538513.localdomain sudo[158457]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijimyevpvoqajdpwxqlyrltclfqszigl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321838.5894563-62-137413976252016/AnsiballZ_command.py
Nov 28 09:23:59 np0005538513.localdomain sudo[158457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:59 np0005538513.localdomain python3.9[158459]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:24:00 np0005538513.localdomain sudo[158457]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17379 DF PROTO=TCP SPT=34796 DPT=9101 SEQ=678617077 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB157D830000000001030307) 
Nov 28 09:24:01 np0005538513.localdomain sudo[158562]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ckbdqsdvkvlovjvqghwzrlyedpmhzhmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321841.0541203-86-119836071272950/AnsiballZ_command.py
Nov 28 09:24:01 np0005538513.localdomain sudo[158562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:01 np0005538513.localdomain python3.9[158564]: ansible-ansible.legacy.command Invoked with _raw_params=podman stop nova_virtlogd _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:24:01 np0005538513.localdomain systemd[1]: libpod-4ec37aacc81c30d8f8c1872da98bbae1184ac4d79565b18c976c336f5619bd9a.scope: Deactivated successfully.
Nov 28 09:24:01 np0005538513.localdomain podman[158565]: 2025-11-28 09:24:01.566576115 +0000 UTC m=+0.070989906 container died 4ec37aacc81c30d8f8c1872da98bbae1184ac4d79565b18c976c336f5619bd9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64)
Nov 28 09:24:01 np0005538513.localdomain podman[158565]: 2025-11-28 09:24:01.612101348 +0000 UTC m=+0.116515129 container cleanup 4ec37aacc81c30d8f8c1872da98bbae1184ac4d79565b18c976c336f5619bd9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-19T00:35:22Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044)
Nov 28 09:24:01 np0005538513.localdomain sudo[158562]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:01 np0005538513.localdomain podman[158580]: 2025-11-28 09:24:01.65554264 +0000 UTC m=+0.083906459 container remove 4ec37aacc81c30d8f8c1872da98bbae1184ac4d79565b18c976c336f5619bd9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, build-date=2025-11-19T00:35:22Z, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Nov 28 09:24:01 np0005538513.localdomain systemd[1]: libpod-conmon-4ec37aacc81c30d8f8c1872da98bbae1184ac4d79565b18c976c336f5619bd9a.scope: Deactivated successfully.
Nov 28 09:24:02 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-2373ee3a3140a9b51206ab4b5b4d03e16415bec87f47ab14239ebc6234afc02f-merged.mount: Deactivated successfully.
Nov 28 09:24:02 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4ec37aacc81c30d8f8c1872da98bbae1184ac4d79565b18c976c336f5619bd9a-userdata-shm.mount: Deactivated successfully.
Nov 28 09:24:02 np0005538513.localdomain sudo[158685]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwqpwavsehrzktiaiglrgbhaiqbffaiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321842.1124055-116-151245143566335/AnsiballZ_systemd_service.py
Nov 28 09:24:02 np0005538513.localdomain sudo[158685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:02 np0005538513.localdomain python3.9[158687]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:24:02 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:24:03 np0005538513.localdomain systemd-sysv-generator[158718]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:24:03 np0005538513.localdomain systemd-rc-local-generator[158715]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:24:03 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:24:03 np0005538513.localdomain sudo[158685]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55235 DF PROTO=TCP SPT=43058 DPT=9102 SEQ=4203539387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1586830000000001030307) 
Nov 28 09:24:04 np0005538513.localdomain python3.9[158813]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:24:04 np0005538513.localdomain network[158830]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:24:04 np0005538513.localdomain network[158831]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:24:04 np0005538513.localdomain network[158832]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:24:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36086 DF PROTO=TCP SPT=50676 DPT=9102 SEQ=2753317606 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1591830000000001030307) 
Nov 28 09:24:06 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:24:08 np0005538513.localdomain sudo[159031]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-skytmoczeaawgiqlmjkylawsvaascvck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321847.9034128-173-139338781533091/AnsiballZ_systemd_service.py
Nov 28 09:24:08 np0005538513.localdomain sudo[159031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:08 np0005538513.localdomain python3.9[159033]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:24:08 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:24:08 np0005538513.localdomain systemd-sysv-generator[159067]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:24:08 np0005538513.localdomain systemd-rc-local-generator[159064]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:24:08 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:24:08 np0005538513.localdomain systemd[1]: Stopped target tripleo_nova_libvirt.target.
Nov 28 09:24:08 np0005538513.localdomain sudo[159031]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19400 DF PROTO=TCP SPT=57738 DPT=9100 SEQ=970466610 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB159D820000000001030307) 
Nov 28 09:24:10 np0005538513.localdomain sudo[159162]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtkjqcmbchwlqdjayinhdzuefkbhltwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321848.9382193-173-45637567794276/AnsiballZ_systemd_service.py
Nov 28 09:24:10 np0005538513.localdomain sudo[159162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:10 np0005538513.localdomain python3.9[159164]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:24:10 np0005538513.localdomain sudo[159162]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:10 np0005538513.localdomain sudo[159255]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-silblfcvkvjcpunskwmxwqqrvwihuhmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321850.6596332-173-230663413060639/AnsiballZ_systemd_service.py
Nov 28 09:24:10 np0005538513.localdomain sudo[159255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:11 np0005538513.localdomain python3.9[159257]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:24:11 np0005538513.localdomain sudo[159255]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:11 np0005538513.localdomain sudo[159348]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgbvvwhogkvzctgftndkqquyywjpcywb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321851.3866425-173-114754804803301/AnsiballZ_systemd_service.py
Nov 28 09:24:11 np0005538513.localdomain sudo[159348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:11 np0005538513.localdomain python3.9[159350]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:24:11 np0005538513.localdomain sudo[159348]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63320 DF PROTO=TCP SPT=46898 DPT=9100 SEQ=3905351516 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB15A9C20000000001030307) 
Nov 28 09:24:12 np0005538513.localdomain sudo[159441]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-htchbnvqyxpdwvzmmbvaximqztbetomv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321852.4100318-173-22265471661035/AnsiballZ_systemd_service.py
Nov 28 09:24:12 np0005538513.localdomain sudo[159441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:12 np0005538513.localdomain python3.9[159443]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:24:13 np0005538513.localdomain sudo[159441]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:13 np0005538513.localdomain sudo[159534]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqeunolvlxnimgjnbtdkrlrmgwufrbem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321853.1488194-173-32730705469367/AnsiballZ_systemd_service.py
Nov 28 09:24:13 np0005538513.localdomain sudo[159534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:13 np0005538513.localdomain python3.9[159536]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:24:13 np0005538513.localdomain sudo[159534]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:14 np0005538513.localdomain sudo[159627]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dscppuqutkbacbxgocvlwwnfuyvzvdbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321853.9758766-173-207100016617913/AnsiballZ_systemd_service.py
Nov 28 09:24:14 np0005538513.localdomain sudo[159627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:14 np0005538513.localdomain python3.9[159629]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:24:14 np0005538513.localdomain sudo[159627]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:24:15 np0005538513.localdomain podman[159645]: 2025-11-28 09:24:15.858635433 +0000 UTC m=+0.089528935 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Nov 28 09:24:15 np0005538513.localdomain podman[159645]: 2025-11-28 09:24:15.943329408 +0000 UTC m=+0.174222880 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 09:24:15 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:24:16 np0005538513.localdomain sudo[159745]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghnoiiuinfuukehfflyjsdfyhdtqzzov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321856.4770865-329-138758027554281/AnsiballZ_file.py
Nov 28 09:24:16 np0005538513.localdomain sudo[159745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:17 np0005538513.localdomain python3.9[159747]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:17 np0005538513.localdomain sudo[159745]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32600 DF PROTO=TCP SPT=56834 DPT=9882 SEQ=1286129200 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB15BC420000000001030307) 
Nov 28 09:24:17 np0005538513.localdomain sudo[159837]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rahkgwsikqwgemackcmoedpmfcmjyfuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321857.2550936-329-71533839563497/AnsiballZ_file.py
Nov 28 09:24:17 np0005538513.localdomain sudo[159837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:17 np0005538513.localdomain python3.9[159839]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:17 np0005538513.localdomain sudo[159837]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:18 np0005538513.localdomain sudo[159929]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cbeczikqynwtslzgvvidipqijqskjqoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321857.8681614-329-25681642676640/AnsiballZ_file.py
Nov 28 09:24:18 np0005538513.localdomain sudo[159929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:18 np0005538513.localdomain python3.9[159931]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:18 np0005538513.localdomain sudo[159929]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:18 np0005538513.localdomain sudo[160021]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxzlgijbughrkjqbtykvuyahngjvosbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321858.4711242-329-85167043620523/AnsiballZ_file.py
Nov 28 09:24:18 np0005538513.localdomain sudo[160021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:18 np0005538513.localdomain python3.9[160023]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:18 np0005538513.localdomain sudo[160021]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64366 DF PROTO=TCP SPT=37896 DPT=9105 SEQ=1913146142 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB15C3C30000000001030307) 
Nov 28 09:24:19 np0005538513.localdomain sudo[160113]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgjsihchocnxfihuuzbgicacrntmxiei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321859.1301658-329-176374648441014/AnsiballZ_file.py
Nov 28 09:24:19 np0005538513.localdomain sudo[160113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:19 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:24:19 np0005538513.localdomain systemd[1]: tmp-crun.C3no49.mount: Deactivated successfully.
Nov 28 09:24:19 np0005538513.localdomain podman[160116]: 2025-11-28 09:24:19.493044142 +0000 UTC m=+0.087090140 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 28 09:24:19 np0005538513.localdomain podman[160116]: 2025-11-28 09:24:19.501410585 +0000 UTC m=+0.095456583 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 09:24:19 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:24:19 np0005538513.localdomain python3.9[160115]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:19 np0005538513.localdomain sudo[160113]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:19 np0005538513.localdomain sudo[160222]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovjzebdbnigxqnuyrmpuctfhdumejtze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321859.737571-329-185580733674954/AnsiballZ_file.py
Nov 28 09:24:19 np0005538513.localdomain sudo[160222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:20 np0005538513.localdomain python3.9[160224]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:20 np0005538513.localdomain sudo[160222]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:20 np0005538513.localdomain sudo[160314]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xpyoizpkrifvbisbeukkraudcwsstpfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321860.3336546-329-188587957310063/AnsiballZ_file.py
Nov 28 09:24:20 np0005538513.localdomain sudo[160314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:20 np0005538513.localdomain python3.9[160316]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:20 np0005538513.localdomain sudo[160314]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64367 DF PROTO=TCP SPT=37896 DPT=9105 SEQ=1913146142 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB15CBC20000000001030307) 
Nov 28 09:24:21 np0005538513.localdomain sudo[160406]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lnpskpbhoejaaeruuroskxghnailzckb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321861.0205388-479-222342836387962/AnsiballZ_file.py
Nov 28 09:24:21 np0005538513.localdomain sudo[160406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:21 np0005538513.localdomain python3.9[160408]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:21 np0005538513.localdomain sudo[160406]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:21 np0005538513.localdomain sudo[160498]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsjjpdcyujcpfhgleuoebfvpmqzrtoqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321861.6426818-479-197974688389631/AnsiballZ_file.py
Nov 28 09:24:21 np0005538513.localdomain sudo[160498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:22 np0005538513.localdomain python3.9[160500]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:22 np0005538513.localdomain sudo[160498]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:22 np0005538513.localdomain sudo[160590]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzbuvdhlbtjbemyitgytokjduqkxyhhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321862.1782207-479-54390916392503/AnsiballZ_file.py
Nov 28 09:24:22 np0005538513.localdomain sudo[160590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:22 np0005538513.localdomain python3.9[160592]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:22 np0005538513.localdomain sudo[160590]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:23 np0005538513.localdomain sudo[160682]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrixdcxvzfxzjxybmhztoizkmbfmmvoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321862.7540557-479-102843915278545/AnsiballZ_file.py
Nov 28 09:24:23 np0005538513.localdomain sudo[160682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:23 np0005538513.localdomain python3.9[160684]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:23 np0005538513.localdomain sudo[160682]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:23 np0005538513.localdomain sudo[160774]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wuhgypkzeykujcbegriflkzqwmgbowec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321863.365255-479-278391544115805/AnsiballZ_file.py
Nov 28 09:24:23 np0005538513.localdomain sudo[160774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:23 np0005538513.localdomain python3.9[160776]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:23 np0005538513.localdomain sudo[160774]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:24 np0005538513.localdomain sudo[160866]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybzykgsfyioiqnazzmkitukzlpfqscvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321863.9661827-479-193172111917263/AnsiballZ_file.py
Nov 28 09:24:24 np0005538513.localdomain sudo[160866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:24 np0005538513.localdomain python3.9[160868]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:24 np0005538513.localdomain sudo[160866]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:24 np0005538513.localdomain sudo[160958]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijpekrcjdzjpgcgjhekbxhvxacxconkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321864.5725355-479-165480271361971/AnsiballZ_file.py
Nov 28 09:24:24 np0005538513.localdomain sudo[160958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:25 np0005538513.localdomain python3.9[160960]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:25 np0005538513.localdomain sudo[160958]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64368 DF PROTO=TCP SPT=37896 DPT=9105 SEQ=1913146142 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB15DB830000000001030307) 
Nov 28 09:24:25 np0005538513.localdomain sudo[161050]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fkhqanzhmggfwgytingzqzgebwtrkpyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321865.5858667-632-111377297124227/AnsiballZ_command.py
Nov 28 09:24:25 np0005538513.localdomain sudo[161050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:25 np0005538513.localdomain python3.9[161052]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:24:26 np0005538513.localdomain sudo[161050]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:26 np0005538513.localdomain python3.9[161144]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 09:24:27 np0005538513.localdomain sudo[161234]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-henorxdomtarvpjmnkrsfmgbeyyvouey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321867.1808643-686-180893460157406/AnsiballZ_systemd_service.py
Nov 28 09:24:27 np0005538513.localdomain sudo[161234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:27 np0005538513.localdomain python3.9[161236]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:24:27 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:24:27 np0005538513.localdomain systemd-rc-local-generator[161263]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:24:27 np0005538513.localdomain systemd-sysv-generator[161266]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:24:27 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:24:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58530 DF PROTO=TCP SPT=33760 DPT=9101 SEQ=3918545985 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB15E6B90000000001030307) 
Nov 28 09:24:28 np0005538513.localdomain sudo[161234]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:28 np0005538513.localdomain sudo[161362]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfjunqtjariexhroyexgcytusysgiuxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321868.3121579-710-84573772588773/AnsiballZ_command.py
Nov 28 09:24:28 np0005538513.localdomain sudo[161362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:28 np0005538513.localdomain python3.9[161364]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:24:28 np0005538513.localdomain sudo[161362]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:29 np0005538513.localdomain sudo[161455]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cucwhfmadrkqzmeqihaajyflasgyvrxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321868.9415479-710-76419228357793/AnsiballZ_command.py
Nov 28 09:24:29 np0005538513.localdomain sudo[161455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:29 np0005538513.localdomain python3.9[161457]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:24:29 np0005538513.localdomain sudo[161455]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:29 np0005538513.localdomain sudo[161548]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-peqtllooqpmbtpioegzqaylcsrvghxrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321869.581744-710-253070774722371/AnsiballZ_command.py
Nov 28 09:24:29 np0005538513.localdomain sudo[161548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:30 np0005538513.localdomain python3.9[161550]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:24:30 np0005538513.localdomain sudo[161548]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:30 np0005538513.localdomain sudo[161641]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhljzebiterodpdwmzhpvrojkfbxkogq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321870.187423-710-27454047740580/AnsiballZ_command.py
Nov 28 09:24:30 np0005538513.localdomain sudo[161641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:30 np0005538513.localdomain python3.9[161643]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:24:30 np0005538513.localdomain sudo[161641]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:31 np0005538513.localdomain sudo[161734]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-anewamizfcuwowriimnyaiiobuxozpwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321870.8188875-710-146147944607226/AnsiballZ_command.py
Nov 28 09:24:31 np0005538513.localdomain sudo[161734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58532 DF PROTO=TCP SPT=33760 DPT=9101 SEQ=3918545985 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB15F2C20000000001030307) 
Nov 28 09:24:31 np0005538513.localdomain python3.9[161736]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:24:31 np0005538513.localdomain sudo[161734]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:31 np0005538513.localdomain sudo[161827]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tiuvowpwwyuivvahqzzgkncdqabhdrbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321871.4390917-710-44410617943841/AnsiballZ_command.py
Nov 28 09:24:31 np0005538513.localdomain sudo[161827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:31 np0005538513.localdomain python3.9[161829]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:24:31 np0005538513.localdomain sudo[161827]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:32 np0005538513.localdomain sudo[161859]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:24:32 np0005538513.localdomain sudo[161859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:24:32 np0005538513.localdomain sudo[161859]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:32 np0005538513.localdomain sudo[161893]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:24:32 np0005538513.localdomain sudo[161893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:24:32 np0005538513.localdomain sudo[161951]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vtoqotzfhpwpnhwzhpgobuihwhjpthhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321871.9991667-710-126801084389674/AnsiballZ_command.py
Nov 28 09:24:32 np0005538513.localdomain sudo[161951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:32 np0005538513.localdomain python3.9[161953]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:24:32 np0005538513.localdomain sudo[161951]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:32 np0005538513.localdomain sudo[161893]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64369 DF PROTO=TCP SPT=37896 DPT=9105 SEQ=1913146142 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB15FB820000000001030307) 
Nov 28 09:24:33 np0005538513.localdomain sudo[162000]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:24:33 np0005538513.localdomain sudo[162000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:24:33 np0005538513.localdomain sudo[162000]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:35 np0005538513.localdomain sudo[162090]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehjunmqoebvvggmooonwojrftpqkmtqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321875.5060327-872-216618070681347/AnsiballZ_getent.py
Nov 28 09:24:35 np0005538513.localdomain sudo[162090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:36 np0005538513.localdomain python3.9[162092]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 28 09:24:36 np0005538513.localdomain sudo[162090]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37519 DF PROTO=TCP SPT=59926 DPT=9100 SEQ=490293883 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1607420000000001030307) 
Nov 28 09:24:36 np0005538513.localdomain sudo[162183]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gmttcmpssztsulrjwxnrdmlxiijdlylq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321876.3248508-896-254248944565899/AnsiballZ_group.py
Nov 28 09:24:36 np0005538513.localdomain sudo[162183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:36 np0005538513.localdomain python3.9[162185]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 09:24:36 np0005538513.localdomain groupadd[162186]: group added to /etc/group: name=libvirt, GID=42473
Nov 28 09:24:36 np0005538513.localdomain groupadd[162186]: group added to /etc/gshadow: name=libvirt
Nov 28 09:24:36 np0005538513.localdomain groupadd[162186]: new group: name=libvirt, GID=42473
Nov 28 09:24:37 np0005538513.localdomain sudo[162183]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:38 np0005538513.localdomain sudo[162281]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aaomugjrdfnlbpjzmuvwdntbjjcaxsiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321877.674287-920-71722976781467/AnsiballZ_user.py
Nov 28 09:24:38 np0005538513.localdomain sudo[162281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:38 np0005538513.localdomain python3.9[162283]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005538513.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 28 09:24:38 np0005538513.localdomain useradd[162285]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/1
Nov 28 09:24:38 np0005538513.localdomain sudo[162281]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:39 np0005538513.localdomain sudo[162381]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kpomgnmdwrdndrqkzpqwpdrwsqikdwnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321878.971425-953-27083830633402/AnsiballZ_setup.py
Nov 28 09:24:39 np0005538513.localdomain sudo[162381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61335 DF PROTO=TCP SPT=47258 DPT=9100 SEQ=1542535815 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1613820000000001030307) 
Nov 28 09:24:39 np0005538513.localdomain python3.9[162383]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:24:39 np0005538513.localdomain sudo[162381]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:40 np0005538513.localdomain sudo[162435]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yddzafhbtortxicgjzfgfzeduvfgmyxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321878.971425-953-27083830633402/AnsiballZ_dnf.py
Nov 28 09:24:40 np0005538513.localdomain sudo[162435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:40 np0005538513.localdomain python3.9[162437]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:24:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37521 DF PROTO=TCP SPT=59926 DPT=9100 SEQ=490293883 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB161F020000000001030307) 
Nov 28 09:24:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:24:46 np0005538513.localdomain podman[162507]: 2025-11-28 09:24:46.859427347 +0000 UTC m=+0.092609495 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:24:46 np0005538513.localdomain podman[162507]: 2025-11-28 09:24:46.898105464 +0000 UTC m=+0.131287662 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 09:24:46 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:24:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36853 DF PROTO=TCP SPT=49260 DPT=9882 SEQ=3662475890 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1631820000000001030307) 
Nov 28 09:24:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59358 DF PROTO=TCP SPT=54070 DPT=9105 SEQ=75823109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1639020000000001030307) 
Nov 28 09:24:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:24:49 np0005538513.localdomain systemd[1]: tmp-crun.OvGzIY.mount: Deactivated successfully.
Nov 28 09:24:49 np0005538513.localdomain podman[162537]: 2025-11-28 09:24:49.870511609 +0000 UTC m=+0.107849058 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true)
Nov 28 09:24:49 np0005538513.localdomain podman[162537]: 2025-11-28 09:24:49.880360124 +0000 UTC m=+0.117697603 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:24:49 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:24:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:24:50.776 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:24:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:24:50.777 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:24:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:24:50.778 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:24:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59359 DF PROTO=TCP SPT=54070 DPT=9105 SEQ=75823109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1641020000000001030307) 
Nov 28 09:24:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59360 DF PROTO=TCP SPT=54070 DPT=9105 SEQ=75823109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1650C20000000001030307) 
Nov 28 09:24:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28509 DF PROTO=TCP SPT=36002 DPT=9101 SEQ=938269270 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB165BEA0000000001030307) 
Nov 28 09:25:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28511 DF PROTO=TCP SPT=36002 DPT=9101 SEQ=938269270 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1668020000000001030307) 
Nov 28 09:25:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59478 DF PROTO=TCP SPT=37558 DPT=9102 SEQ=1286343932 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1670C20000000001030307) 
Nov 28 09:25:05 np0005538513.localdomain kernel: SELinux:  Converting 2759 SID table entries...
Nov 28 09:25:05 np0005538513.localdomain kernel: SELinux:  Context system_u:object_r:insights_client_cache_t:s0 became invalid (unmapped).
Nov 28 09:25:05 np0005538513.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 09:25:05 np0005538513.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 09:25:05 np0005538513.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 09:25:05 np0005538513.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 09:25:05 np0005538513.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 09:25:05 np0005538513.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 09:25:05 np0005538513.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 09:25:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52815 DF PROTO=TCP SPT=37692 DPT=9100 SEQ=1592430692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB167C820000000001030307) 
Nov 28 09:25:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63323 DF PROTO=TCP SPT=46898 DPT=9100 SEQ=3905351516 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1687830000000001030307) 
Nov 28 09:25:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52817 DF PROTO=TCP SPT=37692 DPT=9100 SEQ=1592430692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1694420000000001030307) 
Nov 28 09:25:16 np0005538513.localdomain kernel: SELinux:  Converting 2762 SID table entries...
Nov 28 09:25:16 np0005538513.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 09:25:16 np0005538513.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 09:25:16 np0005538513.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 09:25:16 np0005538513.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 09:25:16 np0005538513.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 09:25:16 np0005538513.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 09:25:16 np0005538513.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 09:25:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29416 DF PROTO=TCP SPT=43222 DPT=9882 SEQ=4111102986 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB16A6C30000000001030307) 
Nov 28 09:25:17 np0005538513.localdomain dbus-broker-launch[755]: avc:  op=load_policy lsm=selinux seqno=20 res=1
Nov 28 09:25:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:25:17 np0005538513.localdomain podman[163610]: 2025-11-28 09:25:17.85744877 +0000 UTC m=+0.082188948 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 28 09:25:17 np0005538513.localdomain podman[163610]: 2025-11-28 09:25:17.941868112 +0000 UTC m=+0.166608310 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:25:17 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:25:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46223 DF PROTO=TCP SPT=46288 DPT=9105 SEQ=3972451082 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB16AE420000000001030307) 
Nov 28 09:25:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:25:20 np0005538513.localdomain systemd[1]: tmp-crun.95slex.mount: Deactivated successfully.
Nov 28 09:25:20 np0005538513.localdomain podman[163635]: 2025-11-28 09:25:20.84336367 +0000 UTC m=+0.084760606 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 09:25:20 np0005538513.localdomain podman[163635]: 2025-11-28 09:25:20.879581963 +0000 UTC m=+0.120978929 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 28 09:25:20 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:25:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46224 DF PROTO=TCP SPT=46288 DPT=9105 SEQ=3972451082 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB16B6430000000001030307) 
Nov 28 09:25:24 np0005538513.localdomain kernel: SELinux:  Converting 2762 SID table entries...
Nov 28 09:25:24 np0005538513.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 09:25:24 np0005538513.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 09:25:24 np0005538513.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 09:25:24 np0005538513.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 09:25:24 np0005538513.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 09:25:24 np0005538513.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 09:25:24 np0005538513.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 09:25:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46225 DF PROTO=TCP SPT=46288 DPT=9105 SEQ=3972451082 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB16C6030000000001030307) 
Nov 28 09:25:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11502 DF PROTO=TCP SPT=50306 DPT=9101 SEQ=2286596531 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB16D1190000000001030307) 
Nov 28 09:25:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11504 DF PROTO=TCP SPT=50306 DPT=9101 SEQ=2286596531 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB16DD020000000001030307) 
Nov 28 09:25:32 np0005538513.localdomain kernel: SELinux:  Converting 2762 SID table entries...
Nov 28 09:25:32 np0005538513.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 09:25:32 np0005538513.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 09:25:32 np0005538513.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 09:25:32 np0005538513.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 09:25:32 np0005538513.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 09:25:32 np0005538513.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 09:25:32 np0005538513.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 09:25:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46226 DF PROTO=TCP SPT=46288 DPT=9105 SEQ=3972451082 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB16E5830000000001030307) 
Nov 28 09:25:33 np0005538513.localdomain sudo[163671]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:25:33 np0005538513.localdomain dbus-broker-launch[755]: avc:  op=load_policy lsm=selinux seqno=22 res=1
Nov 28 09:25:33 np0005538513.localdomain sudo[163671]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:25:33 np0005538513.localdomain sudo[163671]: pam_unix(sudo:session): session closed for user root
Nov 28 09:25:33 np0005538513.localdomain sudo[163689]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:25:33 np0005538513.localdomain sudo[163689]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:25:34 np0005538513.localdomain sudo[163689]: pam_unix(sudo:session): session closed for user root
Nov 28 09:25:35 np0005538513.localdomain sudo[163740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:25:35 np0005538513.localdomain sudo[163740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:25:35 np0005538513.localdomain sudo[163740]: pam_unix(sudo:session): session closed for user root
Nov 28 09:25:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41817 DF PROTO=TCP SPT=42968 DPT=9100 SEQ=483642328 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB16F1820000000001030307) 
Nov 28 09:25:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37524 DF PROTO=TCP SPT=59926 DPT=9100 SEQ=490293883 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB16FD820000000001030307) 
Nov 28 09:25:41 np0005538513.localdomain kernel: SELinux:  Converting 2762 SID table entries...
Nov 28 09:25:41 np0005538513.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 09:25:41 np0005538513.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 09:25:41 np0005538513.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 09:25:41 np0005538513.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 09:25:41 np0005538513.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 09:25:41 np0005538513.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 09:25:41 np0005538513.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 09:25:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41819 DF PROTO=TCP SPT=42968 DPT=9100 SEQ=483642328 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1709420000000001030307) 
Nov 28 09:25:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5423 DF PROTO=TCP SPT=37314 DPT=9882 SEQ=1941038903 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB171BC20000000001030307) 
Nov 28 09:25:48 np0005538513.localdomain dbus-broker-launch[755]: avc:  op=load_policy lsm=selinux seqno=23 res=1
Nov 28 09:25:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:25:48 np0005538513.localdomain systemd[1]: tmp-crun.bMFeha.mount: Deactivated successfully.
Nov 28 09:25:48 np0005538513.localdomain podman[163766]: 2025-11-28 09:25:48.853275703 +0000 UTC m=+0.089103333 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:25:48 np0005538513.localdomain podman[163766]: 2025-11-28 09:25:48.930430265 +0000 UTC m=+0.166257925 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller)
Nov 28 09:25:48 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:25:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18234 DF PROTO=TCP SPT=54654 DPT=9105 SEQ=947938433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1723820000000001030307) 
Nov 28 09:25:49 np0005538513.localdomain kernel: SELinux:  Converting 2762 SID table entries...
Nov 28 09:25:49 np0005538513.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 09:25:49 np0005538513.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 09:25:49 np0005538513.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 09:25:49 np0005538513.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 09:25:49 np0005538513.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 09:25:49 np0005538513.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 09:25:49 np0005538513.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 09:25:50 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:25:50 np0005538513.localdomain dbus-broker-launch[755]: avc:  op=load_policy lsm=selinux seqno=24 res=1
Nov 28 09:25:50 np0005538513.localdomain systemd-rc-local-generator[163819]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:25:50 np0005538513.localdomain systemd-sysv-generator[163823]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:25:50 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:25:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:25:50.777 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:25:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:25:50.779 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:25:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:25:50.781 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:25:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:25:50 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:25:51 np0005538513.localdomain systemd-rc-local-generator[163875]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:25:51 np0005538513.localdomain systemd-sysv-generator[163878]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:25:51 np0005538513.localdomain podman[163838]: 2025-11-28 09:25:51.035091492 +0000 UTC m=+0.094158080 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:25:51 np0005538513.localdomain podman[163838]: 2025-11-28 09:25:51.075406563 +0000 UTC m=+0.134473111 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 09:25:51 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:25:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18235 DF PROTO=TCP SPT=54654 DPT=9105 SEQ=947938433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB172B820000000001030307) 
Nov 28 09:25:51 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:25:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18236 DF PROTO=TCP SPT=54654 DPT=9105 SEQ=947938433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB173B420000000001030307) 
Nov 28 09:25:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2146 DF PROTO=TCP SPT=44362 DPT=9101 SEQ=87046848 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB17464A0000000001030307) 
Nov 28 09:25:59 np0005538513.localdomain kernel: SELinux:  Converting 2763 SID table entries...
Nov 28 09:25:59 np0005538513.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 09:25:59 np0005538513.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 09:25:59 np0005538513.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 09:25:59 np0005538513.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 09:25:59 np0005538513.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 09:25:59 np0005538513.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 09:25:59 np0005538513.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 09:26:00 np0005538513.localdomain groupadd[163908]: group added to /etc/group: name=clevis, GID=985
Nov 28 09:26:00 np0005538513.localdomain groupadd[163908]: group added to /etc/gshadow: name=clevis
Nov 28 09:26:00 np0005538513.localdomain groupadd[163908]: new group: name=clevis, GID=985
Nov 28 09:26:00 np0005538513.localdomain useradd[163915]: new user: name=clevis, UID=985, GID=985, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Nov 28 09:26:00 np0005538513.localdomain usermod[163925]: add 'clevis' to group 'tss'
Nov 28 09:26:00 np0005538513.localdomain usermod[163925]: add 'clevis' to shadow group 'tss'
Nov 28 09:26:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2148 DF PROTO=TCP SPT=44362 DPT=9101 SEQ=87046848 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1752420000000001030307) 
Nov 28 09:26:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58586 DF PROTO=TCP SPT=53808 DPT=9102 SEQ=2849910390 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB175B420000000001030307) 
Nov 28 09:26:03 np0005538513.localdomain groupadd[163947]: group added to /etc/group: name=dnsmasq, GID=984
Nov 28 09:26:03 np0005538513.localdomain groupadd[163947]: group added to /etc/gshadow: name=dnsmasq
Nov 28 09:26:03 np0005538513.localdomain groupadd[163947]: new group: name=dnsmasq, GID=984
Nov 28 09:26:03 np0005538513.localdomain useradd[163954]: new user: name=dnsmasq, UID=984, GID=984, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Nov 28 09:26:03 np0005538513.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Nov 28 09:26:03 np0005538513.localdomain dbus-broker-launch[755]: avc:  op=load_policy lsm=selinux seqno=25 res=1
Nov 28 09:26:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16205 DF PROTO=TCP SPT=59800 DPT=9100 SEQ=1526083829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1766C20000000001030307) 
Nov 28 09:26:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58588 DF PROTO=TCP SPT=53808 DPT=9102 SEQ=2849910390 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1773020000000001030307) 
Nov 28 09:26:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16207 DF PROTO=TCP SPT=59800 DPT=9100 SEQ=1526083829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB177E830000000001030307) 
Nov 28 09:26:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57711 DF PROTO=TCP SPT=47390 DPT=9882 SEQ=4214922872 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1791020000000001030307) 
Nov 28 09:26:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6058 DF PROTO=TCP SPT=43982 DPT=9105 SEQ=1558679513 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1798820000000001030307) 
Nov 28 09:26:19 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:26:19 np0005538513.localdomain systemd[1]: tmp-crun.FO9pth.mount: Deactivated successfully.
Nov 28 09:26:19 np0005538513.localdomain podman[166410]: 2025-11-28 09:26:19.866513566 +0000 UTC m=+0.091962013 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 09:26:19 np0005538513.localdomain podman[166410]: 2025-11-28 09:26:19.932707538 +0000 UTC m=+0.158155975 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 09:26:19 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:26:21 np0005538513.localdomain sshd[167326]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:26:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6059 DF PROTO=TCP SPT=43982 DPT=9105 SEQ=1558679513 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB17A0830000000001030307) 
Nov 28 09:26:21 np0005538513.localdomain sshd[167326]: Invalid user sol from 80.94.92.182 port 51076
Nov 28 09:26:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:26:21 np0005538513.localdomain podman[167796]: 2025-11-28 09:26:21.625074852 +0000 UTC m=+0.081332733 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 28 09:26:21 np0005538513.localdomain podman[167796]: 2025-11-28 09:26:21.660352255 +0000 UTC m=+0.116610126 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 09:26:21 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:26:23 np0005538513.localdomain sshd[167326]: Connection closed by invalid user sol 80.94.92.182 port 51076 [preauth]
Nov 28 09:26:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6060 DF PROTO=TCP SPT=43982 DPT=9105 SEQ=1558679513 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB17B0420000000001030307) 
Nov 28 09:26:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3556 DF PROTO=TCP SPT=35458 DPT=9101 SEQ=665330896 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB17BB7A0000000001030307) 
Nov 28 09:26:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3558 DF PROTO=TCP SPT=35458 DPT=9101 SEQ=665330896 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB17C7830000000001030307) 
Nov 28 09:26:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27991 DF PROTO=TCP SPT=35778 DPT=9102 SEQ=1355455547 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB17D0820000000001030307) 
Nov 28 09:26:35 np0005538513.localdomain sudo[178354]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:26:35 np0005538513.localdomain sudo[178354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:26:35 np0005538513.localdomain sudo[178354]: pam_unix(sudo:session): session closed for user root
Nov 28 09:26:35 np0005538513.localdomain sudo[178429]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:26:35 np0005538513.localdomain sudo[178429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:26:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36223 DF PROTO=TCP SPT=33558 DPT=9102 SEQ=1782435790 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB17DB830000000001030307) 
Nov 28 09:26:36 np0005538513.localdomain sudo[178429]: pam_unix(sudo:session): session closed for user root
Nov 28 09:26:37 np0005538513.localdomain sudo[179496]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:26:37 np0005538513.localdomain sudo[179496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:26:37 np0005538513.localdomain sudo[179496]: pam_unix(sudo:session): session closed for user root
Nov 28 09:26:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41822 DF PROTO=TCP SPT=42968 DPT=9100 SEQ=483642328 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB17E7830000000001030307) 
Nov 28 09:26:40 np0005538513.localdomain polkitd[1036]: Reloading rules
Nov 28 09:26:40 np0005538513.localdomain polkitd[1036]: Collecting garbage unconditionally...
Nov 28 09:26:40 np0005538513.localdomain polkitd[1036]: Loading rules from directory /etc/polkit-1/rules.d
Nov 28 09:26:40 np0005538513.localdomain polkitd[1036]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 28 09:26:40 np0005538513.localdomain polkitd[1036]: Finished loading, compiling and executing 5 rules
Nov 28 09:26:40 np0005538513.localdomain polkitd[1036]: Reloading rules
Nov 28 09:26:40 np0005538513.localdomain polkitd[1036]: Collecting garbage unconditionally...
Nov 28 09:26:40 np0005538513.localdomain polkitd[1036]: Loading rules from directory /etc/polkit-1/rules.d
Nov 28 09:26:40 np0005538513.localdomain polkitd[1036]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 28 09:26:40 np0005538513.localdomain polkitd[1036]: Finished loading, compiling and executing 5 rules
Nov 28 09:26:42 np0005538513.localdomain groupadd[181055]: group added to /etc/group: name=ceph, GID=167
Nov 28 09:26:42 np0005538513.localdomain groupadd[181055]: group added to /etc/gshadow: name=ceph
Nov 28 09:26:42 np0005538513.localdomain groupadd[181055]: new group: name=ceph, GID=167
Nov 28 09:26:42 np0005538513.localdomain useradd[181061]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Nov 28 09:26:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48818 DF PROTO=TCP SPT=47006 DPT=9100 SEQ=2140497574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB17F3C20000000001030307) 
Nov 28 09:26:45 np0005538513.localdomain systemd[1]: Stopping OpenSSH server daemon...
Nov 28 09:26:45 np0005538513.localdomain sshd[117359]: Received signal 15; terminating.
Nov 28 09:26:45 np0005538513.localdomain systemd[1]: sshd.service: Deactivated successfully.
Nov 28 09:26:45 np0005538513.localdomain systemd[1]: Stopped OpenSSH server daemon.
Nov 28 09:26:45 np0005538513.localdomain systemd[1]: sshd.service: Consumed 1.138s CPU time, read 32.0K from disk, written 0B to disk.
Nov 28 09:26:45 np0005538513.localdomain systemd[1]: Stopped target sshd-keygen.target.
Nov 28 09:26:45 np0005538513.localdomain systemd[1]: Stopping sshd-keygen.target...
Nov 28 09:26:45 np0005538513.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 09:26:45 np0005538513.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 09:26:45 np0005538513.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 09:26:45 np0005538513.localdomain systemd[1]: Reached target sshd-keygen.target.
Nov 28 09:26:45 np0005538513.localdomain systemd[1]: Starting OpenSSH server daemon...
Nov 28 09:26:45 np0005538513.localdomain sshd[181730]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:26:45 np0005538513.localdomain sshd[181730]: Server listening on 0.0.0.0 port 22.
Nov 28 09:26:45 np0005538513.localdomain sshd[181730]: Server listening on :: port 22.
Nov 28 09:26:45 np0005538513.localdomain systemd[1]: Started OpenSSH server daemon.
Nov 28 09:26:46 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:46 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:46 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:46 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:46 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:46 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:46 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:46 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:46 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:46 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:46 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:46 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:47 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:47 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50588 DF PROTO=TCP SPT=50622 DPT=9882 SEQ=2933895119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1806420000000001030307) 
Nov 28 09:26:47 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:47 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:47 np0005538513.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 09:26:47 np0005538513.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 28 09:26:47 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:26:48 np0005538513.localdomain systemd-rc-local-generator[181960]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:26:48 np0005538513.localdomain systemd-sysv-generator[181963]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:26:48 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:48 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:48 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:26:48 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:48 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:48 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:48 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:48 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:48 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:48 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:48 np0005538513.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 09:26:48 np0005538513.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 09:26:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25237 DF PROTO=TCP SPT=53944 DPT=9105 SEQ=2637639503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB180DC20000000001030307) 
Nov 28 09:26:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:26:50 np0005538513.localdomain podman[185156]: 2025-11-28 09:26:50.702326261 +0000 UTC m=+0.176167188 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Nov 28 09:26:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:26:50.778 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:26:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:26:50.778 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:26:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:26:50.779 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:26:50 np0005538513.localdomain podman[185156]: 2025-11-28 09:26:50.790451621 +0000 UTC m=+0.264292548 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 09:26:50 np0005538513.localdomain sudo[162435]: pam_unix(sudo:session): session closed for user root
Nov 28 09:26:50 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:26:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25238 DF PROTO=TCP SPT=53944 DPT=9105 SEQ=2637639503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1815C20000000001030307) 
Nov 28 09:26:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:26:51 np0005538513.localdomain podman[186458]: 2025-11-28 09:26:51.843012487 +0000 UTC m=+0.080521781 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 28 09:26:51 np0005538513.localdomain podman[186458]: 2025-11-28 09:26:51.879417995 +0000 UTC m=+0.116927339 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:26:51 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:26:53 np0005538513.localdomain sudo[188147]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uljucuqmultxfraolwasyazqotdayluf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322013.3880146-989-73565679411943/AnsiballZ_systemd.py
Nov 28 09:26:53 np0005538513.localdomain sudo[188147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:26:54 np0005538513.localdomain python3.9[188170]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 09:26:54 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:26:54 np0005538513.localdomain systemd-sysv-generator[188427]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:26:54 np0005538513.localdomain systemd-rc-local-generator[188420]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:26:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:26:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:54 np0005538513.localdomain sudo[188147]: pam_unix(sudo:session): session closed for user root
Nov 28 09:26:55 np0005538513.localdomain sudo[188873]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxxonzbdqrvdkefgnjmrsnagbtjtyoga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322014.8805306-989-209226992382346/AnsiballZ_systemd.py
Nov 28 09:26:55 np0005538513.localdomain sudo[188873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:26:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25239 DF PROTO=TCP SPT=53944 DPT=9105 SEQ=2637639503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1825820000000001030307) 
Nov 28 09:26:55 np0005538513.localdomain python3.9[188895]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 09:26:55 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:26:55 np0005538513.localdomain systemd-sysv-generator[189125]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:26:55 np0005538513.localdomain systemd-rc-local-generator[189122]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:26:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:26:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:55 np0005538513.localdomain sudo[188873]: pam_unix(sudo:session): session closed for user root
Nov 28 09:26:56 np0005538513.localdomain sudo[189566]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wgfwnotqjwdwsucnmkkhtsvjrqqrhaty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322016.028683-989-201184387717272/AnsiballZ_systemd.py
Nov 28 09:26:56 np0005538513.localdomain sudo[189566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:26:56 np0005538513.localdomain python3.9[189584]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 09:26:56 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:26:56 np0005538513.localdomain systemd-rc-local-generator[189828]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:26:56 np0005538513.localdomain systemd-sysv-generator[189832]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:26:56 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:56 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:26:56 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:56 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:56 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:56 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:56 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:56 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:56 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:57 np0005538513.localdomain sudo[189566]: pam_unix(sudo:session): session closed for user root
Nov 28 09:26:57 np0005538513.localdomain sudo[190243]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kllyywqdodtkuauvljmosefjiakuzjsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322017.2003157-989-33289739504814/AnsiballZ_systemd.py
Nov 28 09:26:57 np0005538513.localdomain sudo[190243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:26:57 np0005538513.localdomain python3.9[190256]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 09:26:57 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:26:57 np0005538513.localdomain systemd-sysv-generator[190487]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:26:57 np0005538513.localdomain systemd-rc-local-generator[190482]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:26:57 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:57 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:26:57 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:57 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:57 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:57 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:57 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:57 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:57 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4815 DF PROTO=TCP SPT=38690 DPT=9101 SEQ=3405516526 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1830A90000000001030307) 
Nov 28 09:26:58 np0005538513.localdomain sudo[190243]: pam_unix(sudo:session): session closed for user root
Nov 28 09:26:59 np0005538513.localdomain sudo[191400]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-anihbsegktzggzpkuxsoketdvbcxczzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322019.410325-1076-169013222597487/AnsiballZ_systemd.py
Nov 28 09:26:59 np0005538513.localdomain sudo[191400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:26:59 np0005538513.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 09:26:59 np0005538513.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 28 09:26:59 np0005538513.localdomain systemd[1]: man-db-cache-update.service: Consumed 14.292s CPU time.
Nov 28 09:26:59 np0005538513.localdomain systemd[1]: run-r13ed906980034e2db02f77ee5be26132.service: Deactivated successfully.
Nov 28 09:26:59 np0005538513.localdomain systemd[1]: run-r123d8513e85e4ee8a996436ef10f2026.service: Deactivated successfully.
Nov 28 09:27:00 np0005538513.localdomain python3.9[191418]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:00 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:27:00 np0005538513.localdomain systemd-sysv-generator[191460]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:27:00 np0005538513.localdomain systemd-rc-local-generator[191456]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:27:00 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:00 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:27:00 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:00 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:00 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:00 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:00 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:00 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:00 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:00 np0005538513.localdomain sudo[191400]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4817 DF PROTO=TCP SPT=38690 DPT=9101 SEQ=3405516526 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB183CC20000000001030307) 
Nov 28 09:27:01 np0005538513.localdomain sudo[191577]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzciwwgndntgpufvbjxbivqkfffuajrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322021.3601053-1076-69862178173800/AnsiballZ_systemd.py
Nov 28 09:27:01 np0005538513.localdomain sudo[191577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:01 np0005538513.localdomain python3.9[191579]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:01 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:27:02 np0005538513.localdomain systemd-sysv-generator[191613]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:27:02 np0005538513.localdomain systemd-rc-local-generator[191606]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:27:02 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:02 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:02 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:27:02 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:02 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:02 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:02 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:02 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:02 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:02 np0005538513.localdomain sudo[191577]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:02 np0005538513.localdomain sudo[191726]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dwwpbvbapnyosexbztzfbooipkssftih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322022.4461455-1076-82062188876014/AnsiballZ_systemd.py
Nov 28 09:27:02 np0005538513.localdomain sudo[191726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:03 np0005538513.localdomain python3.9[191728]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:03 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:27:03 np0005538513.localdomain systemd-rc-local-generator[191757]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:27:03 np0005538513.localdomain systemd-sysv-generator[191762]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:27:03 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:03 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:03 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:03 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:27:03 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:03 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:03 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:03 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:03 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:03 np0005538513.localdomain sudo[191726]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25240 DF PROTO=TCP SPT=53944 DPT=9105 SEQ=2637639503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1845820000000001030307) 
Nov 28 09:27:04 np0005538513.localdomain sudo[191875]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qvzaigsnafzqnmicwgknvbguqdvzyaav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322023.7473466-1076-59540636730071/AnsiballZ_systemd.py
Nov 28 09:27:04 np0005538513.localdomain sudo[191875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:04 np0005538513.localdomain python3.9[191877]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:04 np0005538513.localdomain sudo[191875]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:04 np0005538513.localdomain sudo[191988]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-erwhcuaxdjvkafuclinrojgvnsncqrfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322024.5956397-1076-200922520400079/AnsiballZ_systemd.py
Nov 28 09:27:04 np0005538513.localdomain sudo[191988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:05 np0005538513.localdomain python3.9[191990]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:05 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:27:05 np0005538513.localdomain systemd-sysv-generator[192020]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:27:05 np0005538513.localdomain systemd-rc-local-generator[192017]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:27:05 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:05 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:05 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:05 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:05 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:27:05 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:05 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:05 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:05 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:05 np0005538513.localdomain sudo[191988]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33036 DF PROTO=TCP SPT=37890 DPT=9100 SEQ=2342632123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1851430000000001030307) 
Nov 28 09:27:09 np0005538513.localdomain sudo[192137]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbqjxjxiijwaiwpgxztpyqibpejvyrbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322028.8711998-1184-211085852284476/AnsiballZ_systemd.py
Nov 28 09:27:09 np0005538513.localdomain sudo[192137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:09 np0005538513.localdomain python3.9[192139]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 09:27:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13890 DF PROTO=TCP SPT=41638 DPT=9102 SEQ=1384361588 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB185D420000000001030307) 
Nov 28 09:27:09 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:27:09 np0005538513.localdomain systemd-sysv-generator[192167]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:27:09 np0005538513.localdomain systemd-rc-local-generator[192163]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:27:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:27:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:09 np0005538513.localdomain sudo[192137]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:10 np0005538513.localdomain sudo[192285]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjixmdvsxkokzijcbbvmmpxzlxewzmnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322030.1124809-1208-1035563633729/AnsiballZ_systemd.py
Nov 28 09:27:10 np0005538513.localdomain sudo[192285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:10 np0005538513.localdomain python3.9[192287]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:10 np0005538513.localdomain sudo[192285]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:11 np0005538513.localdomain sudo[192398]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rrgzilizujeamnmauystkjqowevqozgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322030.92302-1208-58773297888366/AnsiballZ_systemd.py
Nov 28 09:27:11 np0005538513.localdomain sudo[192398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:11 np0005538513.localdomain python3.9[192400]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:11 np0005538513.localdomain sudo[192398]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:11 np0005538513.localdomain sudo[192511]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fljxhwctloyxzvieopgbfkhhotbefcas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322031.7086232-1208-120186399491184/AnsiballZ_systemd.py
Nov 28 09:27:11 np0005538513.localdomain sudo[192511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:12 np0005538513.localdomain python3.9[192513]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33038 DF PROTO=TCP SPT=37890 DPT=9100 SEQ=2342632123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1869030000000001030307) 
Nov 28 09:27:13 np0005538513.localdomain sudo[192511]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:14 np0005538513.localdomain sudo[192624]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kknamgduexvjjmyhlijwdqmqelizbhfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322034.1234908-1208-24651226276188/AnsiballZ_systemd.py
Nov 28 09:27:14 np0005538513.localdomain sudo[192624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:14 np0005538513.localdomain python3.9[192626]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:14 np0005538513.localdomain sudo[192624]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:15 np0005538513.localdomain sudo[192737]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-htxmqgygoitboafphbkybsvbgobatlhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322034.9036047-1208-160250954759420/AnsiballZ_systemd.py
Nov 28 09:27:15 np0005538513.localdomain sudo[192737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:15 np0005538513.localdomain python3.9[192739]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:15 np0005538513.localdomain sudo[192737]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:16 np0005538513.localdomain sudo[192850]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svjuizsurdhgmbjyznqsymgsutgwqfqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322036.3369434-1208-141565259648482/AnsiballZ_systemd.py
Nov 28 09:27:16 np0005538513.localdomain sudo[192850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:16 np0005538513.localdomain python3.9[192852]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:16 np0005538513.localdomain sudo[192850]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43654 DF PROTO=TCP SPT=33186 DPT=9882 SEQ=1702076260 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB187B830000000001030307) 
Nov 28 09:27:17 np0005538513.localdomain sudo[192963]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpxdfleqzqpkoickflbeylruwxiiggkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322037.1021445-1208-179183940587983/AnsiballZ_systemd.py
Nov 28 09:27:17 np0005538513.localdomain sudo[192963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:17 np0005538513.localdomain python3.9[192965]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:18 np0005538513.localdomain sudo[192963]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31733 DF PROTO=TCP SPT=60190 DPT=9105 SEQ=955319082 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1883020000000001030307) 
Nov 28 09:27:19 np0005538513.localdomain sudo[193076]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzffmgpnuyicjzfjevatdmpmsmnhhibr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322038.9640758-1208-182299155142881/AnsiballZ_systemd.py
Nov 28 09:27:19 np0005538513.localdomain sudo[193076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:19 np0005538513.localdomain python3.9[193078]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:19 np0005538513.localdomain sudo[193076]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:20 np0005538513.localdomain sudo[193189]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klpudjfqrbnxhgmwzveppbvdymuufead ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322039.8452072-1208-209934587157347/AnsiballZ_systemd.py
Nov 28 09:27:20 np0005538513.localdomain sudo[193189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:20 np0005538513.localdomain python3.9[193191]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31734 DF PROTO=TCP SPT=60190 DPT=9105 SEQ=955319082 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB188B030000000001030307) 
Nov 28 09:27:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:27:21 np0005538513.localdomain sudo[193189]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:21 np0005538513.localdomain podman[193195]: 2025-11-28 09:27:21.585179181 +0000 UTC m=+0.093791437 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 09:27:21 np0005538513.localdomain podman[193195]: 2025-11-28 09:27:21.650516962 +0000 UTC m=+0.159129208 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 09:27:21 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:27:21 np0005538513.localdomain sudo[193326]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-inwzomuvuadrukwetrbxcqqmbitnvhpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322041.6693687-1208-186920672943114/AnsiballZ_systemd.py
Nov 28 09:27:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:27:21 np0005538513.localdomain sudo[193326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:22 np0005538513.localdomain podman[193328]: 2025-11-28 09:27:22.047486134 +0000 UTC m=+0.078690121 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 28 09:27:22 np0005538513.localdomain podman[193328]: 2025-11-28 09:27:22.08747199 +0000 UTC m=+0.118675927 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 09:27:22 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:27:22 np0005538513.localdomain python3.9[193329]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:22 np0005538513.localdomain sudo[193326]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:22 np0005538513.localdomain sudo[193456]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llhviklloqcnwtcnfzpdjqduxxjwwwec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322042.4904184-1208-9561209924700/AnsiballZ_systemd.py
Nov 28 09:27:22 np0005538513.localdomain sudo[193456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:23 np0005538513.localdomain python3.9[193458]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:24 np0005538513.localdomain sudo[193456]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:24 np0005538513.localdomain sudo[193569]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcgxykrtalwgqfhryboluvmtopyxlkml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322044.2957082-1208-186724208408590/AnsiballZ_systemd.py
Nov 28 09:27:24 np0005538513.localdomain sudo[193569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:24 np0005538513.localdomain python3.9[193571]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:24 np0005538513.localdomain sudo[193569]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31735 DF PROTO=TCP SPT=60190 DPT=9105 SEQ=955319082 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB189AC20000000001030307) 
Nov 28 09:27:25 np0005538513.localdomain sudo[193682]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bshveytlrsgsxvycsfvwjthkbxeyliht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322045.0663407-1208-72588328955083/AnsiballZ_systemd.py
Nov 28 09:27:25 np0005538513.localdomain sudo[193682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:25 np0005538513.localdomain python3.9[193684]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:25 np0005538513.localdomain sudo[193682]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:26 np0005538513.localdomain sudo[193795]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjzrrwyerfwjlumxcxadhbjjaqsaqqxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322045.846782-1208-215167979867720/AnsiballZ_systemd.py
Nov 28 09:27:26 np0005538513.localdomain sudo[193795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:26 np0005538513.localdomain python3.9[193797]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:26 np0005538513.localdomain sudo[193795]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:27 np0005538513.localdomain sudo[193908]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxtrhktuzhvidifpknpibziqqeuibmqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322046.9686756-1514-243971618024710/AnsiballZ_file.py
Nov 28 09:27:27 np0005538513.localdomain sudo[193908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:27 np0005538513.localdomain python3.9[193910]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:27:27 np0005538513.localdomain sudo[193908]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9310 DF PROTO=TCP SPT=39628 DPT=9101 SEQ=1021539114 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB18A5DA0000000001030307) 
Nov 28 09:27:28 np0005538513.localdomain sudo[194018]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dcrqlhueyzvvkmijzqtwhwnglsqxmpaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322047.6223252-1514-40489295387084/AnsiballZ_file.py
Nov 28 09:27:28 np0005538513.localdomain sudo[194018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:28 np0005538513.localdomain python3.9[194020]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:27:28 np0005538513.localdomain sudo[194018]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:29 np0005538513.localdomain sudo[194128]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oyxaudmcuobynypidzknhoeitazraliw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322048.8213437-1514-120259508740851/AnsiballZ_file.py
Nov 28 09:27:29 np0005538513.localdomain sudo[194128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:29 np0005538513.localdomain python3.9[194130]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:27:29 np0005538513.localdomain sudo[194128]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:29 np0005538513.localdomain sudo[194238]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ajqxcqzbcfaqupdhjnynzwnujzbggnhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322049.431965-1514-234198895637062/AnsiballZ_file.py
Nov 28 09:27:29 np0005538513.localdomain sudo[194238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:29 np0005538513.localdomain python3.9[194240]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:27:29 np0005538513.localdomain sudo[194238]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:31 np0005538513.localdomain sudo[194348]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwofxqgvyhlartskincfusafbjvdjqjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322050.8404813-1514-93307492124653/AnsiballZ_file.py
Nov 28 09:27:31 np0005538513.localdomain sudo[194348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9312 DF PROTO=TCP SPT=39628 DPT=9101 SEQ=1021539114 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB18B2030000000001030307) 
Nov 28 09:27:31 np0005538513.localdomain python3.9[194350]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:27:31 np0005538513.localdomain sudo[194348]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:31 np0005538513.localdomain sudo[194458]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdxeqbhtdupwtwsnfohcomrhdjudefnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322051.476938-1514-28782593580362/AnsiballZ_file.py
Nov 28 09:27:31 np0005538513.localdomain sudo[194458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:31 np0005538513.localdomain python3.9[194460]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:27:31 np0005538513.localdomain sudo[194458]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:32 np0005538513.localdomain sudo[194568]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wrwzubzaswngkmwexjdbfqsfhfdoghha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322052.1793644-1643-204975312148841/AnsiballZ_stat.py
Nov 28 09:27:32 np0005538513.localdomain sudo[194568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:32 np0005538513.localdomain python3.9[194570]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:32 np0005538513.localdomain sudo[194568]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:33 np0005538513.localdomain sudo[194658]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-elhaizozvwqzxyebpjbnmrxwgzjjzkkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322052.1793644-1643-204975312148841/AnsiballZ_copy.py
Nov 28 09:27:33 np0005538513.localdomain sudo[194658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61812 DF PROTO=TCP SPT=44796 DPT=9102 SEQ=3347864527 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB18BAC30000000001030307) 
Nov 28 09:27:33 np0005538513.localdomain python3.9[194660]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322052.1793644-1643-204975312148841/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:33 np0005538513.localdomain sudo[194658]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:33 np0005538513.localdomain sudo[194768]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ryydprpjzhnonucoyhlldpgdprfimrxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322053.6997561-1643-215479033003466/AnsiballZ_stat.py
Nov 28 09:27:33 np0005538513.localdomain sudo[194768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:34 np0005538513.localdomain python3.9[194770]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:34 np0005538513.localdomain sudo[194768]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:34 np0005538513.localdomain sudo[194858]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-euustsybstmluqvawktsrjmdronvmllp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322053.6997561-1643-215479033003466/AnsiballZ_copy.py
Nov 28 09:27:34 np0005538513.localdomain sudo[194858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:34 np0005538513.localdomain python3.9[194860]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322053.6997561-1643-215479033003466/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:35 np0005538513.localdomain sudo[194858]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:35 np0005538513.localdomain sudo[194968]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fngxiifuppooceznvoyurwvjyljkzmjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322055.1242456-1643-37319029699801/AnsiballZ_stat.py
Nov 28 09:27:35 np0005538513.localdomain sudo[194968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:35 np0005538513.localdomain python3.9[194970]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:35 np0005538513.localdomain sudo[194968]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:35 np0005538513.localdomain sudo[195058]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-amyzlfbikokwnncdcyramasmatpevxsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322055.1242456-1643-37319029699801/AnsiballZ_copy.py
Nov 28 09:27:35 np0005538513.localdomain sudo[195058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:36 np0005538513.localdomain python3.9[195060]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322055.1242456-1643-37319029699801/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:36 np0005538513.localdomain sudo[195058]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1155 DF PROTO=TCP SPT=37064 DPT=9100 SEQ=1146749385 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB18C6420000000001030307) 
Nov 28 09:27:36 np0005538513.localdomain sudo[195168]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rogyxacwdfetfxhenukvrfhoofoailml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322056.2293422-1643-148685320685147/AnsiballZ_stat.py
Nov 28 09:27:36 np0005538513.localdomain sudo[195168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:36 np0005538513.localdomain python3.9[195170]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:36 np0005538513.localdomain sudo[195168]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:37 np0005538513.localdomain sudo[195258]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-huskyhwsbbgnuioalpjlfazqqlyveanb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322056.2293422-1643-148685320685147/AnsiballZ_copy.py
Nov 28 09:27:37 np0005538513.localdomain sudo[195258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:37 np0005538513.localdomain python3.9[195260]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322056.2293422-1643-148685320685147/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:37 np0005538513.localdomain sudo[195258]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:37 np0005538513.localdomain sudo[195278]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:27:37 np0005538513.localdomain sudo[195278]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:27:37 np0005538513.localdomain sudo[195278]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:37 np0005538513.localdomain sudo[195323]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:27:37 np0005538513.localdomain sudo[195323]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:27:37 np0005538513.localdomain sudo[195404]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltxpgilobffdgxigmfajucmdijuluhny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322057.3880415-1643-30885881471837/AnsiballZ_stat.py
Nov 28 09:27:37 np0005538513.localdomain sudo[195404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:37 np0005538513.localdomain python3.9[195406]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:37 np0005538513.localdomain sudo[195404]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:38 np0005538513.localdomain sudo[195323]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:38 np0005538513.localdomain sudo[195526]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rkgixdkgijfklwrdoggzuzlljikkzhsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322057.3880415-1643-30885881471837/AnsiballZ_copy.py
Nov 28 09:27:38 np0005538513.localdomain sudo[195526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:38 np0005538513.localdomain python3.9[195528]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322057.3880415-1643-30885881471837/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=8d9b2057482987a531d808ceb2ac4bc7d43bf17c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:38 np0005538513.localdomain sudo[195526]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:38 np0005538513.localdomain sudo[195600]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:27:38 np0005538513.localdomain sudo[195600]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:27:38 np0005538513.localdomain sudo[195600]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:38 np0005538513.localdomain sudo[195654]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydwbckwrxckjfwbajqgddqvvikbbnclo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322058.5308483-1643-40897026161398/AnsiballZ_stat.py
Nov 28 09:27:38 np0005538513.localdomain sudo[195654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:39 np0005538513.localdomain python3.9[195656]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:39 np0005538513.localdomain sudo[195654]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48821 DF PROTO=TCP SPT=47006 DPT=9100 SEQ=2140497574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB18D1820000000001030307) 
Nov 28 09:27:39 np0005538513.localdomain sudo[195744]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-grcbqvpcwzrvglbhbrsevehnjmdnrdep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322058.5308483-1643-40897026161398/AnsiballZ_copy.py
Nov 28 09:27:39 np0005538513.localdomain sudo[195744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:39 np0005538513.localdomain python3.9[195746]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322058.5308483-1643-40897026161398/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:39 np0005538513.localdomain sudo[195744]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:40 np0005538513.localdomain sudo[195854]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jaidxztcjffdqsvbnaryxgyjxlbefnyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322059.7896645-1643-25617959246943/AnsiballZ_stat.py
Nov 28 09:27:40 np0005538513.localdomain sudo[195854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:40 np0005538513.localdomain python3.9[195856]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:40 np0005538513.localdomain sudo[195854]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:41 np0005538513.localdomain sudo[195942]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvwxbydwsjjebddyoppqclqshcwazmly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322059.7896645-1643-25617959246943/AnsiballZ_copy.py
Nov 28 09:27:41 np0005538513.localdomain sudo[195942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:41 np0005538513.localdomain python3.9[195944]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322059.7896645-1643-25617959246943/.source.conf follow=False _original_basename=auth.conf checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:41 np0005538513.localdomain sudo[195942]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:42 np0005538513.localdomain sudo[196052]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igatvxhufstwistlwawajnjxwnbmloao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322061.8411763-1643-275230523206812/AnsiballZ_stat.py
Nov 28 09:27:42 np0005538513.localdomain sudo[196052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:42 np0005538513.localdomain python3.9[196054]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:42 np0005538513.localdomain sudo[196052]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1157 DF PROTO=TCP SPT=37064 DPT=9100 SEQ=1146749385 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB18DE020000000001030307) 
Nov 28 09:27:42 np0005538513.localdomain sudo[196142]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydzgswjncoervqvzhjdpfuemprayjshg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322061.8411763-1643-275230523206812/AnsiballZ_copy.py
Nov 28 09:27:42 np0005538513.localdomain sudo[196142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:42 np0005538513.localdomain python3.9[196144]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322061.8411763-1643-275230523206812/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:42 np0005538513.localdomain sudo[196142]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:43 np0005538513.localdomain sudo[196252]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqhsjoileykfoulnejpucigmoolugtww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322063.3775995-1985-130750488699787/AnsiballZ_file.py
Nov 28 09:27:43 np0005538513.localdomain sudo[196252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:43 np0005538513.localdomain python3.9[196254]: ansible-ansible.builtin.file Invoked with path=/etc/libvirt/passwd.db state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:43 np0005538513.localdomain sudo[196252]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:44 np0005538513.localdomain sudo[196362]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcsrwnzxhslfgmlpqjapmffccodgmbco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322064.138897-2009-162666246634343/AnsiballZ_file.py
Nov 28 09:27:44 np0005538513.localdomain sudo[196362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:44 np0005538513.localdomain python3.9[196364]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:44 np0005538513.localdomain sudo[196362]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:45 np0005538513.localdomain sudo[196472]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecfkxsixpgznqqgjdpbyukwkbmokyaap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322064.7609224-2009-264122262815860/AnsiballZ_file.py
Nov 28 09:27:45 np0005538513.localdomain sudo[196472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:45 np0005538513.localdomain python3.9[196474]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:45 np0005538513.localdomain sudo[196472]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:45 np0005538513.localdomain sudo[196582]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybgucmpqxutsqxwshhbcaetyyufubhyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322065.4231648-2009-244414911256121/AnsiballZ_file.py
Nov 28 09:27:45 np0005538513.localdomain sudo[196582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:45 np0005538513.localdomain python3.9[196584]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:45 np0005538513.localdomain sudo[196582]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:46 np0005538513.localdomain sudo[196692]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apiqtgstmmzialxiregtvactdfmcodfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322066.010744-2009-53351671594728/AnsiballZ_file.py
Nov 28 09:27:46 np0005538513.localdomain sudo[196692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:46 np0005538513.localdomain python3.9[196694]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:46 np0005538513.localdomain sudo[196692]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:46 np0005538513.localdomain sudo[196802]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbielqlbtbetbaimskyqysbsdjxkmnwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322066.6177652-2009-99765664485627/AnsiballZ_file.py
Nov 28 09:27:46 np0005538513.localdomain sudo[196802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:47 np0005538513.localdomain python3.9[196804]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:47 np0005538513.localdomain sudo[196802]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21379 DF PROTO=TCP SPT=47250 DPT=9882 SEQ=3527134190 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB18F0820000000001030307) 
Nov 28 09:27:47 np0005538513.localdomain sudo[196912]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apdfpfrbaouvzgxnbwyedmflvntaztqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322067.257495-2009-254187610851680/AnsiballZ_file.py
Nov 28 09:27:47 np0005538513.localdomain sudo[196912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:47 np0005538513.localdomain python3.9[196914]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:47 np0005538513.localdomain sudo[196912]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:48 np0005538513.localdomain sudo[197022]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hqfabhlebhmtoqdlnarqhisznnjvxwab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322067.8800628-2009-44151520470441/AnsiballZ_file.py
Nov 28 09:27:48 np0005538513.localdomain sudo[197022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:48 np0005538513.localdomain python3.9[197024]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:48 np0005538513.localdomain sudo[197022]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:48 np0005538513.localdomain sudo[197132]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-totzxezvzvpotcnkxyximndxsxkcmsqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322068.503098-2009-177169309125576/AnsiballZ_file.py
Nov 28 09:27:48 np0005538513.localdomain sudo[197132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:48 np0005538513.localdomain python3.9[197134]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:48 np0005538513.localdomain sudo[197132]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59164 DF PROTO=TCP SPT=35670 DPT=9105 SEQ=4024786643 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB18F8420000000001030307) 
Nov 28 09:27:49 np0005538513.localdomain sudo[197242]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ciujaixnhzgikzbmovyfyyavdbqztjqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322069.1233273-2009-239634146252009/AnsiballZ_file.py
Nov 28 09:27:49 np0005538513.localdomain sudo[197242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:49 np0005538513.localdomain python3.9[197244]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:49 np0005538513.localdomain sudo[197242]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:50 np0005538513.localdomain sudo[197352]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvbvmaehkxvrhtcferrnzoxeohuzzrux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322069.7800531-2009-64701021758372/AnsiballZ_file.py
Nov 28 09:27:50 np0005538513.localdomain sudo[197352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:50 np0005538513.localdomain python3.9[197354]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:50 np0005538513.localdomain sudo[197352]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:50 np0005538513.localdomain sudo[197462]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swxukjigjbhmfntawrbjnwxujskiaypz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322070.4759967-2009-94947382877585/AnsiballZ_file.py
Nov 28 09:27:50 np0005538513.localdomain sudo[197462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:27:50.779 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:27:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:27:50.780 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:27:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:27:50.782 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:27:50 np0005538513.localdomain python3.9[197464]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:50 np0005538513.localdomain sudo[197462]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59165 DF PROTO=TCP SPT=35670 DPT=9105 SEQ=4024786643 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1900420000000001030307) 
Nov 28 09:27:51 np0005538513.localdomain sudo[197572]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bikszyrhrktszujmhgclqctndhqbvpty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322071.0750055-2009-59309254066919/AnsiballZ_file.py
Nov 28 09:27:51 np0005538513.localdomain sudo[197572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:51 np0005538513.localdomain python3.9[197574]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:51 np0005538513.localdomain sudo[197572]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:27:51 np0005538513.localdomain podman[197617]: 2025-11-28 09:27:51.853324343 +0000 UTC m=+0.087123405 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:27:51 np0005538513.localdomain podman[197617]: 2025-11-28 09:27:51.923908838 +0000 UTC m=+0.157707940 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:27:51 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:27:52 np0005538513.localdomain sudo[197707]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvulhjftsewktgbfznjlvbxhsdjcwlhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322071.7251482-2009-232010509604994/AnsiballZ_file.py
Nov 28 09:27:52 np0005538513.localdomain sudo[197707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:52 np0005538513.localdomain python3.9[197709]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:52 np0005538513.localdomain sudo[197707]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:52 np0005538513.localdomain sudo[197817]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kymmwnbwnvjpitffcxucytrgqffmomic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322072.4140368-2009-219618440203025/AnsiballZ_file.py
Nov 28 09:27:52 np0005538513.localdomain sudo[197817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:27:52 np0005538513.localdomain podman[197820]: 2025-11-28 09:27:52.807753783 +0000 UTC m=+0.072140814 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 09:27:52 np0005538513.localdomain podman[197820]: 2025-11-28 09:27:52.843310198 +0000 UTC m=+0.107697149 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 09:27:52 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:27:52 np0005538513.localdomain python3.9[197819]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:52 np0005538513.localdomain sudo[197817]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:53 np0005538513.localdomain sudo[197945]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjbsidjktrfzcrdpmovrqahdlwrrdapk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322073.3285704-2306-277679658601667/AnsiballZ_stat.py
Nov 28 09:27:53 np0005538513.localdomain sudo[197945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:53 np0005538513.localdomain python3.9[197947]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:53 np0005538513.localdomain sudo[197945]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:54 np0005538513.localdomain sudo[198033]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eubuyefaagbuaivhtybdlamurewluvre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322073.3285704-2306-277679658601667/AnsiballZ_copy.py
Nov 28 09:27:54 np0005538513.localdomain sudo[198033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:55 np0005538513.localdomain python3.9[198035]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322073.3285704-2306-277679658601667/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:55 np0005538513.localdomain sudo[198033]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59166 DF PROTO=TCP SPT=35670 DPT=9105 SEQ=4024786643 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1910020000000001030307) 
Nov 28 09:27:55 np0005538513.localdomain sudo[198143]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhztrjxprsalhkelthsrnoprjkqhejyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322075.3073146-2306-273706696229909/AnsiballZ_stat.py
Nov 28 09:27:55 np0005538513.localdomain sudo[198143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:55 np0005538513.localdomain python3.9[198145]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:55 np0005538513.localdomain sudo[198143]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:57 np0005538513.localdomain sudo[198231]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aazijblszrhkuftsiwftwyjnqhczjxtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322075.3073146-2306-273706696229909/AnsiballZ_copy.py
Nov 28 09:27:57 np0005538513.localdomain sudo[198231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:57 np0005538513.localdomain python3.9[198233]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322075.3073146-2306-273706696229909/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:57 np0005538513.localdomain sudo[198231]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:57 np0005538513.localdomain sudo[198341]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvclivzurmobflzxouqnvyldpkcyvcts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322077.3913124-2306-269691063136303/AnsiballZ_stat.py
Nov 28 09:27:57 np0005538513.localdomain sudo[198341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:57 np0005538513.localdomain python3.9[198343]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:57 np0005538513.localdomain sudo[198341]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8493 DF PROTO=TCP SPT=45770 DPT=9101 SEQ=2243947299 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB191B090000000001030307) 
Nov 28 09:27:58 np0005538513.localdomain sudo[198429]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlqoizuhdfkvnbcrlwijdbkcihfpmted ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322077.3913124-2306-269691063136303/AnsiballZ_copy.py
Nov 28 09:27:58 np0005538513.localdomain sudo[198429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:58 np0005538513.localdomain python3.9[198431]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322077.3913124-2306-269691063136303/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:58 np0005538513.localdomain sudo[198429]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:58 np0005538513.localdomain sudo[198539]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzrwgslauenialawnmtprxxjnwxnizeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322078.6479716-2306-144957297635360/AnsiballZ_stat.py
Nov 28 09:27:58 np0005538513.localdomain sudo[198539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:59 np0005538513.localdomain python3.9[198541]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:59 np0005538513.localdomain sudo[198539]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:59 np0005538513.localdomain sudo[198627]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vofhujrcunwjrzzgygasllcbcpcmcxnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322078.6479716-2306-144957297635360/AnsiballZ_copy.py
Nov 28 09:27:59 np0005538513.localdomain sudo[198627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:59 np0005538513.localdomain python3.9[198629]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322078.6479716-2306-144957297635360/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:59 np0005538513.localdomain sudo[198627]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:00 np0005538513.localdomain sudo[198737]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmzcnokdlbidceicrclkatuomwpqxjcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322079.8203244-2306-92788893361626/AnsiballZ_stat.py
Nov 28 09:28:00 np0005538513.localdomain sudo[198737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:00 np0005538513.localdomain python3.9[198739]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:00 np0005538513.localdomain sudo[198737]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:00 np0005538513.localdomain sudo[198825]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zznngwmueobvnabldaiafhitokdfkqfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322079.8203244-2306-92788893361626/AnsiballZ_copy.py
Nov 28 09:28:00 np0005538513.localdomain sudo[198825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:00 np0005538513.localdomain python3.9[198827]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322079.8203244-2306-92788893361626/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:00 np0005538513.localdomain sudo[198825]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8495 DF PROTO=TCP SPT=45770 DPT=9101 SEQ=2243947299 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1927020000000001030307) 
Nov 28 09:28:01 np0005538513.localdomain sudo[198935]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tdcuojrkcbwyvlujtjuefwgfdnkdqhrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322081.0852919-2306-229161486077059/AnsiballZ_stat.py
Nov 28 09:28:01 np0005538513.localdomain sudo[198935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:01 np0005538513.localdomain python3.9[198937]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:01 np0005538513.localdomain sudo[198935]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:01 np0005538513.localdomain sudo[199023]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdtrlrdtfcojsqmtycrzyapsporjvigm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322081.0852919-2306-229161486077059/AnsiballZ_copy.py
Nov 28 09:28:01 np0005538513.localdomain sudo[199023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:02 np0005538513.localdomain python3.9[199025]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322081.0852919-2306-229161486077059/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:02 np0005538513.localdomain sudo[199023]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:02 np0005538513.localdomain sudo[199133]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vlqupwnerbgksnmekpbsnpkphpbtbquc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322082.2459536-2306-41145762995919/AnsiballZ_stat.py
Nov 28 09:28:02 np0005538513.localdomain sudo[199133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:02 np0005538513.localdomain python3.9[199135]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:02 np0005538513.localdomain sudo[199133]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:03 np0005538513.localdomain sudo[199221]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwedmfxhdfivwvkpryqvlwwcyzlcidqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322082.2459536-2306-41145762995919/AnsiballZ_copy.py
Nov 28 09:28:03 np0005538513.localdomain sudo[199221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:03 np0005538513.localdomain python3.9[199223]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322082.2459536-2306-41145762995919/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:03 np0005538513.localdomain sudo[199221]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59167 DF PROTO=TCP SPT=35670 DPT=9105 SEQ=4024786643 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB192F820000000001030307) 
Nov 28 09:28:03 np0005538513.localdomain sudo[199331]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rftitbraoeqlvycncjxrnfnlauwsnhek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322083.4281278-2306-30012057901279/AnsiballZ_stat.py
Nov 28 09:28:03 np0005538513.localdomain sudo[199331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:03 np0005538513.localdomain python3.9[199333]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:03 np0005538513.localdomain sudo[199331]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:04 np0005538513.localdomain sudo[199419]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kumalddzortvewrkiwgwhnbtqlbayqdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322083.4281278-2306-30012057901279/AnsiballZ_copy.py
Nov 28 09:28:04 np0005538513.localdomain sudo[199419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:04 np0005538513.localdomain python3.9[199421]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322083.4281278-2306-30012057901279/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:04 np0005538513.localdomain sudo[199419]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:04 np0005538513.localdomain sudo[199529]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwopedaextniouhtvviujvfqqszlvyde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322084.7025983-2306-39272947968528/AnsiballZ_stat.py
Nov 28 09:28:04 np0005538513.localdomain sudo[199529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:05 np0005538513.localdomain python3.9[199531]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:05 np0005538513.localdomain sudo[199529]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:05 np0005538513.localdomain sudo[199617]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwfsaferovdlagigocvknpwilquntaas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322084.7025983-2306-39272947968528/AnsiballZ_copy.py
Nov 28 09:28:05 np0005538513.localdomain sudo[199617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:05 np0005538513.localdomain python3.9[199619]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322084.7025983-2306-39272947968528/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:05 np0005538513.localdomain sudo[199617]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:06 np0005538513.localdomain sudo[199727]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehfxesaxumukaacqoevwsltfxgmsbpls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322085.9570284-2306-52237337935485/AnsiballZ_stat.py
Nov 28 09:28:06 np0005538513.localdomain sudo[199727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13893 DF PROTO=TCP SPT=41638 DPT=9102 SEQ=1384361588 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB193B820000000001030307) 
Nov 28 09:28:06 np0005538513.localdomain python3.9[199729]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:06 np0005538513.localdomain sudo[199727]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:06 np0005538513.localdomain sudo[199815]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-biesmhqhhmwzvthbtniwnapygyrzcuyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322085.9570284-2306-52237337935485/AnsiballZ_copy.py
Nov 28 09:28:06 np0005538513.localdomain sudo[199815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:06 np0005538513.localdomain python3.9[199817]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322085.9570284-2306-52237337935485/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:06 np0005538513.localdomain sudo[199815]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:07 np0005538513.localdomain sudo[199925]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bejegqndlykbozlsqlvbjdwvrtsoqrjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322087.122525-2306-32809047670558/AnsiballZ_stat.py
Nov 28 09:28:07 np0005538513.localdomain sudo[199925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:07 np0005538513.localdomain python3.9[199927]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:07 np0005538513.localdomain sudo[199925]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:09 np0005538513.localdomain sudo[200013]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwtlmhmmryqgzorbwpnlfbdocytpando ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322087.122525-2306-32809047670558/AnsiballZ_copy.py
Nov 28 09:28:09 np0005538513.localdomain sudo[200013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33041 DF PROTO=TCP SPT=37890 DPT=9100 SEQ=2342632123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1947820000000001030307) 
Nov 28 09:28:09 np0005538513.localdomain python3.9[200015]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322087.122525-2306-32809047670558/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:09 np0005538513.localdomain sudo[200013]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:09 np0005538513.localdomain sudo[200123]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hqmachbsqslvoltpmqnkylkevjacbkuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322089.681376-2306-266653880391226/AnsiballZ_stat.py
Nov 28 09:28:09 np0005538513.localdomain sudo[200123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:10 np0005538513.localdomain python3.9[200125]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:10 np0005538513.localdomain sudo[200123]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:10 np0005538513.localdomain sudo[200211]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tcmzhwppsqywmkgdirxnfzqqmbatdxwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322089.681376-2306-266653880391226/AnsiballZ_copy.py
Nov 28 09:28:10 np0005538513.localdomain sudo[200211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:10 np0005538513.localdomain python3.9[200213]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322089.681376-2306-266653880391226/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:10 np0005538513.localdomain sudo[200211]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:11 np0005538513.localdomain sudo[200321]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rpssttsabqrduockncicgcvtdqjpcxfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322091.4093482-2306-220572158321127/AnsiballZ_stat.py
Nov 28 09:28:11 np0005538513.localdomain sudo[200321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:11 np0005538513.localdomain python3.9[200323]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:11 np0005538513.localdomain sudo[200321]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:12 np0005538513.localdomain sudo[200409]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iimsaqfssubsymavgiluoijydlnlyskd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322091.4093482-2306-220572158321127/AnsiballZ_copy.py
Nov 28 09:28:12 np0005538513.localdomain sudo[200409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45370 DF PROTO=TCP SPT=43214 DPT=9100 SEQ=785439338 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1953420000000001030307) 
Nov 28 09:28:12 np0005538513.localdomain python3.9[200411]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322091.4093482-2306-220572158321127/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:12 np0005538513.localdomain sudo[200409]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:12 np0005538513.localdomain sudo[200519]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjamwvesnqblmpinpuonhezpjbbabyrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322092.648587-2306-23094986687716/AnsiballZ_stat.py
Nov 28 09:28:12 np0005538513.localdomain sudo[200519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:13 np0005538513.localdomain python3.9[200521]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:13 np0005538513.localdomain sudo[200519]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:13 np0005538513.localdomain sudo[200607]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khwtgymgwrccqezibpsyidooxxaisgop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322092.648587-2306-23094986687716/AnsiballZ_copy.py
Nov 28 09:28:13 np0005538513.localdomain sudo[200607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:13 np0005538513.localdomain python3.9[200609]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322092.648587-2306-23094986687716/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:13 np0005538513.localdomain sudo[200607]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:14 np0005538513.localdomain python3.9[200717]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                                            ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:28:15 np0005538513.localdomain sudo[200828]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfavffgftlboiskhxtrzdbzgqorouvke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322094.7294536-2924-210122369751736/AnsiballZ_seboolean.py
Nov 28 09:28:15 np0005538513.localdomain sudo[200828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:15 np0005538513.localdomain python3.9[200830]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 28 09:28:15 np0005538513.localdomain sudo[200828]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:16 np0005538513.localdomain sudo[200938]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqtfezqyhfeszaayvsllcmpiroammqwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322095.9938347-2954-152154345204610/AnsiballZ_systemd.py
Nov 28 09:28:16 np0005538513.localdomain sudo[200938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:16 np0005538513.localdomain python3.9[200940]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:28:16 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:28:16 np0005538513.localdomain systemd-rc-local-generator[200964]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:28:16 np0005538513.localdomain systemd-sysv-generator[200969]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:28:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:28:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:16 np0005538513.localdomain systemd[1]: Starting libvirt logging daemon socket...
Nov 28 09:28:16 np0005538513.localdomain systemd[1]: Listening on libvirt logging daemon socket.
Nov 28 09:28:16 np0005538513.localdomain systemd[1]: Starting libvirt logging daemon admin socket...
Nov 28 09:28:16 np0005538513.localdomain systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 28 09:28:16 np0005538513.localdomain systemd[1]: Starting libvirt logging daemon...
Nov 28 09:28:16 np0005538513.localdomain systemd[1]: Started libvirt logging daemon.
Nov 28 09:28:16 np0005538513.localdomain sudo[200938]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=652 DF PROTO=TCP SPT=60246 DPT=9882 SEQ=806108939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1965C20000000001030307) 
Nov 28 09:28:17 np0005538513.localdomain sudo[201090]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zuwsvjuqiqybdvsujonsjhmxdnptyhbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322097.125375-2954-83433993190388/AnsiballZ_systemd.py
Nov 28 09:28:17 np0005538513.localdomain sudo[201090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:17 np0005538513.localdomain python3.9[201092]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:28:17 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:28:17 np0005538513.localdomain systemd-sysv-generator[201120]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:28:17 np0005538513.localdomain systemd-rc-local-generator[201115]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:28:17 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:17 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:17 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:17 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:17 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:28:17 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:17 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:17 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:17 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:18 np0005538513.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 28 09:28:18 np0005538513.localdomain systemd[1]: Starting libvirt nodedev daemon socket...
Nov 28 09:28:18 np0005538513.localdomain systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 28 09:28:18 np0005538513.localdomain systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 28 09:28:18 np0005538513.localdomain systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 28 09:28:18 np0005538513.localdomain systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 28 09:28:18 np0005538513.localdomain systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 28 09:28:18 np0005538513.localdomain systemd[1]: Started libvirt nodedev daemon.
Nov 28 09:28:18 np0005538513.localdomain sudo[201090]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:18 np0005538513.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 28 09:28:18 np0005538513.localdomain setroubleshoot[201129]: Deleting alert 96d97920-1546-4f45-b9c9-d0d51c7a6a1d, it is allowed in current policy
Nov 28 09:28:18 np0005538513.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service.
Nov 28 09:28:18 np0005538513.localdomain sudo[201272]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ysccafqbbescpyrhjjuldcgrljrkxkmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322098.2765362-2954-223559839076582/AnsiballZ_systemd.py
Nov 28 09:28:18 np0005538513.localdomain sudo[201272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:18 np0005538513.localdomain python3.9[201274]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:28:18 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:28:19 np0005538513.localdomain systemd-rc-local-generator[201300]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:28:19 np0005538513.localdomain systemd-sysv-generator[201305]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:28:19 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:19 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:19 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:19 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:19 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:28:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48422 DF PROTO=TCP SPT=44766 DPT=9105 SEQ=3785523750 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB196D420000000001030307) 
Nov 28 09:28:19 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:19 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:19 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:19 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:19 np0005538513.localdomain systemd[1]: Starting libvirt proxy daemon socket...
Nov 28 09:28:19 np0005538513.localdomain systemd[1]: Listening on libvirt proxy daemon socket.
Nov 28 09:28:19 np0005538513.localdomain systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 28 09:28:19 np0005538513.localdomain systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 28 09:28:19 np0005538513.localdomain systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 28 09:28:19 np0005538513.localdomain systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 28 09:28:19 np0005538513.localdomain systemd[1]: Started libvirt proxy daemon.
Nov 28 09:28:19 np0005538513.localdomain sudo[201272]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:19 np0005538513.localdomain setroubleshoot[201129]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 7622d5e9-f3a5-42df-957c-0a069946da20
Nov 28 09:28:19 np0005538513.localdomain setroubleshoot[201129]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                                 
                                                                 *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                                 
                                                                 If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                                 Then turn on full auditing to get path information about the offending file and generate the error again.
                                                                 Do
                                                                 
                                                                 Turn on full auditing
                                                                 # auditctl -w /etc/shadow -p w
                                                                 Try to recreate AVC. Then execute
                                                                 # ausearch -m avc -ts recent
                                                                 If you see PATH record check ownership/permissions on file, and fix it,
                                                                 otherwise report as a bugzilla.
                                                                 
                                                                 *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                                 
                                                                 If you believe that virtlogd should have the dac_read_search capability by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                                 # semodule -X 300 -i my-virtlogd.pp
                                                                 
Nov 28 09:28:19 np0005538513.localdomain setroubleshoot[201129]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 7622d5e9-f3a5-42df-957c-0a069946da20
Nov 28 09:28:19 np0005538513.localdomain setroubleshoot[201129]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                                 
                                                                 *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                                 
                                                                 If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                                 Then turn on full auditing to get path information about the offending file and generate the error again.
                                                                 Do
                                                                 
                                                                 Turn on full auditing
                                                                 # auditctl -w /etc/shadow -p w
                                                                 Try to recreate AVC. Then execute
                                                                 # ausearch -m avc -ts recent
                                                                 If you see PATH record check ownership/permissions on file, and fix it,
                                                                 otherwise report as a bugzilla.
                                                                 
                                                                 *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                                 
                                                                 If you believe that virtlogd should have the dac_read_search capability by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                                 # semodule -X 300 -i my-virtlogd.pp
                                                                 
Nov 28 09:28:19 np0005538513.localdomain sudo[201446]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhrwqfmvwpjhpujpltqliqqgetcqwxrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322099.4144197-2954-141042191641014/AnsiballZ_systemd.py
Nov 28 09:28:19 np0005538513.localdomain sudo[201446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:19 np0005538513.localdomain python3.9[201448]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:28:19 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:28:20 np0005538513.localdomain systemd-rc-local-generator[201469]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:28:20 np0005538513.localdomain systemd-sysv-generator[201474]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:28:20 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:20 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:20 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:20 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:20 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:28:20 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:20 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:20 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:20 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:20 np0005538513.localdomain systemd[1]: Listening on libvirt locking daemon socket.
Nov 28 09:28:20 np0005538513.localdomain systemd[1]: Starting libvirt QEMU daemon socket...
Nov 28 09:28:20 np0005538513.localdomain systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 28 09:28:20 np0005538513.localdomain systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 28 09:28:20 np0005538513.localdomain systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 28 09:28:20 np0005538513.localdomain systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 28 09:28:20 np0005538513.localdomain systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 28 09:28:20 np0005538513.localdomain systemd[1]: Started libvirt QEMU daemon.
Nov 28 09:28:20 np0005538513.localdomain sudo[201446]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:20 np0005538513.localdomain sudo[201627]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ochcqpdjviwxkrfyobrvyxiwltjxzbqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322100.5037367-2954-8730455117446/AnsiballZ_systemd.py
Nov 28 09:28:20 np0005538513.localdomain sudo[201627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:21 np0005538513.localdomain python3.9[201629]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:28:21 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:28:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48423 DF PROTO=TCP SPT=44766 DPT=9105 SEQ=3785523750 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1975420000000001030307) 
Nov 28 09:28:21 np0005538513.localdomain systemd-rc-local-generator[201658]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:28:21 np0005538513.localdomain systemd-sysv-generator[201662]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:28:21 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:21 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:21 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:21 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:21 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:28:21 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:21 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:21 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:21 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:21 np0005538513.localdomain systemd[1]: Starting libvirt secret daemon socket...
Nov 28 09:28:21 np0005538513.localdomain systemd[1]: Listening on libvirt secret daemon socket.
Nov 28 09:28:21 np0005538513.localdomain systemd[1]: Starting libvirt secret daemon admin socket...
Nov 28 09:28:21 np0005538513.localdomain systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 28 09:28:21 np0005538513.localdomain systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 28 09:28:21 np0005538513.localdomain systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 28 09:28:21 np0005538513.localdomain systemd[1]: Started libvirt secret daemon.
Nov 28 09:28:21 np0005538513.localdomain sudo[201627]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:28:22 np0005538513.localdomain podman[201719]: 2025-11-28 09:28:22.851108702 +0000 UTC m=+0.086672131 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 09:28:22 np0005538513.localdomain podman[201719]: 2025-11-28 09:28:22.88573679 +0000 UTC m=+0.121300279 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:28:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:28:22 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:28:22 np0005538513.localdomain systemd[1]: tmp-crun.MtYW9m.mount: Deactivated successfully.
Nov 28 09:28:22 np0005538513.localdomain podman[201744]: 2025-11-28 09:28:22.994395968 +0000 UTC m=+0.085211017 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 28 09:28:23 np0005538513.localdomain podman[201744]: 2025-11-28 09:28:23.024468594 +0000 UTC m=+0.115283633 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:28:23 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:28:23 np0005538513.localdomain sudo[201852]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zmjzaxxvndjminifggthcmeehdjujyqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322103.059326-3065-225023946969396/AnsiballZ_file.py
Nov 28 09:28:23 np0005538513.localdomain sudo[201852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:23 np0005538513.localdomain python3.9[201854]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:23 np0005538513.localdomain sudo[201852]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:23 np0005538513.localdomain sudo[201962]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpsjselngedrmknerzjvwsndybvzchiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322103.7052355-3089-178928200930550/AnsiballZ_find.py
Nov 28 09:28:23 np0005538513.localdomain sudo[201962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:24 np0005538513.localdomain python3.9[201964]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 09:28:24 np0005538513.localdomain sudo[201962]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48424 DF PROTO=TCP SPT=44766 DPT=9105 SEQ=3785523750 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1985030000000001030307) 
Nov 28 09:28:25 np0005538513.localdomain sudo[202072]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-goubrvlhqsegwetrqddzgdqewelxpgkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322105.3625119-3113-83345083680385/AnsiballZ_command.py
Nov 28 09:28:25 np0005538513.localdomain sudo[202072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:25 np0005538513.localdomain python3.9[202074]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;
                                                            echo ceph
                                                            awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:28:25 np0005538513.localdomain sudo[202072]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:26 np0005538513.localdomain python3.9[202186]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 09:28:27 np0005538513.localdomain python3.9[202294]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:28 np0005538513.localdomain python3.9[202380]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322107.030235-3170-23267115983596/.source.xml follow=False _original_basename=secret.xml.j2 checksum=817431989b0a3ade349fa0105099056ad78b021d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17044 DF PROTO=TCP SPT=47494 DPT=9101 SEQ=3657177445 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB19903A0000000001030307) 
Nov 28 09:28:28 np0005538513.localdomain sudo[202488]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czsbaiipedxhtodmepbhytkdgnpqwuit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322108.2202573-3215-116860066017337/AnsiballZ_command.py
Nov 28 09:28:28 np0005538513.localdomain sudo[202488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:28 np0005538513.localdomain python3.9[202490]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 2c5417c9-00eb-57d5-a565-ddecbc7995c1
                                                            virsh secret-define --file /tmp/secret.xml
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:28:28 np0005538513.localdomain polkitd[1036]: Registered Authentication Agent for unix-process:202492:994286 (system bus name :1.2830 [pkttyagent --process 202492 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Nov 28 09:28:28 np0005538513.localdomain polkitd[1036]: Unregistered Authentication Agent for unix-process:202492:994286 (system bus name :1.2830, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Nov 28 09:28:28 np0005538513.localdomain polkitd[1036]: Registered Authentication Agent for unix-process:202491:994286 (system bus name :1.2831 [pkttyagent --process 202491 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Nov 28 09:28:28 np0005538513.localdomain polkitd[1036]: Unregistered Authentication Agent for unix-process:202491:994286 (system bus name :1.2831, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Nov 28 09:28:29 np0005538513.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service: Deactivated successfully.
Nov 28 09:28:29 np0005538513.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 28 09:28:29 np0005538513.localdomain sudo[202488]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:30 np0005538513.localdomain python3.9[202611]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:31 np0005538513.localdomain sudo[202719]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-brmwtbznwblktwwvhgdxusrumlzanlxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322110.7821455-3263-166341072001955/AnsiballZ_command.py
Nov 28 09:28:31 np0005538513.localdomain sudo[202719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17046 DF PROTO=TCP SPT=47494 DPT=9101 SEQ=3657177445 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB199C420000000001030307) 
Nov 28 09:28:31 np0005538513.localdomain sudo[202719]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:31 np0005538513.localdomain sudo[202830]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-earrwkqvwabwkyxltmgmqmubifxhghvp ; FSID=2c5417c9-00eb-57d5-a565-ddecbc7995c1 KEY=AQD7UylpAAAAABAAFA51EB/tlSHSRoK3+SF42Q== /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322111.5021088-3287-85719428339685/AnsiballZ_command.py
Nov 28 09:28:31 np0005538513.localdomain sudo[202830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:32 np0005538513.localdomain polkitd[1036]: Registered Authentication Agent for unix-process:202833:994614 (system bus name :1.2834 [pkttyagent --process 202833 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Nov 28 09:28:32 np0005538513.localdomain polkitd[1036]: Unregistered Authentication Agent for unix-process:202833:994614 (system bus name :1.2834, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Nov 28 09:28:32 np0005538513.localdomain sudo[202830]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:32 np0005538513.localdomain sudo[202946]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujbakssosvjswzaibqqlwxhcgukwyqti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322112.2667797-3311-59644347211307/AnsiballZ_copy.py
Nov 28 09:28:32 np0005538513.localdomain sudo[202946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:32 np0005538513.localdomain python3.9[202948]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:32 np0005538513.localdomain sudo[202946]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:33 np0005538513.localdomain sudo[203056]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptknvxcqwlvbwkmkrdlcfdgbyfyvjwtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322113.0027072-3335-188960797998421/AnsiballZ_stat.py
Nov 28 09:28:33 np0005538513.localdomain sudo[203056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25871 DF PROTO=TCP SPT=52464 DPT=9102 SEQ=3433454233 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB19A5420000000001030307) 
Nov 28 09:28:33 np0005538513.localdomain python3.9[203058]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:33 np0005538513.localdomain sudo[203056]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:33 np0005538513.localdomain sudo[203144]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izrgmkdrhdhhnnzueejkqmxyymaotzzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322113.0027072-3335-188960797998421/AnsiballZ_copy.py
Nov 28 09:28:33 np0005538513.localdomain sudo[203144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:34 np0005538513.localdomain python3.9[203146]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322113.0027072-3335-188960797998421/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=dc5ee7162311c27a6084cbee4052b901d56cb1ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:34 np0005538513.localdomain sudo[203144]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:28:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6000.1 total, 600.0 interval
                                                          Cumulative writes: 4899 writes, 22K keys, 4899 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4899 writes, 608 syncs, 8.06 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0a2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0a2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0a2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 28 09:28:34 np0005538513.localdomain sudo[203254]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqqtjjjeepfhoglmlbtrmeowigiromjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322114.4833958-3383-193401447091394/AnsiballZ_file.py
Nov 28 09:28:34 np0005538513.localdomain sudo[203254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:34 np0005538513.localdomain python3.9[203256]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:34 np0005538513.localdomain sudo[203254]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:36 np0005538513.localdomain sudo[203364]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ahgfetgydxawkpohqllghxjgbcyyabcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322115.1892395-3407-117454912723166/AnsiballZ_stat.py
Nov 28 09:28:36 np0005538513.localdomain sudo[203364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:36 np0005538513.localdomain sshd[203367]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:28:36 np0005538513.localdomain python3.9[203366]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:36 np0005538513.localdomain sudo[203364]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57212 DF PROTO=TCP SPT=33054 DPT=9100 SEQ=2612205495 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB19B0C30000000001030307) 
Nov 28 09:28:36 np0005538513.localdomain sudo[203423]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdszlyczechhdmwjfempailsezvbcaze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322115.1892395-3407-117454912723166/AnsiballZ_file.py
Nov 28 09:28:36 np0005538513.localdomain sudo[203423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:36 np0005538513.localdomain python3.9[203425]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:36 np0005538513.localdomain sudo[203423]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:36 np0005538513.localdomain sshd[203367]: Received disconnect from 80.94.93.233 port 29458:11:  [preauth]
Nov 28 09:28:36 np0005538513.localdomain sshd[203367]: Disconnected from authenticating user root 80.94.93.233 port 29458 [preauth]
Nov 28 09:28:37 np0005538513.localdomain sudo[203533]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tuhslchnqnnekboqzizwfmfdgfidgwkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322116.984516-3443-26321065371403/AnsiballZ_stat.py
Nov 28 09:28:37 np0005538513.localdomain sudo[203533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:37 np0005538513.localdomain python3.9[203535]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:37 np0005538513.localdomain sudo[203533]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:38 np0005538513.localdomain sudo[203590]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlspgqadnzdnfqxnanypcinmzpogbcwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322116.984516-3443-26321065371403/AnsiballZ_file.py
Nov 28 09:28:38 np0005538513.localdomain sudo[203590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:38 np0005538513.localdomain python3.9[203592]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.i0b36f21 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:38 np0005538513.localdomain sudo[203590]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:38 np0005538513.localdomain sudo[203700]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lukjsdgisivavrjasunkftlllsewyfbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322118.3964388-3479-95120217457590/AnsiballZ_stat.py
Nov 28 09:28:38 np0005538513.localdomain sudo[203700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:38 np0005538513.localdomain python3.9[203702]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:38 np0005538513.localdomain sudo[203700]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:39 np0005538513.localdomain sudo[203711]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:28:39 np0005538513.localdomain sudo[203711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:28:39 np0005538513.localdomain sudo[203711]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:39 np0005538513.localdomain sudo[203744]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:28:39 np0005538513.localdomain sudo[203744]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:28:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1160 DF PROTO=TCP SPT=37064 DPT=9100 SEQ=1146749385 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB19BB820000000001030307) 
Nov 28 09:28:39 np0005538513.localdomain sudo[203793]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jsodrixxbfyafxxckejfiklgyoghspyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322118.3964388-3479-95120217457590/AnsiballZ_file.py
Nov 28 09:28:39 np0005538513.localdomain sudo[203793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:39 np0005538513.localdomain python3.9[203795]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:39 np0005538513.localdomain sudo[203793]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:28:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6000.3 total, 600.0 interval
                                                          Cumulative writes: 5616 writes, 25K keys, 5616 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5616 writes, 758 syncs, 7.41 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.3 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.3 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.3 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.3 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.3 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.3 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.3 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.3 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5584391482d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 1.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.3 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5584391482d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 1.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.3 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5584391482d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 1.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.032       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.032       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.032       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.3 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.3 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 28 09:28:39 np0005538513.localdomain sudo[203744]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:40 np0005538513.localdomain sudo[203934]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqngyflwpqzwiwdayodgyombzylmppmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322119.731244-3518-8142864495003/AnsiballZ_command.py
Nov 28 09:28:40 np0005538513.localdomain sudo[203934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:40 np0005538513.localdomain python3.9[203936]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:28:40 np0005538513.localdomain sudo[203934]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:40 np0005538513.localdomain sudo[203938]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:28:40 np0005538513.localdomain sudo[203938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:28:40 np0005538513.localdomain sudo[203938]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:40 np0005538513.localdomain sudo[204063]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xdzrmrtiesdbyefbunfinnyjelvdancp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764322120.4941368-3542-139742984776175/AnsiballZ_edpm_nftables_from_files.py
Nov 28 09:28:40 np0005538513.localdomain sudo[204063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:41 np0005538513.localdomain python3[204065]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 28 09:28:41 np0005538513.localdomain sudo[204063]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:41 np0005538513.localdomain sudo[204173]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxpioovehzlpbcyywlgwpsmwwmyzexjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322121.345324-3566-46614013207917/AnsiballZ_stat.py
Nov 28 09:28:41 np0005538513.localdomain sudo[204173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:41 np0005538513.localdomain python3.9[204175]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:41 np0005538513.localdomain sudo[204173]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:42 np0005538513.localdomain sudo[204230]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjzhjmxrswqnjgkmbcdujfleijxilldg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322121.345324-3566-46614013207917/AnsiballZ_file.py
Nov 28 09:28:42 np0005538513.localdomain sudo[204230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:42 np0005538513.localdomain python3.9[204232]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:42 np0005538513.localdomain sudo[204230]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57214 DF PROTO=TCP SPT=33054 DPT=9100 SEQ=2612205495 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB19C8830000000001030307) 
Nov 28 09:28:42 np0005538513.localdomain sudo[204340]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cyyrrksncvodbnblxzwxfbgrjwesbpee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322122.5703113-3602-265751134635162/AnsiballZ_stat.py
Nov 28 09:28:42 np0005538513.localdomain sudo[204340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:43 np0005538513.localdomain python3.9[204342]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:43 np0005538513.localdomain sudo[204340]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:43 np0005538513.localdomain sudo[204397]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imubxutsktasrwvodiwecrtoxbpcrigl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322122.5703113-3602-265751134635162/AnsiballZ_file.py
Nov 28 09:28:43 np0005538513.localdomain sudo[204397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:43 np0005538513.localdomain python3.9[204399]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:43 np0005538513.localdomain sudo[204397]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:44 np0005538513.localdomain sudo[204507]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ekagbicrfzooowcdyaicxnkjaefekvme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322123.8079145-3638-168950214871612/AnsiballZ_stat.py
Nov 28 09:28:44 np0005538513.localdomain sudo[204507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:44 np0005538513.localdomain python3.9[204509]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:44 np0005538513.localdomain sudo[204507]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:44 np0005538513.localdomain sudo[204564]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfqrzkgshubhvlecbhgvoixcnmquatdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322123.8079145-3638-168950214871612/AnsiballZ_file.py
Nov 28 09:28:44 np0005538513.localdomain sudo[204564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:44 np0005538513.localdomain python3.9[204566]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:44 np0005538513.localdomain sudo[204564]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:45 np0005538513.localdomain sudo[204674]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrburmgdksmzigvvaxgqfbezyrdrwmya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322125.0373018-3674-126979776832530/AnsiballZ_stat.py
Nov 28 09:28:45 np0005538513.localdomain sudo[204674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:45 np0005538513.localdomain python3.9[204676]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:45 np0005538513.localdomain sudo[204674]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:45 np0005538513.localdomain sudo[204731]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxnotdvghjvhfrgbtapytfnxljtlqngd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322125.0373018-3674-126979776832530/AnsiballZ_file.py
Nov 28 09:28:45 np0005538513.localdomain sudo[204731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:45 np0005538513.localdomain python3.9[204733]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:45 np0005538513.localdomain sudo[204731]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:46 np0005538513.localdomain sudo[204841]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlswljiyntdhgycxidtohxhoinbvtkeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322126.3687296-3710-113660550658151/AnsiballZ_stat.py
Nov 28 09:28:46 np0005538513.localdomain sudo[204841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:46 np0005538513.localdomain python3.9[204843]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:46 np0005538513.localdomain sudo[204841]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24756 DF PROTO=TCP SPT=41768 DPT=9882 SEQ=3860430018 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB19DB020000000001030307) 
Nov 28 09:28:47 np0005538513.localdomain sudo[204931]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbsizvenwxzlsagxzqoltbfkctfoljgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322126.3687296-3710-113660550658151/AnsiballZ_copy.py
Nov 28 09:28:47 np0005538513.localdomain sudo[204931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:47 np0005538513.localdomain python3.9[204933]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322126.3687296-3710-113660550658151/.source.nft follow=False _original_basename=ruleset.j2 checksum=e2e2635f27347d386f310e86d2b40c40289835bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:47 np0005538513.localdomain sudo[204931]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:48 np0005538513.localdomain sudo[205041]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ranmfdxfvtppsvdwiztcmhaivgcjpmia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322127.7896736-3755-49770020011923/AnsiballZ_file.py
Nov 28 09:28:48 np0005538513.localdomain sudo[205041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:48 np0005538513.localdomain python3.9[205043]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:48 np0005538513.localdomain sudo[205041]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12056 DF PROTO=TCP SPT=60146 DPT=9105 SEQ=2127023387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB19E2830000000001030307) 
Nov 28 09:28:49 np0005538513.localdomain sudo[205151]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rurqhtugbqatsatujnqhsoebscssrign ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322128.980044-3779-227534640219943/AnsiballZ_command.py
Nov 28 09:28:49 np0005538513.localdomain sudo[205151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:49 np0005538513.localdomain python3.9[205153]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:28:49 np0005538513.localdomain sudo[205151]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:50 np0005538513.localdomain sudo[205264]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jrfqsmqvpfddmlsocydlafgswgnsvqtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322129.6755183-3803-238495455215918/AnsiballZ_blockinfile.py
Nov 28 09:28:50 np0005538513.localdomain sudo[205264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:50 np0005538513.localdomain python3.9[205266]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:50 np0005538513.localdomain sudo[205264]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:28:50.781 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:28:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:28:50.782 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:28:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:28:50.783 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:28:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12057 DF PROTO=TCP SPT=60146 DPT=9105 SEQ=2127023387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB19EA820000000001030307) 
Nov 28 09:28:51 np0005538513.localdomain sudo[205374]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aiofrasfrpuucudhbtpolxvluwjpruwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322131.3093562-3830-272280972055720/AnsiballZ_command.py
Nov 28 09:28:51 np0005538513.localdomain sudo[205374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:51 np0005538513.localdomain python3.9[205376]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:28:51 np0005538513.localdomain sudo[205374]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:52 np0005538513.localdomain sudo[205485]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fylmkjrnwpovwaaxgdydomvcjfuhgdgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322131.9729192-3854-33070941749537/AnsiballZ_stat.py
Nov 28 09:28:52 np0005538513.localdomain sudo[205485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:52 np0005538513.localdomain python3.9[205487]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:28:52 np0005538513.localdomain sudo[205485]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:52 np0005538513.localdomain sudo[205597]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qieggnlsxvbjgdawswftxjqbtkzasxgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322132.6825182-3878-58733666706274/AnsiballZ_command.py
Nov 28 09:28:52 np0005538513.localdomain sudo[205597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:28:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:28:53 np0005538513.localdomain systemd[1]: tmp-crun.QgekrN.mount: Deactivated successfully.
Nov 28 09:28:53 np0005538513.localdomain podman[205600]: 2025-11-28 09:28:53.092697346 +0000 UTC m=+0.100160106 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:28:53 np0005538513.localdomain podman[205600]: 2025-11-28 09:28:53.137867976 +0000 UTC m=+0.145330716 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 09:28:53 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:28:53 np0005538513.localdomain python3.9[205599]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:28:53 np0005538513.localdomain sudo[205597]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:53 np0005538513.localdomain podman[205616]: 2025-11-28 09:28:53.232412064 +0000 UTC m=+0.136067204 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 09:28:53 np0005538513.localdomain podman[205616]: 2025-11-28 09:28:53.26249883 +0000 UTC m=+0.166153960 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 28 09:28:53 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:28:53 np0005538513.localdomain sudo[205753]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmqdcxkxckfhdiujdqzoqqkmbrdxrrqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322133.3852918-3902-139937128928907/AnsiballZ_file.py
Nov 28 09:28:53 np0005538513.localdomain sudo[205753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:53 np0005538513.localdomain python3.9[205755]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:53 np0005538513.localdomain sudo[205753]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:54 np0005538513.localdomain sudo[205863]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnnnpuotwpxuxvrtyimnhemsqhakukvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322134.1404881-3926-157309330453287/AnsiballZ_stat.py
Nov 28 09:28:54 np0005538513.localdomain sudo[205863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:54 np0005538513.localdomain python3.9[205865]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:54 np0005538513.localdomain sudo[205863]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:54 np0005538513.localdomain sudo[205951]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dinxlbrnhhzlxbbzjnfbwlvluazojhmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322134.1404881-3926-157309330453287/AnsiballZ_copy.py
Nov 28 09:28:54 np0005538513.localdomain sudo[205951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:55 np0005538513.localdomain python3.9[205953]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322134.1404881-3926-157309330453287/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:55 np0005538513.localdomain sudo[205951]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12058 DF PROTO=TCP SPT=60146 DPT=9105 SEQ=2127023387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB19FA430000000001030307) 
Nov 28 09:28:56 np0005538513.localdomain sudo[206061]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aowsrmflbgassxylranjhuilrfbmzxhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322135.803836-3971-237242952264560/AnsiballZ_stat.py
Nov 28 09:28:56 np0005538513.localdomain sudo[206061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:56 np0005538513.localdomain python3.9[206063]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:56 np0005538513.localdomain sudo[206061]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:56 np0005538513.localdomain sudo[206149]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iftmsxbjktabyiqtzitobvngbncbywbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322135.803836-3971-237242952264560/AnsiballZ_copy.py
Nov 28 09:28:56 np0005538513.localdomain sudo[206149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:56 np0005538513.localdomain python3.9[206151]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322135.803836-3971-237242952264560/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:56 np0005538513.localdomain sudo[206149]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:57 np0005538513.localdomain sudo[206259]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ifynhofncpyvirbblabblqeakobddxan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322137.0955102-4016-177001413400419/AnsiballZ_stat.py
Nov 28 09:28:57 np0005538513.localdomain sudo[206259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:57 np0005538513.localdomain python3.9[206261]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:57 np0005538513.localdomain sudo[206259]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:57 np0005538513.localdomain sudo[206347]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbocrmdgifwqndznoxtrpywvvhmaivsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322137.0955102-4016-177001413400419/AnsiballZ_copy.py
Nov 28 09:28:57 np0005538513.localdomain sudo[206347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46199 DF PROTO=TCP SPT=52532 DPT=9101 SEQ=2886924465 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A05690000000001030307) 
Nov 28 09:28:58 np0005538513.localdomain python3.9[206349]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322137.0955102-4016-177001413400419/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:58 np0005538513.localdomain sudo[206347]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:58 np0005538513.localdomain sudo[206457]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ullfyjfxzxooctlfromzoswcasosqmow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322138.3020892-4061-225476332430747/AnsiballZ_systemd.py
Nov 28 09:28:58 np0005538513.localdomain sudo[206457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:58 np0005538513.localdomain python3.9[206459]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:28:58 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:28:58 np0005538513.localdomain systemd-sysv-generator[206485]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:28:59 np0005538513.localdomain systemd-rc-local-generator[206480]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:28:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:28:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:59 np0005538513.localdomain systemd[1]: Reached target edpm_libvirt.target.
Nov 28 09:28:59 np0005538513.localdomain sudo[206457]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:00 np0005538513.localdomain sudo[206606]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vorkddkxwgutpxszbcwhqckhhkmquiuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322140.1553996-4085-163374551334974/AnsiballZ_systemd.py
Nov 28 09:29:00 np0005538513.localdomain sudo[206606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:00 np0005538513.localdomain python3.9[206608]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 28 09:29:00 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:29:00 np0005538513.localdomain systemd-sysv-generator[206636]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:29:00 np0005538513.localdomain systemd-rc-local-generator[206632]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:29:00 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:00 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:00 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:00 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:00 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:29:00 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:00 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:00 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:00 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:01 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:29:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46201 DF PROTO=TCP SPT=52532 DPT=9101 SEQ=2886924465 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A11820000000001030307) 
Nov 28 09:29:01 np0005538513.localdomain systemd-sysv-generator[206676]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:29:01 np0005538513.localdomain systemd-rc-local-generator[206670]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:29:01 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:01 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:01 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:01 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:01 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:29:01 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:01 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:01 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:01 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:01 np0005538513.localdomain sudo[206606]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:02 np0005538513.localdomain sshd[158270]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:29:02 np0005538513.localdomain systemd[1]: session-52.scope: Deactivated successfully.
Nov 28 09:29:02 np0005538513.localdomain systemd[1]: session-52.scope: Consumed 3min 37.274s CPU time.
Nov 28 09:29:02 np0005538513.localdomain systemd-logind[764]: Session 52 logged out. Waiting for processes to exit.
Nov 28 09:29:02 np0005538513.localdomain systemd-logind[764]: Removed session 52.
Nov 28 09:29:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32858 DF PROTO=TCP SPT=37284 DPT=9102 SEQ=3188745367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A1A420000000001030307) 
Nov 28 09:29:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65121 DF PROTO=TCP SPT=50028 DPT=9102 SEQ=2430455319 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A25820000000001030307) 
Nov 28 09:29:08 np0005538513.localdomain sshd[206699]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:29:08 np0005538513.localdomain sshd[206699]: Accepted publickey for zuul from 192.168.122.30 port 41330 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:29:08 np0005538513.localdomain systemd-logind[764]: New session 53 of user zuul.
Nov 28 09:29:08 np0005538513.localdomain systemd[1]: Started Session 53 of User zuul.
Nov 28 09:29:08 np0005538513.localdomain sshd[206699]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:29:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45373 DF PROTO=TCP SPT=43214 DPT=9100 SEQ=785439338 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A31820000000001030307) 
Nov 28 09:29:09 np0005538513.localdomain python3.9[206810]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:29:11 np0005538513.localdomain python3.9[206922]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:29:11 np0005538513.localdomain network[206939]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:29:11 np0005538513.localdomain network[206940]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:29:11 np0005538513.localdomain network[206941]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:29:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14522 DF PROTO=TCP SPT=49530 DPT=9100 SEQ=3191443101 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A3DC20000000001030307) 
Nov 28 09:29:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:29:17 np0005538513.localdomain sudo[207171]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmwfyljbaxhtdjsxyyruscuqbriwfnqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322156.7818153-101-68744285231564/AnsiballZ_setup.py
Nov 28 09:29:17 np0005538513.localdomain sudo[207171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7421 DF PROTO=TCP SPT=46362 DPT=9882 SEQ=2306459442 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A50430000000001030307) 
Nov 28 09:29:17 np0005538513.localdomain python3.9[207173]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:29:17 np0005538513.localdomain sudo[207171]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:18 np0005538513.localdomain sudo[207234]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzsyaxaxxpngpsehuonnhomdiymcztig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322156.7818153-101-68744285231564/AnsiballZ_dnf.py
Nov 28 09:29:18 np0005538513.localdomain sudo[207234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:18 np0005538513.localdomain python3.9[207236]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:29:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38734 DF PROTO=TCP SPT=50730 DPT=9105 SEQ=3294590754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A57C20000000001030307) 
Nov 28 09:29:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38735 DF PROTO=TCP SPT=50730 DPT=9105 SEQ=3294590754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A5FC20000000001030307) 
Nov 28 09:29:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:29:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:29:23 np0005538513.localdomain systemd[1]: tmp-crun.KrZMuq.mount: Deactivated successfully.
Nov 28 09:29:23 np0005538513.localdomain podman[207239]: 2025-11-28 09:29:23.858262462 +0000 UTC m=+0.096063149 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 28 09:29:23 np0005538513.localdomain podman[207240]: 2025-11-28 09:29:23.90119606 +0000 UTC m=+0.138118929 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 28 09:29:23 np0005538513.localdomain podman[207240]: 2025-11-28 09:29:23.910361777 +0000 UTC m=+0.147284676 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:29:23 np0005538513.localdomain podman[207239]: 2025-11-28 09:29:23.920060402 +0000 UTC m=+0.157861119 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 28 09:29:23 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:29:23 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:29:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38736 DF PROTO=TCP SPT=50730 DPT=9105 SEQ=3294590754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A6F820000000001030307) 
Nov 28 09:29:25 np0005538513.localdomain sudo[207234]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:26 np0005538513.localdomain sudo[207394]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpkkixctdxhdxpgwyxmrcabtfaivutzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322165.6565008-137-48759531219565/AnsiballZ_stat.py
Nov 28 09:29:26 np0005538513.localdomain sudo[207394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:26 np0005538513.localdomain python3.9[207396]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:29:26 np0005538513.localdomain sudo[207394]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:27 np0005538513.localdomain sudo[207506]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imkclacoyyjysuawhsjcokcxridmctva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322167.068028-161-39543107287890/AnsiballZ_copy.py
Nov 28 09:29:27 np0005538513.localdomain sudo[207506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:27 np0005538513.localdomain python3.9[207508]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi mode=preserve remote_src=True src=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi/ backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:27 np0005538513.localdomain sudo[207506]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48345 DF PROTO=TCP SPT=41214 DPT=9101 SEQ=1171061035 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A7A9A0000000001030307) 
Nov 28 09:29:28 np0005538513.localdomain sudo[207616]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xaxukndyxitfnqbhsrbbfpbwkglxwhlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322167.871052-185-68781404564851/AnsiballZ_command.py
Nov 28 09:29:28 np0005538513.localdomain sudo[207616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:28 np0005538513.localdomain python3.9[207618]: ansible-ansible.legacy.command Invoked with _raw_params=mv "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi" "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi.adopted"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:29:28 np0005538513.localdomain sudo[207616]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:28 np0005538513.localdomain sudo[207727]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-embkhvkzrjrltzrcfxndduugfmrhmxqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322168.6831172-209-197937979794500/AnsiballZ_command.py
Nov 28 09:29:28 np0005538513.localdomain sudo[207727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:29 np0005538513.localdomain python3.9[207729]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:29:29 np0005538513.localdomain sudo[207727]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:29 np0005538513.localdomain sudo[207838]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rsrufxfugmcqhrdkkjbxxjjkmparmsuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322169.4014962-233-15598731621457/AnsiballZ_command.py
Nov 28 09:29:29 np0005538513.localdomain sudo[207838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:29 np0005538513.localdomain python3.9[207840]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -rF /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:29:29 np0005538513.localdomain sudo[207838]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:30 np0005538513.localdomain sudo[207949]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwvbjxcmiuryftdbtihdvayplhplbqlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322170.2215889-260-147648652427412/AnsiballZ_stat.py
Nov 28 09:29:30 np0005538513.localdomain sudo[207949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:30 np0005538513.localdomain python3.9[207951]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:29:30 np0005538513.localdomain sudo[207949]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48347 DF PROTO=TCP SPT=41214 DPT=9101 SEQ=1171061035 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A86C20000000001030307) 
Nov 28 09:29:31 np0005538513.localdomain sudo[208061]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tueualghcyrpusefworrywmuwmfdmape ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322171.1142926-293-190123121472607/AnsiballZ_lineinfile.py
Nov 28 09:29:31 np0005538513.localdomain sudo[208061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:31 np0005538513.localdomain python3.9[208063]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:31 np0005538513.localdomain sudo[208061]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:32 np0005538513.localdomain sudo[208171]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hzkzphvxzxkwvqtjplshkhqivitexsiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322172.0683098-320-190330336813250/AnsiballZ_systemd_service.py
Nov 28 09:29:32 np0005538513.localdomain sudo[208171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:33 np0005538513.localdomain python3.9[208173]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:29:33 np0005538513.localdomain systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 28 09:29:33 np0005538513.localdomain sudo[208171]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56791 DF PROTO=TCP SPT=35450 DPT=9102 SEQ=980278398 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A8F820000000001030307) 
Nov 28 09:29:34 np0005538513.localdomain sudo[208285]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrnpnvypxvxqfvnvecanevbwicvwgxvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322174.3329327-344-123450660826827/AnsiballZ_systemd_service.py
Nov 28 09:29:34 np0005538513.localdomain sudo[208285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:34 np0005538513.localdomain python3.9[208287]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:29:36 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:29:36 np0005538513.localdomain systemd-rc-local-generator[208310]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:29:36 np0005538513.localdomain systemd-sysv-generator[208318]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:29:36 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:36 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:36 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:36 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:36 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:29:36 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:36 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:36 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:36 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:36 np0005538513.localdomain systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 28 09:29:36 np0005538513.localdomain systemd[1]: Starting Open-iSCSI...
Nov 28 09:29:36 np0005538513.localdomain iscsid[208328]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi
Nov 28 09:29:36 np0005538513.localdomain iscsid[208328]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.<reversed domain name>[:identifier].
Nov 28 09:29:36 np0005538513.localdomain iscsid[208328]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6.
Nov 28 09:29:36 np0005538513.localdomain iscsid[208328]: If using hardware iscsi like qla4xxx this message can be ignored.
Nov 28 09:29:36 np0005538513.localdomain iscsid[208328]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi
Nov 28 09:29:36 np0005538513.localdomain iscsid[208328]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf
Nov 28 09:29:36 np0005538513.localdomain iscsid[208328]: iscsid: can't open iscsid.ipc_auth_uid configuration file /etc/iscsi/iscsid.conf
Nov 28 09:29:36 np0005538513.localdomain systemd[1]: Started Open-iSCSI.
Nov 28 09:29:36 np0005538513.localdomain systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 28 09:29:36 np0005538513.localdomain systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 28 09:29:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54530 DF PROTO=TCP SPT=43538 DPT=9100 SEQ=1618416508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A9B020000000001030307) 
Nov 28 09:29:36 np0005538513.localdomain sudo[208285]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:37 np0005538513.localdomain sudo[208437]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhcgzrbdgnzrwqydzezanrhbbbkkulxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322177.0546927-377-195294778989045/AnsiballZ_service_facts.py
Nov 28 09:29:37 np0005538513.localdomain sudo[208437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:37 np0005538513.localdomain python3.9[208439]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:29:37 np0005538513.localdomain network[208456]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:29:37 np0005538513.localdomain network[208457]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:29:37 np0005538513.localdomain network[208458]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:29:38 np0005538513.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 28 09:29:38 np0005538513.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 28 09:29:38 np0005538513.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service.
Nov 28 09:29:38 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:29:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56793 DF PROTO=TCP SPT=35450 DPT=9102 SEQ=980278398 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1AA7420000000001030307) 
Nov 28 09:29:39 np0005538513.localdomain setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 0a93104a-0288-491f-bdb6-6915108f9856
Nov 28 09:29:39 np0005538513.localdomain setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Nov 28 09:29:39 np0005538513.localdomain setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 0a93104a-0288-491f-bdb6-6915108f9856
Nov 28 09:29:39 np0005538513.localdomain setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Nov 28 09:29:39 np0005538513.localdomain setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 0a93104a-0288-491f-bdb6-6915108f9856
Nov 28 09:29:39 np0005538513.localdomain setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Nov 28 09:29:39 np0005538513.localdomain setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 0a93104a-0288-491f-bdb6-6915108f9856
Nov 28 09:29:39 np0005538513.localdomain setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Nov 28 09:29:39 np0005538513.localdomain setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 0a93104a-0288-491f-bdb6-6915108f9856
Nov 28 09:29:39 np0005538513.localdomain setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Nov 28 09:29:39 np0005538513.localdomain setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 0a93104a-0288-491f-bdb6-6915108f9856
Nov 28 09:29:39 np0005538513.localdomain setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Nov 28 09:29:40 np0005538513.localdomain sudo[208593]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:29:40 np0005538513.localdomain sudo[208593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:29:40 np0005538513.localdomain sudo[208593]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:40 np0005538513.localdomain sudo[208437]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:40 np0005538513.localdomain sudo[208616]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:29:40 np0005538513.localdomain sudo[208616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:29:41 np0005538513.localdomain sudo[208616]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:41 np0005538513.localdomain sudo[208683]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:29:41 np0005538513.localdomain sudo[208683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:29:41 np0005538513.localdomain sudo[208683]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7423 DF PROTO=TCP SPT=46362 DPT=9882 SEQ=2306459442 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1AB1820000000001030307) 
Nov 28 09:29:43 np0005538513.localdomain sudo[208791]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzxdtxmzrqgzmbkhendkbgmwhtjugbjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322183.0101683-407-46325513301485/AnsiballZ_file.py
Nov 28 09:29:43 np0005538513.localdomain sudo[208791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:43 np0005538513.localdomain python3.9[208793]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 09:29:43 np0005538513.localdomain sudo[208791]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:44 np0005538513.localdomain sudo[208901]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkpyysebsqmdghevnjezcurpsxuonovi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322183.9532-431-15277102471081/AnsiballZ_modprobe.py
Nov 28 09:29:44 np0005538513.localdomain sudo[208901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:44 np0005538513.localdomain python3.9[208903]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 28 09:29:44 np0005538513.localdomain sudo[208901]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:45 np0005538513.localdomain sudo[209015]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvadsmgkcvpkdxlfabhzjdcpreczzvhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322184.7983687-455-247933568056599/AnsiballZ_stat.py
Nov 28 09:29:45 np0005538513.localdomain sudo[209015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:45 np0005538513.localdomain python3.9[209017]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:29:45 np0005538513.localdomain sudo[209015]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:45 np0005538513.localdomain sudo[209103]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kyngykfmsrgxhvaebhtjbbtzsrluccoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322184.7983687-455-247933568056599/AnsiballZ_copy.py
Nov 28 09:29:45 np0005538513.localdomain sudo[209103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:45 np0005538513.localdomain python3.9[209105]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322184.7983687-455-247933568056599/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:45 np0005538513.localdomain sudo[209103]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:46 np0005538513.localdomain sudo[209213]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vgtzvpxtvxolytpyqpjulajmsiizfauz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322186.2315893-503-70128894033973/AnsiballZ_lineinfile.py
Nov 28 09:29:46 np0005538513.localdomain sudo[209213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:46 np0005538513.localdomain python3.9[209215]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:46 np0005538513.localdomain sudo[209213]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11199 DF PROTO=TCP SPT=45946 DPT=9882 SEQ=968156560 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1AC5430000000001030307) 
Nov 28 09:29:47 np0005538513.localdomain sudo[209323]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktogzeavzsnfyoaauecwepmcospzhilo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322186.937306-527-183754652363946/AnsiballZ_systemd.py
Nov 28 09:29:47 np0005538513.localdomain sudo[209323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:47 np0005538513.localdomain python3.9[209325]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:29:47 np0005538513.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 28 09:29:47 np0005538513.localdomain systemd[1]: Stopped Load Kernel Modules.
Nov 28 09:29:47 np0005538513.localdomain systemd[1]: Stopping Load Kernel Modules...
Nov 28 09:29:47 np0005538513.localdomain systemd[1]: Starting Load Kernel Modules...
Nov 28 09:29:47 np0005538513.localdomain systemd-modules-load[209329]: Module 'msr' is built in
Nov 28 09:29:47 np0005538513.localdomain systemd[1]: Finished Load Kernel Modules.
Nov 28 09:29:47 np0005538513.localdomain sudo[209323]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:48 np0005538513.localdomain sudo[209437]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sbygbbamcohyitotnyqytbxytikwxhco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322188.626518-551-14337656932598/AnsiballZ_file.py
Nov 28 09:29:48 np0005538513.localdomain sudo[209437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:49 np0005538513.localdomain python3.9[209439]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:29:49 np0005538513.localdomain sudo[209437]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32545 DF PROTO=TCP SPT=38806 DPT=9105 SEQ=4287295748 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1ACD020000000001030307) 
Nov 28 09:29:49 np0005538513.localdomain sudo[209547]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwocenqdqbggqoamfgdbioguygdowomc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322189.406827-578-214547495270940/AnsiballZ_stat.py
Nov 28 09:29:49 np0005538513.localdomain sudo[209547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:49 np0005538513.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service: Deactivated successfully.
Nov 28 09:29:49 np0005538513.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 28 09:29:49 np0005538513.localdomain python3.9[209549]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:29:49 np0005538513.localdomain sudo[209547]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:29:50.783 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:29:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:29:50.784 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:29:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:29:50.785 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:29:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32546 DF PROTO=TCP SPT=38806 DPT=9105 SEQ=4287295748 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1AD5020000000001030307) 
Nov 28 09:29:51 np0005538513.localdomain sudo[209657]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fznalffyhlsngwafniqcrohudwxfywdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322190.2190523-605-64305468068091/AnsiballZ_stat.py
Nov 28 09:29:51 np0005538513.localdomain sudo[209657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:51 np0005538513.localdomain python3.9[209659]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:29:51 np0005538513.localdomain sudo[209657]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:52 np0005538513.localdomain sudo[209767]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jlaldpdxbmnhunbkizrkbgfkxxvhtsvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322191.888861-629-65031329828859/AnsiballZ_stat.py
Nov 28 09:29:52 np0005538513.localdomain sudo[209767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:52 np0005538513.localdomain python3.9[209769]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:29:52 np0005538513.localdomain sudo[209767]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:52 np0005538513.localdomain sudo[209855]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgxgpsiendhwdexpgvljhdbtqzorvxbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322191.888861-629-65031329828859/AnsiballZ_copy.py
Nov 28 09:29:52 np0005538513.localdomain sudo[209855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:52 np0005538513.localdomain python3.9[209857]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322191.888861-629-65031329828859/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:52 np0005538513.localdomain sudo[209855]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:53 np0005538513.localdomain sudo[209965]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cbgrxghjumwzzudxropzaghewtwudals ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322193.1173892-674-266752839793477/AnsiballZ_command.py
Nov 28 09:29:53 np0005538513.localdomain sudo[209965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:53 np0005538513.localdomain python3.9[209967]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:29:53 np0005538513.localdomain sudo[209965]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:54 np0005538513.localdomain sudo[210076]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czepkauoikeayredmclffargngulyehq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322193.8987112-698-102152181528190/AnsiballZ_lineinfile.py
Nov 28 09:29:54 np0005538513.localdomain sudo[210076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:29:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:29:54 np0005538513.localdomain podman[210080]: 2025-11-28 09:29:54.294684594 +0000 UTC m=+0.088431778 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 09:29:54 np0005538513.localdomain podman[210080]: 2025-11-28 09:29:54.329564112 +0000 UTC m=+0.123311306 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true)
Nov 28 09:29:54 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:29:54 np0005538513.localdomain python3.9[210078]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:54 np0005538513.localdomain podman[210079]: 2025-11-28 09:29:54.339420316 +0000 UTC m=+0.133671958 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 09:29:54 np0005538513.localdomain sudo[210076]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:54 np0005538513.localdomain podman[210079]: 2025-11-28 09:29:54.426454035 +0000 UTC m=+0.220705647 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:29:54 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:29:55 np0005538513.localdomain sudo[210229]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngikbcomgooxreqtwthudywxmvrkmcud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322194.6527305-722-38157467362298/AnsiballZ_replace.py
Nov 28 09:29:55 np0005538513.localdomain sudo[210229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32547 DF PROTO=TCP SPT=38806 DPT=9105 SEQ=4287295748 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1AE4C30000000001030307) 
Nov 28 09:29:55 np0005538513.localdomain python3.9[210231]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:55 np0005538513.localdomain sudo[210229]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:55 np0005538513.localdomain sudo[210339]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-towhpbunycmzoqqxbotwwodixrdgmqgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322195.7016442-746-253054688806475/AnsiballZ_replace.py
Nov 28 09:29:55 np0005538513.localdomain sudo[210339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:56 np0005538513.localdomain python3.9[210341]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:56 np0005538513.localdomain sudo[210339]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:56 np0005538513.localdomain sudo[210449]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ebllyjvyrsxvkolizmsqgxqykcbisilg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322196.4519365-773-155999876876204/AnsiballZ_lineinfile.py
Nov 28 09:29:56 np0005538513.localdomain sudo[210449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:56 np0005538513.localdomain python3.9[210451]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:56 np0005538513.localdomain sudo[210449]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:57 np0005538513.localdomain sudo[210559]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-akxbhuvjrobmjlthpctbtxblonwqhbfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322197.1131377-773-232126335100813/AnsiballZ_lineinfile.py
Nov 28 09:29:57 np0005538513.localdomain sudo[210559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:57 np0005538513.localdomain python3.9[210561]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:57 np0005538513.localdomain sudo[210559]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:57 np0005538513.localdomain sudo[210669]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qzmbiecimstlesafcxmvjwkoriajwser ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322197.724351-773-83030766307279/AnsiballZ_lineinfile.py
Nov 28 09:29:57 np0005538513.localdomain sudo[210669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61251 DF PROTO=TCP SPT=52818 DPT=9101 SEQ=222613300 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1AEFCA0000000001030307) 
Nov 28 09:29:58 np0005538513.localdomain python3.9[210671]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:58 np0005538513.localdomain sudo[210669]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:58 np0005538513.localdomain sudo[210779]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-anlarznpxjgxzpyjjoofaulbtcyfaknv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322198.3318958-773-34301560735114/AnsiballZ_lineinfile.py
Nov 28 09:29:58 np0005538513.localdomain sudo[210779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:58 np0005538513.localdomain python3.9[210781]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:58 np0005538513.localdomain sudo[210779]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:59 np0005538513.localdomain sudo[210889]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnvnfqzgrjegvlwzcnhaogfzaobzxqbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322199.0940857-860-135116403155001/AnsiballZ_stat.py
Nov 28 09:29:59 np0005538513.localdomain sudo[210889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:59 np0005538513.localdomain python3.9[210891]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:29:59 np0005538513.localdomain sudo[210889]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:00 np0005538513.localdomain sudo[211001]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvyowavcdmqmnoxezwwfxgolwjhjmhmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322199.8239672-884-277354893450339/AnsiballZ_file.py
Nov 28 09:30:00 np0005538513.localdomain sudo[211001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:00 np0005538513.localdomain python3.9[211003]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:00 np0005538513.localdomain sudo[211001]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:00 np0005538513.localdomain sudo[211111]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-znlnffuhucmpcpwyndqjvfphbxbgjmnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322200.6599627-911-107328328221375/AnsiballZ_file.py
Nov 28 09:30:00 np0005538513.localdomain sudo[211111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:01 np0005538513.localdomain python3.9[211113]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:30:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61253 DF PROTO=TCP SPT=52818 DPT=9101 SEQ=222613300 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1AFBC30000000001030307) 
Nov 28 09:30:01 np0005538513.localdomain sudo[211111]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:01 np0005538513.localdomain sudo[211221]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-scxfzmbvzcqjeewyyywxaezwoxlqwuij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322201.4221056-935-27870214813459/AnsiballZ_stat.py
Nov 28 09:30:01 np0005538513.localdomain sudo[211221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:01 np0005538513.localdomain python3.9[211223]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:30:01 np0005538513.localdomain sudo[211221]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:03 np0005538513.localdomain sudo[211278]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uahkqsafqkcuieuovurobzjamgisdmfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322201.4221056-935-27870214813459/AnsiballZ_file.py
Nov 28 09:30:03 np0005538513.localdomain sudo[211278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:03 np0005538513.localdomain python3.9[211280]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:30:03 np0005538513.localdomain sudo[211278]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14122 DF PROTO=TCP SPT=46484 DPT=9102 SEQ=2017328411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B04C30000000001030307) 
Nov 28 09:30:03 np0005538513.localdomain sudo[211388]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lrayrctrtczoxyswfigewsiczxyswbnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322203.407964-935-153512850962525/AnsiballZ_stat.py
Nov 28 09:30:03 np0005538513.localdomain sudo[211388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:03 np0005538513.localdomain python3.9[211390]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:30:03 np0005538513.localdomain sudo[211388]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:04 np0005538513.localdomain sudo[211445]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-doejvynkvbfsupdfltdcaakfipmiktzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322203.407964-935-153512850962525/AnsiballZ_file.py
Nov 28 09:30:04 np0005538513.localdomain sudo[211445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:04 np0005538513.localdomain python3.9[211447]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:30:04 np0005538513.localdomain sudo[211445]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:05 np0005538513.localdomain sudo[211555]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivxqsdxjdiitfzdslvferrezjiwssvub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322205.569355-1004-152660268276090/AnsiballZ_file.py
Nov 28 09:30:05 np0005538513.localdomain sudo[211555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:06 np0005538513.localdomain python3.9[211557]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:06 np0005538513.localdomain sudo[211555]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32863 DF PROTO=TCP SPT=37284 DPT=9102 SEQ=3188745367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B0F820000000001030307) 
Nov 28 09:30:06 np0005538513.localdomain sudo[211665]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jyktrxomzernqgaiwhsljhdtcqeebzsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322206.250341-1028-55473622677846/AnsiballZ_stat.py
Nov 28 09:30:06 np0005538513.localdomain sudo[211665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:06 np0005538513.localdomain python3.9[211667]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:30:06 np0005538513.localdomain sudo[211665]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:06 np0005538513.localdomain sudo[211722]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vzqqelpxpudojxfqsomqsdvdgocygpjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322206.250341-1028-55473622677846/AnsiballZ_file.py
Nov 28 09:30:06 np0005538513.localdomain sudo[211722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:07 np0005538513.localdomain python3.9[211724]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:07 np0005538513.localdomain sudo[211722]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:07 np0005538513.localdomain sudo[211832]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xobrnsvzfdlmojtvpwqsqfajbjdsduhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322207.489848-1064-139606578693201/AnsiballZ_stat.py
Nov 28 09:30:07 np0005538513.localdomain sudo[211832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:07 np0005538513.localdomain python3.9[211834]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:30:08 np0005538513.localdomain sudo[211832]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:08 np0005538513.localdomain sudo[211889]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-syigziddmuquybnhfzakeuepzikoykfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322207.489848-1064-139606578693201/AnsiballZ_file.py
Nov 28 09:30:08 np0005538513.localdomain sudo[211889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:08 np0005538513.localdomain python3.9[211891]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:08 np0005538513.localdomain sudo[211889]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:08 np0005538513.localdomain sudo[211999]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vuvncyrcornffdupaarwqcgonxztcekr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322208.671699-1100-45128568604171/AnsiballZ_systemd.py
Nov 28 09:30:08 np0005538513.localdomain sudo[211999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14525 DF PROTO=TCP SPT=49530 DPT=9100 SEQ=3191443101 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B1B820000000001030307) 
Nov 28 09:30:09 np0005538513.localdomain python3.9[212001]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:30:09 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:30:09 np0005538513.localdomain systemd-rc-local-generator[212024]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:30:09 np0005538513.localdomain systemd-sysv-generator[212029]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:30:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:30:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:09 np0005538513.localdomain sudo[211999]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:10 np0005538513.localdomain sudo[212147]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gickzbpumoxrekbklajtvztauezhqpyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322209.9202096-1124-20502070259634/AnsiballZ_stat.py
Nov 28 09:30:10 np0005538513.localdomain sudo[212147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:10 np0005538513.localdomain python3.9[212149]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:30:10 np0005538513.localdomain sudo[212147]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:10 np0005538513.localdomain sudo[212204]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmhogwxiwmjacvgqtycsbtalqemkfvyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322209.9202096-1124-20502070259634/AnsiballZ_file.py
Nov 28 09:30:10 np0005538513.localdomain sudo[212204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:10 np0005538513.localdomain python3.9[212206]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:10 np0005538513.localdomain sudo[212204]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:11 np0005538513.localdomain sudo[212314]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tajsfnpbloohhplsqvmrgokqibhggesu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322211.2172875-1160-64906120888221/AnsiballZ_stat.py
Nov 28 09:30:11 np0005538513.localdomain sudo[212314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:11 np0005538513.localdomain python3.9[212316]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:30:11 np0005538513.localdomain sudo[212314]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:11 np0005538513.localdomain sudo[212371]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdccqlnnrfurwfrptpvhmdrxdeeankma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322211.2172875-1160-64906120888221/AnsiballZ_file.py
Nov 28 09:30:11 np0005538513.localdomain sudo[212371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:12 np0005538513.localdomain python3.9[212373]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:12 np0005538513.localdomain sudo[212371]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40315 DF PROTO=TCP SPT=42796 DPT=9100 SEQ=2395171770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B28020000000001030307) 
Nov 28 09:30:12 np0005538513.localdomain sudo[212481]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbqgkejrvsboyqonexjeggrtowgeykgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322212.3568084-1196-137034592119408/AnsiballZ_systemd.py
Nov 28 09:30:12 np0005538513.localdomain sudo[212481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:12 np0005538513.localdomain python3.9[212483]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:30:12 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:30:13 np0005538513.localdomain systemd-rc-local-generator[212504]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:30:13 np0005538513.localdomain systemd-sysv-generator[212509]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:30:13 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:13 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:13 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:13 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:13 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:30:13 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:13 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:13 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:13 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:13 np0005538513.localdomain systemd[1]: Starting Create netns directory...
Nov 28 09:30:13 np0005538513.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 09:30:13 np0005538513.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 09:30:13 np0005538513.localdomain systemd[1]: Finished Create netns directory.
Nov 28 09:30:13 np0005538513.localdomain sudo[212481]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:14 np0005538513.localdomain sudo[212632]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhjratiupxhiozcmcklzyjpupstgyynz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322213.8081598-1226-32352821557791/AnsiballZ_file.py
Nov 28 09:30:14 np0005538513.localdomain sudo[212632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:14 np0005538513.localdomain python3.9[212634]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:30:14 np0005538513.localdomain sudo[212632]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:14 np0005538513.localdomain sudo[212742]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrryokpdfqcnbjemjorpyxxbovqhtilg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322214.5086439-1250-82951937879058/AnsiballZ_stat.py
Nov 28 09:30:14 np0005538513.localdomain sudo[212742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:14 np0005538513.localdomain python3.9[212744]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:30:14 np0005538513.localdomain sudo[212742]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:15 np0005538513.localdomain sudo[212830]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gnnlioziqhsozhvldmhmqnjeqtpwfajx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322214.5086439-1250-82951937879058/AnsiballZ_copy.py
Nov 28 09:30:15 np0005538513.localdomain sudo[212830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:15 np0005538513.localdomain python3.9[212832]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322214.5086439-1250-82951937879058/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:30:15 np0005538513.localdomain sudo[212830]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:16 np0005538513.localdomain sudo[212940]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkppzodvbwsfmbaqkivksqhemeagxuvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322216.44722-1301-66132904548676/AnsiballZ_file.py
Nov 28 09:30:16 np0005538513.localdomain sudo[212940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:16 np0005538513.localdomain python3.9[212942]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:30:16 np0005538513.localdomain sudo[212940]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59288 DF PROTO=TCP SPT=48552 DPT=9882 SEQ=2146318426 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B3A830000000001030307) 
Nov 28 09:30:17 np0005538513.localdomain sudo[213050]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybvvuaszkfxmvybudcgtkwuxlnhzvibu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322217.1988795-1325-97036648301814/AnsiballZ_stat.py
Nov 28 09:30:17 np0005538513.localdomain sudo[213050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:17 np0005538513.localdomain python3.9[213052]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:30:17 np0005538513.localdomain sudo[213050]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:18 np0005538513.localdomain systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 28 09:30:18 np0005538513.localdomain sudo[213139]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ttkwixjwlxcgwnlchoctpxakhaygrmlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322217.1988795-1325-97036648301814/AnsiballZ_copy.py
Nov 28 09:30:18 np0005538513.localdomain sudo[213139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:19 np0005538513.localdomain python3.9[213141]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322217.1988795-1325-97036648301814/.source.json _original_basename=._qlnhfg7 follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:19 np0005538513.localdomain sudo[213139]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60076 DF PROTO=TCP SPT=53610 DPT=9105 SEQ=2367419891 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B42020000000001030307) 
Nov 28 09:30:19 np0005538513.localdomain systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 28 09:30:19 np0005538513.localdomain sudo[213250]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ygswqicosfgphcpdjlichbzwhgcklvbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322219.2616317-1370-32568909202457/AnsiballZ_file.py
Nov 28 09:30:19 np0005538513.localdomain sudo[213250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:19 np0005538513.localdomain python3.9[213252]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:19 np0005538513.localdomain sudo[213250]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:20 np0005538513.localdomain sudo[213360]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngnuvdixvqagqwttwooaeswmkmzgqjiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322220.0115564-1394-127483327524026/AnsiballZ_stat.py
Nov 28 09:30:20 np0005538513.localdomain sudo[213360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:20 np0005538513.localdomain sudo[213360]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:20 np0005538513.localdomain sudo[213448]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pimpdnhihkccwbybcciuowqnxtiiisdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322220.0115564-1394-127483327524026/AnsiballZ_copy.py
Nov 28 09:30:20 np0005538513.localdomain sudo[213448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:21 np0005538513.localdomain sudo[213448]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60077 DF PROTO=TCP SPT=53610 DPT=9105 SEQ=2367419891 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B4A020000000001030307) 
Nov 28 09:30:22 np0005538513.localdomain sudo[213558]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxhcxhpfnuhchnjmyghwtennbrassiyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322221.5032232-1445-237439471497637/AnsiballZ_container_config_data.py
Nov 28 09:30:22 np0005538513.localdomain sudo[213558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:22 np0005538513.localdomain python3.9[213560]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 28 09:30:22 np0005538513.localdomain sudo[213558]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:23 np0005538513.localdomain sudo[213668]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osjvviggonueysexhzodyqglkotudxud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322222.6273856-1472-19057907387361/AnsiballZ_container_config_hash.py
Nov 28 09:30:23 np0005538513.localdomain sudo[213668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:23 np0005538513.localdomain python3.9[213670]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:30:23 np0005538513.localdomain sudo[213668]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:24 np0005538513.localdomain sudo[213778]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbhpspnixvlzulzzfjvawgyneptijqgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322223.6281002-1499-241861666653998/AnsiballZ_podman_container_info.py
Nov 28 09:30:24 np0005538513.localdomain sudo[213778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:24 np0005538513.localdomain python3.9[213780]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 28 09:30:24 np0005538513.localdomain sudo[213778]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:30:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:30:24 np0005538513.localdomain podman[213809]: 2025-11-28 09:30:24.846507941 +0000 UTC m=+0.083662652 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 28 09:30:24 np0005538513.localdomain systemd[1]: tmp-crun.dgdKyr.mount: Deactivated successfully.
Nov 28 09:30:24 np0005538513.localdomain podman[213809]: 2025-11-28 09:30:24.909594024 +0000 UTC m=+0.146748735 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:30:24 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:30:24 np0005538513.localdomain podman[213812]: 2025-11-28 09:30:24.91003562 +0000 UTC m=+0.146592240 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 28 09:30:24 np0005538513.localdomain podman[213812]: 2025-11-28 09:30:24.991826535 +0000 UTC m=+0.228383125 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 09:30:25 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:30:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60078 DF PROTO=TCP SPT=53610 DPT=9105 SEQ=2367419891 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B59C20000000001030307) 
Nov 28 09:30:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38726 DF PROTO=TCP SPT=58704 DPT=9101 SEQ=4233587836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B64F90000000001030307) 
Nov 28 09:30:28 np0005538513.localdomain sudo[213958]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktikifqseuenequcfhwfovhrzvebognt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764322228.063885-1538-270744047095731/AnsiballZ_edpm_container_manage.py
Nov 28 09:30:28 np0005538513.localdomain sudo[213958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:28 np0005538513.localdomain python3[213960]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:30:30 np0005538513.localdomain podman[213975]: 2025-11-28 09:30:28.903235733 +0000 UTC m=+0.045067445 image pull  quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 28 09:30:31 np0005538513.localdomain podman[214022]: 
Nov 28 09:30:31 np0005538513.localdomain podman[214022]: 2025-11-28 09:30:31.030770503 +0000 UTC m=+0.062994942 container create 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:30:31 np0005538513.localdomain podman[214022]: 2025-11-28 09:30:31.002607219 +0000 UTC m=+0.034831708 image pull  quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 28 09:30:31 np0005538513.localdomain python3[213960]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 28 09:30:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38728 DF PROTO=TCP SPT=58704 DPT=9101 SEQ=4233587836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B71020000000001030307) 
Nov 28 09:30:31 np0005538513.localdomain sudo[213958]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:31 np0005538513.localdomain sudo[214168]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prngmpyltqoaoccohegktelcgrvnqblo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322231.5050657-1562-80576184349134/AnsiballZ_stat.py
Nov 28 09:30:31 np0005538513.localdomain sudo[214168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:31 np0005538513.localdomain python3.9[214170]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:30:32 np0005538513.localdomain systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 28 09:30:33 np0005538513.localdomain sudo[214168]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60079 DF PROTO=TCP SPT=53610 DPT=9105 SEQ=2367419891 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B79820000000001030307) 
Nov 28 09:30:33 np0005538513.localdomain sudo[214281]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpwkdtsoekcqulaushmsbggatskfcyrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322233.32783-1589-157339491086871/AnsiballZ_file.py
Nov 28 09:30:33 np0005538513.localdomain sudo[214281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:33 np0005538513.localdomain python3.9[214283]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:33 np0005538513.localdomain sudo[214281]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:34 np0005538513.localdomain sudo[214336]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqkirdhfdiyevazljmuegwwtwkvimejt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322233.32783-1589-157339491086871/AnsiballZ_stat.py
Nov 28 09:30:35 np0005538513.localdomain sudo[214336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:35 np0005538513.localdomain python3.9[214338]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:30:35 np0005538513.localdomain sudo[214336]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:35 np0005538513.localdomain sudo[214445]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oayxokjbpuyqtyiqypfhaogjkkgskdan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322235.4866197-1589-1687990589434/AnsiballZ_copy.py
Nov 28 09:30:35 np0005538513.localdomain sudo[214445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:36 np0005538513.localdomain python3.9[214447]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764322235.4866197-1589-1687990589434/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:36 np0005538513.localdomain sudo[214445]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:36 np0005538513.localdomain sudo[214500]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uzkhzeulxcryeonpvmwvkkpibwyhudke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322235.4866197-1589-1687990589434/AnsiballZ_systemd.py
Nov 28 09:30:36 np0005538513.localdomain sudo[214500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15130 DF PROTO=TCP SPT=57402 DPT=9100 SEQ=617379117 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B85820000000001030307) 
Nov 28 09:30:36 np0005538513.localdomain python3.9[214502]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:30:36 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:30:36 np0005538513.localdomain systemd-sysv-generator[214528]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:30:36 np0005538513.localdomain systemd-rc-local-generator[214524]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:30:36 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:36 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:36 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:36 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:36 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:30:36 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:36 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:36 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:36 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:36 np0005538513.localdomain sudo[214500]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:37 np0005538513.localdomain sudo[214590]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hcxtkifwbswrwnobjlqjbkfkucwbpphc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322235.4866197-1589-1687990589434/AnsiballZ_systemd.py
Nov 28 09:30:37 np0005538513.localdomain sudo[214590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:37 np0005538513.localdomain python3.9[214592]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:30:37 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:30:37 np0005538513.localdomain systemd-rc-local-generator[214619]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:30:37 np0005538513.localdomain systemd-sysv-generator[214622]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:30:37 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:37 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:37 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:37 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:37 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:30:37 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:37 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:37 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:37 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:37 np0005538513.localdomain systemd[1]: Starting multipathd container...
Nov 28 09:30:38 np0005538513.localdomain systemd[1]: tmp-crun.5NbCSk.mount: Deactivated successfully.
Nov 28 09:30:38 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:30:38 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b94389f32798135ccab4ef5d015a3beb8a228cdedf149e6a2665616bd7ec48ac/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 28 09:30:38 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b94389f32798135ccab4ef5d015a3beb8a228cdedf149e6a2665616bd7ec48ac/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 09:30:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:30:38 np0005538513.localdomain podman[214633]: 2025-11-28 09:30:38.10729225 +0000 UTC m=+0.159318344 container init 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3)
Nov 28 09:30:38 np0005538513.localdomain multipathd[214646]: + sudo -E kolla_set_configs
Nov 28 09:30:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:30:38 np0005538513.localdomain podman[214633]: 2025-11-28 09:30:38.146183538 +0000 UTC m=+0.198209622 container start 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 09:30:38 np0005538513.localdomain podman[214633]: multipathd
Nov 28 09:30:38 np0005538513.localdomain systemd[1]: Started multipathd container.
Nov 28 09:30:38 np0005538513.localdomain sudo[214652]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 09:30:38 np0005538513.localdomain sudo[214652]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 28 09:30:38 np0005538513.localdomain sudo[214652]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 09:30:38 np0005538513.localdomain sudo[214590]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:38 np0005538513.localdomain podman[214653]: 2025-11-28 09:30:38.21439741 +0000 UTC m=+0.063170876 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:30:38 np0005538513.localdomain multipathd[214646]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:30:38 np0005538513.localdomain multipathd[214646]: INFO:__main__:Validating config file
Nov 28 09:30:38 np0005538513.localdomain multipathd[214646]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:30:38 np0005538513.localdomain multipathd[214646]: INFO:__main__:Writing out command to execute
Nov 28 09:30:38 np0005538513.localdomain sudo[214652]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:38 np0005538513.localdomain multipathd[214646]: ++ cat /run_command
Nov 28 09:30:38 np0005538513.localdomain multipathd[214646]: + CMD='/usr/sbin/multipathd -d'
Nov 28 09:30:38 np0005538513.localdomain multipathd[214646]: + ARGS=
Nov 28 09:30:38 np0005538513.localdomain multipathd[214646]: + sudo kolla_copy_cacerts
Nov 28 09:30:38 np0005538513.localdomain podman[214653]: 2025-11-28 09:30:38.240432259 +0000 UTC m=+0.089205795 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd)
Nov 28 09:30:38 np0005538513.localdomain podman[214653]: unhealthy
Nov 28 09:30:38 np0005538513.localdomain sudo[214678]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 28 09:30:38 np0005538513.localdomain sudo[214678]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 28 09:30:38 np0005538513.localdomain sudo[214678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 09:30:38 np0005538513.localdomain sudo[214678]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:38 np0005538513.localdomain multipathd[214646]: + [[ ! -n '' ]]
Nov 28 09:30:38 np0005538513.localdomain multipathd[214646]: + . kolla_extend_start
Nov 28 09:30:38 np0005538513.localdomain multipathd[214646]: Running command: '/usr/sbin/multipathd -d'
Nov 28 09:30:38 np0005538513.localdomain multipathd[214646]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 28 09:30:38 np0005538513.localdomain multipathd[214646]: + umask 0022
Nov 28 09:30:38 np0005538513.localdomain multipathd[214646]: + exec /usr/sbin/multipathd -d
Nov 28 09:30:38 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:30:38 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Failed with result 'exit-code'.
Nov 28 09:30:38 np0005538513.localdomain multipathd[214646]: 10072.434832 | --------start up--------
Nov 28 09:30:38 np0005538513.localdomain systemd-journald[47227]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation.
Nov 28 09:30:38 np0005538513.localdomain systemd-journald[47227]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 09:30:38 np0005538513.localdomain rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:30:38 np0005538513.localdomain multipathd[214646]: 10072.434902 | read /etc/multipath.conf
Nov 28 09:30:38 np0005538513.localdomain multipathd[214646]: 10072.438780 | path checkers start up
Nov 28 09:30:38 np0005538513.localdomain rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:30:38 np0005538513.localdomain python3.9[214795]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:30:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54535 DF PROTO=TCP SPT=43538 DPT=9100 SEQ=1618416508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B91820000000001030307) 
Nov 28 09:30:39 np0005538513.localdomain sudo[214905]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhbxtuhjfixxkjzvbzahvmnkrstkpdya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322239.5517163-1697-204812051408458/AnsiballZ_command.py
Nov 28 09:30:39 np0005538513.localdomain sudo[214905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:40 np0005538513.localdomain python3.9[214907]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:30:40 np0005538513.localdomain sudo[214905]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:40 np0005538513.localdomain sudo[215027]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xyxoaibctsglfumlzycjjajjftnqulvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322240.3698738-1721-260463941829350/AnsiballZ_systemd.py
Nov 28 09:30:40 np0005538513.localdomain sudo[215027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:40 np0005538513.localdomain python3.9[215029]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:30:41 np0005538513.localdomain systemd[1]: Stopping multipathd container...
Nov 28 09:30:42 np0005538513.localdomain multipathd[214646]: 10076.243414 | exit (signal)
Nov 28 09:30:42 np0005538513.localdomain multipathd[214646]: 10076.243814 | --------shut down-------
Nov 28 09:30:42 np0005538513.localdomain systemd[1]: tmp-crun.OAqVUv.mount: Deactivated successfully.
Nov 28 09:30:42 np0005538513.localdomain systemd[1]: libpod-7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.scope: Deactivated successfully.
Nov 28 09:30:42 np0005538513.localdomain podman[215033]: 2025-11-28 09:30:42.102911839 +0000 UTC m=+0.103043870 container died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true)
Nov 28 09:30:42 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.timer: Deactivated successfully.
Nov 28 09:30:42 np0005538513.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:30:42 np0005538513.localdomain sudo[215045]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:30:42 np0005538513.localdomain sudo[215045]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:30:42 np0005538513.localdomain sudo[215045]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:42 np0005538513.localdomain sudo[215079]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:30:42 np0005538513.localdomain sudo[215079]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:30:42 np0005538513.localdomain podman[215033]: 2025-11-28 09:30:42.278739338 +0000 UTC m=+0.278871379 container cleanup 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 09:30:42 np0005538513.localdomain podman[215033]: multipathd
Nov 28 09:30:42 np0005538513.localdomain podman[215097]: 2025-11-28 09:30:42.348232695 +0000 UTC m=+0.044890078 container cleanup 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Nov 28 09:30:42 np0005538513.localdomain podman[215097]: multipathd
Nov 28 09:30:42 np0005538513.localdomain systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 28 09:30:42 np0005538513.localdomain systemd[1]: Stopped multipathd container.
Nov 28 09:30:42 np0005538513.localdomain systemd[1]: Starting multipathd container...
Nov 28 09:30:42 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:30:42 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b94389f32798135ccab4ef5d015a3beb8a228cdedf149e6a2665616bd7ec48ac/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 28 09:30:42 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b94389f32798135ccab4ef5d015a3beb8a228cdedf149e6a2665616bd7ec48ac/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 09:30:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15132 DF PROTO=TCP SPT=57402 DPT=9100 SEQ=617379117 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B9D420000000001030307) 
Nov 28 09:30:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:30:42 np0005538513.localdomain podman[215109]: 2025-11-28 09:30:42.484801293 +0000 UTC m=+0.109427212 container init 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:30:42 np0005538513.localdomain multipathd[215121]: + sudo -E kolla_set_configs
Nov 28 09:30:42 np0005538513.localdomain sudo[215134]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 09:30:42 np0005538513.localdomain sudo[215134]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 28 09:30:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:30:42 np0005538513.localdomain sudo[215134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 09:30:42 np0005538513.localdomain podman[215109]: 2025-11-28 09:30:42.524621344 +0000 UTC m=+0.149247263 container start 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:30:42 np0005538513.localdomain podman[215109]: multipathd
Nov 28 09:30:42 np0005538513.localdomain systemd[1]: Started multipathd container.
Nov 28 09:30:42 np0005538513.localdomain sudo[215027]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:42 np0005538513.localdomain multipathd[215121]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:30:42 np0005538513.localdomain multipathd[215121]: INFO:__main__:Validating config file
Nov 28 09:30:42 np0005538513.localdomain multipathd[215121]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:30:42 np0005538513.localdomain multipathd[215121]: INFO:__main__:Writing out command to execute
Nov 28 09:30:42 np0005538513.localdomain sudo[215134]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:42 np0005538513.localdomain multipathd[215121]: ++ cat /run_command
Nov 28 09:30:42 np0005538513.localdomain multipathd[215121]: + CMD='/usr/sbin/multipathd -d'
Nov 28 09:30:42 np0005538513.localdomain multipathd[215121]: + ARGS=
Nov 28 09:30:42 np0005538513.localdomain multipathd[215121]: + sudo kolla_copy_cacerts
Nov 28 09:30:42 np0005538513.localdomain podman[215139]: 2025-11-28 09:30:42.587549381 +0000 UTC m=+0.061958234 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 28 09:30:42 np0005538513.localdomain sudo[215161]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 28 09:30:42 np0005538513.localdomain sudo[215161]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 28 09:30:42 np0005538513.localdomain sudo[215161]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 09:30:42 np0005538513.localdomain podman[215139]: 2025-11-28 09:30:42.595308002 +0000 UTC m=+0.069716855 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 09:30:42 np0005538513.localdomain sudo[215161]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:42 np0005538513.localdomain multipathd[215121]: + [[ ! -n '' ]]
Nov 28 09:30:42 np0005538513.localdomain multipathd[215121]: + . kolla_extend_start
Nov 28 09:30:42 np0005538513.localdomain multipathd[215121]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 28 09:30:42 np0005538513.localdomain multipathd[215121]: Running command: '/usr/sbin/multipathd -d'
Nov 28 09:30:42 np0005538513.localdomain multipathd[215121]: + umask 0022
Nov 28 09:30:42 np0005538513.localdomain multipathd[215121]: + exec /usr/sbin/multipathd -d
Nov 28 09:30:42 np0005538513.localdomain podman[215139]: unhealthy
Nov 28 09:30:42 np0005538513.localdomain multipathd[215121]: 10076.779974 | --------start up--------
Nov 28 09:30:42 np0005538513.localdomain multipathd[215121]: 10076.779994 | read /etc/multipath.conf
Nov 28 09:30:42 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:30:42 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Failed with result 'exit-code'.
Nov 28 09:30:42 np0005538513.localdomain multipathd[215121]: 10076.783212 | path checkers start up
Nov 28 09:30:42 np0005538513.localdomain sudo[215079]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:43 np0005538513.localdomain sudo[215300]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdbqgaoqlglhypzcihdztwhxoyfvytnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322243.686404-1745-25852260098952/AnsiballZ_file.py
Nov 28 09:30:43 np0005538513.localdomain sudo[215300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:44 np0005538513.localdomain python3.9[215302]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:44 np0005538513.localdomain sudo[215300]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:45 np0005538513.localdomain sudo[215410]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tnrtidokgptdlswxbzrctqjgwncearvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322244.7494316-1781-111296783967809/AnsiballZ_file.py
Nov 28 09:30:45 np0005538513.localdomain sudo[215410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:45 np0005538513.localdomain python3.9[215412]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 09:30:45 np0005538513.localdomain sudo[215410]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:45 np0005538513.localdomain sudo[215430]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:30:45 np0005538513.localdomain sudo[215430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:30:45 np0005538513.localdomain sudo[215430]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:45 np0005538513.localdomain sudo[215538]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtvuquljcanlzujvrrtuliedjkaxpzbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322245.6827495-1805-14006722050554/AnsiballZ_modprobe.py
Nov 28 09:30:45 np0005538513.localdomain sudo[215538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:46 np0005538513.localdomain python3.9[215540]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 28 09:30:46 np0005538513.localdomain sudo[215538]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:46 np0005538513.localdomain sudo[215656]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pguhwkirxfyzpnfdzcjzuwfukqwncrax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322246.4281995-1829-121453552736592/AnsiballZ_stat.py
Nov 28 09:30:46 np0005538513.localdomain sudo[215656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:46 np0005538513.localdomain python3.9[215658]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:30:46 np0005538513.localdomain sudo[215656]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34905 DF PROTO=TCP SPT=48764 DPT=9882 SEQ=1475787364 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1BAFC30000000001030307) 
Nov 28 09:30:47 np0005538513.localdomain sudo[215744]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlulicqqmxzqivavyufyelsxdbkbrqye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322246.4281995-1829-121453552736592/AnsiballZ_copy.py
Nov 28 09:30:47 np0005538513.localdomain sudo[215744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:47 np0005538513.localdomain python3.9[215746]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322246.4281995-1829-121453552736592/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:47 np0005538513.localdomain sudo[215744]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:47 np0005538513.localdomain sudo[215854]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-omlsbvgmlgeecyjxrrfijpaskpocyxip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322247.6864266-1877-230453620780481/AnsiballZ_lineinfile.py
Nov 28 09:30:47 np0005538513.localdomain sudo[215854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:48 np0005538513.localdomain python3.9[215856]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:48 np0005538513.localdomain sudo[215854]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:48 np0005538513.localdomain sudo[215964]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-busguyvmriukrmglsxfucckyqjsxxffy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322248.3641338-1901-170761614619795/AnsiballZ_systemd.py
Nov 28 09:30:48 np0005538513.localdomain sudo[215964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:48 np0005538513.localdomain python3.9[215966]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:30:48 np0005538513.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 28 09:30:48 np0005538513.localdomain systemd[1]: Stopped Load Kernel Modules.
Nov 28 09:30:48 np0005538513.localdomain systemd[1]: Stopping Load Kernel Modules...
Nov 28 09:30:48 np0005538513.localdomain systemd[1]: Starting Load Kernel Modules...
Nov 28 09:30:48 np0005538513.localdomain systemd-modules-load[215970]: Module 'msr' is built in
Nov 28 09:30:48 np0005538513.localdomain systemd[1]: Finished Load Kernel Modules.
Nov 28 09:30:49 np0005538513.localdomain sudo[215964]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1827 DF PROTO=TCP SPT=44680 DPT=9105 SEQ=2909593796 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1BB7420000000001030307) 
Nov 28 09:30:50 np0005538513.localdomain sudo[216078]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pawpezpeytsulycsmqqjvckquzudtfse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322249.3585393-1925-83506657046546/AnsiballZ_dnf.py
Nov 28 09:30:50 np0005538513.localdomain sudo[216078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:50 np0005538513.localdomain python3.9[216080]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:30:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:30:50.784 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:30:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:30:50.785 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:30:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:30:50.787 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:30:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1828 DF PROTO=TCP SPT=44680 DPT=9105 SEQ=2909593796 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1BBF420000000001030307) 
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:30:54 np0005538513.localdomain systemd-sysv-generator[216115]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:30:54 np0005538513.localdomain systemd-rc-local-generator[216112]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:30:54 np0005538513.localdomain systemd-rc-local-generator[216149]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:30:54 np0005538513.localdomain systemd-sysv-generator[216152]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538513.localdomain systemd-logind[764]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 28 09:30:54 np0005538513.localdomain systemd-logind[764]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 28 09:30:54 np0005538513.localdomain lvm[216207]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 28 09:30:54 np0005538513.localdomain lvm[216207]: VG ceph_vg0 finished
Nov 28 09:30:54 np0005538513.localdomain lvm[216205]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Nov 28 09:30:54 np0005538513.localdomain lvm[216205]: VG ceph_vg1 finished
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 28 09:30:54 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:30:55 np0005538513.localdomain podman[216223]: 2025-11-28 09:30:55.036196142 +0000 UTC m=+0.087990147 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:30:55 np0005538513.localdomain systemd-rc-local-generator[216268]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:30:55 np0005538513.localdomain systemd-sysv-generator[216271]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:30:55 np0005538513.localdomain podman[216223]: 2025-11-28 09:30:55.113296888 +0000 UTC m=+0.165090893 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Nov 28 09:30:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:30:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1829 DF PROTO=TCP SPT=44680 DPT=9105 SEQ=2909593796 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1BCF020000000001030307) 
Nov 28 09:30:55 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:30:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:30:55 np0005538513.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 09:30:55 np0005538513.localdomain systemd[1]: tmp-crun.0XK5ma.mount: Deactivated successfully.
Nov 28 09:30:55 np0005538513.localdomain podman[216608]: 2025-11-28 09:30:55.435101392 +0000 UTC m=+0.093733698 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 28 09:30:55 np0005538513.localdomain podman[216608]: 2025-11-28 09:30:55.46936172 +0000 UTC m=+0.127994026 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:30:55 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:30:56 np0005538513.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 09:30:56 np0005538513.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 28 09:30:56 np0005538513.localdomain systemd[1]: man-db-cache-update.service: Consumed 1.255s CPU time.
Nov 28 09:30:56 np0005538513.localdomain systemd[1]: run-rbba2a76fc91d4edead9817eb37db3b7c.service: Deactivated successfully.
Nov 28 09:30:56 np0005538513.localdomain sudo[216078]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:57 np0005538513.localdomain python3.9[217538]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:30:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32477 DF PROTO=TCP SPT=49288 DPT=9101 SEQ=2341876394 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1BDA290000000001030307) 
Nov 28 09:30:58 np0005538513.localdomain sudo[217650]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vbfummfbgqrvhzhsjhrxuppeokaapgmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322257.9958832-1977-178172442623787/AnsiballZ_file.py
Nov 28 09:30:58 np0005538513.localdomain sudo[217650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:58 np0005538513.localdomain python3.9[217652]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:58 np0005538513.localdomain sudo[217650]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:59 np0005538513.localdomain sudo[217760]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjmkieqnjuwynezzufwlhdxgptpbnich ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322259.1502745-2010-99333341361441/AnsiballZ_systemd_service.py
Nov 28 09:30:59 np0005538513.localdomain sudo[217760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:59 np0005538513.localdomain python3.9[217762]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:30:59 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:30:59 np0005538513.localdomain systemd-sysv-generator[217790]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:30:59 np0005538513.localdomain systemd-rc-local-generator[217784]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:30:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:30:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:31:00 np0005538513.localdomain sudo[217760]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:00 np0005538513.localdomain python3.9[217906]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:31:00 np0005538513.localdomain network[217923]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:31:00 np0005538513.localdomain network[217924]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:31:00 np0005538513.localdomain network[217925]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:31:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32479 DF PROTO=TCP SPT=49288 DPT=9101 SEQ=2341876394 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1BE6420000000001030307) 
Nov 28 09:31:01 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:31:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57399 DF PROTO=TCP SPT=41920 DPT=9102 SEQ=4180756437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1BEF030000000001030307) 
Nov 28 09:31:05 np0005538513.localdomain sudo[218158]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sddcfkjphpedrzitpxttjzlljbcegtny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322265.1218958-2067-193663765947133/AnsiballZ_systemd_service.py
Nov 28 09:31:05 np0005538513.localdomain sudo[218158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:05 np0005538513.localdomain python3.9[218160]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:31:05 np0005538513.localdomain sudo[218158]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:06 np0005538513.localdomain sudo[218269]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbvbvefhgjbheauhruqgzbkfmplepifa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322265.8870084-2067-46906312967510/AnsiballZ_systemd_service.py
Nov 28 09:31:06 np0005538513.localdomain sudo[218269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49676 DF PROTO=TCP SPT=38158 DPT=9100 SEQ=3893746115 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1BFAC20000000001030307) 
Nov 28 09:31:06 np0005538513.localdomain python3.9[218271]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:31:06 np0005538513.localdomain sudo[218269]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:06 np0005538513.localdomain sudo[218380]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqlgyenjwfhvwusyoiwscabwmsmxralx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322266.6549325-2067-10171116898704/AnsiballZ_systemd_service.py
Nov 28 09:31:06 np0005538513.localdomain sudo[218380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:07 np0005538513.localdomain python3.9[218382]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:31:07 np0005538513.localdomain sudo[218380]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:07 np0005538513.localdomain sudo[218491]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xoykekaxjnyynveusyzlyzcrajzaimag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322267.4221704-2067-146192335094908/AnsiballZ_systemd_service.py
Nov 28 09:31:07 np0005538513.localdomain sudo[218491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:07 np0005538513.localdomain python3.9[218493]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:31:08 np0005538513.localdomain sudo[218491]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:09 np0005538513.localdomain sudo[218602]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vjuzssdqauwbbvanuszhhwkoiztwhdyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322268.193434-2067-141980453154522/AnsiballZ_systemd_service.py
Nov 28 09:31:09 np0005538513.localdomain sudo[218602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40318 DF PROTO=TCP SPT=42796 DPT=9100 SEQ=2395171770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C05830000000001030307) 
Nov 28 09:31:09 np0005538513.localdomain python3.9[218604]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:31:09 np0005538513.localdomain sudo[218602]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:09 np0005538513.localdomain sudo[218713]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypqtccgggvruellyicqqzmskibocabby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322269.4915888-2067-134541203675521/AnsiballZ_systemd_service.py
Nov 28 09:31:09 np0005538513.localdomain sudo[218713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:10 np0005538513.localdomain python3.9[218715]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:31:10 np0005538513.localdomain sudo[218713]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:10 np0005538513.localdomain sudo[218824]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrgalbfocbkgnyelskoctygemmkefiqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322270.273729-2067-4741054207905/AnsiballZ_systemd_service.py
Nov 28 09:31:10 np0005538513.localdomain sudo[218824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:10 np0005538513.localdomain python3.9[218826]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:31:10 np0005538513.localdomain sudo[218824]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:11 np0005538513.localdomain sudo[218935]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujtopilqhdhyarmzvjnsopxsrqdiyibo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322271.2478416-2067-255321003079416/AnsiballZ_systemd_service.py
Nov 28 09:31:11 np0005538513.localdomain sudo[218935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:11 np0005538513.localdomain python3.9[218937]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:31:11 np0005538513.localdomain sudo[218935]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49678 DF PROTO=TCP SPT=38158 DPT=9100 SEQ=3893746115 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C12820000000001030307) 
Nov 28 09:31:12 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:31:12 np0005538513.localdomain podman[218956]: 2025-11-28 09:31:12.859173046 +0000 UTC m=+0.092218969 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 09:31:12 np0005538513.localdomain podman[218956]: 2025-11-28 09:31:12.875426782 +0000 UTC m=+0.108472685 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd)
Nov 28 09:31:12 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:31:14 np0005538513.localdomain sudo[219066]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwdlpcimotoycplqqrujeiifchievvvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322273.8318336-2244-101986254246001/AnsiballZ_file.py
Nov 28 09:31:14 np0005538513.localdomain sudo[219066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:14 np0005538513.localdomain python3.9[219068]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:14 np0005538513.localdomain sudo[219066]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:14 np0005538513.localdomain sudo[219176]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-locgavqilvmnxnosrzrqafhvgqlschza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322274.4462178-2244-34531483918300/AnsiballZ_file.py
Nov 28 09:31:14 np0005538513.localdomain sudo[219176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:14 np0005538513.localdomain python3.9[219178]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:14 np0005538513.localdomain sudo[219176]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:15 np0005538513.localdomain sudo[219286]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-robjxbkbdncrirtdhnruvlwxvuxxskjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322275.235609-2244-87441262941204/AnsiballZ_file.py
Nov 28 09:31:15 np0005538513.localdomain sudo[219286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:15 np0005538513.localdomain python3.9[219288]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:15 np0005538513.localdomain sudo[219286]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:16 np0005538513.localdomain sudo[219396]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yxkrbhrlhawjazeqtmxkxcsrztiuspoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322275.8408403-2244-135972937057754/AnsiballZ_file.py
Nov 28 09:31:16 np0005538513.localdomain sudo[219396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:16 np0005538513.localdomain python3.9[219398]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:16 np0005538513.localdomain sudo[219396]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:16 np0005538513.localdomain sudo[219506]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uipmfccrmcilpieqkudarrkrslkwtkvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322276.4300396-2244-99875294054100/AnsiballZ_file.py
Nov 28 09:31:16 np0005538513.localdomain sudo[219506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:16 np0005538513.localdomain python3.9[219508]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:16 np0005538513.localdomain sudo[219506]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:17 np0005538513.localdomain sudo[219616]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibshxyfxtlwurudhrgsxhfflodtanlcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322276.9672248-2244-137256733462419/AnsiballZ_file.py
Nov 28 09:31:17 np0005538513.localdomain sudo[219616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10347 DF PROTO=TCP SPT=43626 DPT=9882 SEQ=2467435932 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C25020000000001030307) 
Nov 28 09:31:17 np0005538513.localdomain python3.9[219618]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:17 np0005538513.localdomain sudo[219616]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:17 np0005538513.localdomain sudo[219726]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-anbefzvkvgxbwkcnyxqufghtplneggyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322277.536503-2244-129265816759803/AnsiballZ_file.py
Nov 28 09:31:17 np0005538513.localdomain sudo[219726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:18 np0005538513.localdomain python3.9[219728]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:18 np0005538513.localdomain sudo[219726]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:18 np0005538513.localdomain sudo[219836]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-djjvzebpwahcxqzxsnzlkwgkozdekenv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322278.1391788-2244-168328007630351/AnsiballZ_file.py
Nov 28 09:31:18 np0005538513.localdomain sudo[219836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:18 np0005538513.localdomain python3.9[219838]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:18 np0005538513.localdomain sudo[219836]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9103 DF PROTO=TCP SPT=33126 DPT=9105 SEQ=4190552098 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C2C820000000001030307) 
Nov 28 09:31:19 np0005538513.localdomain sudo[219946]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjceohmjfcuxtlctrevlkmuvnmfxbfnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322279.0247786-2415-234507787366570/AnsiballZ_file.py
Nov 28 09:31:19 np0005538513.localdomain sudo[219946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:19 np0005538513.localdomain python3.9[219948]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:19 np0005538513.localdomain sudo[219946]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:19 np0005538513.localdomain sudo[220056]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kkszdaltyqupkjyskswmhxaluefujlhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322279.6089463-2415-17821081981666/AnsiballZ_file.py
Nov 28 09:31:19 np0005538513.localdomain sudo[220056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:20 np0005538513.localdomain python3.9[220058]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:20 np0005538513.localdomain sudo[220056]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:20 np0005538513.localdomain sudo[220166]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jruvekcdjiwzptdwuninwkeqssrgagqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322280.2241766-2415-101069201801172/AnsiballZ_file.py
Nov 28 09:31:20 np0005538513.localdomain sudo[220166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:20 np0005538513.localdomain python3.9[220168]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:20 np0005538513.localdomain sudo[220166]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9104 DF PROTO=TCP SPT=33126 DPT=9105 SEQ=4190552098 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C34820000000001030307) 
Nov 28 09:31:21 np0005538513.localdomain sudo[220276]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qzxbuhcaenafmscetiklqeamomlyidnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322281.5163343-2415-216904094286119/AnsiballZ_file.py
Nov 28 09:31:21 np0005538513.localdomain sudo[220276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:21 np0005538513.localdomain python3.9[220278]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:21 np0005538513.localdomain sudo[220276]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:22 np0005538513.localdomain sudo[220386]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klqunodanlpadysxzqaxijmbkxbaushw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322282.112487-2415-162306280353976/AnsiballZ_file.py
Nov 28 09:31:22 np0005538513.localdomain sudo[220386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:22 np0005538513.localdomain python3.9[220388]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:22 np0005538513.localdomain sudo[220386]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:22 np0005538513.localdomain sudo[220496]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-minywqbpjhbjeyptpfykpfcuxojwebeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322282.7023833-2415-123477497080385/AnsiballZ_file.py
Nov 28 09:31:22 np0005538513.localdomain sudo[220496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:23 np0005538513.localdomain python3.9[220498]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:23 np0005538513.localdomain sudo[220496]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:24 np0005538513.localdomain sudo[220606]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dkqhwdzoxcirwjethnbexhxawlijdnbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322283.7326674-2415-125523050316640/AnsiballZ_file.py
Nov 28 09:31:24 np0005538513.localdomain sudo[220606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:24 np0005538513.localdomain python3.9[220608]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:24 np0005538513.localdomain sudo[220606]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:24 np0005538513.localdomain sudo[220716]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncrxzbobxtnxgdusqqrlhtzxeoavalko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322284.422747-2415-118193870250376/AnsiballZ_file.py
Nov 28 09:31:24 np0005538513.localdomain sudo[220716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:24 np0005538513.localdomain python3.9[220718]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:24 np0005538513.localdomain sudo[220716]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9105 DF PROTO=TCP SPT=33126 DPT=9105 SEQ=4190552098 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C44420000000001030307) 
Nov 28 09:31:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:31:25 np0005538513.localdomain sudo[220826]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gglbcepabiwckrimmdjvaxvwkzqxtrpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322285.4909284-2589-245874071327944/AnsiballZ_command.py
Nov 28 09:31:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:31:25 np0005538513.localdomain sudo[220826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:25 np0005538513.localdomain podman[220828]: 2025-11-28 09:31:25.83062986 +0000 UTC m=+0.072655817 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:31:25 np0005538513.localdomain podman[220828]: 2025-11-28 09:31:25.864401939 +0000 UTC m=+0.106427896 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:31:25 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:31:25 np0005538513.localdomain podman[220829]: 2025-11-28 09:31:25.950008084 +0000 UTC m=+0.184571724 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 09:31:25 np0005538513.localdomain python3.9[220835]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:31:25 np0005538513.localdomain podman[220829]: 2025-11-28 09:31:25.97822858 +0000 UTC m=+0.212792290 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true)
Nov 28 09:31:25 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:31:26 np0005538513.localdomain sudo[220826]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:26 np0005538513.localdomain python3.9[220980]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 09:31:27 np0005538513.localdomain sudo[221088]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nohkkcnwsztxwbwhivogfmziowufjfdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322287.2583597-2643-176010452440965/AnsiballZ_systemd_service.py
Nov 28 09:31:27 np0005538513.localdomain sudo[221088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:27 np0005538513.localdomain python3.9[221090]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:31:27 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:31:27 np0005538513.localdomain systemd-rc-local-generator[221116]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:31:27 np0005538513.localdomain systemd-sysv-generator[221119]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:31:28 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:31:28 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:31:28 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:31:28 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:31:28 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:31:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51913 DF PROTO=TCP SPT=52198 DPT=9101 SEQ=2425600492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C4F5A0000000001030307) 
Nov 28 09:31:28 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:31:28 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:31:28 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:31:28 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:31:28 np0005538513.localdomain sudo[221088]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:28 np0005538513.localdomain sudo[221234]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwdfkkmmsouqeqteqavuvtznxnduuykg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322288.431572-2667-132527560058606/AnsiballZ_command.py
Nov 28 09:31:28 np0005538513.localdomain sudo[221234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:28 np0005538513.localdomain python3.9[221236]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:31:28 np0005538513.localdomain sudo[221234]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:29 np0005538513.localdomain sudo[221345]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqionzqyezzfkswvsfthaeygsznpposa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322289.0288908-2667-112799158903611/AnsiballZ_command.py
Nov 28 09:31:29 np0005538513.localdomain sudo[221345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:29 np0005538513.localdomain python3.9[221347]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:31:29 np0005538513.localdomain sudo[221345]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:29 np0005538513.localdomain sudo[221456]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axmmsjpyqrbpsnkdksslzdeatyllsbhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322289.6280792-2667-105823888390196/AnsiballZ_command.py
Nov 28 09:31:29 np0005538513.localdomain sudo[221456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:30 np0005538513.localdomain python3.9[221458]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:31:30 np0005538513.localdomain sudo[221456]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:30 np0005538513.localdomain sudo[221567]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-unixwfqaynmmoteskxnzjxhospkguqdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322290.2285852-2667-87476675013916/AnsiballZ_command.py
Nov 28 09:31:30 np0005538513.localdomain sudo[221567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:30 np0005538513.localdomain python3.9[221569]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:31:30 np0005538513.localdomain sudo[221567]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51915 DF PROTO=TCP SPT=52198 DPT=9101 SEQ=2425600492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C5B420000000001030307) 
Nov 28 09:31:31 np0005538513.localdomain sudo[221678]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrswmhmyvbofgzwdiumxtmegwsaffkvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322290.8338032-2667-66533233268852/AnsiballZ_command.py
Nov 28 09:31:31 np0005538513.localdomain sudo[221678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:31 np0005538513.localdomain python3.9[221680]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:31:31 np0005538513.localdomain sudo[221678]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:31 np0005538513.localdomain sudo[221789]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ccewalbmckgavauocoqctfoahvkrgpxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322291.5017645-2667-89055169513648/AnsiballZ_command.py
Nov 28 09:31:31 np0005538513.localdomain sudo[221789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:31 np0005538513.localdomain python3.9[221791]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:31:31 np0005538513.localdomain sudo[221789]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:32 np0005538513.localdomain sudo[221900]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxapcahaugygwjsoahswasavparpijio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322292.2383354-2667-23963108308581/AnsiballZ_command.py
Nov 28 09:31:32 np0005538513.localdomain sudo[221900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:32 np0005538513.localdomain python3.9[221902]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:31:32 np0005538513.localdomain sudo[221900]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:33 np0005538513.localdomain sudo[222011]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ddjffplyelwzoypetzaqfxtmojxezmet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322292.8533678-2667-180124896752085/AnsiballZ_command.py
Nov 28 09:31:33 np0005538513.localdomain sudo[222011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:33 np0005538513.localdomain python3.9[222013]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:31:33 np0005538513.localdomain sudo[222011]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60885 DF PROTO=TCP SPT=53580 DPT=9102 SEQ=1058156247 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C64430000000001030307) 
Nov 28 09:31:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30628 DF PROTO=TCP SPT=43092 DPT=9102 SEQ=1955098286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C6F830000000001030307) 
Nov 28 09:31:37 np0005538513.localdomain sudo[222122]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jezkkiuvdjfrzgjksekbajdrbpqdedlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322297.4582267-2874-40362282147430/AnsiballZ_file.py
Nov 28 09:31:37 np0005538513.localdomain sudo[222122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:37 np0005538513.localdomain python3.9[222124]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:37 np0005538513.localdomain sudo[222122]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:38 np0005538513.localdomain sudo[222232]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dahynxpevfwgqpsylgjgxmirvjplmrxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322298.0727096-2874-97048018918873/AnsiballZ_file.py
Nov 28 09:31:38 np0005538513.localdomain sudo[222232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:38 np0005538513.localdomain python3.9[222234]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:38 np0005538513.localdomain sudo[222232]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:39 np0005538513.localdomain sudo[222342]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mratlyxogmeutiygdzkmlsdtutozscuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322298.7582283-2874-212450718111535/AnsiballZ_file.py
Nov 28 09:31:39 np0005538513.localdomain sudo[222342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:39 np0005538513.localdomain python3.9[222344]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:39 np0005538513.localdomain sudo[222342]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15135 DF PROTO=TCP SPT=57402 DPT=9100 SEQ=617379117 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C7B820000000001030307) 
Nov 28 09:31:39 np0005538513.localdomain sudo[222452]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rakmqfvvzjparzawpxkhhhcckkrrekpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322299.4063253-2940-4462219618722/AnsiballZ_file.py
Nov 28 09:31:39 np0005538513.localdomain sudo[222452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:39 np0005538513.localdomain python3.9[222454]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:39 np0005538513.localdomain sudo[222452]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:40 np0005538513.localdomain sudo[222562]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-szzupoginsvsdwkoyxmnsplxiotcncdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322300.042873-2940-124049397169136/AnsiballZ_file.py
Nov 28 09:31:40 np0005538513.localdomain sudo[222562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:40 np0005538513.localdomain python3.9[222564]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:40 np0005538513.localdomain sudo[222562]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:40 np0005538513.localdomain sudo[222672]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvurwpvqcwqksorutegltlqfarznuvef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322300.6800678-2940-79865425695350/AnsiballZ_file.py
Nov 28 09:31:40 np0005538513.localdomain sudo[222672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:41 np0005538513.localdomain python3.9[222674]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:41 np0005538513.localdomain sudo[222672]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:41 np0005538513.localdomain sudo[222782]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmuywpzblssiqqdszhpwilorwvmvmbzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322301.282578-2940-6886804577799/AnsiballZ_file.py
Nov 28 09:31:41 np0005538513.localdomain sudo[222782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:41 np0005538513.localdomain python3.9[222784]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:41 np0005538513.localdomain sudo[222782]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:42 np0005538513.localdomain sudo[222892]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbagrbtszzrpxfmpnjjnxpdndumrskhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322301.8969612-2940-80911601189094/AnsiballZ_file.py
Nov 28 09:31:42 np0005538513.localdomain sudo[222892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:42 np0005538513.localdomain python3.9[222894]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:42 np0005538513.localdomain sudo[222892]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61555 DF PROTO=TCP SPT=50872 DPT=9100 SEQ=1856325133 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C87830000000001030307) 
Nov 28 09:31:42 np0005538513.localdomain sudo[223002]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgxurszbypoukpjtlxgatkvzybqvycpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322302.4780743-2940-64473327305730/AnsiballZ_file.py
Nov 28 09:31:42 np0005538513.localdomain sudo[223002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:42 np0005538513.localdomain python3.9[223004]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:42 np0005538513.localdomain sudo[223002]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:43 np0005538513.localdomain sudo[223112]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gyuzjnxralodzpaegiqzlcidgdmxeqwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322303.0611718-2940-213760007637007/AnsiballZ_file.py
Nov 28 09:31:43 np0005538513.localdomain sudo[223112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:31:43 np0005538513.localdomain systemd[1]: tmp-crun.pd6aJa.mount: Deactivated successfully.
Nov 28 09:31:43 np0005538513.localdomain podman[223115]: 2025-11-28 09:31:43.468747741 +0000 UTC m=+0.097850645 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:31:43 np0005538513.localdomain podman[223115]: 2025-11-28 09:31:43.484366213 +0000 UTC m=+0.113469167 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 09:31:43 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:31:43 np0005538513.localdomain python3.9[223114]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:43 np0005538513.localdomain sudo[223112]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:45 np0005538513.localdomain sudo[223149]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:31:45 np0005538513.localdomain sudo[223149]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:31:45 np0005538513.localdomain sudo[223149]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:45 np0005538513.localdomain sudo[223167]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 09:31:45 np0005538513.localdomain sudo[223167]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:31:46 np0005538513.localdomain sudo[223167]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:46 np0005538513.localdomain sudo[223207]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:31:46 np0005538513.localdomain sudo[223207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:31:46 np0005538513.localdomain sudo[223207]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:46 np0005538513.localdomain sudo[223225]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:31:46 np0005538513.localdomain sudo[223225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:31:47 np0005538513.localdomain sudo[223225]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53072 DF PROTO=TCP SPT=60710 DPT=9882 SEQ=2672581527 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C9A030000000001030307) 
Nov 28 09:31:47 np0005538513.localdomain sudo[223275]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:31:47 np0005538513.localdomain sudo[223275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:31:47 np0005538513.localdomain sudo[223275]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35722 DF PROTO=TCP SPT=44990 DPT=9105 SEQ=3272907869 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1CA1C20000000001030307) 
Nov 28 09:31:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:31:50.785 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:31:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:31:50.786 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:31:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:31:50.787 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:31:50 np0005538513.localdomain sudo[223384]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbrahcvugubcrktlcijmwbmssqltewfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322310.452644-3265-178364711993030/AnsiballZ_getent.py
Nov 28 09:31:50 np0005538513.localdomain sudo[223384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:51 np0005538513.localdomain python3.9[223386]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 28 09:31:51 np0005538513.localdomain sudo[223384]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35723 DF PROTO=TCP SPT=44990 DPT=9105 SEQ=3272907869 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1CA9C30000000001030307) 
Nov 28 09:31:51 np0005538513.localdomain sudo[223495]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-owunslibzipojcbtrjiosgzxxbtudwop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322311.3302243-3289-80112040718674/AnsiballZ_group.py
Nov 28 09:31:51 np0005538513.localdomain sudo[223495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:51 np0005538513.localdomain python3.9[223497]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 09:31:51 np0005538513.localdomain groupadd[223498]: group added to /etc/group: name=nova, GID=42436
Nov 28 09:31:51 np0005538513.localdomain groupadd[223498]: group added to /etc/gshadow: name=nova
Nov 28 09:31:52 np0005538513.localdomain groupadd[223498]: new group: name=nova, GID=42436
Nov 28 09:31:52 np0005538513.localdomain sudo[223495]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:52 np0005538513.localdomain sudo[223611]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncyrtdctlbbteuztxsqofbrkmxzjbuxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322312.2895594-3313-126365950873547/AnsiballZ_user.py
Nov 28 09:31:52 np0005538513.localdomain sudo[223611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:52 np0005538513.localdomain python3.9[223613]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005538513.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 28 09:31:53 np0005538513.localdomain useradd[223615]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/1
Nov 28 09:31:53 np0005538513.localdomain useradd[223615]: add 'nova' to group 'libvirt'
Nov 28 09:31:53 np0005538513.localdomain useradd[223615]: add 'nova' to shadow group 'libvirt'
Nov 28 09:31:53 np0005538513.localdomain sudo[223611]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:54 np0005538513.localdomain sshd[223639]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:31:54 np0005538513.localdomain sshd[223639]: Accepted publickey for zuul from 192.168.122.30 port 45530 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:31:54 np0005538513.localdomain systemd-logind[764]: New session 54 of user zuul.
Nov 28 09:31:54 np0005538513.localdomain systemd[1]: Started Session 54 of User zuul.
Nov 28 09:31:54 np0005538513.localdomain sshd[223639]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:31:54 np0005538513.localdomain sshd[223642]: Received disconnect from 192.168.122.30 port 45530:11: disconnected by user
Nov 28 09:31:54 np0005538513.localdomain sshd[223642]: Disconnected from user zuul 192.168.122.30 port 45530
Nov 28 09:31:54 np0005538513.localdomain sshd[223639]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:31:54 np0005538513.localdomain systemd[1]: session-54.scope: Deactivated successfully.
Nov 28 09:31:54 np0005538513.localdomain systemd-logind[764]: Session 54 logged out. Waiting for processes to exit.
Nov 28 09:31:54 np0005538513.localdomain systemd-logind[764]: Removed session 54.
Nov 28 09:31:54 np0005538513.localdomain python3.9[223750]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:31:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35724 DF PROTO=TCP SPT=44990 DPT=9105 SEQ=3272907869 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1CB9820000000001030307) 
Nov 28 09:31:55 np0005538513.localdomain python3.9[223836]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322314.5339255-3388-134022272078712/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:56 np0005538513.localdomain python3.9[223944]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:31:56 np0005538513.localdomain python3.9[223999]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:31:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:31:56 np0005538513.localdomain podman[224034]: 2025-11-28 09:31:56.854779178 +0000 UTC m=+0.081694004 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 09:31:56 np0005538513.localdomain podman[224034]: 2025-11-28 09:31:56.865227639 +0000 UTC m=+0.092142485 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 09:31:56 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:31:56 np0005538513.localdomain podman[224033]: 2025-11-28 09:31:56.957973589 +0000 UTC m=+0.185128616 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 09:31:57 np0005538513.localdomain podman[224033]: 2025-11-28 09:31:57.055423328 +0000 UTC m=+0.282578355 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 09:31:57 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:31:57 np0005538513.localdomain python3.9[224147]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:31:57 np0005538513.localdomain python3.9[224233]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322316.7633526-3388-32646983963191/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50495 DF PROTO=TCP SPT=56108 DPT=9101 SEQ=244356425 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1CC48A0000000001030307) 
Nov 28 09:31:58 np0005538513.localdomain python3.9[224341]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:31:58 np0005538513.localdomain python3.9[224427]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322317.9220626-3388-117429173261899/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=534005c01c7af821d962fad87e973f668cecbdc9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:59 np0005538513.localdomain python3.9[224535]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:32:01 np0005538513.localdomain python3.9[224621]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322319.1117601-3388-256889393724881/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:32:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50497 DF PROTO=TCP SPT=56108 DPT=9101 SEQ=244356425 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1CD0820000000001030307) 
Nov 28 09:32:01 np0005538513.localdomain python3.9[224729]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:32:02 np0005538513.localdomain python3.9[224815]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322321.25727-3388-240308374367014/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:32:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35725 DF PROTO=TCP SPT=44990 DPT=9105 SEQ=3272907869 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1CD9820000000001030307) 
Nov 28 09:32:03 np0005538513.localdomain sudo[224923]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fkkpdlimidwyoebrwuvvdqbheovfkwym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322323.4434018-3637-266235740471908/AnsiballZ_file.py
Nov 28 09:32:03 np0005538513.localdomain sudo[224923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:03 np0005538513.localdomain python3.9[224925]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:32:03 np0005538513.localdomain sudo[224923]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:04 np0005538513.localdomain sudo[225033]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hyxkbpqjrxihnxjjaewnsaeedhlhwpvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322324.5180025-3661-10250723621855/AnsiballZ_copy.py
Nov 28 09:32:04 np0005538513.localdomain sudo[225033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:04 np0005538513.localdomain python3.9[225035]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:32:04 np0005538513.localdomain sudo[225033]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:05 np0005538513.localdomain sudo[225143]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwsmympraamispkrwsfgjrycbzzdfcds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322325.256334-3685-69049999200913/AnsiballZ_stat.py
Nov 28 09:32:05 np0005538513.localdomain sudo[225143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:05 np0005538513.localdomain python3.9[225145]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:32:05 np0005538513.localdomain sudo[225143]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:06 np0005538513.localdomain sudo[225255]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqygbyjprmbealmyopqchoynkmfmxhwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322326.0331848-3712-63686499052654/AnsiballZ_file.py
Nov 28 09:32:06 np0005538513.localdomain sudo[225255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32372 DF PROTO=TCP SPT=44028 DPT=9100 SEQ=3257820138 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1CE5020000000001030307) 
Nov 28 09:32:06 np0005538513.localdomain python3.9[225257]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:32:06 np0005538513.localdomain sudo[225255]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:07 np0005538513.localdomain python3.9[225365]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:32:07 np0005538513.localdomain python3.9[225475]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:32:08 np0005538513.localdomain python3.9[225561]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322327.497679-3763-213543367310891/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:32:09 np0005538513.localdomain python3.9[225669]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:32:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54205 DF PROTO=TCP SPT=34742 DPT=9102 SEQ=829525828 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1CF1430000000001030307) 
Nov 28 09:32:09 np0005538513.localdomain python3.9[225755]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322328.8099177-3808-180727474085391/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:32:10 np0005538513.localdomain sudo[225863]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yemmzpbukbqhajrmxwlsrcynbswguowt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322330.22284-3859-246803883415397/AnsiballZ_container_config_data.py
Nov 28 09:32:10 np0005538513.localdomain sudo[225863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:10 np0005538513.localdomain python3.9[225865]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 28 09:32:10 np0005538513.localdomain sudo[225863]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:11 np0005538513.localdomain sudo[225973]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-urmowfsxvtrzagsdqaqxqnqqzlcrlotg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322331.059144-3886-100447398045403/AnsiballZ_container_config_hash.py
Nov 28 09:32:11 np0005538513.localdomain sudo[225973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:11 np0005538513.localdomain python3.9[225975]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:32:11 np0005538513.localdomain sudo[225973]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:12 np0005538513.localdomain sudo[226083]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydzvjvgpsodgyrgespemrxzjxulxbojw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764322331.9022748-3916-271975516251870/AnsiballZ_edpm_container_manage.py
Nov 28 09:32:12 np0005538513.localdomain sudo[226083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:12 np0005538513.localdomain python3[226085]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:32:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32374 DF PROTO=TCP SPT=44028 DPT=9100 SEQ=3257820138 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1CFCC20000000001030307) 
Nov 28 09:32:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:32:13 np0005538513.localdomain podman[226113]: 2025-11-28 09:32:13.866659643 +0000 UTC m=+0.099119765 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:32:13 np0005538513.localdomain podman[226113]: 2025-11-28 09:32:13.878490119 +0000 UTC m=+0.110950221 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 09:32:13 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:32:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32162 DF PROTO=TCP SPT=46022 DPT=9882 SEQ=3945402709 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D0F420000000001030307) 
Nov 28 09:32:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24295 DF PROTO=TCP SPT=39316 DPT=9105 SEQ=366948878 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D16C20000000001030307) 
Nov 28 09:32:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24296 DF PROTO=TCP SPT=39316 DPT=9105 SEQ=366948878 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D1EC20000000001030307) 
Nov 28 09:32:22 np0005538513.localdomain podman[226100]: 2025-11-28 09:32:12.584122143 +0000 UTC m=+0.048020215 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 28 09:32:22 np0005538513.localdomain podman[226181]: 
Nov 28 09:32:23 np0005538513.localdomain podman[226181]: 2025-11-28 09:32:23.053778194 +0000 UTC m=+0.124617755 container create f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible)
Nov 28 09:32:23 np0005538513.localdomain podman[226181]: 2025-11-28 09:32:22.965682059 +0000 UTC m=+0.036521660 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 28 09:32:23 np0005538513.localdomain python3[226085]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 28 09:32:23 np0005538513.localdomain sudo[226083]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:23 np0005538513.localdomain sudo[226326]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzmqfjpxnxkdmwbawygnkviqaenxfnav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322343.6327877-3940-127517642619406/AnsiballZ_stat.py
Nov 28 09:32:23 np0005538513.localdomain sudo[226326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:24 np0005538513.localdomain python3.9[226328]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:32:24 np0005538513.localdomain sudo[226326]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:25 np0005538513.localdomain sudo[226438]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aglkyzwkbdulgwguuujavwxlgwgjxwqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322344.818702-3976-86520279364915/AnsiballZ_container_config_data.py
Nov 28 09:32:25 np0005538513.localdomain sudo[226438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24297 DF PROTO=TCP SPT=39316 DPT=9105 SEQ=366948878 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D2E830000000001030307) 
Nov 28 09:32:25 np0005538513.localdomain python3.9[226440]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 28 09:32:25 np0005538513.localdomain sudo[226438]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:25 np0005538513.localdomain sudo[226548]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdzzbmjymjtlyxxvpmfbbueohsfjjwjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322345.713026-4003-174805384257832/AnsiballZ_container_config_hash.py
Nov 28 09:32:25 np0005538513.localdomain sudo[226548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:26 np0005538513.localdomain python3.9[226550]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:32:26 np0005538513.localdomain sudo[226548]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:26 np0005538513.localdomain sudo[226658]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwjdmswbcngysszdoxhynjznunodwsom ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764322346.604875-4033-137649274484026/AnsiballZ_edpm_container_manage.py
Nov 28 09:32:26 np0005538513.localdomain sudo[226658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:27 np0005538513.localdomain python3[226660]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:32:27 np0005538513.localdomain python3[226660]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a",
                                                                    "Digest": "sha256:647f1d5dc1b70ffa3e1832199619d57bfaeceac8823ff53ece64b8e42cc9688e",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:647f1d5dc1b70ffa3e1832199619d57bfaeceac8823ff53ece64b8e42cc9688e"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-11-26T06:36:07.10279245Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1211782527,
                                                                    "VirtualSize": 1211782527,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309/diff:/var/lib/containers/storage/overlay/f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a/diff:/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:1e3477d3ea795ca64b46f28aa9428ba791c4250e0fd05e173a4b9c0fb0bdee23",
                                                                              "sha256:c136b33417f134a3b932677bcf7a2df089c29f20eca250129eafd2132d4708bb",
                                                                              "sha256:7913bde445307e7f24767d9149b2e7f498930793ac9f073ccec69b608c009d31",
                                                                              "sha256:084b2323a717fe711217b0ec21da61f4804f7a0d506adae935888421b80809cf"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.55004106Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550061231Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550071761Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550082711Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550094371Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550104472Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.937139683Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:33.845342269Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:37.752912815Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.066850603Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.343690066Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.121414134Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.758394881Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.023293708Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.666927498Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.274045447Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.934810694Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:42.460051822Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.056709748Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.656939418Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.391634882Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.866551538Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.384686341Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.893815667Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:50.280039705Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:51.365780205Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:52.238116267Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:54.354755699Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.47438266Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474435383Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474444143Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474450953Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:58.542433842Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:13:58.883943816Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:14:39.655921147Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:14:42.534184087Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:20.237322707Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:20.688296939Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:21.069367201Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:23:46.989417927Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:23:54.535170465Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:34:24.828469773Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:06.089054875Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:06.610811813Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:07.099939071Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:07.100032994Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:14.509959241Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 28 09:32:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:32:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:32:27 np0005538513.localdomain podman[226710]: 2025-11-28 09:32:27.523179155 +0000 UTC m=+0.111435376 container remove c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Nov 28 09:32:27 np0005538513.localdomain python3[226660]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute
Nov 28 09:32:27 np0005538513.localdomain podman[226722]: 2025-11-28 09:32:27.580919737 +0000 UTC m=+0.090717129 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 09:32:27 np0005538513.localdomain podman[226723]: 2025-11-28 09:32:27.651123724 +0000 UTC m=+0.160354328 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:32:27 np0005538513.localdomain podman[226741]: 
Nov 28 09:32:27 np0005538513.localdomain podman[226741]: 2025-11-28 09:32:27.669737565 +0000 UTC m=+0.126502245 container create 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251125, tcib_managed=true, container_name=nova_compute, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 28 09:32:27 np0005538513.localdomain podman[226741]: 2025-11-28 09:32:27.593429324 +0000 UTC m=+0.050194054 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 28 09:32:27 np0005538513.localdomain podman[226722]: 2025-11-28 09:32:27.674370862 +0000 UTC m=+0.184168224 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:32:27 np0005538513.localdomain python3[226660]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Nov 28 09:32:27 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:32:27 np0005538513.localdomain podman[226723]: 2025-11-28 09:32:27.689761321 +0000 UTC m=+0.198991905 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:32:27 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:32:27 np0005538513.localdomain sudo[226658]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36869 DF PROTO=TCP SPT=59624 DPT=9101 SEQ=2115230041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D39BA0000000001030307) 
Nov 28 09:32:28 np0005538513.localdomain sudo[226910]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpegdjhpiywinmfpwzpcqltappfcwzah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322348.1732469-4057-281041904540059/AnsiballZ_stat.py
Nov 28 09:32:28 np0005538513.localdomain sudo[226910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:28 np0005538513.localdomain python3.9[226912]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:32:28 np0005538513.localdomain sudo[226910]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:29 np0005538513.localdomain sudo[227022]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kiykbozpfasrjqbgmfcjjwbsapouxrpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322349.0421069-4084-150032266027276/AnsiballZ_file.py
Nov 28 09:32:29 np0005538513.localdomain sudo[227022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:29 np0005538513.localdomain python3.9[227024]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:32:29 np0005538513.localdomain sudo[227022]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:29 np0005538513.localdomain sudo[227131]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhlhpmvesprazwztttzefznlmxdqqjxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322349.6011608-4084-267492742123445/AnsiballZ_copy.py
Nov 28 09:32:29 np0005538513.localdomain sudo[227131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:30 np0005538513.localdomain python3.9[227133]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764322349.6011608-4084-267492742123445/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:32:30 np0005538513.localdomain sudo[227131]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:30 np0005538513.localdomain sudo[227186]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxrmgdmlhkpbabxfjjznwpoiqagpvcuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322349.6011608-4084-267492742123445/AnsiballZ_systemd.py
Nov 28 09:32:30 np0005538513.localdomain sudo[227186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:30 np0005538513.localdomain python3.9[227188]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:32:30 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:32:30 np0005538513.localdomain systemd-rc-local-generator[227214]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:32:30 np0005538513.localdomain systemd-sysv-generator[227218]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:32:30 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:30 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:30 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:30 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:31 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:32:31 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:31 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:31 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:31 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36871 DF PROTO=TCP SPT=59624 DPT=9101 SEQ=2115230041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D45C30000000001030307) 
Nov 28 09:32:31 np0005538513.localdomain sudo[227186]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:31 np0005538513.localdomain sudo[227276]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pchrdiuyffecuaqwxudadiqfpyzgcadc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322349.6011608-4084-267492742123445/AnsiballZ_systemd.py
Nov 28 09:32:31 np0005538513.localdomain sudo[227276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:31 np0005538513.localdomain python3.9[227278]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:32:31 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:32:31 np0005538513.localdomain systemd-rc-local-generator[227305]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:32:31 np0005538513.localdomain systemd-sysv-generator[227310]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:32:32 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:32 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:32 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:32 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:32 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:32:32 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:32 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:32 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:32 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:32 np0005538513.localdomain systemd[1]: Starting nova_compute container...
Nov 28 09:32:32 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:32:32 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:32 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:32 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:32 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:32 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:32 np0005538513.localdomain podman[227319]: 2025-11-28 09:32:32.346936079 +0000 UTC m=+0.126438723 container init 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible)
Nov 28 09:32:32 np0005538513.localdomain podman[227319]: 2025-11-28 09:32:32.355758878 +0000 UTC m=+0.135261512 container start 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 09:32:32 np0005538513.localdomain podman[227319]: nova_compute
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: + sudo -E kolla_set_configs
Nov 28 09:32:32 np0005538513.localdomain systemd[1]: Started nova_compute container.
Nov 28 09:32:32 np0005538513.localdomain sudo[227276]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Validating config file
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Copying service configuration files
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Deleting /etc/ceph
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Creating directory /etc/ceph
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Setting permission for /etc/ceph
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Writing out command to execute
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: ++ cat /run_command
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: + CMD=nova-compute
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: + ARGS=
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: + sudo kolla_copy_cacerts
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: + [[ ! -n '' ]]
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: + . kolla_extend_start
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: Running command: 'nova-compute'
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: + echo 'Running command: '\''nova-compute'\'''
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: + umask 0022
Nov 28 09:32:32 np0005538513.localdomain nova_compute[227332]: + exec nova-compute
Nov 28 09:32:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23238 DF PROTO=TCP SPT=43890 DPT=9102 SEQ=1949467689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D4EC20000000001030307) 
Nov 28 09:32:33 np0005538513.localdomain python3.9[227452]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.032 227336 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.032 227336 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.032 227336 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.033 227336 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.146 227336 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.153 227336 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.154 227336 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.617 227336 INFO nova.virt.driver [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.901 227336 INFO nova.compute.provider_config [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.913 227336 WARNING nova.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.913 227336 DEBUG oslo_concurrency.lockutils [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.914 227336 DEBUG oslo_concurrency.lockutils [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.914 227336 DEBUG oslo_concurrency.lockutils [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.915 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.915 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.915 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.915 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.916 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.916 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.916 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.916 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.917 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.917 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.917 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.917 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.918 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.918 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.918 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.918 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.919 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.919 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.919 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.919 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] console_host                   = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.920 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.920 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.920 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.920 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.921 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.921 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.921 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.921 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.922 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.922 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.922 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.922 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.923 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.923 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.923 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.923 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.924 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.924 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.924 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] host                           = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.924 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.925 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.925 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.925 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.926 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.926 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.926 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.926 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.927 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.927 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.927 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.927 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.928 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.928 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.928 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.928 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.929 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.929 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.929 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.929 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.930 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.930 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.930 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.930 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.931 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.931 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.931 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.931 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.932 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.932 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.932 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.932 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.933 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.933 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.933 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.934 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.934 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.934 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.934 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.935 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.935 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.935 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] my_block_storage_ip            = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.935 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] my_ip                          = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.936 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.936 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.936 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.936 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.937 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.937 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.937 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.937 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.938 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.938 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.938 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.938 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.939 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.939 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.939 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.939 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.940 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.940 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.940 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.940 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.941 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.941 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.941 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.941 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.942 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.942 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.942 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.942 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.943 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.943 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.943 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.943 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.944 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.944 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.944 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.944 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.944 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.945 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.945 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.945 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.945 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.946 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.946 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.946 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.946 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.947 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.947 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.947 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.947 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.948 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.948 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.948 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.948 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.949 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.949 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.949 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.949 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.950 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.950 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.950 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.950 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.951 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.951 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.951 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.951 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.952 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.952 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.952 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.952 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.953 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.953 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.953 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.954 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.954 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.954 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.954 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.955 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.955 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.955 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.955 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.956 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.956 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.956 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.956 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.957 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.957 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.957 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.957 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.958 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.958 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.958 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.958 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.959 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.959 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.959 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.959 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.960 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.960 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.960 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.960 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.961 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.961 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.961 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.961 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.962 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.962 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.962 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.962 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.963 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.963 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.963 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.963 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.963 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.964 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.964 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.964 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.964 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.964 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.964 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.965 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.965 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.965 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.965 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.965 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.965 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.966 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.966 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.966 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.966 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.966 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.967 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.967 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.967 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.967 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.967 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.967 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.968 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.968 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.968 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.968 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.968 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.968 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.969 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.969 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.969 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.969 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.969 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.970 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.970 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.970 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.970 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.970 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.970 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.971 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.971 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.971 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.971 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.971 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.971 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.972 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.972 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.972 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.972 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.972 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.973 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.973 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.973 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.973 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.973 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.974 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.974 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.974 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.974 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.974 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.974 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.975 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.975 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.975 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.975 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.975 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.976 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.976 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.976 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.976 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.976 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.976 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.977 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.977 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.977 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.977 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.977 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.977 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.978 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.978 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.978 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.978 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.978 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.979 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.979 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.979 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.979 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.979 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.979 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.980 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.980 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.980 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.980 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.980 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.980 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.981 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.981 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.981 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.981 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.981 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.982 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.982 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.982 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.982 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.982 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.982 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.983 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.983 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.983 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.983 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.983 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.984 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.984 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.984 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.984 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.984 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.985 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.985 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.985 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.985 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.985 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.985 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.986 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.986 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.986 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.986 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.986 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.987 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.987 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.987 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.987 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.988 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.988 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.988 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.988 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.989 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.989 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.989 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.989 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.989 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.990 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.990 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.990 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.990 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.990 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.990 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.991 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.991 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.991 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.991 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.991 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.991 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.992 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.992 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.992 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.992 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.992 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.993 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.993 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.993 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.993 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.993 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.994 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.994 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.994 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.994 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.994 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.994 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.995 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.995 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.995 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.995 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.995 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.996 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.996 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.996 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.996 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.996 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.996 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.997 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.997 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.997 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.997 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.997 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.997 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.998 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.998 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.998 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.998 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.998 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.998 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.999 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.999 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.999 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.999 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:34.999 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.000 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.000 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.000 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.000 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.000 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.000 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.001 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.001 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.001 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.001 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.001 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.001 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.002 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.002 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.002 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.002 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.002 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.003 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.003 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.003 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.003 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.003 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.003 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.004 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.004 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.004 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.004 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.004 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.004 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.005 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.005 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.005 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.005 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.005 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.005 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.006 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.006 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.006 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.006 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.006 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.007 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.007 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.007 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.007 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.007 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.007 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.008 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.008 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.008 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.008 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.008 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.008 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.009 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.009 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.009 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.009 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.009 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.009 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.010 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.010 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.010 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.010 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.010 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.011 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.011 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.011 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.011 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.011 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.011 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.012 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.012 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.012 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.012 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.012 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.012 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.013 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.013 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.013 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.013 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.013 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.014 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.014 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.014 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.014 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.014 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.014 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.015 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.015 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.015 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.015 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.015 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.015 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.016 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.016 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.016 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.016 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.016 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.016 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.017 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.017 227336 WARNING oslo_config.cfg [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: and ``live_migration_inbound_addr`` respectively.
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: ).  Its value may be silently ignored in the future.
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.017 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.017 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.018 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.018 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.018 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.018 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.018 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.018 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.019 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.019 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.019 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.019 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.019 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.020 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.020 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.020 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.020 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.020 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.020 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.rbd_secret_uuid        = 2c5417c9-00eb-57d5-a565-ddecbc7995c1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.021 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.021 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.021 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.021 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.021 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.022 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.022 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.022 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.022 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.022 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.022 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.023 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.023 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.023 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.023 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.024 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.024 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.024 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.024 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.024 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.024 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.025 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.025 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.025 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.025 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.025 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.026 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.026 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.026 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.026 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.026 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.026 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.027 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.027 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.027 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.027 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.027 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.027 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.028 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.028 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.028 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.028 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.028 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.029 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.029 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.029 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.029 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.029 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.029 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.030 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.030 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.030 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.030 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.030 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.030 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.031 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.031 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.031 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.031 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.031 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.031 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.032 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.032 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.032 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.032 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.032 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.033 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.033 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.033 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.033 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.033 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.033 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.034 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.034 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.034 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.034 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.034 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.035 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.035 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.035 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.035 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.035 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.035 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.036 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.036 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.036 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.036 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.036 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.036 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.037 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.037 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.037 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.037 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.037 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.038 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.038 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.038 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.038 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.038 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.038 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.039 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.039 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.039 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.039 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.039 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.039 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.040 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.040 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.040 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.040 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.040 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.041 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.041 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.041 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.041 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.041 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.041 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.042 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.042 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.042 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.042 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.042 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.043 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.043 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.043 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.043 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.043 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.043 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.044 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.044 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.044 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.044 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.044 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.045 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.045 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.045 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.045 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.045 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.045 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.046 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.046 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.046 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.046 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.046 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.047 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.047 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.047 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.047 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.047 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.047 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.048 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.048 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.048 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.048 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.048 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.048 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.049 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.049 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.049 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.049 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.049 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.050 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.050 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.050 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.050 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.050 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.050 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.051 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.051 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.051 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.051 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.051 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.052 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.052 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.052 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.052 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.052 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.052 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.053 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.053 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.053 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.053 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.053 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.054 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.054 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.054 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.054 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.054 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.054 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.055 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.055 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.055 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.055 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.055 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.055 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.056 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.056 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.056 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.056 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.056 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.057 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.057 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.057 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.057 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.057 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.057 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.058 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.058 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.058 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.058 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.058 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.058 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.059 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.059 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.059 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.059 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.059 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.059 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.060 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.060 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.060 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.060 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.060 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.060 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.061 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.061 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.061 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.061 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.061 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.062 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.062 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.062 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.062 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.062 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.063 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.063 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.063 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.063 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.063 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.063 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.064 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.064 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.064 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.064 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.064 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.065 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.065 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.065 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.065 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.065 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.065 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.066 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.066 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.066 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.066 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.066 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.066 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.067 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.067 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.067 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.067 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.067 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.067 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.068 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.068 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.068 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.068 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.068 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.069 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.069 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.069 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.069 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.069 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.069 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.070 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.070 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.071 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.071 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.071 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.072 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.072 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.072 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.073 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.073 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.073 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.074 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.074 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.074 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.074 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.075 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.075 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.075 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.076 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.076 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.076 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.077 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.077 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.077 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.078 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.078 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.078 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.078 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.079 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.079 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.079 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.080 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.080 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.080 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.081 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.081 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.081 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.082 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.082 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.082 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.083 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.083 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.083 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.084 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.084 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.084 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.085 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.085 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.085 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.086 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.086 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.086 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.086 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.087 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.087 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.087 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.088 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.088 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.088 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.088 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.089 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.089 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.089 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.090 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.090 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.090 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.091 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.091 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.091 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.091 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.092 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.092 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.092 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.093 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.093 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.093 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.094 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.094 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.094 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.095 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.095 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.095 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.096 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.096 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.096 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.096 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.097 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.097 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.097 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.098 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.098 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.098 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.099 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.099 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.099 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.100 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.100 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.100 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.101 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.101 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.101 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.102 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.102 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.102 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.102 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.103 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.103 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.103 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.104 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.104 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.104 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.105 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.105 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.105 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.105 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.106 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.106 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.106 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.107 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.107 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.107 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.108 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.108 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.108 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.108 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.109 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.109 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.109 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.110 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.110 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.110 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.112 227336 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.156 227336 INFO nova.virt.node [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Determined node identity 35fead26-0bad-4950-b646-987079d58a17 from /var/lib/nova/compute_id
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.157 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.158 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.158 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.159 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.172 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fd52e32dcd0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.175 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fd52e32dcd0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.177 227336 INFO nova.virt.libvirt.driver [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Connection event '1' reason 'None'
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.198 227336 DEBUG nova.virt.libvirt.volume.mount [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.202 227336 INFO nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Libvirt host capabilities <capabilities>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <host>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <uuid>eb468aed-e0e9-4528-988f-9267a3530b7a</uuid>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <cpu>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <arch>x86_64</arch>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model>EPYC-Rome-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <vendor>AMD</vendor>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <microcode version='16777317'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <signature family='23' model='49' stepping='0'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature name='x2apic'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature name='tsc-deadline'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature name='osxsave'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature name='hypervisor'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature name='tsc_adjust'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature name='spec-ctrl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature name='stibp'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature name='arch-capabilities'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature name='ssbd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature name='cmp_legacy'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature name='topoext'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature name='virt-ssbd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature name='lbrv'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature name='tsc-scale'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature name='vmcb-clean'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature name='pause-filter'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature name='pfthreshold'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature name='svme-addr-chk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature name='rdctl-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature name='skip-l1dfl-vmentry'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature name='mds-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature name='pschange-mc-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <pages unit='KiB' size='4'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <pages unit='KiB' size='2048'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <pages unit='KiB' size='1048576'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </cpu>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <power_management>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <suspend_mem/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <suspend_disk/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <suspend_hybrid/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </power_management>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <iommu support='no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <migration_features>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <live/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <uri_transports>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <uri_transport>tcp</uri_transport>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <uri_transport>rdma</uri_transport>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </uri_transports>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </migration_features>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <topology>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <cells num='1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <cell id='0'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:           <memory unit='KiB'>16116612</memory>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:           <pages unit='KiB' size='4'>4029153</pages>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:           <pages unit='KiB' size='2048'>0</pages>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:           <distances>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:             <sibling id='0' value='10'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:           </distances>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:           <cpus num='8'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:           </cpus>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         </cell>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </cells>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </topology>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <cache>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </cache>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <secmodel>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model>selinux</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <doi>0</doi>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </secmodel>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <secmodel>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model>dac</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <doi>0</doi>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </secmodel>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   </host>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <guest>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <os_type>hvm</os_type>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <arch name='i686'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <wordsize>32</wordsize>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <domain type='qemu'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <domain type='kvm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </arch>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <features>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <pae/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <nonpae/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <acpi default='on' toggle='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <apic default='on' toggle='no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <cpuselection/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <deviceboot/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <disksnapshot default='on' toggle='no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <externalSnapshot/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </features>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   </guest>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <guest>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <os_type>hvm</os_type>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <arch name='x86_64'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <wordsize>64</wordsize>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <domain type='qemu'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <domain type='kvm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </arch>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <features>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <acpi default='on' toggle='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <apic default='on' toggle='no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <cpuselection/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <deviceboot/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <disksnapshot default='on' toggle='no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <externalSnapshot/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </features>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   </guest>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: </capabilities>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.215 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.233 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: <domainCapabilities>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <domain>kvm</domain>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <arch>i686</arch>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <vcpu max='1024'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <iothreads supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <os supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <enum name='firmware'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <loader supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='type'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>rom</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>pflash</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='readonly'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>yes</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>no</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='secure'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>no</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </loader>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   </os>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <cpu>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>on</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>off</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </mode>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <mode name='maximum' supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='maximumMigratable'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>on</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>off</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </mode>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <mode name='host-model' supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <vendor>AMD</vendor>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='x2apic'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='stibp'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='ssbd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='succor'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='ibrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='lbrv'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </mode>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <mode name='custom' supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cooperlake'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cooperlake-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cooperlake-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Denverton'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mpx'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Denverton-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mpx'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Denverton-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Denverton-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Dhyana-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Genoa'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amd-psfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='auto-ibrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='stibp-always-on'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amd-psfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='auto-ibrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='stibp-always-on'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Milan'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amd-psfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='stibp-always-on'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Rome'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='GraniteRapids'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mcdt-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pbrsb-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='prefetchiti'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mcdt-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pbrsb-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='prefetchiti'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx10'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx10-128'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx10-256'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx10-512'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mcdt-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pbrsb-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='prefetchiti'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-noTSX'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='IvyBridge'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='IvyBridge-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='IvyBridge-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='KnightsMill'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512er'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512pf'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='KnightsMill-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512er'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512pf'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Opteron_G4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fma4'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xop'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fma4'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xop'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Opteron_G5'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fma4'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tbm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xop'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fma4'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tbm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xop'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SapphireRapids'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SierraForest'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cmpccxadd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mcdt-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pbrsb-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SierraForest-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cmpccxadd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mcdt-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pbrsb-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Snowridge'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='core-capability'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mpx'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='split-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Snowridge-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='core-capability'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mpx'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='split-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Snowridge-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='core-capability'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='split-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Snowridge-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='core-capability'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='split-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Snowridge-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='athlon'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnow'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnowext'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='athlon-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnow'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnowext'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='core2duo'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='core2duo-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='coreduo'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='coreduo-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='n270'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='n270-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='phenom'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnow'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnowext'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='phenom-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnow'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnowext'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </mode>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   </cpu>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <memoryBacking supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <enum name='sourceType'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <value>file</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <value>anonymous</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <value>memfd</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   </memoryBacking>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <devices>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <disk supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='diskDevice'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>disk</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>cdrom</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>floppy</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>lun</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='bus'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>fdc</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>scsi</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>usb</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>sata</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='model'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio-transitional</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio-non-transitional</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </disk>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <graphics supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='type'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vnc</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>egl-headless</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>dbus</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </graphics>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <video supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='modelType'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vga</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>cirrus</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>none</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>bochs</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>ramfb</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </video>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <hostdev supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='mode'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>subsystem</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='startupPolicy'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>default</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>mandatory</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>requisite</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>optional</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='subsysType'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>usb</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>pci</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>scsi</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='capsType'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='pciBackend'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </hostdev>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <rng supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='model'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio-transitional</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio-non-transitional</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='backendModel'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>random</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>egd</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>builtin</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </rng>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <filesystem supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='driverType'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>path</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>handle</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtiofs</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </filesystem>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <tpm supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='model'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>tpm-tis</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>tpm-crb</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='backendModel'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>emulator</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>external</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='backendVersion'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>2.0</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </tpm>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <redirdev supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='bus'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>usb</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </redirdev>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <channel supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='type'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>pty</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>unix</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </channel>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <crypto supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='model'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='type'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>qemu</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='backendModel'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>builtin</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </crypto>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <interface supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='backendType'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>default</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>passt</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </interface>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <panic supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='model'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>isa</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>hyperv</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </panic>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <console supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='type'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>null</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vc</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>pty</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>dev</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>file</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>pipe</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>stdio</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>udp</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>tcp</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>unix</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>qemu-vdagent</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>dbus</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </console>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   </devices>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <features>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <gic supported='no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <vmcoreinfo supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <genid supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <backingStoreInput supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <backup supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <async-teardown supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <ps2 supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <sev supported='no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <sgx supported='no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <hyperv supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='features'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>relaxed</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vapic</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>spinlocks</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vpindex</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>runtime</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>synic</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>stimer</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>reset</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vendor_id</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>frequencies</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>reenlightenment</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>tlbflush</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>ipi</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>avic</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>emsr_bitmap</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>xmm_input</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <defaults>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <spinlocks>4095</spinlocks>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <stimer_direct>on</stimer_direct>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </defaults>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </hyperv>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <launchSecurity supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='sectype'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>tdx</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </launchSecurity>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   </features>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: </domainCapabilities>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.240 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: <domainCapabilities>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <domain>kvm</domain>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <arch>i686</arch>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <vcpu max='240'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <iothreads supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <os supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <enum name='firmware'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <loader supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='type'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>rom</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>pflash</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='readonly'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>yes</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>no</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='secure'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>no</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </loader>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   </os>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <cpu>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>on</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>off</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </mode>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <mode name='maximum' supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='maximumMigratable'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>on</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>off</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </mode>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <mode name='host-model' supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <vendor>AMD</vendor>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='x2apic'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='stibp'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='ssbd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='succor'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='ibrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='lbrv'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </mode>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <mode name='custom' supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cooperlake'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cooperlake-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cooperlake-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Denverton'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mpx'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Denverton-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mpx'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Denverton-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Denverton-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Dhyana-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Genoa'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amd-psfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='auto-ibrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='stibp-always-on'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amd-psfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='auto-ibrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='stibp-always-on'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Milan'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amd-psfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='stibp-always-on'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Rome'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='GraniteRapids'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mcdt-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pbrsb-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='prefetchiti'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mcdt-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pbrsb-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='prefetchiti'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx10'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx10-128'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx10-256'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx10-512'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mcdt-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pbrsb-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='prefetchiti'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-noTSX'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='IvyBridge'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='IvyBridge-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='IvyBridge-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='KnightsMill'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512er'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512pf'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='KnightsMill-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512er'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512pf'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Opteron_G4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fma4'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xop'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fma4'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xop'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Opteron_G5'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fma4'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tbm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xop'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fma4'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tbm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xop'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SapphireRapids'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SierraForest'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cmpccxadd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mcdt-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pbrsb-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SierraForest-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cmpccxadd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mcdt-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pbrsb-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Snowridge'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='core-capability'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mpx'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='split-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Snowridge-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='core-capability'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mpx'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='split-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Snowridge-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='core-capability'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='split-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Snowridge-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='core-capability'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='split-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Snowridge-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='athlon'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnow'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnowext'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='athlon-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnow'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnowext'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='core2duo'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='core2duo-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='coreduo'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='coreduo-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='n270'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='n270-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='phenom'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnow'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnowext'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='phenom-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnow'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnowext'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </mode>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   </cpu>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <memoryBacking supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <enum name='sourceType'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <value>file</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <value>anonymous</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <value>memfd</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   </memoryBacking>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <devices>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <disk supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='diskDevice'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>disk</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>cdrom</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>floppy</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>lun</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='bus'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>ide</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>fdc</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>scsi</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>usb</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>sata</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='model'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio-transitional</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio-non-transitional</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </disk>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <graphics supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='type'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vnc</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>egl-headless</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>dbus</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </graphics>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <video supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='modelType'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vga</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>cirrus</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>none</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>bochs</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>ramfb</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </video>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <hostdev supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='mode'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>subsystem</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='startupPolicy'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>default</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>mandatory</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>requisite</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>optional</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='subsysType'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>usb</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>pci</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>scsi</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='capsType'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='pciBackend'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </hostdev>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <rng supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='model'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio-transitional</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio-non-transitional</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='backendModel'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>random</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>egd</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>builtin</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </rng>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <filesystem supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='driverType'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>path</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>handle</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtiofs</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </filesystem>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <tpm supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='model'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>tpm-tis</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>tpm-crb</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='backendModel'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>emulator</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>external</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='backendVersion'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>2.0</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </tpm>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <redirdev supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='bus'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>usb</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </redirdev>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <channel supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='type'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>pty</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>unix</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </channel>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <crypto supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='model'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='type'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>qemu</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='backendModel'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>builtin</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </crypto>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <interface supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='backendType'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>default</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>passt</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </interface>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <panic supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='model'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>isa</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>hyperv</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </panic>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <console supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='type'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>null</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vc</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>pty</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>dev</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>file</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>pipe</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>stdio</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>udp</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>tcp</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>unix</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>qemu-vdagent</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>dbus</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </console>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   </devices>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <features>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <gic supported='no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <vmcoreinfo supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <genid supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <backingStoreInput supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <backup supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <async-teardown supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <ps2 supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <sev supported='no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <sgx supported='no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <hyperv supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='features'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>relaxed</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vapic</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>spinlocks</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vpindex</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>runtime</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>synic</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>stimer</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>reset</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vendor_id</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>frequencies</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>reenlightenment</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>tlbflush</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>ipi</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>avic</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>emsr_bitmap</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>xmm_input</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <defaults>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <spinlocks>4095</spinlocks>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <stimer_direct>on</stimer_direct>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </defaults>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </hyperv>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <launchSecurity supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='sectype'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>tdx</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </launchSecurity>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   </features>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: </domainCapabilities>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.286 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.290 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: <domainCapabilities>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <domain>kvm</domain>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <arch>x86_64</arch>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <vcpu max='1024'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <iothreads supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <os supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <enum name='firmware'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <value>efi</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <loader supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='type'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>rom</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>pflash</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='readonly'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>yes</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>no</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='secure'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>yes</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>no</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </loader>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   </os>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <cpu>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>on</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>off</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </mode>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <mode name='maximum' supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='maximumMigratable'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>on</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>off</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </mode>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <mode name='host-model' supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <vendor>AMD</vendor>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='x2apic'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='stibp'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='ssbd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='succor'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='ibrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='lbrv'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </mode>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <mode name='custom' supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cooperlake'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cooperlake-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cooperlake-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Denverton'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mpx'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Denverton-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mpx'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Denverton-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Denverton-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Dhyana-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Genoa'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amd-psfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='auto-ibrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='stibp-always-on'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amd-psfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='auto-ibrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='stibp-always-on'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Milan'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amd-psfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='stibp-always-on'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Rome'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='GraniteRapids'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mcdt-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pbrsb-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='prefetchiti'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mcdt-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pbrsb-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='prefetchiti'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx10'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx10-128'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx10-256'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx10-512'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mcdt-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pbrsb-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='prefetchiti'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-noTSX'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='IvyBridge'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='IvyBridge-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='IvyBridge-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='KnightsMill'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512er'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512pf'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='KnightsMill-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512er'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512pf'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Opteron_G4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fma4'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xop'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fma4'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xop'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Opteron_G5'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fma4'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tbm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xop'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fma4'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tbm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xop'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SapphireRapids'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SierraForest'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cmpccxadd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mcdt-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pbrsb-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SierraForest-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cmpccxadd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mcdt-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pbrsb-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Snowridge'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='core-capability'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mpx'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='split-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Snowridge-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='core-capability'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mpx'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='split-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Snowridge-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='core-capability'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='split-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Snowridge-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='core-capability'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='split-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Snowridge-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='athlon'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnow'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnowext'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='athlon-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnow'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnowext'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='core2duo'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='core2duo-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='coreduo'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='coreduo-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='n270'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='n270-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='phenom'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnow'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnowext'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='phenom-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnow'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnowext'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </mode>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   </cpu>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <memoryBacking supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <enum name='sourceType'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <value>file</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <value>anonymous</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <value>memfd</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   </memoryBacking>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <devices>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <disk supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='diskDevice'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>disk</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>cdrom</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>floppy</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>lun</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='bus'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>fdc</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>scsi</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>usb</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>sata</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='model'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio-transitional</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio-non-transitional</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </disk>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <graphics supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='type'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vnc</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>egl-headless</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>dbus</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </graphics>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <video supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='modelType'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vga</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>cirrus</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>none</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>bochs</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>ramfb</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </video>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <hostdev supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='mode'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>subsystem</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='startupPolicy'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>default</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>mandatory</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>requisite</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>optional</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='subsysType'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>usb</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>pci</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>scsi</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='capsType'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='pciBackend'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </hostdev>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <rng supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='model'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio-transitional</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio-non-transitional</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='backendModel'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>random</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>egd</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>builtin</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </rng>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <filesystem supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='driverType'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>path</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>handle</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtiofs</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </filesystem>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <tpm supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='model'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>tpm-tis</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>tpm-crb</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='backendModel'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>emulator</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>external</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='backendVersion'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>2.0</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </tpm>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <redirdev supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='bus'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>usb</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </redirdev>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <channel supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='type'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>pty</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>unix</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </channel>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <crypto supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='model'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='type'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>qemu</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='backendModel'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>builtin</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </crypto>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <interface supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='backendType'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>default</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>passt</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </interface>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <panic supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='model'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>isa</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>hyperv</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </panic>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <console supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='type'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>null</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vc</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>pty</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>dev</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>file</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>pipe</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>stdio</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>udp</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>tcp</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>unix</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>qemu-vdagent</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>dbus</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </console>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   </devices>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <features>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <gic supported='no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <vmcoreinfo supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <genid supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <backingStoreInput supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <backup supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <async-teardown supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <ps2 supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <sev supported='no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <sgx supported='no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <hyperv supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='features'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>relaxed</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vapic</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>spinlocks</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vpindex</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>runtime</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>synic</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>stimer</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>reset</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vendor_id</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>frequencies</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>reenlightenment</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>tlbflush</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>ipi</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>avic</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>emsr_bitmap</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>xmm_input</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <defaults>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <spinlocks>4095</spinlocks>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <stimer_direct>on</stimer_direct>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </defaults>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </hyperv>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <launchSecurity supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='sectype'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>tdx</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </launchSecurity>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   </features>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: </domainCapabilities>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.344 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: <domainCapabilities>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <domain>kvm</domain>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <arch>x86_64</arch>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <vcpu max='240'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <iothreads supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <os supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <enum name='firmware'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <loader supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='type'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>rom</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>pflash</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='readonly'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>yes</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>no</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='secure'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>no</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </loader>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   </os>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <cpu>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>on</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>off</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </mode>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <mode name='maximum' supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='maximumMigratable'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>on</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>off</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </mode>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <mode name='host-model' supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <vendor>AMD</vendor>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='x2apic'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='stibp'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='ssbd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='succor'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='ibrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='lbrv'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </mode>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <mode name='custom' supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Broadwell-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cooperlake'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cooperlake-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Cooperlake-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Denverton'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mpx'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Denverton-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mpx'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Denverton-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Denverton-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Dhyana-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Genoa'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amd-psfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='auto-ibrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='stibp-always-on'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amd-psfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='auto-ibrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='stibp-always-on'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Milan'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amd-psfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='stibp-always-on'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Rome'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='EPYC-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='GraniteRapids'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mcdt-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pbrsb-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='prefetchiti'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mcdt-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pbrsb-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='prefetchiti'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx10'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx10-128'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx10-256'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx10-512'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mcdt-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pbrsb-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='prefetchiti'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-noTSX'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Haswell-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='IvyBridge'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='IvyBridge-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='IvyBridge-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='KnightsMill'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512er'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512pf'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='KnightsMill-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512er'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512pf'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Opteron_G4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fma4'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xop'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fma4'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xop'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Opteron_G5'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fma4'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tbm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xop'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fma4'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tbm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xop'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SapphireRapids'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='amx-tile'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-bf16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-fp16'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bitalg'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrc'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fzrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='la57'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='taa-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xfd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SierraForest'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cmpccxadd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mcdt-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pbrsb-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='SierraForest-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-ifma'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cmpccxadd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fbsdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='fsrs'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ibrs-all'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mcdt-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pbrsb-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='psdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='serialize'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vaes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='hle'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='rtm'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512bw'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512cd'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512dq'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512f'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='avx512vl'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='invpcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pcid'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='pku'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Snowridge'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='core-capability'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mpx'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='split-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Snowridge-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='core-capability'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='mpx'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='split-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Snowridge-v2'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='core-capability'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='split-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Snowridge-v3'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='core-capability'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='split-lock-detect'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='Snowridge-v4'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='cldemote'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='erms'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='gfni'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdir64b'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='movdiri'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='xsaves'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='athlon'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnow'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnowext'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='athlon-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnow'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnowext'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='core2duo'>
Nov 28 09:32:35 np0005538513.localdomain python3.9[227564]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='core2duo-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='coreduo'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='coreduo-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='n270'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='n270-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='ss'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='phenom'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnow'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnowext'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <blockers model='phenom-v1'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnow'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <feature name='3dnowext'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </blockers>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </mode>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   </cpu>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <memoryBacking supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <enum name='sourceType'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <value>file</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <value>anonymous</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <value>memfd</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   </memoryBacking>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <devices>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <disk supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='diskDevice'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>disk</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>cdrom</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>floppy</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>lun</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='bus'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>ide</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>fdc</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>scsi</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>usb</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>sata</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='model'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio-transitional</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio-non-transitional</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </disk>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <graphics supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='type'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vnc</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>egl-headless</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>dbus</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </graphics>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <video supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='modelType'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vga</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>cirrus</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>none</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>bochs</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>ramfb</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </video>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <hostdev supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='mode'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>subsystem</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='startupPolicy'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>default</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>mandatory</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>requisite</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>optional</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='subsysType'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>usb</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>pci</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>scsi</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='capsType'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='pciBackend'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </hostdev>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <rng supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='model'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio-transitional</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtio-non-transitional</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='backendModel'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>random</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>egd</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>builtin</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </rng>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <filesystem supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='driverType'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>path</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>handle</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>virtiofs</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </filesystem>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <tpm supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='model'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>tpm-tis</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>tpm-crb</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='backendModel'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>emulator</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>external</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='backendVersion'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>2.0</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </tpm>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <redirdev supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='bus'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>usb</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </redirdev>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <channel supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='type'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>pty</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>unix</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </channel>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <crypto supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='model'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='type'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>qemu</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='backendModel'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>builtin</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </crypto>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <interface supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='backendType'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>default</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>passt</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </interface>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <panic supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='model'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>isa</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>hyperv</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </panic>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <console supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='type'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>null</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vc</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>pty</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>dev</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>file</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>pipe</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>stdio</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>udp</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>tcp</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>unix</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>qemu-vdagent</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>dbus</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </console>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   </devices>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   <features>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <gic supported='no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <vmcoreinfo supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <genid supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <backingStoreInput supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <backup supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <async-teardown supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <ps2 supported='yes'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <sev supported='no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <sgx supported='no'/>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <hyperv supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='features'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>relaxed</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vapic</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>spinlocks</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vpindex</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>runtime</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>synic</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>stimer</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>reset</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>vendor_id</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>frequencies</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>reenlightenment</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>tlbflush</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>ipi</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>avic</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>emsr_bitmap</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>xmm_input</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <defaults>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <spinlocks>4095</spinlocks>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <stimer_direct>on</stimer_direct>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </defaults>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </hyperv>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     <launchSecurity supported='yes'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       <enum name='sectype'>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:         <value>tdx</value>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:       </enum>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:     </launchSecurity>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:   </features>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: </domainCapabilities>
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.395 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.395 227336 INFO nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Secure Boot support detected
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.397 227336 INFO nova.virt.libvirt.driver [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.398 227336 INFO nova.virt.libvirt.driver [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.418 227336 DEBUG nova.virt.libvirt.driver [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.482 227336 INFO nova.virt.node [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Determined node identity 35fead26-0bad-4950-b646-987079d58a17 from /var/lib/nova/compute_id
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.501 227336 DEBUG nova.compute.manager [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Verified node 35fead26-0bad-4950-b646-987079d58a17 matches my host np0005538513.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.539 227336 DEBUG nova.compute.manager [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.543 227336 DEBUG nova.virt.libvirt.vif [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T08:32:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=<?>,hidden=False,host='np0005538513.localdomain',hostname='test',id=2,image_ref='391767f1-35f2-4b68-ae15-e0b29db66dcb',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T08:33:06Z,launched_on='np0005538513.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=<?>,node='np0005538513.localdomain',numa_topology=None,old_flavor=<?>,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9dda653c53224db086060962b0702694',ramdisk_id='',reservation_id='r-a3c307c0',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata=<?>,tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T08:33:07Z,user_data=None,user_id='4d9169247d4447d0a8dd4c33f8b23dee',uuid=c2f0c7d6-df5f-4541-8b2c-bc1eaf805812,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.543 227336 DEBUG nova.network.os_vif_util [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Converting VIF {"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.544 227336 DEBUG nova.network.os_vif_util [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.545 227336 DEBUG os_vif [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.612 227336 DEBUG ovsdbapp.backend.ovs_idl [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.612 227336 DEBUG ovsdbapp.backend.ovs_idl [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.612 227336 DEBUG ovsdbapp.backend.ovs_idl [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.612 227336 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.613 227336 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.613 227336 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.613 227336 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.615 227336 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.618 227336 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.633 227336 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.633 227336 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.634 227336 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 09:32:35 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:35.641 227336 INFO oslo.privsep.daemon [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp70qswwnh/privsep.sock']
Nov 28 09:32:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60890 DF PROTO=TCP SPT=53580 DPT=9102 SEQ=1058156247 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D59820000000001030307) 
Nov 28 09:32:36 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:36.251 227336 INFO oslo.privsep.daemon [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Spawned new privsep daemon via rootwrap
Nov 28 09:32:36 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:36.146 227849 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 28 09:32:36 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:36.149 227849 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 28 09:32:36 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:36.152 227849 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Nov 28 09:32:36 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:36.152 227849 INFO oslo.privsep.daemon [-] privsep daemon running as pid 227849
Nov 28 09:32:36 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:36.535 227336 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:32:36 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:36.536 227336 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09612b07-51, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:32:36 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:36.536 227336 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap09612b07-51, col_values=(('external_ids', {'iface-id': '09612b07-5142-4b0f-9dab-74bf4403f69f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f4:fc:6c', 'vm-uuid': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:32:36 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:36.537 227336 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 09:32:36 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:36.538 227336 INFO os_vif [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51')
Nov 28 09:32:36 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:36.538 227336 DEBUG nova.compute.manager [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 09:32:36 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:36.542 227336 DEBUG nova.compute.manager [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304
Nov 28 09:32:36 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:36.542 227336 INFO nova.compute.manager [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 28 09:32:36 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:36.905 227336 INFO nova.service [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Updating service version for nova-compute on np0005538513.localdomain from 57 to 66
Nov 28 09:32:36 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:36.933 227336 DEBUG oslo_concurrency.lockutils [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:32:36 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:36.933 227336 DEBUG oslo_concurrency.lockutils [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:32:36 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:36.934 227336 DEBUG oslo_concurrency.lockutils [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:32:36 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:36.934 227336 DEBUG nova.compute.resource_tracker [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:32:36 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:36.935 227336 DEBUG oslo_concurrency.processutils [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:32:37 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:37.389 227336 DEBUG oslo_concurrency.processutils [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:32:37 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:37.455 227336 DEBUG nova.virt.libvirt.driver [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:32:37 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:37.455 227336 DEBUG nova.virt.libvirt.driver [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:32:37 np0005538513.localdomain systemd[1]: Started libvirt nodedev daemon.
Nov 28 09:32:37 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:37.836 227336 WARNING nova.virt.libvirt.driver [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:32:37 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:37.837 227336 DEBUG nova.compute.resource_tracker [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12907MB free_disk=41.837093353271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:32:37 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:37.838 227336 DEBUG oslo_concurrency.lockutils [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:32:37 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:37.838 227336 DEBUG oslo_concurrency.lockutils [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:32:37 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:37.974 227336 DEBUG nova.compute.resource_tracker [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:32:37 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:37.975 227336 DEBUG nova.compute.resource_tracker [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:32:37 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:37.975 227336 DEBUG nova.compute.resource_tracker [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:32:37 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:37.990 227336 DEBUG nova.scheduler.client.report [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Refreshing inventories for resource provider 35fead26-0bad-4950-b646-987079d58a17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 09:32:38 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:38.046 227336 DEBUG nova.scheduler.client.report [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Updating ProviderTree inventory for provider 35fead26-0bad-4950-b646-987079d58a17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 09:32:38 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:38.047 227336 DEBUG nova.compute.provider_tree [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 09:32:38 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:38.060 227336 DEBUG nova.scheduler.client.report [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Refreshing aggregate associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 09:32:38 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:38.083 227336 DEBUG nova.scheduler.client.report [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Refreshing trait associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, traits: COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE,HW_CPU_X86_FMA3,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_ACCELERATORS,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_CLMUL,HW_CPU_X86_SVM,HW_CPU_X86_AESNI,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_BMI,HW_CPU_X86_AVX,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_LAN9118,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_ABM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,HW_CPU_X86_F16C,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AVX2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 09:32:38 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:38.112 227336 DEBUG oslo_concurrency.processutils [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:32:38 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:38.581 227336 DEBUG oslo_concurrency.processutils [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:32:38 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:38.589 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 28 09:32:38 np0005538513.localdomain nova_compute[227332]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Nov 28 09:32:38 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:38.590 227336 INFO nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] kernel doesn't support AMD SEV
Nov 28 09:32:38 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:38.592 227336 DEBUG nova.compute.provider_tree [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 09:32:38 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:38.592 227336 DEBUG nova.virt.libvirt.driver [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 09:32:38 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:38.656 227336 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:32:38 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:38.687 227336 DEBUG nova.scheduler.client.report [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Updated inventory for provider 35fead26-0bad-4950-b646-987079d58a17 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Nov 28 09:32:38 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:38.688 227336 DEBUG nova.compute.provider_tree [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Updating resource provider 35fead26-0bad-4950-b646-987079d58a17 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 28 09:32:38 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:38.688 227336 DEBUG nova.compute.provider_tree [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 09:32:38 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:38.783 227336 DEBUG nova.compute.provider_tree [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Updating resource provider 35fead26-0bad-4950-b646-987079d58a17 generation from 4 to 5 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 28 09:32:38 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:38.816 227336 DEBUG nova.compute.resource_tracker [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:32:38 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:38.817 227336 DEBUG oslo_concurrency.lockutils [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.979s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:32:38 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:38.817 227336 DEBUG nova.service [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Nov 28 09:32:38 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:38.875 227336 DEBUG nova.service [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Nov 28 09:32:38 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:38.876 227336 DEBUG nova.servicegroup.drivers.db [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] DB_Driver: join new ServiceGroup member np0005538513.localdomain to the compute group, service = <Service: host=np0005538513.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Nov 28 09:32:39 np0005538513.localdomain python3.9[228005]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:32:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61558 DF PROTO=TCP SPT=50872 DPT=9100 SEQ=1856325133 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D65820000000001030307) 
Nov 28 09:32:39 np0005538513.localdomain sudo[228135]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fitfdufgzimuvlqroxwrloovugvxydde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322359.3836932-4264-148370805211180/AnsiballZ_podman_container.py
Nov 28 09:32:39 np0005538513.localdomain sudo[228135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:40 np0005538513.localdomain python3.9[228137]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 28 09:32:40 np0005538513.localdomain sudo[228135]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:40 np0005538513.localdomain systemd-journald[47227]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 115.0 (383 of 333 items), suggesting rotation.
Nov 28 09:32:40 np0005538513.localdomain systemd-journald[47227]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 09:32:40 np0005538513.localdomain rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:32:40 np0005538513.localdomain rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:32:40 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:40.648 227336 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:32:41 np0005538513.localdomain sudo[228268]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhzwygbluzrwpymwodcaqbibkbyrvlae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322360.8828378-4288-245251440747917/AnsiballZ_systemd.py
Nov 28 09:32:41 np0005538513.localdomain sudo[228268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:41 np0005538513.localdomain python3.9[228270]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:32:41 np0005538513.localdomain systemd[1]: Stopping nova_compute container...
Nov 28 09:32:41 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:41.639 227336 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170
Nov 28 09:32:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41358 DF PROTO=TCP SPT=57320 DPT=9100 SEQ=1150725942 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D72020000000001030307) 
Nov 28 09:32:42 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:42.994 227336 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Nov 28 09:32:42 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:42.996 227336 DEBUG oslo_concurrency.lockutils [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:32:42 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:42.997 227336 DEBUG oslo_concurrency.lockutils [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:32:42 np0005538513.localdomain nova_compute[227332]: 2025-11-28 09:32:42.997 227336 DEBUG oslo_concurrency.lockutils [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:32:43 np0005538513.localdomain virtqemud[201490]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 28 09:32:43 np0005538513.localdomain virtqemud[201490]: hostname: np0005538513.localdomain
Nov 28 09:32:43 np0005538513.localdomain virtqemud[201490]: End of file while reading data: Input/output error
Nov 28 09:32:43 np0005538513.localdomain systemd[1]: libpod-11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf.scope: Deactivated successfully.
Nov 28 09:32:43 np0005538513.localdomain systemd[1]: libpod-11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf.scope: Consumed 4.822s CPU time.
Nov 28 09:32:43 np0005538513.localdomain podman[228274]: 2025-11-28 09:32:43.415610696 +0000 UTC m=+1.856776621 container died 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true)
Nov 28 09:32:43 np0005538513.localdomain systemd[1]: tmp-crun.KudeZ4.mount: Deactivated successfully.
Nov 28 09:32:43 np0005538513.localdomain systemd[1]: tmp-crun.86Ixpm.mount: Deactivated successfully.
Nov 28 09:32:43 np0005538513.localdomain podman[228274]: 2025-11-28 09:32:43.483977036 +0000 UTC m=+1.925142941 container cleanup 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Nov 28 09:32:43 np0005538513.localdomain podman[228274]: nova_compute
Nov 28 09:32:43 np0005538513.localdomain podman[228315]: error opening file `/run/crun/11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf/status`: No such file or directory
Nov 28 09:32:43 np0005538513.localdomain podman[228304]: 2025-11-28 09:32:43.590401593 +0000 UTC m=+0.070891881 container cleanup 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 09:32:43 np0005538513.localdomain podman[228304]: nova_compute
Nov 28 09:32:43 np0005538513.localdomain systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 28 09:32:43 np0005538513.localdomain systemd[1]: Stopped nova_compute container.
Nov 28 09:32:43 np0005538513.localdomain systemd[1]: Starting nova_compute container...
Nov 28 09:32:43 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:32:43 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:43 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:43 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:43 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:43 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:43 np0005538513.localdomain podman[228317]: 2025-11-28 09:32:43.743165199 +0000 UTC m=+0.116657572 container init 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:32:43 np0005538513.localdomain podman[228317]: 2025-11-28 09:32:43.751888105 +0000 UTC m=+0.125380478 container start 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:32:43 np0005538513.localdomain podman[228317]: nova_compute
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: + sudo -E kolla_set_configs
Nov 28 09:32:43 np0005538513.localdomain systemd[1]: Started nova_compute container.
Nov 28 09:32:43 np0005538513.localdomain sudo[228268]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Validating config file
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Copying service configuration files
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Deleting /etc/ceph
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Creating directory /etc/ceph
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Setting permission for /etc/ceph
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Writing out command to execute
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: ++ cat /run_command
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: + CMD=nova-compute
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: + ARGS=
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: + sudo kolla_copy_cacerts
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: + [[ ! -n '' ]]
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: + . kolla_extend_start
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: Running command: 'nova-compute'
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: + echo 'Running command: '\''nova-compute'\'''
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: + umask 0022
Nov 28 09:32:43 np0005538513.localdomain nova_compute[228333]: + exec nova-compute
Nov 28 09:32:44 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:32:44 np0005538513.localdomain podman[228362]: 2025-11-28 09:32:44.350911091 +0000 UTC m=+0.085848544 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:32:44 np0005538513.localdomain podman[228362]: 2025-11-28 09:32:44.386878922 +0000 UTC m=+0.121816405 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 28 09:32:44 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:32:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:45.536 228337 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 09:32:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:45.536 228337 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 09:32:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:45.536 228337 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 09:32:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:45.536 228337 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 28 09:32:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:45.650 228337 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:32:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:45.672 228337 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:32:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:45.672 228337 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.102 228337 INFO nova.virt.driver [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.215 228337 INFO nova.compute.provider_config [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.223 228337 WARNING nova.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.223 228337 DEBUG oslo_concurrency.lockutils [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.223 228337 DEBUG oslo_concurrency.lockutils [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.223 228337 DEBUG oslo_concurrency.lockutils [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.224 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.224 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.224 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.224 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.224 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.224 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.225 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.225 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.225 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.225 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.225 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.225 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.225 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.225 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.226 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.226 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.226 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.226 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.226 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.226 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] console_host                   = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.226 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.227 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.227 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.227 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.227 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.227 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.227 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.227 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.227 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.228 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.228 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.228 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.228 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.228 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.228 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.228 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.229 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.229 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.229 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] host                           = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.229 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.229 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.229 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.229 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.230 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.230 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.230 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.230 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.230 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.230 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.230 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.231 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.231 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.231 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.231 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.231 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.231 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.231 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.231 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.232 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.232 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.232 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.232 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.232 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.232 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.232 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.232 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.233 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.233 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.233 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.233 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.233 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.233 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.233 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.234 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.234 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.234 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.234 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.234 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.234 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.234 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.234 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] my_block_storage_ip            = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.235 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] my_ip                          = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.235 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.235 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.235 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.235 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.235 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.235 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.236 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.236 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.236 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.236 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.236 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.236 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.237 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.237 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.237 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.237 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.237 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.237 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.237 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.237 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.238 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.238 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.238 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.238 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.238 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.238 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.238 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.238 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.239 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.239 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.239 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.239 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.239 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.239 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.239 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.239 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.240 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.240 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.240 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.240 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.240 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.240 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.240 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.241 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.241 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.241 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.241 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.241 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.241 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.241 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.241 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.242 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.242 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.242 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.242 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.242 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.242 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.242 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.242 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.243 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.243 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.243 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.243 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.243 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.243 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.243 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.243 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.244 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.244 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.244 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.244 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.244 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.244 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.244 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.245 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.245 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.245 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.245 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.245 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.245 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.245 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.246 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.246 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.246 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.246 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.246 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.246 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.246 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.246 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.247 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.247 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.247 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.247 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.247 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.247 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.247 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.248 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.248 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.248 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.248 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.248 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.248 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.248 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.249 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.249 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.249 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.249 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.249 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.249 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.249 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.249 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.250 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.250 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.250 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.250 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.250 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.250 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.250 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.251 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.251 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.251 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.251 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.251 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.251 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.251 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.251 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.252 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.252 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.252 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.252 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.252 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.252 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.252 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.253 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.253 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.253 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.253 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.253 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.253 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.253 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.253 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.254 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.254 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.254 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.254 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.254 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.254 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.254 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.254 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.255 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.255 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.255 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.255 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.255 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.255 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.255 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.256 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.256 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.256 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.256 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.256 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.256 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.256 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.257 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.257 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.257 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.257 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.257 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.257 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.257 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.257 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.258 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.258 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.258 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.258 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.258 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.258 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.258 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.258 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.259 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.259 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.259 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.259 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.259 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.259 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.259 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.260 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.260 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.260 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.260 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.260 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.260 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.260 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.261 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.261 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.261 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.261 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.261 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.261 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.261 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.261 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.262 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.262 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.262 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.262 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.262 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.262 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.262 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.263 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.263 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.263 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.263 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.263 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.263 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.263 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.263 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.264 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.264 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.264 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.264 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.264 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.264 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.264 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.264 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.265 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.265 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.265 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.265 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.265 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.265 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.265 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.266 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.266 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.266 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.266 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.266 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.266 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.266 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.267 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.267 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.267 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.267 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.267 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.267 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.267 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.267 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.268 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.268 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.268 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.268 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.268 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.268 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.268 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.269 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.269 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.269 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.269 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.269 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.269 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.269 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.269 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.270 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.270 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.270 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.270 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.270 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.270 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.270 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.271 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.271 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.271 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.271 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.271 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.271 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.271 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.272 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.272 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.272 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.272 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.272 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.272 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.272 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.273 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.273 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.273 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.273 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.273 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.273 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.273 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.273 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.274 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.274 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.274 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.274 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.274 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.274 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.274 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.274 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.275 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.275 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.275 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.275 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.275 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.275 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.275 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.276 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.276 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.276 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.276 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.276 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.276 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.276 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.277 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.277 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.277 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.277 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.277 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.277 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.277 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.277 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.278 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.278 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.278 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.278 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.278 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.278 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.278 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.278 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.279 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.279 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.279 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.279 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.279 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.279 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.279 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.280 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.280 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.280 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.280 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.280 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.280 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.280 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.280 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.281 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.281 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.281 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.281 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.281 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.281 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.281 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.281 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.282 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.282 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.282 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.282 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.282 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.282 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.282 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.283 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.283 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.283 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.283 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.283 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.283 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.283 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.283 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.284 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.284 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.284 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.284 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.284 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.284 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.284 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.285 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.285 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.285 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.285 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.285 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.285 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.285 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.285 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.286 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.286 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.286 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.286 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.286 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.286 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.286 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.287 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.287 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.287 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.287 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.287 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.287 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.287 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.287 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.288 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.288 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.288 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.288 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.288 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.288 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.288 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.289 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.289 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.289 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.289 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.289 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.289 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.289 228337 WARNING oslo_config.cfg [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: and ``live_migration_inbound_addr`` respectively.
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: ).  Its value may be silently ignored in the future.
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.290 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.290 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.290 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.290 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.290 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.290 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.290 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.291 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.291 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.291 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.291 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.291 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.291 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.291 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.292 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.292 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.292 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.292 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.292 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.rbd_secret_uuid        = 2c5417c9-00eb-57d5-a565-ddecbc7995c1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.292 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.292 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.293 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.293 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.293 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.293 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.293 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.293 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.293 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.293 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.294 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.294 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.294 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.294 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.294 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.294 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.294 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.295 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.295 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.295 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.295 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.295 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.295 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.295 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.296 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.296 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.296 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.296 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.296 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.296 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.296 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.296 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.297 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.297 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.297 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.297 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.297 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.297 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.297 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.298 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.298 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.298 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.298 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.298 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.298 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.298 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.298 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.299 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.299 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.299 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.299 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.299 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.299 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.299 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.299 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.300 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.300 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.300 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.300 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.300 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.300 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.300 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.301 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.301 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.301 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.301 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.301 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.301 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.301 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.301 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.302 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.302 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.302 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.302 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.302 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.302 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.302 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.303 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.303 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.303 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.303 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.303 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.303 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.303 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.303 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.304 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.304 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.304 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.304 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.304 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.304 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.304 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.305 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.305 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.305 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.305 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.305 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.305 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.305 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.305 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.306 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.306 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.306 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.306 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.306 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.306 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.306 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.306 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.307 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.307 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.307 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.307 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.307 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.307 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.307 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.308 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.308 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.308 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.308 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.308 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.308 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.308 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.309 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.309 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.309 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.309 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.309 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.309 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.309 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.309 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.310 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.310 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.310 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.310 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.310 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.310 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.310 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.311 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.311 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.311 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.311 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.311 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.311 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.311 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.312 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.312 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.312 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.312 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.312 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.312 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.312 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.312 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.313 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.313 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.313 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.313 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.313 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.313 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.313 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.313 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.314 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.314 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.314 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.314 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.314 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.314 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.315 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.315 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.315 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.315 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.315 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.315 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.315 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.315 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.316 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.316 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.316 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.316 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.316 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.316 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.316 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.317 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.317 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.317 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.317 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.317 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.317 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.317 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.317 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.318 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.318 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.318 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.318 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.318 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.318 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.318 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.319 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.319 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.319 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.319 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.319 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.319 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.319 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.319 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.320 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.320 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.320 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.320 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.320 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.320 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.320 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.320 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.321 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.321 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.321 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.321 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.321 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.321 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.321 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.321 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.322 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.322 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.322 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.322 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.322 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.322 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.322 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.323 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.323 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.323 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.323 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.323 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.323 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.323 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.324 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.324 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.324 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.324 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.324 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.324 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.324 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.325 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.325 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.325 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.325 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.325 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.325 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.325 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.325 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.326 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.326 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.326 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.326 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.326 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.326 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.326 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.326 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.327 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.327 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.327 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.327 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.327 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.327 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.327 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.328 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.328 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.328 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.328 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.328 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.328 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.328 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.329 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.329 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.329 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.329 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.329 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.329 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.329 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.329 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.330 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.330 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.330 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.330 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.330 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.330 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.330 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.331 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.331 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.331 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.331 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.331 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.331 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.331 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.331 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.332 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.332 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.332 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.332 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.332 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.332 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.332 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.332 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.333 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.333 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.333 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.333 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.333 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.333 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.333 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.334 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.334 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.334 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.334 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.334 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.334 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.334 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.334 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.335 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.335 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.335 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.335 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.335 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.335 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.335 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.336 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.336 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.336 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.336 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.336 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.336 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.336 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.336 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.337 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.337 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.337 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.337 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.337 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.337 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.337 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.337 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.338 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.338 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.338 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.338 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.338 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.338 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.338 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.339 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.339 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.339 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.339 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.339 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.339 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.339 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.339 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.340 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.340 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.340 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.340 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.340 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.340 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.340 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.340 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.341 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.341 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.341 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.341 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.341 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.341 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.341 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.341 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.342 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.342 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.342 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.342 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.342 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.342 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.342 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.343 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.343 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.343 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.343 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.343 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.343 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.343 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.343 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.344 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.344 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.344 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.344 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.344 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.344 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.344 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.345 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.345 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.345 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.345 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.345 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.345 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.345 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.345 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.346 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.346 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.346 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.346 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.348 228337 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.360 228337 INFO nova.virt.node [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Determined node identity 35fead26-0bad-4950-b646-987079d58a17 from /var/lib/nova/compute_id
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.361 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.362 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.362 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.362 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.372 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f4fa720b0a0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.375 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f4fa720b0a0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.375 228337 INFO nova.virt.libvirt.driver [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Connection event '1' reason 'None'
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.380 228337 INFO nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Libvirt host capabilities <capabilities>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <host>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <uuid>eb468aed-e0e9-4528-988f-9267a3530b7a</uuid>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <cpu>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <arch>x86_64</arch>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model>EPYC-Rome-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <vendor>AMD</vendor>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <microcode version='16777317'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <signature family='23' model='49' stepping='0'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature name='x2apic'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature name='tsc-deadline'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature name='osxsave'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature name='hypervisor'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature name='tsc_adjust'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature name='spec-ctrl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature name='stibp'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature name='arch-capabilities'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature name='ssbd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature name='cmp_legacy'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature name='topoext'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature name='virt-ssbd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature name='lbrv'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature name='tsc-scale'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature name='vmcb-clean'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature name='pause-filter'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature name='pfthreshold'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature name='svme-addr-chk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature name='rdctl-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature name='skip-l1dfl-vmentry'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature name='mds-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature name='pschange-mc-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <pages unit='KiB' size='4'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <pages unit='KiB' size='2048'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <pages unit='KiB' size='1048576'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </cpu>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <power_management>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <suspend_mem/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <suspend_disk/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <suspend_hybrid/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </power_management>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <iommu support='no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <migration_features>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <live/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <uri_transports>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <uri_transport>tcp</uri_transport>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <uri_transport>rdma</uri_transport>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </uri_transports>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </migration_features>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <topology>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <cells num='1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <cell id='0'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:           <memory unit='KiB'>16116612</memory>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:           <pages unit='KiB' size='4'>4029153</pages>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:           <pages unit='KiB' size='2048'>0</pages>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:           <distances>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:             <sibling id='0' value='10'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:           </distances>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:           <cpus num='8'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:           </cpus>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         </cell>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </cells>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </topology>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <cache>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </cache>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <secmodel>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model>selinux</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <doi>0</doi>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </secmodel>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <secmodel>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model>dac</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <doi>0</doi>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </secmodel>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   </host>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <guest>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <os_type>hvm</os_type>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <arch name='i686'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <wordsize>32</wordsize>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <domain type='qemu'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <domain type='kvm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </arch>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <features>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <pae/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <nonpae/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <acpi default='on' toggle='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <apic default='on' toggle='no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <cpuselection/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <deviceboot/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <disksnapshot default='on' toggle='no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <externalSnapshot/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </features>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   </guest>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <guest>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <os_type>hvm</os_type>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <arch name='x86_64'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <wordsize>64</wordsize>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <domain type='qemu'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <domain type='kvm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </arch>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <features>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <acpi default='on' toggle='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <apic default='on' toggle='no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <cpuselection/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <deviceboot/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <disksnapshot default='on' toggle='no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <externalSnapshot/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </features>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   </guest>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: </capabilities>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.386 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.389 228337 DEBUG nova.virt.libvirt.volume.mount [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.390 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: <domainCapabilities>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <domain>kvm</domain>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <arch>i686</arch>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <vcpu max='1024'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <iothreads supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <os supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <enum name='firmware'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <loader supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='type'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>rom</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>pflash</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='readonly'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>yes</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>no</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='secure'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>no</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </loader>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   </os>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <cpu>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>on</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>off</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </mode>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <mode name='maximum' supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='maximumMigratable'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>on</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>off</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </mode>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <mode name='host-model' supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <vendor>AMD</vendor>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='x2apic'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='stibp'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='ssbd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='succor'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='ibrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='lbrv'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </mode>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <mode name='custom' supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cooperlake'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cooperlake-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cooperlake-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Denverton'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mpx'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Denverton-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mpx'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Denverton-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Denverton-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Dhyana-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Genoa'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amd-psfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='auto-ibrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='stibp-always-on'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amd-psfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='auto-ibrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='stibp-always-on'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Milan'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amd-psfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='stibp-always-on'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Rome'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='GraniteRapids'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mcdt-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pbrsb-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='prefetchiti'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mcdt-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pbrsb-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='prefetchiti'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx10'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx10-128'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx10-256'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx10-512'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mcdt-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pbrsb-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='prefetchiti'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-noTSX'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='IvyBridge'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='IvyBridge-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='IvyBridge-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='KnightsMill'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512er'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512pf'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='KnightsMill-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512er'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512pf'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Opteron_G4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fma4'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xop'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fma4'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xop'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Opteron_G5'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fma4'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tbm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xop'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fma4'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tbm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xop'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SapphireRapids'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SierraForest'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cmpccxadd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mcdt-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pbrsb-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SierraForest-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cmpccxadd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mcdt-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pbrsb-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Snowridge'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='core-capability'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mpx'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='split-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Snowridge-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='core-capability'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mpx'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='split-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Snowridge-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='core-capability'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='split-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Snowridge-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='core-capability'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='split-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Snowridge-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='athlon'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnow'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnowext'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='athlon-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnow'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnowext'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='core2duo'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='core2duo-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='coreduo'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='coreduo-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='n270'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='n270-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='phenom'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnow'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnowext'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='phenom-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnow'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnowext'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </mode>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   </cpu>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <memoryBacking supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <enum name='sourceType'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <value>file</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <value>anonymous</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <value>memfd</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   </memoryBacking>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <devices>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <disk supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='diskDevice'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>disk</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>cdrom</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>floppy</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>lun</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='bus'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>fdc</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>scsi</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>usb</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>sata</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='model'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio-transitional</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio-non-transitional</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </disk>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <graphics supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='type'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vnc</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>egl-headless</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>dbus</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </graphics>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <video supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='modelType'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vga</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>cirrus</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>none</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>bochs</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>ramfb</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </video>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <hostdev supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='mode'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>subsystem</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='startupPolicy'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>default</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>mandatory</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>requisite</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>optional</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='subsysType'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>usb</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>pci</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>scsi</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='capsType'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='pciBackend'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </hostdev>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <rng supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='model'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio-transitional</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio-non-transitional</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='backendModel'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>random</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>egd</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>builtin</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </rng>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <filesystem supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='driverType'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>path</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>handle</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtiofs</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </filesystem>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <tpm supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='model'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>tpm-tis</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>tpm-crb</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='backendModel'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>emulator</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>external</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='backendVersion'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>2.0</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </tpm>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <redirdev supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='bus'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>usb</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </redirdev>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <channel supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='type'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>pty</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>unix</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </channel>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <crypto supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='model'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='type'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>qemu</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='backendModel'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>builtin</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </crypto>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <interface supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='backendType'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>default</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>passt</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </interface>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <panic supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='model'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>isa</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>hyperv</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </panic>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <console supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='type'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>null</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vc</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>pty</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>dev</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>file</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>pipe</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>stdio</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>udp</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>tcp</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>unix</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>qemu-vdagent</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>dbus</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </console>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   </devices>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <features>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <gic supported='no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <vmcoreinfo supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <genid supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <backingStoreInput supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <backup supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <async-teardown supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <ps2 supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <sev supported='no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <sgx supported='no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <hyperv supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='features'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>relaxed</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vapic</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>spinlocks</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vpindex</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>runtime</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>synic</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>stimer</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>reset</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vendor_id</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>frequencies</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>reenlightenment</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>tlbflush</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>ipi</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>avic</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>emsr_bitmap</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>xmm_input</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <defaults>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <spinlocks>4095</spinlocks>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <stimer_direct>on</stimer_direct>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </defaults>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </hyperv>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <launchSecurity supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='sectype'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>tdx</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </launchSecurity>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   </features>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: </domainCapabilities>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.396 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: <domainCapabilities>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <domain>kvm</domain>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <arch>i686</arch>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <vcpu max='240'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <iothreads supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <os supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <enum name='firmware'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <loader supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='type'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>rom</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>pflash</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='readonly'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>yes</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>no</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='secure'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>no</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </loader>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   </os>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <cpu>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>on</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>off</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </mode>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <mode name='maximum' supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='maximumMigratable'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>on</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>off</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </mode>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <mode name='host-model' supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <vendor>AMD</vendor>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='x2apic'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='stibp'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='ssbd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='succor'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='ibrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='lbrv'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </mode>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <mode name='custom' supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cooperlake'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cooperlake-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cooperlake-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Denverton'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mpx'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Denverton-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mpx'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Denverton-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Denverton-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Dhyana-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Genoa'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amd-psfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='auto-ibrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='stibp-always-on'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amd-psfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='auto-ibrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='stibp-always-on'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Milan'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amd-psfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='stibp-always-on'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Rome'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='GraniteRapids'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mcdt-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pbrsb-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='prefetchiti'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mcdt-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pbrsb-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='prefetchiti'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx10'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx10-128'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx10-256'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx10-512'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mcdt-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pbrsb-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='prefetchiti'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-noTSX'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='IvyBridge'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='IvyBridge-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='IvyBridge-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='KnightsMill'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512er'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512pf'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='KnightsMill-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512er'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512pf'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Opteron_G4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fma4'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xop'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fma4'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xop'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Opteron_G5'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fma4'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tbm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xop'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fma4'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tbm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xop'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SapphireRapids'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SierraForest'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cmpccxadd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mcdt-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pbrsb-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SierraForest-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cmpccxadd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mcdt-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pbrsb-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Snowridge'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='core-capability'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mpx'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='split-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Snowridge-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='core-capability'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mpx'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='split-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Snowridge-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='core-capability'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='split-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Snowridge-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='core-capability'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='split-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Snowridge-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='athlon'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnow'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnowext'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='athlon-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnow'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnowext'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='core2duo'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='core2duo-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='coreduo'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='coreduo-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='n270'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='n270-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='phenom'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnow'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnowext'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='phenom-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnow'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnowext'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </mode>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   </cpu>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <memoryBacking supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <enum name='sourceType'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <value>file</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <value>anonymous</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <value>memfd</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   </memoryBacking>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <devices>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <disk supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='diskDevice'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>disk</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>cdrom</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>floppy</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>lun</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='bus'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>ide</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>fdc</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>scsi</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>usb</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>sata</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='model'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio-transitional</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio-non-transitional</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </disk>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <graphics supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='type'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vnc</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>egl-headless</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>dbus</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </graphics>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <video supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='modelType'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vga</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>cirrus</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>none</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>bochs</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>ramfb</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </video>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <hostdev supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='mode'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>subsystem</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='startupPolicy'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>default</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>mandatory</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>requisite</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>optional</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='subsysType'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>usb</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>pci</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>scsi</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='capsType'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='pciBackend'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </hostdev>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <rng supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='model'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio-transitional</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio-non-transitional</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='backendModel'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>random</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>egd</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>builtin</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </rng>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <filesystem supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='driverType'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>path</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>handle</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtiofs</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </filesystem>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <tpm supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='model'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>tpm-tis</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>tpm-crb</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='backendModel'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>emulator</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>external</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='backendVersion'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>2.0</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </tpm>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <redirdev supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='bus'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>usb</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </redirdev>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <channel supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='type'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>pty</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>unix</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </channel>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <crypto supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='model'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='type'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>qemu</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='backendModel'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>builtin</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </crypto>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <interface supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='backendType'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>default</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>passt</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </interface>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <panic supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='model'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>isa</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>hyperv</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </panic>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <console supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='type'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>null</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vc</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>pty</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>dev</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>file</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>pipe</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>stdio</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>udp</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>tcp</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>unix</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>qemu-vdagent</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>dbus</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </console>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   </devices>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <features>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <gic supported='no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <vmcoreinfo supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <genid supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <backingStoreInput supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <backup supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <async-teardown supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <ps2 supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <sev supported='no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <sgx supported='no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <hyperv supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='features'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>relaxed</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vapic</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>spinlocks</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vpindex</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>runtime</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>synic</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>stimer</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>reset</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vendor_id</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>frequencies</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>reenlightenment</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>tlbflush</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>ipi</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>avic</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>emsr_bitmap</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>xmm_input</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <defaults>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <spinlocks>4095</spinlocks>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <stimer_direct>on</stimer_direct>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </defaults>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </hyperv>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <launchSecurity supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='sectype'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>tdx</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </launchSecurity>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   </features>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: </domainCapabilities>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.425 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.430 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: <domainCapabilities>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <domain>kvm</domain>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <arch>x86_64</arch>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <vcpu max='1024'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <iothreads supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <os supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <enum name='firmware'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <value>efi</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <loader supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='type'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>rom</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>pflash</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='readonly'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>yes</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>no</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='secure'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>yes</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>no</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </loader>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   </os>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <cpu>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>on</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>off</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </mode>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <mode name='maximum' supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='maximumMigratable'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>on</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>off</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </mode>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <mode name='host-model' supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <vendor>AMD</vendor>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='x2apic'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='stibp'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='ssbd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='succor'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='ibrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='lbrv'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </mode>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <mode name='custom' supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cooperlake'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cooperlake-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cooperlake-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Denverton'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mpx'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Denverton-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mpx'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Denverton-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Denverton-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Dhyana-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Genoa'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amd-psfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='auto-ibrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='stibp-always-on'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amd-psfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='auto-ibrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='stibp-always-on'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Milan'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amd-psfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='stibp-always-on'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Rome'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='GraniteRapids'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mcdt-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pbrsb-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='prefetchiti'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mcdt-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pbrsb-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='prefetchiti'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx10'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx10-128'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx10-256'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx10-512'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mcdt-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pbrsb-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='prefetchiti'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-noTSX'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='IvyBridge'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='IvyBridge-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='IvyBridge-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='KnightsMill'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512er'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512pf'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='KnightsMill-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512er'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512pf'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Opteron_G4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fma4'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xop'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fma4'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xop'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Opteron_G5'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fma4'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tbm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xop'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fma4'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tbm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xop'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SapphireRapids'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SierraForest'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cmpccxadd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mcdt-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pbrsb-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SierraForest-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cmpccxadd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mcdt-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pbrsb-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Snowridge'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='core-capability'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mpx'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='split-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Snowridge-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='core-capability'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mpx'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='split-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Snowridge-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='core-capability'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='split-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Snowridge-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='core-capability'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='split-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Snowridge-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='athlon'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnow'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnowext'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='athlon-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnow'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnowext'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='core2duo'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='core2duo-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='coreduo'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='coreduo-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='n270'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='n270-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='phenom'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnow'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnowext'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='phenom-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnow'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnowext'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </mode>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   </cpu>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <memoryBacking supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <enum name='sourceType'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <value>file</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <value>anonymous</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <value>memfd</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   </memoryBacking>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <devices>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <disk supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='diskDevice'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>disk</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>cdrom</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>floppy</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>lun</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='bus'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>fdc</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>scsi</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>usb</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>sata</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='model'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio-transitional</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio-non-transitional</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </disk>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <graphics supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='type'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vnc</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>egl-headless</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>dbus</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </graphics>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <video supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='modelType'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vga</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>cirrus</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>none</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>bochs</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>ramfb</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </video>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <hostdev supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='mode'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>subsystem</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='startupPolicy'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>default</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>mandatory</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>requisite</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>optional</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='subsysType'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>usb</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>pci</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>scsi</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='capsType'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='pciBackend'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </hostdev>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <rng supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='model'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio-transitional</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio-non-transitional</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='backendModel'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>random</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>egd</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>builtin</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </rng>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <filesystem supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='driverType'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>path</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>handle</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtiofs</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </filesystem>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <tpm supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='model'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>tpm-tis</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>tpm-crb</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='backendModel'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>emulator</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>external</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='backendVersion'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>2.0</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </tpm>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <redirdev supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='bus'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>usb</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </redirdev>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <channel supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='type'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>pty</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>unix</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </channel>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <crypto supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='model'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='type'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>qemu</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='backendModel'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>builtin</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </crypto>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <interface supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='backendType'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>default</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>passt</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </interface>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <panic supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='model'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>isa</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>hyperv</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </panic>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <console supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='type'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>null</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vc</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>pty</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>dev</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>file</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>pipe</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>stdio</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>udp</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>tcp</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>unix</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>qemu-vdagent</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>dbus</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </console>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   </devices>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <features>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <gic supported='no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <vmcoreinfo supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <genid supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <backingStoreInput supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <backup supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <async-teardown supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <ps2 supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <sev supported='no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <sgx supported='no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <hyperv supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='features'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>relaxed</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vapic</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>spinlocks</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vpindex</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>runtime</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>synic</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>stimer</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>reset</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vendor_id</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>frequencies</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>reenlightenment</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>tlbflush</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>ipi</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>avic</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>emsr_bitmap</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>xmm_input</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <defaults>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <spinlocks>4095</spinlocks>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <stimer_direct>on</stimer_direct>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </defaults>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </hyperv>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <launchSecurity supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='sectype'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>tdx</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </launchSecurity>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   </features>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: </domainCapabilities>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.477 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: <domainCapabilities>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <domain>kvm</domain>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <arch>x86_64</arch>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <vcpu max='240'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <iothreads supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <os supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <enum name='firmware'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <loader supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='type'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>rom</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>pflash</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='readonly'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>yes</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>no</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='secure'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>no</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </loader>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   </os>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <cpu>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>on</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>off</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </mode>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <mode name='maximum' supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='maximumMigratable'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>on</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>off</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </mode>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <mode name='host-model' supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <vendor>AMD</vendor>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='x2apic'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='stibp'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='ssbd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='succor'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='ibrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='lbrv'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </mode>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <mode name='custom' supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Broadwell-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cooperlake'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cooperlake-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Cooperlake-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Denverton'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mpx'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Denverton-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mpx'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Denverton-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Denverton-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Dhyana-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Genoa'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amd-psfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='auto-ibrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='stibp-always-on'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amd-psfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='auto-ibrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='stibp-always-on'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Milan'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amd-psfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='stibp-always-on'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Rome'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='EPYC-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='GraniteRapids'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mcdt-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pbrsb-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='prefetchiti'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mcdt-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pbrsb-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='prefetchiti'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx10'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx10-128'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx10-256'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx10-512'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mcdt-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pbrsb-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='prefetchiti'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-noTSX'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Haswell-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='IvyBridge'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='IvyBridge-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='IvyBridge-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='KnightsMill'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512er'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512pf'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='KnightsMill-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512er'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512pf'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Opteron_G4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fma4'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xop'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fma4'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xop'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Opteron_G5'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fma4'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tbm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xop'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fma4'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tbm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xop'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SapphireRapids'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='amx-tile'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-bf16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-fp16'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bitalg'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrc'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fzrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='la57'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='taa-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xfd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SierraForest'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cmpccxadd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mcdt-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pbrsb-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='SierraForest-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-ifma'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cmpccxadd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fbsdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='fsrs'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ibrs-all'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mcdt-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pbrsb-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='psdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='serialize'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vaes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='hle'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='rtm'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512bw'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512cd'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512dq'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512f'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='avx512vl'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='invpcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pcid'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='pku'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Snowridge'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='core-capability'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mpx'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='split-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Snowridge-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='core-capability'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='mpx'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='split-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Snowridge-v2'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='core-capability'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='split-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Snowridge-v3'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='core-capability'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='split-lock-detect'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='Snowridge-v4'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='cldemote'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='erms'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='gfni'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdir64b'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='movdiri'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='xsaves'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='athlon'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnow'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnowext'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='athlon-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnow'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnowext'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='core2duo'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='core2duo-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='coreduo'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='coreduo-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='n270'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='n270-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='ss'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='phenom'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnow'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnowext'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <blockers model='phenom-v1'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnow'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <feature name='3dnowext'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </blockers>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </mode>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   </cpu>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <memoryBacking supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <enum name='sourceType'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <value>file</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <value>anonymous</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <value>memfd</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   </memoryBacking>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <devices>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <disk supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='diskDevice'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>disk</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>cdrom</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>floppy</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>lun</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='bus'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>ide</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>fdc</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>scsi</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>usb</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>sata</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='model'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio-transitional</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio-non-transitional</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </disk>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <graphics supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='type'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vnc</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>egl-headless</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>dbus</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </graphics>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <video supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='modelType'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vga</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>cirrus</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>none</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>bochs</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>ramfb</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </video>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <hostdev supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='mode'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>subsystem</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='startupPolicy'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>default</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>mandatory</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>requisite</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>optional</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='subsysType'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>usb</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>pci</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>scsi</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='capsType'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='pciBackend'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </hostdev>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <rng supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='model'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio-transitional</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtio-non-transitional</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='backendModel'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>random</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>egd</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>builtin</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </rng>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <filesystem supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='driverType'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>path</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>handle</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>virtiofs</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </filesystem>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <tpm supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='model'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>tpm-tis</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>tpm-crb</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='backendModel'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>emulator</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>external</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='backendVersion'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>2.0</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </tpm>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <redirdev supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='bus'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>usb</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </redirdev>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <channel supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='type'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>pty</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>unix</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </channel>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <crypto supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='model'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='type'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>qemu</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='backendModel'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>builtin</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </crypto>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <interface supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='backendType'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>default</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>passt</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </interface>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <panic supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='model'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>isa</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>hyperv</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </panic>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <console supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='type'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>null</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vc</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>pty</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>dev</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>file</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>pipe</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>stdio</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>udp</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>tcp</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>unix</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>qemu-vdagent</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>dbus</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </console>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   </devices>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   <features>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <gic supported='no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <vmcoreinfo supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <genid supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <backingStoreInput supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <backup supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <async-teardown supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <ps2 supported='yes'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <sev supported='no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <sgx supported='no'/>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <hyperv supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='features'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>relaxed</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vapic</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>spinlocks</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vpindex</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>runtime</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>synic</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>stimer</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>reset</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>vendor_id</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>frequencies</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>reenlightenment</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>tlbflush</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>ipi</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>avic</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>emsr_bitmap</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>xmm_input</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <defaults>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <spinlocks>4095</spinlocks>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <stimer_direct>on</stimer_direct>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </defaults>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </hyperv>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     <launchSecurity supported='yes'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       <enum name='sectype'>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:         <value>tdx</value>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:       </enum>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:     </launchSecurity>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:   </features>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: </domainCapabilities>
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.544 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.544 228337 INFO nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Secure Boot support detected
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.546 228337 INFO nova.virt.libvirt.driver [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.547 228337 INFO nova.virt.libvirt.driver [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.559 228337 DEBUG nova.virt.libvirt.driver [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.604 228337 INFO nova.virt.node [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Determined node identity 35fead26-0bad-4950-b646-987079d58a17 from /var/lib/nova/compute_id
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.621 228337 DEBUG nova.compute.manager [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Verified node 35fead26-0bad-4950-b646-987079d58a17 matches my host np0005538513.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.651 228337 DEBUG nova.compute.manager [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.656 228337 DEBUG nova.virt.libvirt.vif [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T08:32:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=<?>,hidden=False,host='np0005538513.localdomain',hostname='test',id=2,image_ref='391767f1-35f2-4b68-ae15-e0b29db66dcb',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T08:33:06Z,launched_on='np0005538513.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=<?>,node='np0005538513.localdomain',numa_topology=None,old_flavor=<?>,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9dda653c53224db086060962b0702694',ramdisk_id='',reservation_id='r-a3c307c0',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata=<?>,tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T08:33:07Z,user_data=None,user_id='4d9169247d4447d0a8dd4c33f8b23dee',uuid=c2f0c7d6-df5f-4541-8b2c-bc1eaf805812,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.656 228337 DEBUG nova.network.os_vif_util [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Converting VIF {"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.657 228337 DEBUG nova.network.os_vif_util [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.658 228337 DEBUG os_vif [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.701 228337 DEBUG ovsdbapp.backend.ovs_idl [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.701 228337 DEBUG ovsdbapp.backend.ovs_idl [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.701 228337 DEBUG ovsdbapp.backend.ovs_idl [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.702 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.702 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.702 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.703 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.704 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.706 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.722 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.722 228337 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.723 228337 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 09:32:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:46.724 228337 INFO oslo.privsep.daemon [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpyle7c7tk/privsep.sock']
Nov 28 09:32:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29574 DF PROTO=TCP SPT=59928 DPT=9882 SEQ=3280948079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D84820000000001030307) 
Nov 28 09:32:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:47.261 228337 INFO oslo.privsep.daemon [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Spawned new privsep daemon via rootwrap
Nov 28 09:32:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:47.171 228411 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 28 09:32:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:47.174 228411 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 28 09:32:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:47.176 228411 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Nov 28 09:32:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:47.176 228411 INFO oslo.privsep.daemon [-] privsep daemon running as pid 228411
Nov 28 09:32:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:47.533 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:32:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:47.534 228337 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09612b07-51, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:32:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:47.534 228337 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap09612b07-51, col_values=(('external_ids', {'iface-id': '09612b07-5142-4b0f-9dab-74bf4403f69f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f4:fc:6c', 'vm-uuid': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:32:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:47.536 228337 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 09:32:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:47.536 228337 INFO os_vif [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51')
Nov 28 09:32:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:47.537 228337 DEBUG nova.compute.manager [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 09:32:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:47.540 228337 DEBUG nova.compute.manager [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304
Nov 28 09:32:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:47.541 228337 INFO nova.compute.manager [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 28 09:32:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:47.603 228337 DEBUG oslo_concurrency.lockutils [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:32:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:47.604 228337 DEBUG oslo_concurrency.lockutils [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:32:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:47.604 228337 DEBUG oslo_concurrency.lockutils [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:32:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:47.604 228337 DEBUG nova.compute.resource_tracker [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:32:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:47.605 228337 DEBUG oslo_concurrency.processutils [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:32:47 np0005538513.localdomain sudo[228435]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:32:47 np0005538513.localdomain sudo[228435]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:32:47 np0005538513.localdomain sudo[228435]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:47 np0005538513.localdomain sudo[228469]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:32:47 np0005538513.localdomain sudo[228469]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:32:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:48.073 228337 DEBUG oslo_concurrency.processutils [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:32:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:48.130 228337 DEBUG nova.virt.libvirt.driver [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:32:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:48.131 228337 DEBUG nova.virt.libvirt.driver [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:32:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:48.286 228337 WARNING nova.virt.libvirt.driver [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:32:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:48.287 228337 DEBUG nova.compute.resource_tracker [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12887MB free_disk=41.837093353271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:32:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:48.287 228337 DEBUG oslo_concurrency.lockutils [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:32:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:48.287 228337 DEBUG oslo_concurrency.lockutils [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:32:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:48.440 228337 DEBUG nova.compute.resource_tracker [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:32:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:48.441 228337 DEBUG nova.compute.resource_tracker [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:32:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:48.441 228337 DEBUG nova.compute.resource_tracker [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:32:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:48.493 228337 DEBUG nova.scheduler.client.report [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Refreshing inventories for resource provider 35fead26-0bad-4950-b646-987079d58a17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 09:32:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:48.513 228337 DEBUG nova.scheduler.client.report [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Updating ProviderTree inventory for provider 35fead26-0bad-4950-b646-987079d58a17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 09:32:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:48.514 228337 DEBUG nova.compute.provider_tree [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 09:32:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:48.536 228337 DEBUG nova.scheduler.client.report [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Refreshing aggregate associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 09:32:48 np0005538513.localdomain sudo[228469]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:48.560 228337 DEBUG nova.scheduler.client.report [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Refreshing trait associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,HW_CPU_X86_FMA3,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX2,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_CLMUL,HW_CPU_X86_ABM,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE41,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_F16C,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 09:32:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:48.606 228337 DEBUG oslo_concurrency.processutils [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:32:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:48.705 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:32:48 np0005538513.localdomain sudo[228614]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tbrlnkoxhqgxkdinwxrqbsmvyufvevuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322367.9116304-4315-265480238478439/AnsiballZ_podman_container.py
Nov 28 09:32:48 np0005538513.localdomain sudo[228614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:49.056 228337 DEBUG oslo_concurrency.processutils [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:32:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:49.060 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 28 09:32:49 np0005538513.localdomain nova_compute[228333]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Nov 28 09:32:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:49.061 228337 INFO nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] kernel doesn't support AMD SEV
Nov 28 09:32:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:49.061 228337 DEBUG nova.compute.provider_tree [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:32:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:49.062 228337 DEBUG nova.virt.libvirt.driver [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 09:32:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:49.076 228337 DEBUG nova.scheduler.client.report [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:32:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:49.093 228337 DEBUG nova.compute.resource_tracker [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:32:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:49.093 228337 DEBUG oslo_concurrency.lockutils [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.806s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:32:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:49.093 228337 DEBUG nova.service [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Nov 28 09:32:49 np0005538513.localdomain python3.9[228616]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 28 09:32:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:49.112 228337 DEBUG nova.service [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Nov 28 09:32:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:49.112 228337 DEBUG nova.servicegroup.drivers.db [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] DB_Driver: join new ServiceGroup member np0005538513.localdomain to the compute group, service = <Service: host=np0005538513.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Nov 28 09:32:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58028 DF PROTO=TCP SPT=45838 DPT=9105 SEQ=3383472125 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D8C020000000001030307) 
Nov 28 09:32:49 np0005538513.localdomain sudo[228621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:32:49 np0005538513.localdomain sudo[228621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:32:49 np0005538513.localdomain sudo[228621]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:49 np0005538513.localdomain systemd[1]: Started libpod-conmon-f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967.scope.
Nov 28 09:32:49 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:32:49 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91ffc49efb3c7a4f0573f98c172baf0327c9cb7db605d9d237beebdefd054f1a/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:49 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91ffc49efb3c7a4f0573f98c172baf0327c9cb7db605d9d237beebdefd054f1a/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:49 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91ffc49efb3c7a4f0573f98c172baf0327c9cb7db605d9d237beebdefd054f1a/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:49 np0005538513.localdomain podman[228661]: 2025-11-28 09:32:49.334167235 +0000 UTC m=+0.129887731 container init f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:32:49 np0005538513.localdomain podman[228661]: 2025-11-28 09:32:49.343632995 +0000 UTC m=+0.139353501 container start f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 09:32:49 np0005538513.localdomain python3.9[228616]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Applying nova statedir ownership
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 already 42436:42436
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 to system_u:object_r:container_file_t:s0
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/console.log
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/2b1009b2ab824d8ddaaa3afb1ca6ce3f88abf415
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-2b1009b2ab824d8ddaaa3afb1ca6ce3f88abf415
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/b234715fc878456b41e32c4fbc669b417044dbe6c6684bbc9059e5c93396ffea
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/469bc4441baff9216df986857f9ff45dbf25965a8d2f755a6449ac2645cb7191
Nov 28 09:32:49 np0005538513.localdomain nova_compute_init[228682]: INFO:nova_statedir:Nova statedir ownership complete
Nov 28 09:32:49 np0005538513.localdomain systemd[1]: libpod-f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967.scope: Deactivated successfully.
Nov 28 09:32:49 np0005538513.localdomain podman[228697]: 2025-11-28 09:32:49.490559097 +0000 UTC m=+0.068631909 container died f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 09:32:49 np0005538513.localdomain sudo[228614]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:49 np0005538513.localdomain podman[228697]: 2025-11-28 09:32:49.51965475 +0000 UTC m=+0.097727522 container cleanup f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:32:49 np0005538513.localdomain systemd[1]: libpod-conmon-f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967.scope: Deactivated successfully.
Nov 28 09:32:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-91ffc49efb3c7a4f0573f98c172baf0327c9cb7db605d9d237beebdefd054f1a-merged.mount: Deactivated successfully.
Nov 28 09:32:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967-userdata-shm.mount: Deactivated successfully.
Nov 28 09:32:50 np0005538513.localdomain sshd[206699]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:32:50 np0005538513.localdomain systemd[1]: session-53.scope: Deactivated successfully.
Nov 28 09:32:50 np0005538513.localdomain systemd[1]: session-53.scope: Consumed 2min 11.205s CPU time.
Nov 28 09:32:50 np0005538513.localdomain systemd-logind[764]: Session 53 logged out. Waiting for processes to exit.
Nov 28 09:32:50 np0005538513.localdomain systemd-logind[764]: Removed session 53.
Nov 28 09:32:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:32:50.811 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:32:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:32:50.812 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:32:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:32:50.813 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:32:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58029 DF PROTO=TCP SPT=45838 DPT=9105 SEQ=3383472125 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D94020000000001030307) 
Nov 28 09:32:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:51.706 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:32:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:52.114 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:32:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:52.135 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Triggering sync for uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 28 09:32:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:52.136 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:32:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:52.137 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:32:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:52.137 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:32:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:52.201 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:32:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:53.706 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:32:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58030 DF PROTO=TCP SPT=45838 DPT=9105 SEQ=3383472125 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1DA3C20000000001030307) 
Nov 28 09:32:56 np0005538513.localdomain sshd[228737]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:32:56 np0005538513.localdomain sshd[228737]: Accepted publickey for zuul from 192.168.122.30 port 37702 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:32:56 np0005538513.localdomain systemd-logind[764]: New session 55 of user zuul.
Nov 28 09:32:56 np0005538513.localdomain systemd[1]: Started Session 55 of User zuul.
Nov 28 09:32:56 np0005538513.localdomain sshd[228737]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:32:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:56.708 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:32:57 np0005538513.localdomain python3.9[228848]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:32:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:32:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:32:57 np0005538513.localdomain podman[228854]: 2025-11-28 09:32:57.842304984 +0000 UTC m=+0.077131028 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 09:32:57 np0005538513.localdomain podman[228854]: 2025-11-28 09:32:57.847905361 +0000 UTC m=+0.082731415 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 28 09:32:57 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:32:57 np0005538513.localdomain systemd[1]: tmp-crun.LunVHZ.mount: Deactivated successfully.
Nov 28 09:32:57 np0005538513.localdomain podman[228853]: 2025-11-28 09:32:57.899763067 +0000 UTC m=+0.134160628 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:32:57 np0005538513.localdomain podman[228853]: 2025-11-28 09:32:57.962472426 +0000 UTC m=+0.196869977 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 09:32:57 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:32:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28993 DF PROTO=TCP SPT=56164 DPT=9101 SEQ=2028527426 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1DAEEA0000000001030307) 
Nov 28 09:32:58 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:32:58.741 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:32:58 np0005538513.localdomain sudo[229004]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqbtogzwkeoggwxxxjucmasizwqxzpmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322378.2789774-68-50854194740764/AnsiballZ_systemd_service.py
Nov 28 09:32:58 np0005538513.localdomain sudo[229004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:59 np0005538513.localdomain python3.9[229006]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:32:59 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:32:59 np0005538513.localdomain systemd-sysv-generator[229033]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:32:59 np0005538513.localdomain systemd-rc-local-generator[229028]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:32:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:32:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:59 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:59 np0005538513.localdomain sudo[229004]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:00 np0005538513.localdomain python3.9[229150]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:33:00 np0005538513.localdomain network[229167]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:33:00 np0005538513.localdomain network[229168]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:33:00 np0005538513.localdomain network[229169]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:33:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28995 DF PROTO=TCP SPT=56164 DPT=9101 SEQ=2028527426 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1DBB020000000001030307) 
Nov 28 09:33:01 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:01.710 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:02 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:33:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58031 DF PROTO=TCP SPT=45838 DPT=9105 SEQ=3383472125 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1DC3820000000001030307) 
Nov 28 09:33:03 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:03.744 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65005 DF PROTO=TCP SPT=60840 DPT=9100 SEQ=2329009262 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1DCF820000000001030307) 
Nov 28 09:33:06 np0005538513.localdomain sudo[229402]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhxpwtedgsdrjsirxggeihmstgrmwxju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322386.345958-125-143781470040944/AnsiballZ_systemd_service.py
Nov 28 09:33:06 np0005538513.localdomain sudo[229402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:06 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:06.713 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:06 np0005538513.localdomain python3.9[229404]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:33:07 np0005538513.localdomain sudo[229402]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:07 np0005538513.localdomain sudo[229513]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xdjxopnmzwqxjlvnvcybihobocthfput ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322387.5417829-155-263896666363632/AnsiballZ_file.py
Nov 28 09:33:07 np0005538513.localdomain sudo[229513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:08 np0005538513.localdomain python3.9[229515]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:08 np0005538513.localdomain sudo[229513]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:08 np0005538513.localdomain systemd-journald[47227]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 76.3 (254 of 333 items), suggesting rotation.
Nov 28 09:33:08 np0005538513.localdomain systemd-journald[47227]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 09:33:08 np0005538513.localdomain rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:33:08 np0005538513.localdomain rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:33:08 np0005538513.localdomain sudo[229624]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjynghocaxzdjmalzkwpqnpbowmcfujj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322388.4254782-179-55447886674093/AnsiballZ_file.py
Nov 28 09:33:08 np0005538513.localdomain sudo[229624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:08 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:08.774 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:08 np0005538513.localdomain python3.9[229626]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:08 np0005538513.localdomain sudo[229624]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45682 DF PROTO=TCP SPT=34304 DPT=9102 SEQ=594967161 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1DDB820000000001030307) 
Nov 28 09:33:09 np0005538513.localdomain sudo[229734]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-stedkmuaqrdzqhntqpciqrbjiwgayctb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322389.3932881-206-275783045216813/AnsiballZ_command.py
Nov 28 09:33:09 np0005538513.localdomain sudo[229734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:10 np0005538513.localdomain python3.9[229736]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:33:10 np0005538513.localdomain sudo[229734]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:10 np0005538513.localdomain python3.9[229846]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 09:33:11 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:11.715 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:12 np0005538513.localdomain sudo[229954]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fhlgejpwnihjpzfrvhnrwywwfuxwnqyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322391.7777998-260-120963487324391/AnsiballZ_systemd_service.py
Nov 28 09:33:12 np0005538513.localdomain sudo[229954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:12 np0005538513.localdomain python3.9[229956]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:33:12 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:33:12 np0005538513.localdomain systemd-rc-local-generator[229978]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:33:12 np0005538513.localdomain systemd-sysv-generator[229986]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:33:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65007 DF PROTO=TCP SPT=60840 DPT=9100 SEQ=2329009262 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1DE7420000000001030307) 
Nov 28 09:33:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:33:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:12 np0005538513.localdomain sudo[229954]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:13 np0005538513.localdomain sudo[230100]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdztqqyqescbqeiwalnnvoorxdsmtltl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322392.8749275-284-198907667688161/AnsiballZ_command.py
Nov 28 09:33:13 np0005538513.localdomain sudo[230100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:13 np0005538513.localdomain python3.9[230102]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:33:13 np0005538513.localdomain sudo[230100]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:13 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:13.776 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:14 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:33:14 np0005538513.localdomain podman[230159]: 2025-11-28 09:33:14.856363098 +0000 UTC m=+0.086087460 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:33:14 np0005538513.localdomain podman[230159]: 2025-11-28 09:33:14.873368823 +0000 UTC m=+0.103093175 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 09:33:14 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:33:14 np0005538513.localdomain sudo[230231]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wefzbnbicsigeoqghjxabvevtpyibufx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322394.7072282-311-5115348923379/AnsiballZ_file.py
Nov 28 09:33:14 np0005538513.localdomain sudo[230231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:15 np0005538513.localdomain python3.9[230233]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:33:15 np0005538513.localdomain sudo[230231]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:15 np0005538513.localdomain python3.9[230341]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:33:16 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:16.717 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:16 np0005538513.localdomain python3.9[230451]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46252 DF PROTO=TCP SPT=52650 DPT=9882 SEQ=4185442292 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1DF9C20000000001030307) 
Nov 28 09:33:17 np0005538513.localdomain python3.9[230537]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322396.2342167-359-263547322514989/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=8310b0590be84763ce46965bb976fd9ca6a7668a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:33:18 np0005538513.localdomain sudo[230645]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fsuvmepktqkpgwdbjsempzijgcbamqpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322397.6002119-404-139971587325580/AnsiballZ_group.py
Nov 28 09:33:18 np0005538513.localdomain sudo[230645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:18 np0005538513.localdomain python3.9[230647]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Nov 28 09:33:18 np0005538513.localdomain sudo[230645]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:18 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:18.802 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17290 DF PROTO=TCP SPT=54360 DPT=9105 SEQ=937547911 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E01420000000001030307) 
Nov 28 09:33:19 np0005538513.localdomain sudo[230755]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdfdfjcnamcukhaqfwlpbzhmrxvnteie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322398.7748516-437-95968122592880/AnsiballZ_getent.py
Nov 28 09:33:19 np0005538513.localdomain sudo[230755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:19 np0005538513.localdomain python3.9[230757]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Nov 28 09:33:19 np0005538513.localdomain sudo[230755]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:19 np0005538513.localdomain sudo[230866]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gycsguoezrlkknfheyoocqlujwqsbszr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322399.6585007-461-4344038193578/AnsiballZ_group.py
Nov 28 09:33:19 np0005538513.localdomain sudo[230866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:20 np0005538513.localdomain python3.9[230868]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 09:33:20 np0005538513.localdomain groupadd[230869]: group added to /etc/group: name=ceilometer, GID=42405
Nov 28 09:33:20 np0005538513.localdomain groupadd[230869]: group added to /etc/gshadow: name=ceilometer
Nov 28 09:33:20 np0005538513.localdomain groupadd[230869]: new group: name=ceilometer, GID=42405
Nov 28 09:33:20 np0005538513.localdomain sudo[230866]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:20 np0005538513.localdomain sudo[230982]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lorzroxqyqvgltgbgllogrycjapfzaia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322400.4579377-485-116966825420401/AnsiballZ_user.py
Nov 28 09:33:20 np0005538513.localdomain sudo[230982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:21 np0005538513.localdomain python3.9[230984]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005538513.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 28 09:33:21 np0005538513.localdomain useradd[230986]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/1
Nov 28 09:33:21 np0005538513.localdomain useradd[230986]: add 'ceilometer' to group 'libvirt'
Nov 28 09:33:21 np0005538513.localdomain useradd[230986]: add 'ceilometer' to shadow group 'libvirt'
Nov 28 09:33:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17291 DF PROTO=TCP SPT=54360 DPT=9105 SEQ=937547911 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E09420000000001030307) 
Nov 28 09:33:21 np0005538513.localdomain sudo[230982]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:21 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:21.720 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:22 np0005538513.localdomain python3.9[231100]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:23 np0005538513.localdomain python3.9[231186]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764322402.117302-563-252611136264212/.source.conf _original_basename=ceilometer.conf follow=False checksum=e4f5a0d8a335534158f72dc0bd2ff76fd1e29e2d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:23 np0005538513.localdomain python3.9[231294]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:23 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:23.802 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:24 np0005538513.localdomain python3.9[231380]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764322403.1893637-563-89601618094617/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:25 np0005538513.localdomain python3.9[231488]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17292 DF PROTO=TCP SPT=54360 DPT=9105 SEQ=937547911 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E19020000000001030307) 
Nov 28 09:33:25 np0005538513.localdomain python3.9[231574]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764322404.7822988-563-39065417668188/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:26 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:26.721 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:27 np0005538513.localdomain python3.9[231682]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:33:27 np0005538513.localdomain python3.9[231790]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:33:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29020 DF PROTO=TCP SPT=55544 DPT=9101 SEQ=474957933 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E24190000000001030307) 
Nov 28 09:33:28 np0005538513.localdomain python3.9[231898]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:33:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:33:28 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:28.805 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:28 np0005538513.localdomain systemd[1]: tmp-crun.yo6INa.mount: Deactivated successfully.
Nov 28 09:33:28 np0005538513.localdomain podman[231986]: 2025-11-28 09:33:28.856820617 +0000 UTC m=+0.087540986 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:33:28 np0005538513.localdomain python3.9[231984]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322407.9481335-740-232584814813974/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:28 np0005538513.localdomain systemd[1]: tmp-crun.znjtNQ.mount: Deactivated successfully.
Nov 28 09:33:28 np0005538513.localdomain podman[231985]: 2025-11-28 09:33:28.89870101 +0000 UTC m=+0.132985613 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 09:33:28 np0005538513.localdomain podman[231986]: 2025-11-28 09:33:28.923807244 +0000 UTC m=+0.154527633 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 28 09:33:28 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:33:28 np0005538513.localdomain podman[231985]: 2025-11-28 09:33:28.959682524 +0000 UTC m=+0.193967117 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller)
Nov 28 09:33:28 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:33:29 np0005538513.localdomain python3.9[232134]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:29 np0005538513.localdomain python3.9[232189]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:30 np0005538513.localdomain python3.9[232297]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:30 np0005538513.localdomain python3.9[232383]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322410.011794-740-228076457598327/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=d15068604cf730dd6e7b88a19d62f57d3a39f94f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29022 DF PROTO=TCP SPT=55544 DPT=9101 SEQ=474957933 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E30020000000001030307) 
Nov 28 09:33:31 np0005538513.localdomain python3.9[232491]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:31 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:31.723 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:32 np0005538513.localdomain python3.9[232577]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322411.097895-740-139062728913990/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:32 np0005538513.localdomain python3.9[232685]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:33 np0005538513.localdomain python3.9[232771]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322412.237855-740-54978732213164/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38450 DF PROTO=TCP SPT=55732 DPT=9102 SEQ=3705141826 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E39020000000001030307) 
Nov 28 09:33:33 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:33.810 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:33 np0005538513.localdomain python3.9[232879]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:34 np0005538513.localdomain python3.9[232965]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322413.362747-740-149523037075264/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=7e5ab36b7368c1d4a00810e02af11a7f7d7c84e8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:35 np0005538513.localdomain python3.9[233073]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:35 np0005538513.localdomain python3.9[233159]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322414.5653374-740-189288447202918/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:36 np0005538513.localdomain python3.9[233267]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44169 DF PROTO=TCP SPT=58168 DPT=9100 SEQ=32968573 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E44820000000001030307) 
Nov 28 09:33:36 np0005538513.localdomain python3.9[233353]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322415.6777573-740-233639674667536/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=0e4ea521b0035bea70b7a804346a5c89364dcbc3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:36 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:36.725 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:37 np0005538513.localdomain python3.9[233461]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:38 np0005538513.localdomain python3.9[233547]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322416.7693648-740-48363449612220/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=b056dcaaba7624b93826bb95ee9e82f81bde6c72 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:38 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:38.842 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:38 np0005538513.localdomain python3.9[233655]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41361 DF PROTO=TCP SPT=57320 DPT=9100 SEQ=1150725942 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E4F820000000001030307) 
Nov 28 09:33:39 np0005538513.localdomain python3.9[233741]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322418.5314248-740-19092222534056/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=885ccc6f5edd8803cb385bdda5648d0b3017b4e4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:40 np0005538513.localdomain python3.9[233849]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:41 np0005538513.localdomain python3.9[233935]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322420.1067162-740-32014311779095/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:41 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:41.727 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44171 DF PROTO=TCP SPT=58168 DPT=9100 SEQ=32968573 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E5C420000000001030307) 
Nov 28 09:33:42 np0005538513.localdomain sudo[234043]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtudunmhmpujjwfjnkwcxgpxobcdtmfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322422.2509456-1205-80822468695964/AnsiballZ_file.py
Nov 28 09:33:42 np0005538513.localdomain sudo[234043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:42 np0005538513.localdomain python3.9[234045]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:33:42 np0005538513.localdomain sudo[234043]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:43 np0005538513.localdomain sudo[234153]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fygwsconjvjsmliqnygmjtjcmyskqeji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322423.0030057-1229-464393077198/AnsiballZ_systemd_service.py
Nov 28 09:33:43 np0005538513.localdomain sudo[234153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:43 np0005538513.localdomain python3.9[234155]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:33:43 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:33:43 np0005538513.localdomain systemd-rc-local-generator[234182]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:33:43 np0005538513.localdomain systemd-sysv-generator[234188]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:33:43 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:43 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:43 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:43 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:43 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:33:43 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:43 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:43 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:43 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:43 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:43.842 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:43 np0005538513.localdomain systemd[1]: Listening on Podman API Socket.
Nov 28 09:33:43 np0005538513.localdomain sudo[234153]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:44 np0005538513.localdomain sudo[234303]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cnxhwvzrieulfhkjcqrzafzinjwmfrvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322424.393135-1256-181850790095253/AnsiballZ_stat.py
Nov 28 09:33:44 np0005538513.localdomain sudo[234303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:44 np0005538513.localdomain python3.9[234305]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:44 np0005538513.localdomain sudo[234303]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:45 np0005538513.localdomain sudo[234391]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nherxwhjwjzamybpyrllpxylsjltaxvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322424.393135-1256-181850790095253/AnsiballZ_copy.py
Nov 28 09:33:45 np0005538513.localdomain sudo[234391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:33:45 np0005538513.localdomain systemd[1]: tmp-crun.CFaR2m.mount: Deactivated successfully.
Nov 28 09:33:45 np0005538513.localdomain podman[234394]: 2025-11-28 09:33:45.270148116 +0000 UTC m=+0.098383454 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:33:45 np0005538513.localdomain podman[234394]: 2025-11-28 09:33:45.281482959 +0000 UTC m=+0.109718347 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:33:45 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:33:45 np0005538513.localdomain python3.9[234393]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322424.393135-1256-181850790095253/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:33:45 np0005538513.localdomain sudo[234391]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:45 np0005538513.localdomain sudo[234467]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmqtjoamymmydcjlfqgsucxjdobzyavi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322424.393135-1256-181850790095253/AnsiballZ_stat.py
Nov 28 09:33:45 np0005538513.localdomain sudo[234467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:45.777 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:33:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:45.778 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:33:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:45.778 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:33:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:45.778 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:33:45 np0005538513.localdomain python3.9[234469]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:45 np0005538513.localdomain sudo[234467]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:46 np0005538513.localdomain sudo[234555]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yyvdrbmpodbakmiomghcqyzxvavvchmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322424.393135-1256-181850790095253/AnsiballZ_copy.py
Nov 28 09:33:46 np0005538513.localdomain sudo[234555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:46 np0005538513.localdomain python3.9[234557]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322424.393135-1256-181850790095253/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:33:46 np0005538513.localdomain sudo[234555]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:46.730 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65128 DF PROTO=TCP SPT=58924 DPT=9882 SEQ=2916956063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E6EC20000000001030307) 
Nov 28 09:33:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:47.232 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:33:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:47.232 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:33:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:47.232 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:33:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:47.233 228337 DEBUG nova.objects.instance [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:33:47 np0005538513.localdomain sudo[234665]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ikpbldvlwukabtzluztijkwipmvucczu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322427.0137582-1340-248998740481223/AnsiballZ_container_config_data.py
Nov 28 09:33:47 np0005538513.localdomain sudo[234665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:47 np0005538513.localdomain python3.9[234667]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Nov 28 09:33:47 np0005538513.localdomain sudo[234665]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:47.722 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:33:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:47.744 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:33:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:47.744 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:33:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:47.745 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:33:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:47.745 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:33:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:47.746 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:33:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:47.746 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:33:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:47.747 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:33:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:47.747 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:33:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:47.748 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:33:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:47.748 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:33:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:47.765 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:33:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:47.765 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:33:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:47.766 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:33:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:47.766 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:33:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:47.767 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:33:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:48.210 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:33:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:48.272 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:33:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:48.273 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:33:48 np0005538513.localdomain sudo[234797]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nlimlcasvxupxvwngsioduufqjkhsbve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322427.9900374-1367-223702988896174/AnsiballZ_container_config_hash.py
Nov 28 09:33:48 np0005538513.localdomain sudo[234797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:48.496 228337 WARNING nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:33:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:48.498 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12917MB free_disk=41.837093353271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:33:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:48.498 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:33:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:48.499 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:33:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:48.588 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:33:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:48.589 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:33:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:48.589 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:33:48 np0005538513.localdomain python3.9[234799]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:33:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:48.657 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:33:48 np0005538513.localdomain sudo[234797]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:48.871 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22867 DF PROTO=TCP SPT=36458 DPT=9105 SEQ=3545525059 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E76830000000001030307) 
Nov 28 09:33:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:49.158 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:33:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:49.165 228337 DEBUG nova.compute.provider_tree [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:33:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:49.211 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:33:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:49.213 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:33:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:49.214 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.715s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:33:49 np0005538513.localdomain sudo[234877]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:33:49 np0005538513.localdomain sudo[234877]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:33:49 np0005538513.localdomain sudo[234877]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:49 np0005538513.localdomain sudo[234902]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:33:49 np0005538513.localdomain sudo[234902]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:33:50 np0005538513.localdomain podman[235006]: 2025-11-28 09:33:50.324350354 +0000 UTC m=+0.094223730 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, io.openshift.expose-services=, release=553, vcs-type=git, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_CLEAN=True, version=7, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, vendor=Red Hat, Inc., RELEASE=main, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:33:50 np0005538513.localdomain sudo[235050]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrxcjpsucihlphpsbeitblzochcadnql ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764322429.1286147-1397-90856841241683/AnsiballZ_edpm_container_manage.py
Nov 28 09:33:50 np0005538513.localdomain sudo[235050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:50 np0005538513.localdomain podman[235006]: 2025-11-28 09:33:50.449907988 +0000 UTC m=+0.219781414 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, release=553, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., version=7, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_CLEAN=True, vcs-type=git, RELEASE=main, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, name=rhceph)
Nov 28 09:33:50 np0005538513.localdomain python3[235057]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:33:50 np0005538513.localdomain sudo[234902]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:50 np0005538513.localdomain sudo[235130]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:33:50 np0005538513.localdomain sudo[235130]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:33:50 np0005538513.localdomain sudo[235130]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:33:50.812 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:33:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:33:50.813 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:33:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:33:50.814 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:33:50 np0005538513.localdomain python3[235057]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "e6f07353639e492d8c9627d6d615ceeb47cb00ac4d14993b12e8023ee2aeee6f",
                                                                    "Digest": "sha256:ba8d4a4e89620dec751cb5de5631f858557101d862972a8e817b82e4e10180a1",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:ba8d4a4e89620dec751cb5de5631f858557101d862972a8e817b82e4e10180a1"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-11-26T06:26:47.510377458Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 505178369,
                                                                    "VirtualSize": 505178369,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/dc5b8b4def912dce4d14a76402b323c6b5c48ee8271230eacbdaaa7e58e676b2/diff:/var/lib/containers/storage/overlay/f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a/diff:/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/5ea32d7a444086a7f1ea2479bd7b214a5adab9651f7d4df1f24a039ae5563f9d/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/5ea32d7a444086a7f1ea2479bd7b214a5adab9651f7d4df1f24a039ae5563f9d/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:1e3477d3ea795ca64b46f28aa9428ba791c4250e0fd05e173a4b9c0fb0bdee23",
                                                                              "sha256:c136b33417f134a3b932677bcf7a2df089c29f20eca250129eafd2132d4708bb",
                                                                              "sha256:df29e1f065b3ca62a976bd39a05f70336eee2ae6be8f0f1548e8c749ab2e29f2",
                                                                              "sha256:23884b48504b714fa8c89fa23b204d39c39cc69fece546e604d8bd0566e4fb11"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.55004106Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550061231Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550071761Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550082711Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550094371Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550104472Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.937139683Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:33.845342269Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:37.752912815Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.066850603Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.343690066Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.121414134Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.758394881Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.023293708Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.666927498Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.274045447Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.934810694Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:42.460051822Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.056709748Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.656939418Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.391634882Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.866551538Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.384686341Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.893815667Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:50.280039705Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:51.365780205Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:52.238116267Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:54.354755699Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.47438266Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474435383Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474444143Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474450953Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:58.542433842Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:13:58.883943816Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:14:39.655921147Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:14:42.534184087Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:17:11.648903438Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:17:14.841832772Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage ceilometer",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:18:00.567980594Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openstack-ceilometer-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:18:03.88569442Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:26:11.053013113Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-ceilometer-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:26:47.509622089Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openstack-ceilometer-compute && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:26:54.939484291Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 28 09:33:50 np0005538513.localdomain sudo[235162]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:33:50 np0005538513.localdomain sudo[235162]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:33:50 np0005538513.localdomain podman[235190]: 2025-11-28 09:33:50.909763006 +0000 UTC m=+0.078621421 container remove 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4)
Nov 28 09:33:50 np0005538513.localdomain python3[235057]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ceilometer_agent_compute
Nov 28 09:33:51 np0005538513.localdomain podman[235205]: 
Nov 28 09:33:51 np0005538513.localdomain podman[235205]: 2025-11-28 09:33:51.016709674 +0000 UTC m=+0.089129438 container create 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:33:51 np0005538513.localdomain podman[235205]: 2025-11-28 09:33:50.973090466 +0000 UTC m=+0.045510270 image pull  quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 28 09:33:51 np0005538513.localdomain python3[235057]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Nov 28 09:33:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22868 DF PROTO=TCP SPT=36458 DPT=9105 SEQ=3545525059 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E7E820000000001030307) 
Nov 28 09:33:51 np0005538513.localdomain sudo[235050]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:51 np0005538513.localdomain sudo[235162]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:51 np0005538513.localdomain sudo[235382]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ajcjuipqgadsxitydozwzrclmimodzlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322431.4717407-1421-214451331266146/AnsiballZ_stat.py
Nov 28 09:33:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:51.731 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:51 np0005538513.localdomain sudo[235382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:51 np0005538513.localdomain python3.9[235384]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:33:51 np0005538513.localdomain sudo[235382]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:52 np0005538513.localdomain sudo[235387]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:33:52 np0005538513.localdomain sudo[235387]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:33:52 np0005538513.localdomain sudo[235387]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:52 np0005538513.localdomain sudo[235512]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wkslhizvcymrjrzolgqjnoejwynbegse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322432.4964077-1448-140932083566542/AnsiballZ_file.py
Nov 28 09:33:52 np0005538513.localdomain sudo[235512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:52 np0005538513.localdomain python3.9[235514]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:52 np0005538513.localdomain sudo[235512]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:53 np0005538513.localdomain sudo[235621]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dxsosmpdjzqgjbatihwdosrnpqrzsfpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322433.0439098-1448-231923254777113/AnsiballZ_copy.py
Nov 28 09:33:53 np0005538513.localdomain sudo[235621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:53 np0005538513.localdomain python3.9[235623]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764322433.0439098-1448-231923254777113/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:53 np0005538513.localdomain sudo[235621]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:53.874 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:54 np0005538513.localdomain sudo[235676]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blobjfqsjrowywdwmuhgsxiwolaxiipz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322433.0439098-1448-231923254777113/AnsiballZ_systemd.py
Nov 28 09:33:54 np0005538513.localdomain sudo[235676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:54 np0005538513.localdomain python3.9[235678]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:33:54 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:33:54 np0005538513.localdomain systemd-sysv-generator[235705]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:33:54 np0005538513.localdomain systemd-rc-local-generator[235700]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:33:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:33:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:54 np0005538513.localdomain sudo[235676]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:55 np0005538513.localdomain sudo[235767]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wukitocfaoybvksrnfqtafoiqonhrrht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322433.0439098-1448-231923254777113/AnsiballZ_systemd.py
Nov 28 09:33:55 np0005538513.localdomain sudo[235767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22869 DF PROTO=TCP SPT=36458 DPT=9105 SEQ=3545525059 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E8E420000000001030307) 
Nov 28 09:33:55 np0005538513.localdomain python3.9[235769]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:33:55 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:33:55 np0005538513.localdomain systemd-sysv-generator[235799]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:33:55 np0005538513.localdomain systemd-rc-local-generator[235795]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:33:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:33:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:55 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:55 np0005538513.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Nov 28 09:33:55 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:33:55 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9629a2e38a20fc33020415afb3d6da4739b5060b02482d1a6cbfc404ea2cb188/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 28 09:33:55 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9629a2e38a20fc33020415afb3d6da4739b5060b02482d1a6cbfc404ea2cb188/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 28 09:33:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:33:56 np0005538513.localdomain podman[235810]: 2025-11-28 09:33:56.011109005 +0000 UTC m=+0.138443577 container init 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true)
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: + sudo -E kolla_set_configs
Nov 28 09:33:56 np0005538513.localdomain sudo[235830]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 09:33:56 np0005538513.localdomain sudo[235830]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: sudo: unable to send audit message: Operation not permitted
Nov 28 09:33:56 np0005538513.localdomain sudo[235830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 28 09:33:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:33:56 np0005538513.localdomain podman[235810]: 2025-11-28 09:33:56.048242876 +0000 UTC m=+0.175577398 container start 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, managed_by=edpm_ansible, config_id=edpm, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 09:33:56 np0005538513.localdomain podman[235810]: ceilometer_agent_compute
Nov 28 09:33:56 np0005538513.localdomain systemd[1]: Started ceilometer_agent_compute container.
Nov 28 09:33:56 np0005538513.localdomain sudo[235767]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: INFO:__main__:Validating config file
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: INFO:__main__:Copying service configuration files
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: INFO:__main__:Writing out command to execute
Nov 28 09:33:56 np0005538513.localdomain sudo[235830]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: ++ cat /run_command
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: + ARGS=
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: + sudo kolla_copy_cacerts
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: sudo: unable to send audit message: Operation not permitted
Nov 28 09:33:56 np0005538513.localdomain sudo[235850]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 28 09:33:56 np0005538513.localdomain sudo[235850]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 28 09:33:56 np0005538513.localdomain sudo[235850]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 28 09:33:56 np0005538513.localdomain sudo[235850]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: + [[ ! -n '' ]]
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: + . kolla_extend_start
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: + umask 0022
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 28 09:33:56 np0005538513.localdomain podman[235833]: 2025-11-28 09:33:56.137156175 +0000 UTC m=+0.083806357 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 09:33:56 np0005538513.localdomain podman[235833]: 2025-11-28 09:33:56.166240598 +0000 UTC m=+0.112890760 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 09:33:56 np0005538513.localdomain podman[235833]: unhealthy
Nov 28 09:33:56 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:33:56 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Failed with result 'exit-code'.
Nov 28 09:33:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:56.732 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.854 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.854 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.854 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.854 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.854 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.854 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.854 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.854 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.867 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.867 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.867 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.867 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.867 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.885 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.886 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 28 09:33:56 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.887 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.001 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.100 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.100 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.100 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.100 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.100 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.101 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.101 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.101 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.101 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.101 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.101 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.101 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.101 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.101 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.101 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.102 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.102 12 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.102 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.102 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.102 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.102 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.102 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.102 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.102 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.102 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.102 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.109 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.109 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.109 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.109 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.109 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.109 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.109 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.109 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.109 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.109 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.109 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.110 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.110 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.110 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.110 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.110 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.110 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.110 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.110 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.110 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.110 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.110 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.122 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.130 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.522 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}51229a5bf42cf6af691520c2ac7386a36cdfa0a076b91349c2da91076b3ef699" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.629 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 327 Content-Type: application/json Date: Fri, 28 Nov 2025 09:33:57 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-d6465910-457e-42dd-8a7d-8ec985d08cb3 x-openstack-request-id: req-d6465910-457e-42dd-8a7d-8ec985d08cb3 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.630 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "f3c44237-060e-4213-a926-aa7fdb4bf902", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.630 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-d6465910-457e-42dd-8a7d-8ec985d08cb3 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.631 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}51229a5bf42cf6af691520c2ac7386a36cdfa0a076b91349c2da91076b3ef699" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.671 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 494 Content-Type: application/json Date: Fri, 28 Nov 2025 09:33:57 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-3120e210-08ea-40ea-962d-66699a6a63f6 x-openstack-request-id: req-3120e210-08ea-40ea-962d-66699a6a63f6 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.672 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "f3c44237-060e-4213-a926-aa7fdb4bf902", "name": "m1.small", "ram": 512, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 1, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.672 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902 used request id req-3120e210-08ea-40ea-962d-66699a6a63f6 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.673 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.674 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.678 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 / tap09612b07-51 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.678 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa27bbbc-3702-4283-be90-43f12cc30d06', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 144, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:33:57.674522', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5d2d1f32-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.84653548, 'message_signature': '7ab1d7c56ec27be02c50ca8ddca319124a9c75cc3e6bd0ddc88d8c3e44a4cf66'}]}, 'timestamp': '2025-11-28 09:33:57.679916', '_unique_id': '9b213e18a80c4f6ba31c632a2183756b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.690 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.691 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 12742 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64ac717b-6c3e-4b31-87bb-e04939ce0b0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12742, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:33:57.691092', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5d2eeef2-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.84653548, 'message_signature': '039f431660e40919d057bbd5ce54aea90f8f7ebddbabe2778454cbe6c4cfafd7'}]}, 'timestamp': '2025-11-28 09:33:57.691639', '_unique_id': '67be56d627dc447a8d4e59649febf04c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.694 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.694 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 86 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82e137d5-cea1-4ca4-abf6-901803e30015', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 86, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:33:57.694206', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5d2f67a6-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.84653548, 'message_signature': '23e3ce63e48bba963d778646f39450fab789bca0dba3a1eb8019e25bd84461f8'}]}, 'timestamp': '2025-11-28 09:33:57.694717', '_unique_id': '31f3880f174d457fba64e8a6c8eb92cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.696 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.697 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.697 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.697 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.698 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.698 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.698 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.698 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4b76efa-8ec2-47fd-93d2-43e152ec5cd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:33:57.698626', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5d301444-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.84653548, 'message_signature': 'dfdd8936e406ed3b1d47ab8a3bf9fe02eaf0e00768e8e812d2f8453bc356de6b'}]}, 'timestamp': '2025-11-28 09:33:57.699186', '_unique_id': '3c48782524d940238a3b6934074dc8ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.701 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.701 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.701 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.701 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.726 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 73981952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.726 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f43a430-2f07-41b6-8f15-789882feebea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73981952, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:33:57.702000', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5d344e06-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': '348e897946e5927a0cd01498c219838b4797e143ae34db28edb169f3afacc76c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:33:57.702000', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5d345f22-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': 'bddc70edf9a5c7db83d9df1d6bcff7b64c8c2603a08d5ca5e48e813499742486'}]}, 'timestamp': '2025-11-28 09:33:57.727255', '_unique_id': '598918ff44074ef984d841ec96ba6b6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.729 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.729 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1480a42-da39-4fe4-83e1-56cec27dca1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:33:57.729615', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5d34cea8-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.84653548, 'message_signature': 'f71c15a82d308abe09888198d098f3030abd5fa4ed829398ad867aa4e320fbec'}]}, 'timestamp': '2025-11-28 09:33:57.730131', '_unique_id': '8d6166cddc2244928c84494f11f4f973'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.732 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.732 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 566 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.732 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e13ee981-b175-4e47-b54b-954f09d15e04', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 566, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:33:57.732216', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5d35332a-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': '648f7c62161d64ff3cbbb4ace8e768edd853b6c807bad9ae53e8ce13738e92bd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:33:57.732216', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5d354356-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': 'efd7b0d671fd6aa905d09b2251b8b346b00cbf5626cd90894f9dfe66e3efe94f'}]}, 'timestamp': '2025-11-28 09:33:57.733082', '_unique_id': 'e84a16f8206c413895a58ea0e93a64b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.735 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.735 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f567f27-0a64-435d-a548-f1a2afe91747', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:33:57.735340', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5d35adf0-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.84653548, 'message_signature': 'eca2aff805cd3dbbd6bce6ef12a9d885d00105146bb3f2c0ad890fcf6e9c865a'}]}, 'timestamp': '2025-11-28 09:33:57.735819', '_unique_id': '41e9f8fb7ec14bbd92f3604e3bff4c17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.737 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.753 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 45990000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6cd58f4-69af-4a31-853d-48298e4d0695', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 45990000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:33:57.738044', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '5d3861ee-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.924564451, 'message_signature': '323eaf640fd10b3030a0f5e847384e0308e23ee0c377185108eaeae8751b836a'}]}, 'timestamp': '2025-11-28 09:33:57.753560', '_unique_id': '70349386fc9a4dfcb6d7cce628d9d6fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.755 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.768 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.768 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '011ef4f7-5dd7-4cdc-b548-ab5202f406d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:33:57.756079', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5d3abbce-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.928088524, 'message_signature': 'fd3e9b591c91df6d398a05058a52ace110f8541056734e92cf49afdd72a4973f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:33:57.756079', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5d3ace52-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.928088524, 'message_signature': '04e4c6bda24a386ef59cb4bda387a5ed9e14ad019f6fa885d1aa51281fe5ccf3'}]}, 'timestamp': '2025-11-28 09:33:57.769386', '_unique_id': '2748a374c91742a09c7351203a735374'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.771 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.771 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1313024378 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.772 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 168520223 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5722941-546d-4851-b7cd-9dd8bf8552e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1313024378, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:33:57.771575', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5d3b3496-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': '002316aa748c4b76fec7a81721e87b8cbb023a6ad23a7c7f789a2f21a71b08ff'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 168520223, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:33:57.771575', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5d3b4774-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': '8294cd4f89992dad67116654a499808c5096884678ce0afca8044d0c636a5486'}]}, 'timestamp': '2025-11-28 09:33:57.772475', '_unique_id': 'dbe6dba5ad284533afb63100f3c0b902'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.774 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.774 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '259e6840-de26-47b5-ae7d-dec912fd902a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:33:57.774726', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5d3bb042-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.84653548, 'message_signature': 'afcb2a163c4190046096c59e3fb52da485afcdff6a6a9b0f27bcf48761d83f10'}]}, 'timestamp': '2025-11-28 09:33:57.775220', '_unique_id': '644d125839bc41cca9c80248b0feebaf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.777 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.777 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 9175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '990c1ac2-6276-4972-aa34-f321ec57a52f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9175, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:33:57.777270', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5d3c1334-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.84653548, 'message_signature': '1fda7708a4000b17ab0cbcaa632b2097dea9a10a3b5796e064230ae6855cbacc'}]}, 'timestamp': '2025-11-28 09:33:57.777754', '_unique_id': '2ac2f465d5f54b0188ee81584a7dfa29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.779 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.779 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.780 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca2ec3c7-2c10-4203-bb2f-10c0d32fb21a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:33:57.779874', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5d3c7a0e-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.928088524, 'message_signature': '821ba8cb104b068bea0482688c4a16dda1f993649c5153ad386bce845baa08cc'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:33:57.779874', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5d3c8a12-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.928088524, 'message_signature': '2a868eabb5b81636e94faf8e6223ba93a6820eafe7c64dd9fac1910dedb32f2c'}]}, 'timestamp': '2025-11-28 09:33:57.780734', '_unique_id': '994250b4e3174e1a84cd9ae224947e20'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.782 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.782 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 305908425 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.783 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 30452399 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3040a6ab-0de7-4373-80a9-294bcd9fc883', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 305908425, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:33:57.782861', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5d3ceec6-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': 'ff4c057d1fc000f59c85798c4fbcaebfa514330877fc2c88f2dd8c68a185a7f4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30452399, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:33:57.782861', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5d3cfede-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': 'e86e351eeb3370e35875b8f8a19773d10f02ccb8997bc8bc64311f447dd25ec7'}]}, 'timestamp': '2025-11-28 09:33:57.783722', '_unique_id': 'fbf2d635a0e3458788df2a37cdc35e69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.785 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.786 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.786 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ca3d5cd-a85a-4728-bc7c-d09595b00104', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:33:57.785972', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5d3d68ba-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': '1f8ed315533b19d2a0c31996d68a9b9886ae9a4a78c0f4140ca297567433e744'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:33:57.785972', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5d3d78a0-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': '087fa5b5e41fb6b3161650b56f1528152e9e3aef1b2f489b2065b3fbb5a6e768'}]}, 'timestamp': '2025-11-28 09:33:57.786843', '_unique_id': '8631371e7c3048b7a21ed80189fa98c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.788 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.789 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7b1a3cb-6388-4ef6-9f6e-e7553f73777a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:33:57.788968', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5d3dddae-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.84653548, 'message_signature': '9c29d11809e40cafff42df960341425cc8f364266e4daf324c54135a987d35da'}]}, 'timestamp': '2025-11-28 09:33:57.789459', '_unique_id': '0ad8f734eff44e549a019d2f15d65e1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.791 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.791 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 52.40234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f50b491-056f-4adf-aa90-e87e5217659a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.40234375, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:33:57.791601', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '5d3e42f8-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.924564451, 'message_signature': '11a18da6ee90fd4913ada2df168b17a9d6eafa9681b97c33d6cfc96be8d99c7f'}]}, 'timestamp': '2025-11-28 09:33:57.792064', '_unique_id': '876f6e43611a44b586b357b0366a953a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.794 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.794 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.794 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.794 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5205426-083f-44f0-808a-6bb0a4710cfc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:33:57.794977', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5d3ec8d6-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.928088524, 'message_signature': '4da5aa7a3ac30478a0598e050da35fc3b24d417a795f0469c332db793d8167a9'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:33:57.794977', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5d3ed8e4-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.928088524, 'message_signature': '0a352a081040427bee9c54f3a2fbecc51105cdfb8c8a1db00d7a27b9387f24cd'}]}, 'timestamp': '2025-11-28 09:33:57.795861', '_unique_id': 'f1015bde5ddc4393a5238b8660372ce1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.797 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.798 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.798 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6af788c-0402-4281-a7c0-005de4f17bf0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:33:57.798154', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5d3f4310-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': 'dc1e3a40f73573aad0dddba42014aa84412c0e05be1d820173004c62603864d6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:33:57.798154', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5d3f52ec-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': 'd1d20ae784cd0c917636cb14b0ee9d48380cfa3b759e77a50b4b0364d7ec753b'}]}, 'timestamp': '2025-11-28 09:33:57.798980', '_unique_id': '3fde224552b14faaa8ebe60f935427fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.800 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.801 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb1d93fe-b332-4fc2-8076-bdf31e7a014f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:33:57.801145', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5d3fb840-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.84653548, 'message_signature': 'f6f95317da1d8bb8fdcc3418e23b947937bc5188f613894c38d538a0c4543617'}]}, 'timestamp': '2025-11-28 09:33:57.801637', '_unique_id': 'da7daececfb64abe8684178de282d352'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:33:57 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:33:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34679 DF PROTO=TCP SPT=33768 DPT=9101 SEQ=2529987109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E99490000000001030307) 
Nov 28 09:33:58 np0005538513.localdomain sudo[235968]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkpycjpbzvidrmqgfxmvvlhvnjhmryoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322438.1841383-1520-112403215947785/AnsiballZ_systemd.py
Nov 28 09:33:58 np0005538513.localdomain sudo[235968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:58 np0005538513.localdomain python3.9[235970]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:33:58 np0005538513.localdomain systemd[1]: Stopping ceilometer_agent_compute container...
Nov 28 09:33:58 np0005538513.localdomain systemd[1]: tmp-crun.EGdWK5.mount: Deactivated successfully.
Nov 28 09:33:58 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:58.908 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Nov 28 09:33:58 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:33:58.907 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:59.009 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:59.009 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:59.009 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Nov 28 09:33:59 np0005538513.localdomain virtqemud[201490]: End of file while reading data: Input/output error
Nov 28 09:33:59 np0005538513.localdomain virtqemud[201490]: End of file while reading data: Input/output error
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[235824]: 2025-11-28 09:33:59.017 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Nov 28 09:33:59 np0005538513.localdomain systemd[1]: libpod-1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.scope: Deactivated successfully.
Nov 28 09:33:59 np0005538513.localdomain systemd[1]: libpod-1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.scope: Consumed 1.348s CPU time.
Nov 28 09:33:59 np0005538513.localdomain podman[235974]: 2025-11-28 09:33:59.156416178 +0000 UTC m=+0.339524103 container died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:33:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:33:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:33:59 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.timer: Deactivated successfully.
Nov 28 09:33:59 np0005538513.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:33:59 np0005538513.localdomain podman[235994]: 2025-11-28 09:33:59.254497921 +0000 UTC m=+0.081931707 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:33:59 np0005538513.localdomain podman[235995]: 2025-11-28 09:33:59.312723617 +0000 UTC m=+0.137552319 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 28 09:33:59 np0005538513.localdomain podman[235994]: 2025-11-28 09:33:59.321732396 +0000 UTC m=+0.149166182 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:33:59 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:33:59 np0005538513.localdomain podman[235974]: 2025-11-28 09:33:59.344062751 +0000 UTC m=+0.527170606 container cleanup 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 09:33:59 np0005538513.localdomain podman[235974]: ceilometer_agent_compute
Nov 28 09:33:59 np0005538513.localdomain podman[235995]: 2025-11-28 09:33:59.394069034 +0000 UTC m=+0.218897766 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 09:33:59 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:33:59 np0005538513.localdomain podman[236045]: 2025-11-28 09:33:59.442813306 +0000 UTC m=+0.074277301 container cleanup 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 28 09:33:59 np0005538513.localdomain podman[236045]: ceilometer_agent_compute
Nov 28 09:33:59 np0005538513.localdomain systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Nov 28 09:33:59 np0005538513.localdomain systemd[1]: Stopped ceilometer_agent_compute container.
Nov 28 09:33:59 np0005538513.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Nov 28 09:33:59 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:33:59 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9629a2e38a20fc33020415afb3d6da4739b5060b02482d1a6cbfc404ea2cb188/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 28 09:33:59 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9629a2e38a20fc33020415afb3d6da4739b5060b02482d1a6cbfc404ea2cb188/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 28 09:33:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:33:59 np0005538513.localdomain podman[236058]: 2025-11-28 09:33:59.607829954 +0000 UTC m=+0.133168678 container init 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm)
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: + sudo -E kolla_set_configs
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: sudo: unable to send audit message: Operation not permitted
Nov 28 09:33:59 np0005538513.localdomain sudo[236078]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 09:33:59 np0005538513.localdomain sudo[236078]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 28 09:33:59 np0005538513.localdomain sudo[236078]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 28 09:33:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:33:59 np0005538513.localdomain podman[236058]: 2025-11-28 09:33:59.649583583 +0000 UTC m=+0.174922267 container start 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 28 09:33:59 np0005538513.localdomain podman[236058]: ceilometer_agent_compute
Nov 28 09:33:59 np0005538513.localdomain systemd[1]: Started ceilometer_agent_compute container.
Nov 28 09:33:59 np0005538513.localdomain sudo[235968]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: INFO:__main__:Validating config file
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: INFO:__main__:Copying service configuration files
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: INFO:__main__:Writing out command to execute
Nov 28 09:33:59 np0005538513.localdomain sudo[236078]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: ++ cat /run_command
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: + ARGS=
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: + sudo kolla_copy_cacerts
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: sudo: unable to send audit message: Operation not permitted
Nov 28 09:33:59 np0005538513.localdomain sudo[236095]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 28 09:33:59 np0005538513.localdomain sudo[236095]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 28 09:33:59 np0005538513.localdomain sudo[236095]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 28 09:33:59 np0005538513.localdomain sudo[236095]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:59 np0005538513.localdomain podman[236081]: 2025-11-28 09:33:59.739420532 +0000 UTC m=+0.084298973 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: + [[ ! -n '' ]]
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: + . kolla_extend_start
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: + umask 0022
Nov 28 09:33:59 np0005538513.localdomain ceilometer_agent_compute[236072]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 28 09:33:59 np0005538513.localdomain podman[236081]: 2025-11-28 09:33:59.774453915 +0000 UTC m=+0.119332376 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute)
Nov 28 09:33:59 np0005538513.localdomain podman[236081]: unhealthy
Nov 28 09:33:59 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:33:59 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Failed with result 'exit-code'.
Nov 28 09:34:00 np0005538513.localdomain sudo[236210]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tlodixlsbjpeypnrncnqaltvcxkzvhch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322439.9206502-1544-122255528229007/AnsiballZ_stat.py
Nov 28 09:34:00 np0005538513.localdomain sudo[236210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:00 np0005538513.localdomain python3.9[236212]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:34:00 np0005538513.localdomain sudo[236210]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.458 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.459 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.459 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.459 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.459 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.459 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.459 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.459 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.459 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.459 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.459 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.459 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.460 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.460 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.460 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.460 2 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.460 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.460 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.460 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.460 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.460 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.464 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.464 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.464 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.464 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.464 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.464 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.464 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.464 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.464 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.464 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.464 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.468 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.468 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.468 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.468 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.468 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.468 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.468 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.468 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.468 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.468 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.468 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.470 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.470 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.470 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.470 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.470 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.470 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.470 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.470 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.470 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.470 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.470 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.472 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.472 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.472 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.472 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.472 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.472 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.472 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.472 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.490 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.492 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.493 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.510 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.634 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.634 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.634 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.634 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.634 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.634 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.634 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.635 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.635 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.635 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.635 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.635 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.635 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.635 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.635 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.635 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.635 12 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.640 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.640 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.640 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.640 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.640 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.640 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.640 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.640 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.640 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.640 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.640 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.642 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.642 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.642 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.642 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.642 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.642 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.642 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.642 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.642 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.642 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.642 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.643 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.643 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.643 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.643 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.643 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.643 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.643 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.643 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.643 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.643 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.643 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.644 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.644 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.644 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.644 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.644 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.644 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.644 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.644 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.644 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.644 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.644 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.645 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.645 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.645 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.645 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.645 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.645 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.645 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.645 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.645 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.645 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.645 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.652 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.652 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.652 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.652 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.652 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.652 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.652 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.652 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.652 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.652 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.652 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.656 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 28 09:34:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.664 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 28 09:34:00 np0005538513.localdomain sudo[236304]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-goxuthbswaatdkhtlfrjezcouwhukfha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322439.9206502-1544-122255528229007/AnsiballZ_copy.py
Nov 28 09:34:00 np0005538513.localdomain sudo[236304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:00 np0005538513.localdomain python3.9[236306]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322439.9206502-1544-122255528229007/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:34:00 np0005538513.localdomain sudo[236304]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.044 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}0c98bde1f38df54628b5d8e5f8133062b9971605ca5dd209589037cdd5565180" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 28 09:34:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34681 DF PROTO=TCP SPT=33768 DPT=9101 SEQ=2529987109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1EA5620000000001030307) 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.161 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 327 Content-Type: application/json Date: Fri, 28 Nov 2025 09:34:01 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-b154ef59-1b58-4072-b77e-39cd99b341b7 x-openstack-request-id: req-b154ef59-1b58-4072-b77e-39cd99b341b7 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.161 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "f3c44237-060e-4213-a926-aa7fdb4bf902", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.161 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-b154ef59-1b58-4072-b77e-39cd99b341b7 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.164 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}0c98bde1f38df54628b5d8e5f8133062b9971605ca5dd209589037cdd5565180" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.182 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 494 Content-Type: application/json Date: Fri, 28 Nov 2025 09:34:01 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-e628c67f-ce5d-4010-9447-dfea4d757f7c x-openstack-request-id: req-e628c67f-ce5d-4010-9447-dfea4d757f7c _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.182 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "f3c44237-060e-4213-a926-aa7fdb4bf902", "name": "m1.small", "ram": 512, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 1, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.183 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902 used request id req-e628c67f-ce5d-4010-9447-dfea4d757f7c request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.184 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.184 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.190 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 / tap09612b07-51 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.190 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97f97acd-3b21-49d8-8b0a-800e2eaeb6ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:34:01.185254', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5f44ff74-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.357300134, 'message_signature': 'bb4d6d381bf9ed9b230eeda83f10ac5ad4841d80a91c777c331a307ee2ef9519'}]}, 'timestamp': '2025-11-28 09:34:01.191837', '_unique_id': '5f0d11ce69e54019a5c5232fd2eb0a4b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.204 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.204 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.204 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.205 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.205 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.205 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.205 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.205 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 12742 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12f5d177-2e0a-4d4e-9c0f-8219b23a40cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12742, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:34:01.205870', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5f473f00-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.357300134, 'message_signature': '2c9dff60989d225464ee24f47b7b88bd7f11380563fb5b07a12bfc764c2642c6'}]}, 'timestamp': '2025-11-28 09:34:01.206444', '_unique_id': '4ab641480f234902aca9fed28d512a5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.208 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.208 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62147e78-804b-48ee-9aef-65c0db56f082', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:34:01.208813', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5f47b11a-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.357300134, 'message_signature': '34489fd06d48c9ad3536959440c34da771a63857e8909de17505c04e5bb4dced'}]}, 'timestamp': '2025-11-28 09:34:01.209320', '_unique_id': 'a421786b399746e2a7c2ec3695e20f89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.211 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.247 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 305908425 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.248 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 30452399 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a06b837f-2db5-45fb-ae2e-4a5cf1e273b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 305908425, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:34:01.211452', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5f4daea8-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': 'c5d9f870dce25b0732e5f9db278626a08ad8614a0d7ab794485d718f438d21de'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30452399, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:34:01.211452', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5f4dc032-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': 'ea8d7d368945e9329c0b4ef9f06468dffefdcb27348eb5a87bdafa7b204aeb10'}]}, 'timestamp': '2025-11-28 09:34:01.248990', '_unique_id': '5c8b6164a6324f39a3d798ff91702c01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.251 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.251 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '439092f3-5af4-4aec-a3d1-7d60a587dd8a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:34:01.251759', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5f4e3ddc-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.357300134, 'message_signature': '0019c1c37c7b9a76f9a123c4dcca0531c4ee386c9fd60b1df1c4ebae69383446'}]}, 'timestamp': '2025-11-28 09:34:01.252280', '_unique_id': '3f12acab79b3433bab1087999db5075c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.254 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.254 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.254 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.255 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.268 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.268 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02ffb186-0811-4c05-b84d-b3e2d56692af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:34:01.255157', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5f50bddc-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.427199334, 'message_signature': 'b04401fd082a2b97bdea872e526dd5fda4f02ab375e18b4e96b433d069f256af'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:34:01.255157', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5f50ce94-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.427199334, 'message_signature': '0876939eaca65ac62044a7c0aa2a45cbb55ccd70059cfd4764622e3aa715a3d9'}]}, 'timestamp': '2025-11-28 09:34:01.269042', '_unique_id': '7f2cf4af764f4790b41389876361a324'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.271 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.271 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8561dabb-fc3a-4a45-8921-e62c913dfc37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:34:01.271452', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5f513eec-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.357300134, 'message_signature': '746647ea4408f2312003a4a03e306a45270a6c6e4467dd6e3ec9489c9870dc32'}]}, 'timestamp': '2025-11-28 09:34:01.271921', '_unique_id': '7d9a1263e48b4cb49dcc6dba1e88cf94'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.273 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.294 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 52.40234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3fb917e-62a1-41ef-92a8-e74236e750a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.40234375, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:34:01.274097', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '5f54ba0e-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.46573624, 'message_signature': '515acd3ca0f6eff7eed76081c03a074ef189a6b84f101bb119f38429ffc7d03c'}]}, 'timestamp': '2025-11-28 09:34:01.294724', '_unique_id': 'b77ed978a2cf486c80d1f2016ac99a29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.296 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.296 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '398f5d2f-c6fe-4648-b5a6-bf208ea0023e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:34:01.296939', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5f552476-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.357300134, 'message_signature': '52c24c47d26b5b9074e600942219ff3694752758defe775a666d4fd5660b3d99'}]}, 'timestamp': '2025-11-28 09:34:01.297456', '_unique_id': '707a31ca3a184ce88f75df824e030cd4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.299 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.299 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.299 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.300 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.300 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1313024378 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.300 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 168520223 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6cb8d55-4708-4036-a3db-74a0bfad7faa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1313024378, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:34:01.300164', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5f55a054-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': 'd7cf23081c947dbb01f7256641e69196359a1bae2aa45b8c64070f5cf3899ee4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 168520223, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:34:01.300164', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5f55b08a-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': '472849c8e4634f863a79fb47159d03c98fec5d834d8533d486676853225a481c'}]}, 'timestamp': '2025-11-28 09:34:01.301010', '_unique_id': '6bf17e2d7b31423c868bc85f77419b1d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.303 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.303 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.303 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3e2f7b2-527d-473a-9ef4-a632aae5183c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:34:01.303174', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5f5615c0-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.427199334, 'message_signature': 'ef512859a60e284dbfa439d3e855be2397fd7be50e2526983ae629c64d8f30cf'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:34:01.303174', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5f5625b0-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.427199334, 'message_signature': '86db59c0a2864ba55f5e238bcb85075d2bf8a24a2d69a41f5466906add2aa295'}]}, 'timestamp': '2025-11-28 09:34:01.304046', '_unique_id': '8a8ffefb342d459194fa71dca44dadb3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.306 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.306 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 566 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.306 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fcc0c24-6d39-4358-830d-d2f1014c1158', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 566, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:34:01.306343', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5f5691a8-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': 'eec2fa02999ddd8c5d7578914011f2fc92d5c6cbea84d2e35afc2f4b5e2dc6dc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:34:01.306343', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5f56a1ac-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': '2b051335e225bacff7033a5c561371158044e350d4f2b8ec2f9ffe6992392cde'}]}, 'timestamp': '2025-11-28 09:34:01.307221', '_unique_id': 'b8ee691e7ebf40ba92fb04a6e0ea80cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.309 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.309 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6738a910-1c2c-4a7b-a7d4-4660ced1c8f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:34:01.309400', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5f570912-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.357300134, 'message_signature': 'db97d6bf93cc1a089b143ecd224fd1c995f395c54ffc73baf370541ccb848d72'}]}, 'timestamp': '2025-11-28 09:34:01.309852', '_unique_id': 'e540467858b3407cbb2067cb6dd2b9c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.311 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.312 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.312 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b7499f9-0343-463a-b3bf-50bc88468427', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:34:01.312111', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5f5772e4-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.427199334, 'message_signature': '3f70c2767c6642f8f7f0950bdd795817558de8bba4e172cb5bee5c7288c1cf4f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:34:01.312111', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5f5782c0-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.427199334, 'message_signature': '75b6dcad3e42c418c224cd67c0e08b28da725b82ac8c26627ce14714de4a6246'}]}, 'timestamp': '2025-11-28 09:34:01.312939', '_unique_id': '8742d984ca8a47fca417fda4c4ad239a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.314 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.315 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ca72991-25de-4ba7-8d41-9004c4816da0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 144, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:34:01.315295', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5f57f17e-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.357300134, 'message_signature': '5ab0d1e334eb5cd694b043b0e9fbab8f4e030cf2fd093e6779346b002b15e0ce'}]}, 'timestamp': '2025-11-28 09:34:01.315811', '_unique_id': 'cad02125b26942359e11776a2d2faae0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.317 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.318 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 9175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eef74aea-dae6-464b-a9ff-fe35371b36c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9175, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:34:01.317966', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5f585920-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.357300134, 'message_signature': '95c451172e677a117790269a2989e54117c0d4e498df5104f2ec768b0dd1d5de'}]}, 'timestamp': '2025-11-28 09:34:01.318570', '_unique_id': '78ae35551eee49c0b85cf80e46a8f36e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.320 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.321 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.321 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '618cc4f6-28a8-42a1-aa87-370dee06ae3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:34:01.321056', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5f58d166-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': 'd814680ec686665d787ffa77e873771e9c54a2774f8ea26b28e51d78c805ac82'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:34:01.321056', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5f58e16a-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': '160bdf82186450640e44aea5ce8cb3976cb9bec36068647ecb33f2e8273b8ab5'}]}, 'timestamp': '2025-11-28 09:34:01.321916', '_unique_id': 'f4a847118c1843f190b5e8eb5f4ea71c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.324 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.324 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 73981952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.324 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b54a2dbf-fac6-4b44-816f-f4b0b6d5c225', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73981952, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:34:01.324210', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5f594c0e-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': '682900c99478c8db0cdd2f1a2865aff300411c7ddaf2333c5c1d37dbb2aca768'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:34:01.324210', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5f595be0-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': '0029281dcb7d2da8581a33e494823239375e922ca736837f3ecf48a3d9a81e16'}]}, 'timestamp': '2025-11-28 09:34:01.325087', '_unique_id': 'e3f74a1f3efb4d9d938ef5bc75c9ef6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.327 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.327 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 46020000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4545d91-e988-4604-9a7b-5ac5498ad0f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46020000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:34:01.327861', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '5f59dd40-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.46573624, 'message_signature': 'c4000779c40a4da1463d2fe1e8ea3c50331b1ac2118d52adcf1be5b5dc08ca44'}]}, 'timestamp': '2025-11-28 09:34:01.328381', '_unique_id': '7068a07f92ef44c69a79fd59f8aa7364'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.330 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.330 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.330 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '769fba96-d840-46a3-9bb9-93f9df85b152', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:34:01.330190', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5f5a31d2-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': '023b547ccf5048325fdad8de66016b21350296673518982d0a9826102de5b89a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:34:01.330190', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5f5a3b6e-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': 'c924dd66cc0cdaea188fb97621112e90a364620108fa1dba226558197cd4a462'}]}, 'timestamp': '2025-11-28 09:34:01.330696', '_unique_id': '67b5a7e5d96f4a5f8b4c2db623f34a5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.332 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.332 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 86 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3639a01b-6e6f-45e7-af72-0cce6c3bbe15', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 86, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:34:01.332156', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5f5a7f0c-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.357300134, 'message_signature': '2630e23c4f5fe9521d0c888b5fafb46f9e66d53afb8bd382e8424944aec5cb08'}]}, 'timestamp': '2025-11-28 09:34:01.332450', '_unique_id': 'f2befa3af4c34ff4aea2f0d78606246a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:34:01 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:34:01 np0005538513.localdomain sudo[236414]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ctrspupyfbpftscgmgjzwnjhkcnefwhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322441.4037957-1595-69165212127467/AnsiballZ_container_config_data.py
Nov 28 09:34:01 np0005538513.localdomain sudo[236414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:01 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:01.736 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:01 np0005538513.localdomain python3.9[236416]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Nov 28 09:34:01 np0005538513.localdomain sudo[236414]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:02 np0005538513.localdomain sudo[236524]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-voohzcobbiejafbptnorufucjzokzmtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322442.2024357-1622-19636158033091/AnsiballZ_container_config_hash.py
Nov 28 09:34:02 np0005538513.localdomain sudo[236524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:02 np0005538513.localdomain python3.9[236526]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:34:02 np0005538513.localdomain sudo[236524]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24986 DF PROTO=TCP SPT=33148 DPT=9102 SEQ=2463907536 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1EAE420000000001030307) 
Nov 28 09:34:03 np0005538513.localdomain sudo[236634]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjhxqwesyqtmmsljelpogdklonxlvxlo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764322443.5258114-1652-96637568037830/AnsiballZ_edpm_container_manage.py
Nov 28 09:34:03 np0005538513.localdomain sudo[236634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:03 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:03.909 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:04 np0005538513.localdomain python3[236636]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:34:04 np0005538513.localdomain podman[236674]: 
Nov 28 09:34:04 np0005538513.localdomain podman[236674]: 2025-11-28 09:34:04.3792568 +0000 UTC m=+0.077564396 container create 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:34:04 np0005538513.localdomain podman[236674]: 2025-11-28 09:34:04.339271319 +0000 UTC m=+0.037578945 image pull  quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Nov 28 09:34:04 np0005538513.localdomain python3[236636]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Nov 28 09:34:04 np0005538513.localdomain sudo[236634]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:05 np0005538513.localdomain sudo[236818]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjwhukfpckjxepilfolomjlhqniezgli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322445.5603008-1676-58265721820017/AnsiballZ_stat.py
Nov 28 09:34:05 np0005538513.localdomain sudo[236818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:06 np0005538513.localdomain python3.9[236820]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:34:06 np0005538513.localdomain sudo[236818]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45685 DF PROTO=TCP SPT=34304 DPT=9102 SEQ=594967161 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1EB9820000000001030307) 
Nov 28 09:34:06 np0005538513.localdomain sudo[236930]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncmnihpgocrlelelizultshfdhvkidmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322446.416885-1703-232209794813763/AnsiballZ_file.py
Nov 28 09:34:06 np0005538513.localdomain sudo[236930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:06 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:06.737 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:06 np0005538513.localdomain python3.9[236932]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:34:06 np0005538513.localdomain sudo[236930]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:07 np0005538513.localdomain sudo[237039]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cbltbkvqrizdhjboxtapbvdoiynrmifq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322446.952259-1703-110887842884263/AnsiballZ_copy.py
Nov 28 09:34:07 np0005538513.localdomain sudo[237039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:07 np0005538513.localdomain python3.9[237041]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764322446.952259-1703-110887842884263/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:34:07 np0005538513.localdomain sudo[237039]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:07 np0005538513.localdomain sudo[237094]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvupazqtvncftieagvpajvgysgssbjeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322446.952259-1703-110887842884263/AnsiballZ_systemd.py
Nov 28 09:34:07 np0005538513.localdomain sudo[237094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:08 np0005538513.localdomain python3.9[237096]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:34:08 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:34:08 np0005538513.localdomain systemd-rc-local-generator[237119]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:34:08 np0005538513.localdomain systemd-sysv-generator[237124]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:34:08 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:08 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:08 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:08 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:08 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:34:08 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:08 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:08 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:08 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:08 np0005538513.localdomain sudo[237094]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:08 np0005538513.localdomain sudo[237185]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bofhglkqngmqnklwtzgeakbbmrkwzgsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322446.952259-1703-110887842884263/AnsiballZ_systemd.py
Nov 28 09:34:08 np0005538513.localdomain sudo[237185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:08 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:08.941 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:09 np0005538513.localdomain python3.9[237187]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:34:09 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:34:09 np0005538513.localdomain systemd-rc-local-generator[237212]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:34:09 np0005538513.localdomain systemd-sysv-generator[237218]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:34:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:34:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:09 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65010 DF PROTO=TCP SPT=60840 DPT=9100 SEQ=2329009262 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1EC5820000000001030307) 
Nov 28 09:34:09 np0005538513.localdomain systemd[1]: Starting node_exporter container...
Nov 28 09:34:09 np0005538513.localdomain systemd[1]: tmp-crun.yMxVMN.mount: Deactivated successfully.
Nov 28 09:34:09 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:34:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:34:09 np0005538513.localdomain podman[237228]: 2025-11-28 09:34:09.648768718 +0000 UTC m=+0.151432456 container init 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.669Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.669Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.669Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.670Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.670Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.670Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.670Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=arp
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=bcache
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=bonding
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=cpu
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=edac
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=filefd
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=netclass
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=netdev
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=netstat
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=nfs
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=nvme
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=softnet
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=systemd
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=xfs
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=zfs
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.672Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 28 09:34:09 np0005538513.localdomain node_exporter[237242]: ts=2025-11-28T09:34:09.672Z caller=tls_config.go:235 level=info msg="TLS is disabled." http2=false address=[::]:9100
Nov 28 09:34:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:34:09 np0005538513.localdomain podman[237228]: 2025-11-28 09:34:09.689725182 +0000 UTC m=+0.192388890 container start 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:34:09 np0005538513.localdomain podman[237228]: node_exporter
Nov 28 09:34:09 np0005538513.localdomain systemd[1]: Started node_exporter container.
Nov 28 09:34:09 np0005538513.localdomain sudo[237185]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:09 np0005538513.localdomain podman[237251]: 2025-11-28 09:34:09.794141329 +0000 UTC m=+0.097664561 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=starting, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:34:09 np0005538513.localdomain podman[237251]: 2025-11-28 09:34:09.802996834 +0000 UTC m=+0.106520036 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:34:09 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:34:10 np0005538513.localdomain sudo[237382]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ximldpytezraschqtwirtufviwbkqcpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322450.0602446-1775-176729742946368/AnsiballZ_systemd.py
Nov 28 09:34:10 np0005538513.localdomain sudo[237382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:10 np0005538513.localdomain systemd[1]: tmp-crun.VcsOSJ.mount: Deactivated successfully.
Nov 28 09:34:10 np0005538513.localdomain python3.9[237384]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:34:10 np0005538513.localdomain systemd[1]: Stopping node_exporter container...
Nov 28 09:34:10 np0005538513.localdomain systemd[1]: libpod-49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.scope: Deactivated successfully.
Nov 28 09:34:10 np0005538513.localdomain podman[237388]: 2025-11-28 09:34:10.846279482 +0000 UTC m=+0.082680083 container died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:34:10 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.timer: Deactivated successfully.
Nov 28 09:34:10 np0005538513.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:34:10 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553-userdata-shm.mount: Deactivated successfully.
Nov 28 09:34:10 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d489f0eb67e7eb4e89c33762ecb0c35a6142c1059d2b20cd6070670e7ee5ef23-merged.mount: Deactivated successfully.
Nov 28 09:34:10 np0005538513.localdomain podman[237388]: 2025-11-28 09:34:10.90112285 +0000 UTC m=+0.137523371 container cleanup 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:34:10 np0005538513.localdomain podman[237388]: node_exporter
Nov 28 09:34:10 np0005538513.localdomain systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 28 09:34:10 np0005538513.localdomain podman[237412]: 2025-11-28 09:34:10.992703736 +0000 UTC m=+0.064260271 container cleanup 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:34:10 np0005538513.localdomain podman[237412]: node_exporter
Nov 28 09:34:10 np0005538513.localdomain systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Nov 28 09:34:10 np0005538513.localdomain systemd[1]: Stopped node_exporter container.
Nov 28 09:34:11 np0005538513.localdomain systemd[1]: Starting node_exporter container...
Nov 28 09:34:11 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:34:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:34:11 np0005538513.localdomain podman[237425]: 2025-11-28 09:34:11.17186037 +0000 UTC m=+0.147027305 container init 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.187Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.187Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.187Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.188Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.188Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.188Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=arp
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=bcache
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=bonding
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=cpu
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=edac
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=filefd
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=netclass
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=netdev
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=netstat
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=nfs
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=nvme
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=softnet
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=systemd
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=xfs
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=zfs
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.190Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 28 09:34:11 np0005538513.localdomain node_exporter[237440]: ts=2025-11-28T09:34:11.190Z caller=tls_config.go:235 level=info msg="TLS is disabled." http2=false address=[::]:9100
Nov 28 09:34:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:34:11 np0005538513.localdomain podman[237425]: 2025-11-28 09:34:11.20212697 +0000 UTC m=+0.177293865 container start 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:34:11 np0005538513.localdomain podman[237425]: node_exporter
Nov 28 09:34:11 np0005538513.localdomain systemd[1]: Started node_exporter container.
Nov 28 09:34:11 np0005538513.localdomain sudo[237382]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:11 np0005538513.localdomain podman[237449]: 2025-11-28 09:34:11.292035463 +0000 UTC m=+0.082085733 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=starting, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:34:11 np0005538513.localdomain podman[237449]: 2025-11-28 09:34:11.324670489 +0000 UTC m=+0.114720709 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:34:11 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:34:11 np0005538513.localdomain sudo[237580]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kozypglbauvzeyenfgluietlivndhhnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322451.4197206-1799-273962096826046/AnsiballZ_stat.py
Nov 28 09:34:11 np0005538513.localdomain sudo[237580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:11 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:11.739 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:11 np0005538513.localdomain python3.9[237582]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:34:11 np0005538513.localdomain sudo[237580]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:12 np0005538513.localdomain sudo[237668]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbspquqzdvlfbyvnbtmdbrcwwqsecaav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322451.4197206-1799-273962096826046/AnsiballZ_copy.py
Nov 28 09:34:12 np0005538513.localdomain sudo[237668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:12 np0005538513.localdomain python3.9[237670]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322451.4197206-1799-273962096826046/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:34:12 np0005538513.localdomain sudo[237668]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46748 DF PROTO=TCP SPT=41968 DPT=9100 SEQ=2251484439 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1ED1820000000001030307) 
Nov 28 09:34:13 np0005538513.localdomain sudo[237778]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mapivktysvnkhydlgbdaganbitzntffm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322452.9031422-1850-234803420183473/AnsiballZ_container_config_data.py
Nov 28 09:34:13 np0005538513.localdomain sudo[237778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:13 np0005538513.localdomain python3.9[237780]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Nov 28 09:34:13 np0005538513.localdomain sudo[237778]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:13 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:13.945 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:13 np0005538513.localdomain sudo[237888]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjmlzpdaobpgjphimvkrhskwklqusflz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322453.7336931-1877-50123776020328/AnsiballZ_container_config_hash.py
Nov 28 09:34:14 np0005538513.localdomain sudo[237888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:14 np0005538513.localdomain python3.9[237890]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:34:14 np0005538513.localdomain sudo[237888]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:14 np0005538513.localdomain sudo[237998]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gxzshygtzdsrqerzyizbjutzzodmyvju ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764322454.6544855-1907-35150080909162/AnsiballZ_edpm_container_manage.py
Nov 28 09:34:14 np0005538513.localdomain sudo[237998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:15 np0005538513.localdomain python3[238000]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:34:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:34:15 np0005538513.localdomain systemd[1]: tmp-crun.WmSqBS.mount: Deactivated successfully.
Nov 28 09:34:15 np0005538513.localdomain podman[238028]: 2025-11-28 09:34:15.853464815 +0000 UTC m=+0.084747068 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 09:34:15 np0005538513.localdomain podman[238028]: 2025-11-28 09:34:15.860944775 +0000 UTC m=+0.092227078 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 09:34:15 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:34:16 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:16.741 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:17 np0005538513.localdomain podman[238015]: 2025-11-28 09:34:15.278996247 +0000 UTC m=+0.031158329 image pull  quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Nov 28 09:34:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60796 DF PROTO=TCP SPT=60144 DPT=9882 SEQ=1095107216 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1EE4020000000001030307) 
Nov 28 09:34:17 np0005538513.localdomain podman[238109]: 
Nov 28 09:34:17 np0005538513.localdomain podman[238109]: 2025-11-28 09:34:17.219671847 +0000 UTC m=+0.074453568 container create 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm)
Nov 28 09:34:17 np0005538513.localdomain podman[238109]: 2025-11-28 09:34:17.177729752 +0000 UTC m=+0.032511493 image pull  quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Nov 28 09:34:17 np0005538513.localdomain python3[238000]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Nov 28 09:34:17 np0005538513.localdomain sudo[237998]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:18 np0005538513.localdomain sudo[238253]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpmroifuipummxpvceoffvyzydusixqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322458.453657-1931-265253430511488/AnsiballZ_stat.py
Nov 28 09:34:18 np0005538513.localdomain sudo[238253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:18 np0005538513.localdomain python3.9[238255]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:34:18 np0005538513.localdomain sudo[238253]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:18 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:18.978 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7022 DF PROTO=TCP SPT=48914 DPT=9105 SEQ=2279364401 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1EEB820000000001030307) 
Nov 28 09:34:19 np0005538513.localdomain sudo[238365]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zldlycesfzvcfkcbkwvaotbpqrryztml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322459.2899623-1958-156528270175026/AnsiballZ_file.py
Nov 28 09:34:19 np0005538513.localdomain sudo[238365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:19 np0005538513.localdomain python3.9[238367]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:34:19 np0005538513.localdomain sudo[238365]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:20 np0005538513.localdomain sudo[238474]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ziulcgzqzaedbmhwlrebacmsqugftour ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322459.825107-1958-33736602754692/AnsiballZ_copy.py
Nov 28 09:34:20 np0005538513.localdomain sudo[238474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:20 np0005538513.localdomain python3.9[238476]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764322459.825107-1958-33736602754692/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:34:20 np0005538513.localdomain sudo[238474]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:20 np0005538513.localdomain sudo[238529]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdlkghuvbhswblsbyiognkupcumkwjhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322459.825107-1958-33736602754692/AnsiballZ_systemd.py
Nov 28 09:34:20 np0005538513.localdomain sudo[238529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:21 np0005538513.localdomain python3.9[238531]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:34:21 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:34:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7023 DF PROTO=TCP SPT=48914 DPT=9105 SEQ=2279364401 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1EF3830000000001030307) 
Nov 28 09:34:21 np0005538513.localdomain systemd-sysv-generator[238556]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:34:21 np0005538513.localdomain systemd-rc-local-generator[238551]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:34:21 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:21 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:21 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:21 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:21 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:34:21 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:21 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:21 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:21 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:21 np0005538513.localdomain sudo[238529]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:21 np0005538513.localdomain rsyslogd[759]: imjournal from <localhost:systemd>: begin to drop messages due to rate-limiting
Nov 28 09:34:21 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:21.743 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:21 np0005538513.localdomain sudo[238621]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwzpcwecmmlekjzliegbmksvioafslnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322459.825107-1958-33736602754692/AnsiballZ_systemd.py
Nov 28 09:34:21 np0005538513.localdomain sudo[238621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:22 np0005538513.localdomain python3.9[238623]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:34:22 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:34:22 np0005538513.localdomain systemd-sysv-generator[238651]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:34:22 np0005538513.localdomain systemd-rc-local-generator[238647]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:34:22 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:22 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:22 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:22 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:22 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:34:22 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:22 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:22 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:22 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:22 np0005538513.localdomain systemd[1]: Starting podman_exporter container...
Nov 28 09:34:22 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:34:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:34:22 np0005538513.localdomain podman[238664]: 2025-11-28 09:34:22.562297435 +0000 UTC m=+0.150879108 container init 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:34:22 np0005538513.localdomain podman_exporter[238676]: ts=2025-11-28T09:34:22.582Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 28 09:34:22 np0005538513.localdomain podman_exporter[238676]: ts=2025-11-28T09:34:22.582Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 28 09:34:22 np0005538513.localdomain podman_exporter[238676]: ts=2025-11-28T09:34:22.582Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 28 09:34:22 np0005538513.localdomain podman_exporter[238676]: ts=2025-11-28T09:34:22.582Z caller=handler.go:105 level=info collector=container
Nov 28 09:34:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:34:22 np0005538513.localdomain podman[238664]: 2025-11-28 09:34:22.602220625 +0000 UTC m=+0.190802248 container start 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:34:22 np0005538513.localdomain podman[238664]: podman_exporter
Nov 28 09:34:22 np0005538513.localdomain systemd[1]: Starting Podman API Service...
Nov 28 09:34:22 np0005538513.localdomain systemd[1]: Started podman_exporter container.
Nov 28 09:34:22 np0005538513.localdomain systemd[1]: Started Podman API Service.
Nov 28 09:34:22 np0005538513.localdomain sudo[238621]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:22 np0005538513.localdomain podman[238687]: time="2025-11-28T09:34:22Z" level=info msg="/usr/bin/podman filtering at log level info"
Nov 28 09:34:22 np0005538513.localdomain podman[238687]: time="2025-11-28T09:34:22Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Nov 28 09:34:22 np0005538513.localdomain podman[238687]: time="2025-11-28T09:34:22Z" level=info msg="Setting parallel job count to 25"
Nov 28 09:34:22 np0005538513.localdomain podman[238687]: time="2025-11-28T09:34:22Z" level=info msg="Using systemd socket activation to determine API endpoint"
Nov 28 09:34:22 np0005538513.localdomain podman[238687]: time="2025-11-28T09:34:22Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"/run/podman/podman.sock\""
Nov 28 09:34:22 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:34:22 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 28 09:34:22 np0005538513.localdomain podman[238687]: time="2025-11-28T09:34:22Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:34:22 np0005538513.localdomain podman[238686]: 2025-11-28 09:34:22.703893895 +0000 UTC m=+0.096546497 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:34:22 np0005538513.localdomain podman[238686]: 2025-11-28 09:34:22.789482289 +0000 UTC m=+0.182134931 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:34:22 np0005538513.localdomain podman[238686]: unhealthy
Nov 28 09:34:23 np0005538513.localdomain sudo[238833]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrzrlabkvceieyhwbjsxxmtvssccuysf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322462.9005017-2030-111568397530594/AnsiballZ_systemd.py
Nov 28 09:34:23 np0005538513.localdomain sudo[238833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:23 np0005538513.localdomain python3.9[238835]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:34:23 np0005538513.localdomain systemd[1]: Stopping podman_exporter container...
Nov 28 09:34:23 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 28 09:34:23 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:34:23 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Failed with result 'exit-code'.
Nov 28 09:34:23 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:34:22 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 2790 "" "Go-http-client/1.1"
Nov 28 09:34:23 np0005538513.localdomain systemd[1]: libpod-3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.scope: Deactivated successfully.
Nov 28 09:34:23 np0005538513.localdomain podman[238839]: 2025-11-28 09:34:23.770182011 +0000 UTC m=+0.233496697 container died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:34:23 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.timer: Deactivated successfully.
Nov 28 09:34:23 np0005538513.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:34:23 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:23.981 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:24 np0005538513.localdomain podman[238839]: 2025-11-28 09:34:24.444353065 +0000 UTC m=+0.907667741 container cleanup 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:34:24 np0005538513.localdomain podman[238839]: podman_exporter
Nov 28 09:34:24 np0005538513.localdomain podman[238854]: 2025-11-28 09:34:24.451639019 +0000 UTC m=+0.678520155 container cleanup 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:34:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:34:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3f18e4f520c6517db6541bc21164a50bc8b8c45d1bffa8dac3494612977ee748-merged.mount: Deactivated successfully.
Nov 28 09:34:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2-userdata-shm.mount: Deactivated successfully.
Nov 28 09:34:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:34:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:34:24 np0005538513.localdomain systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 28 09:34:24 np0005538513.localdomain podman[238867]: 2025-11-28 09:34:24.910085727 +0000 UTC m=+0.069272241 container cleanup 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:34:24 np0005538513.localdomain podman[238867]: podman_exporter
Nov 28 09:34:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7024 DF PROTO=TCP SPT=48914 DPT=9105 SEQ=2279364401 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1F03420000000001030307) 
Nov 28 09:34:25 np0005538513.localdomain systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Nov 28 09:34:25 np0005538513.localdomain systemd[1]: Stopped podman_exporter container.
Nov 28 09:34:25 np0005538513.localdomain systemd[1]: Starting podman_exporter container...
Nov 28 09:34:25 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 28 09:34:25 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cc098275d7f03b1c1fd504750e41e859fc80e308322902c4e50d7df8300f206a-merged.mount: Deactivated successfully.
Nov 28 09:34:26 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:34:26 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 28 09:34:26 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 28 09:34:26 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:34:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:34:26 np0005538513.localdomain podman[238880]: 2025-11-28 09:34:26.622132596 +0000 UTC m=+1.181216531 container init 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:34:26 np0005538513.localdomain podman_exporter[238894]: ts=2025-11-28T09:34:26.635Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 28 09:34:26 np0005538513.localdomain podman_exporter[238894]: ts=2025-11-28T09:34:26.635Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 28 09:34:26 np0005538513.localdomain podman_exporter[238894]: ts=2025-11-28T09:34:26.635Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 28 09:34:26 np0005538513.localdomain podman_exporter[238894]: ts=2025-11-28T09:34:26.635Z caller=handler.go:105 level=info collector=container
Nov 28 09:34:26 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:34:26 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 28 09:34:26 np0005538513.localdomain podman[238687]: time="2025-11-28T09:34:26Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:34:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:34:26 np0005538513.localdomain podman[238880]: 2025-11-28 09:34:26.707316918 +0000 UTC m=+1.266400823 container start 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:34:26 np0005538513.localdomain podman[238880]: podman_exporter
Nov 28 09:34:26 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:26.745 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:27 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:34:27 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:34:27 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:34:27 np0005538513.localdomain systemd[1]: Started podman_exporter container.
Nov 28 09:34:27 np0005538513.localdomain podman[238687]: time="2025-11-28T09:34:27Z" level=error msg="Unmounting /var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged: invalid argument"
Nov 28 09:34:27 np0005538513.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:27 np0005538513.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:27 np0005538513.localdomain podman[238687]: time="2025-11-28T09:34:27Z" level=error msg="Getting root fs size for \"08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93\": getting diffsize of layer \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\" and its parent \"c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6\": creating overlay mount to /var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged, mount_data=\"lowerdir=/var/lib/containers/storage/overlay/l/BO3U54FAUNG3PEE6EV7WXO7QWP,upperdir=/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/diff,workdir=/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/work,nodev,metacopy=on\": no such file or directory"
Nov 28 09:34:27 np0005538513.localdomain podman[238904]: 2025-11-28 09:34:27.740351458 +0000 UTC m=+1.079068017 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:34:27 np0005538513.localdomain podman[238904]: 2025-11-28 09:34:27.74760434 +0000 UTC m=+1.086320909 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:34:27 np0005538513.localdomain sudo[238833]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:27 np0005538513.localdomain podman[238904]: unhealthy
Nov 28 09:34:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31675 DF PROTO=TCP SPT=34634 DPT=9101 SEQ=344210973 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1F0E790000000001030307) 
Nov 28 09:34:28 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 28 09:34:28 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-7b21fd5920b03309569c61b370f896baf7b2149eb0e6261b2fb5b94e6a082fed-merged.mount: Deactivated successfully.
Nov 28 09:34:28 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:34:28 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Failed with result 'exit-code'.
Nov 28 09:34:29 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:29.014 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:34:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:34:29 np0005538513.localdomain podman[238943]: 2025-11-28 09:34:29.612689806 +0000 UTC m=+0.088721496 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 09:34:29 np0005538513.localdomain podman[238944]: 2025-11-28 09:34:29.586247468 +0000 UTC m=+0.065606854 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 09:34:29 np0005538513.localdomain podman[238943]: 2025-11-28 09:34:29.649642551 +0000 UTC m=+0.125674281 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 09:34:29 np0005538513.localdomain podman[238944]: 2025-11-28 09:34:29.668382632 +0000 UTC m=+0.147741918 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 09:34:30 np0005538513.localdomain systemd[1]: tmp-crun.ZI7DDB.mount: Deactivated successfully.
Nov 28 09:34:30 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 28 09:34:30 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cc098275d7f03b1c1fd504750e41e859fc80e308322902c4e50d7df8300f206a-merged.mount: Deactivated successfully.
Nov 28 09:34:30 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:34:30 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309-merged.mount: Deactivated successfully.
Nov 28 09:34:30 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449-merged.mount: Deactivated successfully.
Nov 28 09:34:31 np0005538513.localdomain sudo[239085]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ohccuukyonzypnekesuiinxncpqgqnjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322470.7457156-2054-215208349060927/AnsiballZ_stat.py
Nov 28 09:34:31 np0005538513.localdomain sudo[239085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31677 DF PROTO=TCP SPT=34634 DPT=9101 SEQ=344210973 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1F1A830000000001030307) 
Nov 28 09:34:31 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449-merged.mount: Deactivated successfully.
Nov 28 09:34:31 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:34:31 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:34:31 np0005538513.localdomain podman[238986]: 2025-11-28 09:34:31.219306825 +0000 UTC m=+0.702892795 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 09:34:31 np0005538513.localdomain python3.9[239087]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:34:31 np0005538513.localdomain sudo[239085]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:31 np0005538513.localdomain podman[238986]: 2025-11-28 09:34:31.248682677 +0000 UTC m=+0.732268647 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:34:31 np0005538513.localdomain podman[238986]: unhealthy
Nov 28 09:34:31 np0005538513.localdomain sudo[239183]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-terzhjhontkngmadpyvzzkptcsyslcaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322470.7457156-2054-215208349060927/AnsiballZ_copy.py
Nov 28 09:34:31 np0005538513.localdomain sudo[239183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:31 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:31.747 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:31 np0005538513.localdomain python3.9[239185]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322470.7457156-2054-215208349060927/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:34:31 np0005538513.localdomain sudo[239183]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:34:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 28 09:34:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 28 09:34:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:34:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309-merged.mount: Deactivated successfully.
Nov 28 09:34:32 np0005538513.localdomain sudo[239293]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-powxbicohwgkisdpmpppjfcuqkexhrcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322472.4695826-2105-219620756749704/AnsiballZ_container_config_data.py
Nov 28 09:34:32 np0005538513.localdomain sudo[239293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309-merged.mount: Deactivated successfully.
Nov 28 09:34:32 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:34:32 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Failed with result 'exit-code'.
Nov 28 09:34:32 np0005538513.localdomain python3.9[239295]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Nov 28 09:34:32 np0005538513.localdomain sudo[239293]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7025 DF PROTO=TCP SPT=48914 DPT=9105 SEQ=2279364401 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1F23820000000001030307) 
Nov 28 09:34:33 np0005538513.localdomain sudo[239403]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybhshjxecwiayxzvuarnfnlcqlkikfhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322473.2352734-2132-256468775783501/AnsiballZ_container_config_hash.py
Nov 28 09:34:33 np0005538513.localdomain sudo[239403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:34:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:34:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:34:33 np0005538513.localdomain python3.9[239405]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:34:33 np0005538513.localdomain sudo[239403]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:34 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:34.018 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:34 np0005538513.localdomain sudo[239513]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-armlkiixlgquvumxpogncohsdmlchnlo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764322474.2324595-2162-96388634063147/AnsiballZ_edpm_container_manage.py
Nov 28 09:34:34 np0005538513.localdomain sudo[239513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:34 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Nov 28 09:34:34 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:34:34 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:34:34 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:34:34 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:34:34 np0005538513.localdomain python3[239515]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:34:34 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:34:36 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449-merged.mount: Deactivated successfully.
Nov 28 09:34:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24375 DF PROTO=TCP SPT=59784 DPT=9100 SEQ=3852122444 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1F2F020000000001030307) 
Nov 28 09:34:36 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:36.748 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:37 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 28 09:34:37 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-7b21fd5920b03309569c61b370f896baf7b2149eb0e6261b2fb5b94e6a082fed-merged.mount: Deactivated successfully.
Nov 28 09:34:37 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-7b21fd5920b03309569c61b370f896baf7b2149eb0e6261b2fb5b94e6a082fed-merged.mount: Deactivated successfully.
Nov 28 09:34:37 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-dc5b8b4def912dce4d14a76402b323c6b5c48ee8271230eacbdaaa7e58e676b2-merged.mount: Deactivated successfully.
Nov 28 09:34:38 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5ea32d7a444086a7f1ea2479bd7b214a5adab9651f7d4df1f24a039ae5563f9d-merged.mount: Deactivated successfully.
Nov 28 09:34:38 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:34:38 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-dc5b8b4def912dce4d14a76402b323c6b5c48ee8271230eacbdaaa7e58e676b2-merged.mount: Deactivated successfully.
Nov 28 09:34:39 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:39.044 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-dc5b8b4def912dce4d14a76402b323c6b5c48ee8271230eacbdaaa7e58e676b2-merged.mount: Deactivated successfully.
Nov 28 09:34:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17307 DF PROTO=TCP SPT=52832 DPT=9102 SEQ=2085921338 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1F3B420000000001030307) 
Nov 28 09:34:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309-merged.mount: Deactivated successfully.
Nov 28 09:34:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449-merged.mount: Deactivated successfully.
Nov 28 09:34:39 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:40 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449-merged.mount: Deactivated successfully.
Nov 28 09:34:40 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:40 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:34:40 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:34:40 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Nov 28 09:34:40 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:41 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:34:41 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:34:41 np0005538513.localdomain podman[239543]: 2025-11-28 09:34:41.623038166 +0000 UTC m=+0.107477916 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:34:41 np0005538513.localdomain podman[239543]: 2025-11-28 09:34:41.631118196 +0000 UTC m=+0.115557986 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:34:41 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309-merged.mount: Deactivated successfully.
Nov 28 09:34:41 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:41.750 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:41 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5ea32d7a444086a7f1ea2479bd7b214a5adab9651f7d4df1f24a039ae5563f9d-merged.mount: Deactivated successfully.
Nov 28 09:34:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24377 DF PROTO=TCP SPT=59784 DPT=9100 SEQ=3852122444 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1F46C20000000001030307) 
Nov 28 09:34:42 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:34:43 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:34:43 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:34:43 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:34:43 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 28 09:34:44 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:44.044 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 28 09:34:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:34:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:34:45 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:34:45 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Nov 28 09:34:45 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:34:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:34:46 np0005538513.localdomain podman[239576]: 2025-11-28 09:34:46.070415443 +0000 UTC m=+0.102734405 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Nov 28 09:34:46 np0005538513.localdomain podman[239576]: 2025-11-28 09:34:46.109476125 +0000 UTC m=+0.141795087 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 09:34:46 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 28 09:34:46 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-483382bf68693c05a65aded8bd4df683c0e0d8870bcc7441be08a396c5476266-merged.mount: Deactivated successfully.
Nov 28 09:34:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:46.751 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57643 DF PROTO=TCP SPT=49156 DPT=9882 SEQ=731182719 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1F59430000000001030307) 
Nov 28 09:34:47 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449-merged.mount: Deactivated successfully.
Nov 28 09:34:47 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:34:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-dc5b8b4def912dce4d14a76402b323c6b5c48ee8271230eacbdaaa7e58e676b2-merged.mount: Deactivated successfully.
Nov 28 09:34:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5ea32d7a444086a7f1ea2479bd7b214a5adab9651f7d4df1f24a039ae5563f9d-merged.mount: Deactivated successfully.
Nov 28 09:34:49 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5ea32d7a444086a7f1ea2479bd7b214a5adab9651f7d4df1f24a039ae5563f9d-merged.mount: Deactivated successfully.
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.080 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.113 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.113 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:34:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16630 DF PROTO=TCP SPT=42048 DPT=9105 SEQ=813186906 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1F60C20000000001030307) 
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.134 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.135 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.135 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.317 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.317 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.318 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.318 228337 DEBUG nova.objects.instance [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.692 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.707 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.707 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.708 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.708 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.709 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.709 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.709 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.710 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.710 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.711 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.730 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.730 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.730 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.731 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:34:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:49.731 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:34:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:50.133 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.402s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:34:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:34:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:50.214 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:34:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:50.214 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:34:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:34:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:34:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-dc5b8b4def912dce4d14a76402b323c6b5c48ee8271230eacbdaaa7e58e676b2-merged.mount: Deactivated successfully.
Nov 28 09:34:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:50.446 228337 WARNING nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:34:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:50.447 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12335MB free_disk=41.837093353271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:34:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:50.448 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:34:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:50.448 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:34:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:50.526 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:34:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:50.527 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:34:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:50.528 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:34:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:50.565 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:34:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:34:50.813 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:34:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:34:50.813 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:34:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:34:50.814 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:34:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:51.039 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:34:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:51.046 228337 DEBUG nova.compute.provider_tree [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:34:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:51.060 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:34:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:51.063 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:34:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:51.063 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:34:51 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:34:51 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:34:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16631 DF PROTO=TCP SPT=42048 DPT=9105 SEQ=813186906 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1F68C30000000001030307) 
Nov 28 09:34:51 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:34:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:51.753 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:52 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Nov 28 09:34:52 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:34:52 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:34:52 np0005538513.localdomain sudo[239648]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:34:52 np0005538513.localdomain sudo[239648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:34:52 np0005538513.localdomain sudo[239648]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:52 np0005538513.localdomain sudo[239666]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:34:52 np0005538513.localdomain sudo[239666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:34:52 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Nov 28 09:34:53 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5ea32d7a444086a7f1ea2479bd7b214a5adab9651f7d4df1f24a039ae5563f9d-merged.mount: Deactivated successfully.
Nov 28 09:34:53 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:34:53 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:34:53 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:53 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:34:53 np0005538513.localdomain sudo[239666]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:54.083 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:54 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:34:54 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:34:54 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 28 09:34:54 np0005538513.localdomain sudo[239727]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:34:54 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:54 np0005538513.localdomain sudo[239727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:34:54 np0005538513.localdomain sudo[239727]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:54 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:54 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:54 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:54 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:54 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:55 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:34:55 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:34:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16632 DF PROTO=TCP SPT=42048 DPT=9105 SEQ=813186906 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1F78820000000001030307) 
Nov 28 09:34:55 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:34:56 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 28 09:34:56 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-483382bf68693c05a65aded8bd4df683c0e0d8870bcc7441be08a396c5476266-merged.mount: Deactivated successfully.
Nov 28 09:34:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:56.755 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:34:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e56fe7373565106f757a890fd2f61ff50243f466d810e377560c4fab1b4ea8c4-merged.mount: Deactivated successfully.
Nov 28 09:34:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e56fe7373565106f757a890fd2f61ff50243f466d810e377560c4fab1b4ea8c4-merged.mount: Deactivated successfully.
Nov 28 09:34:57 np0005538513.localdomain podman[239529]: 2025-11-28 09:34:36.504122231 +0000 UTC m=+0.045449218 image pull  quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Nov 28 09:34:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33947 DF PROTO=TCP SPT=41576 DPT=9101 SEQ=1129885523 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1F83AA0000000001030307) 
Nov 28 09:34:58 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:34:58 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 28 09:34:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:34:59 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 28 09:34:59 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:34:59.123 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:34:59 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:34:59 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:34:59 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:34:59 np0005538513.localdomain podman[239745]: 2025-11-28 09:34:59.996549155 +0000 UTC m=+1.019115005 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:35:00 np0005538513.localdomain podman[239745]: 2025-11-28 09:35:00.006287726 +0000 UTC m=+1.028853506 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:35:00 np0005538513.localdomain podman[239745]: unhealthy
Nov 28 09:35:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33949 DF PROTO=TCP SPT=41576 DPT=9101 SEQ=1129885523 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1F8FC20000000001030307) 
Nov 28 09:35:01 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:35:01 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:35:01 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:35:01 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:01 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:01 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:35:01 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:01.757 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:01 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:01 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:35:01 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Failed with result 'exit-code'.
Nov 28 09:35:01 np0005538513.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:01 np0005538513.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:01 np0005538513.localdomain podman[239780]: 2025-11-28 09:35:01.927948066 +0000 UTC m=+0.645854297 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 28 09:35:01 np0005538513.localdomain podman[239781]: 2025-11-28 09:35:01.984555062 +0000 UTC m=+0.699937342 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 28 09:35:01 np0005538513.localdomain podman[239780]: 2025-11-28 09:35:01.990482852 +0000 UTC m=+0.708389023 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 09:35:02 np0005538513.localdomain podman[239781]: 2025-11-28 09:35:02.018427088 +0000 UTC m=+0.733809398 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 09:35:02 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:02 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:35:02 np0005538513.localdomain podman[238687]: time="2025-11-28T09:35:02Z" level=error msg="Getting root fs size for \"33a8f7b4baabd0da2b2d44596de4ba044dd80b4c1cbf36f390043c4e7ddd116e\": getting diffsize of layer \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\" and its parent \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\": unmounting layer 3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae: replacing mount point \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged\": device or resource busy"
Nov 28 09:35:03 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:03 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:03 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:35:03 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:35:03 np0005538513.localdomain podman[239840]: 2025-11-28 09:35:03.09084954 +0000 UTC m=+0.153756180 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:35:03 np0005538513.localdomain podman[239821]: 2025-11-28 09:35:03.016124304 +0000 UTC m=+1.056794762 image pull  quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Nov 28 09:35:03 np0005538513.localdomain podman[239840]: 2025-11-28 09:35:03.120957975 +0000 UTC m=+0.183864635 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 28 09:35:03 np0005538513.localdomain podman[239840]: unhealthy
Nov 28 09:35:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64262 DF PROTO=TCP SPT=51412 DPT=9102 SEQ=4205650874 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1F98820000000001030307) 
Nov 28 09:35:03 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:03 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:04 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:04.125 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:04 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 28 09:35:04 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-78b48dbff0a2fe76e018f9048f1970d44652db7588437c79ac71691eb45c0ad0-merged.mount: Deactivated successfully.
Nov 28 09:35:04 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:35:04 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Failed with result 'exit-code'.
Nov 28 09:35:04 np0005538513.localdomain podman[239821]: 
Nov 28 09:35:04 np0005538513.localdomain podman[239821]: 2025-11-28 09:35:04.575083385 +0000 UTC m=+2.615753823 container create a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, architecture=x86_64)
Nov 28 09:35:04 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:05 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:05 np0005538513.localdomain python3[239515]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Nov 28 09:35:05 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-1e604deea57dbda554a168861cff1238f93b8c6c69c863c43aed37d9d99c5fed-merged.mount: Deactivated successfully.
Nov 28 09:35:05 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-b4f761d90eeb5a4c1ea51e856783cf8398e02a6caf306b90498250a43e5bbae1-merged.mount: Deactivated successfully.
Nov 28 09:35:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24991 DF PROTO=TCP SPT=33148 DPT=9102 SEQ=2463907536 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1FA3820000000001030307) 
Nov 28 09:35:06 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:06.758 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:07 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:07 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e56fe7373565106f757a890fd2f61ff50243f466d810e377560c4fab1b4ea8c4-merged.mount: Deactivated successfully.
Nov 28 09:35:08 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-1e604deea57dbda554a168861cff1238f93b8c6c69c863c43aed37d9d99c5fed-merged.mount: Deactivated successfully.
Nov 28 09:35:08 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:35:08 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 28 09:35:09 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 28 09:35:09 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:09.159 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46751 DF PROTO=TCP SPT=41968 DPT=9100 SEQ=2251484439 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1FAF820000000001030307) 
Nov 28 09:35:10 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:10 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:35:10 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:35:10 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:10 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:10 np0005538513.localdomain sudo[239513]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:10 np0005538513.localdomain sudo[239992]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ckbbrslcekxiltsyyhrshgnlinxcppuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322510.657177-2186-49879698160525/AnsiballZ_stat.py
Nov 28 09:35:10 np0005538513.localdomain sudo[239992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:35:11 np0005538513.localdomain python3.9[239994]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:35:11 np0005538513.localdomain sudo[239992]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:11 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:11 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:11 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:11 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:11.759 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:11 np0005538513.localdomain sudo[240104]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rdtiiizvrngebqdbwhrtviggjjdnkrfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322511.5098891-2213-258271708889324/AnsiballZ_file.py
Nov 28 09:35:11 np0005538513.localdomain sudo[240104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:35:12 np0005538513.localdomain python3.9[240106]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:35:12 np0005538513.localdomain sudo[240104]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:12 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:12 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47709 DF PROTO=TCP SPT=34374 DPT=9100 SEQ=332810096 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1FBC020000000001030307) 
Nov 28 09:35:12 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:12 np0005538513.localdomain sudo[240213]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkfreoffzvkmpgdsyablxvibfhdxxjct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322512.0669112-2213-1590486213092/AnsiballZ_copy.py
Nov 28 09:35:12 np0005538513.localdomain sudo[240213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:35:12 np0005538513.localdomain python3.9[240215]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764322512.0669112-2213-1590486213092/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:35:12 np0005538513.localdomain sudo[240213]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:13 np0005538513.localdomain sudo[240268]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yaqvxdnwhjrizixdbbmkhcceoxxbqsfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322512.0669112-2213-1590486213092/AnsiballZ_systemd.py
Nov 28 09:35:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:35:13 np0005538513.localdomain sudo[240268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:35:13 np0005538513.localdomain podman[240270]: 2025-11-28 09:35:13.189743165 +0000 UTC m=+0.087568086 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:35:13 np0005538513.localdomain podman[240270]: 2025-11-28 09:35:13.203424148 +0000 UTC m=+0.101249079 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:35:13 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Nov 28 09:35:13 np0005538513.localdomain python3.9[240271]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:35:13 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:35:13 np0005538513.localdomain systemd-rc-local-generator[240314]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:35:13 np0005538513.localdomain systemd-sysv-generator[240319]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:35:13 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:13 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:13 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:13 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:13 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:35:13 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:13 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:13 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:13 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:13 np0005538513.localdomain sudo[240268]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:13 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 28 09:35:13 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-78b48dbff0a2fe76e018f9048f1970d44652db7588437c79ac71691eb45c0ad0-merged.mount: Deactivated successfully.
Nov 28 09:35:14 np0005538513.localdomain sudo[240382]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avngrteiivsadoskwiiyezeguuqbpsky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322512.0669112-2213-1590486213092/AnsiballZ_systemd.py
Nov 28 09:35:14 np0005538513.localdomain sudo[240382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:35:14 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:14.163 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:14 np0005538513.localdomain python3.9[240384]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:35:14 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:35:14 np0005538513.localdomain systemd-sysv-generator[240411]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:35:14 np0005538513.localdomain systemd-rc-local-generator[240404]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:35:14 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:14 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:14 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:14 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:14 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:35:14 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:14 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:14 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:14 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:14 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:14 np0005538513.localdomain systemd[1]: Starting openstack_network_exporter container...
Nov 28 09:35:14 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:35:14 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:35:15 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-b4f761d90eeb5a4c1ea51e856783cf8398e02a6caf306b90498250a43e5bbae1-merged.mount: Deactivated successfully.
Nov 28 09:35:15 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e1fac4507a16e359f79966290a44e975bb0ed717e8b6cc0e34b61e8c96e0a1a3-merged.mount: Deactivated successfully.
Nov 28 09:35:15 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:15 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:16 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:16 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:35:16 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33ff60da07618a40b2fc62ada30664d987983f58a7edb15149056316c06bf8ae/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 28 09:35:16 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33ff60da07618a40b2fc62ada30664d987983f58a7edb15149056316c06bf8ae/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 09:35:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:35:16 np0005538513.localdomain podman[240425]: 2025-11-28 09:35:16.101640828 +0000 UTC m=+1.319102440 container init a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Nov 28 09:35:16 np0005538513.localdomain openstack_network_exporter[240439]: INFO    09:35:16 main.go:48: registering *bridge.Collector
Nov 28 09:35:16 np0005538513.localdomain openstack_network_exporter[240439]: INFO    09:35:16 main.go:48: registering *coverage.Collector
Nov 28 09:35:16 np0005538513.localdomain openstack_network_exporter[240439]: INFO    09:35:16 main.go:48: registering *datapath.Collector
Nov 28 09:35:16 np0005538513.localdomain openstack_network_exporter[240439]: INFO    09:35:16 main.go:48: registering *iface.Collector
Nov 28 09:35:16 np0005538513.localdomain openstack_network_exporter[240439]: INFO    09:35:16 main.go:48: registering *memory.Collector
Nov 28 09:35:16 np0005538513.localdomain openstack_network_exporter[240439]: INFO    09:35:16 main.go:48: registering *ovnnorthd.Collector
Nov 28 09:35:16 np0005538513.localdomain openstack_network_exporter[240439]: INFO    09:35:16 main.go:48: registering *ovn.Collector
Nov 28 09:35:16 np0005538513.localdomain openstack_network_exporter[240439]: INFO    09:35:16 main.go:48: registering *ovsdbserver.Collector
Nov 28 09:35:16 np0005538513.localdomain openstack_network_exporter[240439]: INFO    09:35:16 main.go:48: registering *pmd_perf.Collector
Nov 28 09:35:16 np0005538513.localdomain openstack_network_exporter[240439]: INFO    09:35:16 main.go:48: registering *pmd_rxq.Collector
Nov 28 09:35:16 np0005538513.localdomain openstack_network_exporter[240439]: INFO    09:35:16 main.go:48: registering *vswitch.Collector
Nov 28 09:35:16 np0005538513.localdomain openstack_network_exporter[240439]: NOTICE  09:35:16 main.go:82: listening on http://:9105/metrics
Nov 28 09:35:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:35:16 np0005538513.localdomain podman[240425]: 2025-11-28 09:35:16.136322089 +0000 UTC m=+1.353783651 container start a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, architecture=x86_64, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_id=edpm, version=9.6, managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7)
Nov 28 09:35:16 np0005538513.localdomain podman[240425]: openstack_network_exporter
Nov 28 09:35:16 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:16.761 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:16 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:16 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:16 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:16 np0005538513.localdomain systemd[1]: Started openstack_network_exporter container.
Nov 28 09:35:16 np0005538513.localdomain podman[240449]: 2025-11-28 09:35:16.91587476 +0000 UTC m=+0.774873067 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=starting, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, release=1755695350, container_name=openstack_network_exporter, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Nov 28 09:35:16 np0005538513.localdomain sudo[240382]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:16 np0005538513.localdomain podman[240449]: 2025-11-28 09:35:16.944431683 +0000 UTC m=+0.803430030 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 09:35:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24578 DF PROTO=TCP SPT=36456 DPT=9882 SEQ=3864392090 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1FCE830000000001030307) 
Nov 28 09:35:17 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:35:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:35:17 np0005538513.localdomain systemd[1]: tmp-crun.A6SbWR.mount: Deactivated successfully.
Nov 28 09:35:17 np0005538513.localdomain podman[240487]: 2025-11-28 09:35:17.862807161 +0000 UTC m=+0.094316553 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:35:17 np0005538513.localdomain podman[240487]: 2025-11-28 09:35:17.873394589 +0000 UTC m=+0.104903991 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 28 09:35:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63956 DF PROTO=TCP SPT=52332 DPT=9105 SEQ=1190426356 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1FD6020000000001030307) 
Nov 28 09:35:19 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:19.219 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:19 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Nov 28 09:35:19 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-95fcfabaf1695644f149e65ad3965d41d9d311fa2ace2d9f1efd734cfe54eb68-merged.mount: Deactivated successfully.
Nov 28 09:35:19 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-95fcfabaf1695644f149e65ad3965d41d9d311fa2ace2d9f1efd734cfe54eb68-merged.mount: Deactivated successfully.
Nov 28 09:35:19 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:35:20 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 28 09:35:20 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:20 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:21 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63957 DF PROTO=TCP SPT=52332 DPT=9105 SEQ=1190426356 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1FDE020000000001030307) 
Nov 28 09:35:21 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:21 np0005538513.localdomain sudo[240596]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwigboivbcxugmphyfoosottfqvyxcfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322521.147131-2285-195813060402274/AnsiballZ_systemd.py
Nov 28 09:35:21 np0005538513.localdomain sudo[240596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:35:21 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:21 np0005538513.localdomain python3.9[240598]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:35:21 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:21.762 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:21 np0005538513.localdomain systemd[1]: Stopping openstack_network_exporter container...
Nov 28 09:35:21 np0005538513.localdomain systemd[1]: libpod-a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.scope: Deactivated successfully.
Nov 28 09:35:21 np0005538513.localdomain podman[240602]: 2025-11-28 09:35:21.875914391 +0000 UTC m=+0.064193374 container died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, container_name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 09:35:21 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.timer: Deactivated successfully.
Nov 28 09:35:21 np0005538513.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:35:22 np0005538513.localdomain podman[240602]: 2025-11-28 09:35:22.122928871 +0000 UTC m=+0.311207874 container cleanup a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, version=9.6, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, config_id=edpm, release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Nov 28 09:35:22 np0005538513.localdomain podman[240602]: openstack_network_exporter
Nov 28 09:35:22 np0005538513.localdomain systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 28 09:35:22 np0005538513.localdomain podman[240629]: 2025-11-28 09:35:22.222084285 +0000 UTC m=+0.073967587 container cleanup a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git)
Nov 28 09:35:22 np0005538513.localdomain podman[240629]: openstack_network_exporter
Nov 28 09:35:22 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 28 09:35:22 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-aa2c8f8913bff67439dab4aeb6db6e2056909e99c44d0de88f7837d07d6c8847-merged.mount: Deactivated successfully.
Nov 28 09:35:22 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Nov 28 09:35:22 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-33ff60da07618a40b2fc62ada30664d987983f58a7edb15149056316c06bf8ae-merged.mount: Deactivated successfully.
Nov 28 09:35:22 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659-userdata-shm.mount: Deactivated successfully.
Nov 28 09:35:23 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:23 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:35:23 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:35:23 np0005538513.localdomain systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Nov 28 09:35:23 np0005538513.localdomain systemd[1]: Stopped openstack_network_exporter container.
Nov 28 09:35:23 np0005538513.localdomain systemd[1]: Starting openstack_network_exporter container...
Nov 28 09:35:24 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:24.224 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully.
Nov 28 09:35:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:25 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:25 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:35:25 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33ff60da07618a40b2fc62ada30664d987983f58a7edb15149056316c06bf8ae/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 28 09:35:25 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33ff60da07618a40b2fc62ada30664d987983f58a7edb15149056316c06bf8ae/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 09:35:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:35:25 np0005538513.localdomain podman[240642]: 2025-11-28 09:35:25.141644332 +0000 UTC m=+1.137632753 container init a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41)
Nov 28 09:35:25 np0005538513.localdomain openstack_network_exporter[240658]: INFO    09:35:25 main.go:48: registering *bridge.Collector
Nov 28 09:35:25 np0005538513.localdomain openstack_network_exporter[240658]: INFO    09:35:25 main.go:48: registering *coverage.Collector
Nov 28 09:35:25 np0005538513.localdomain openstack_network_exporter[240658]: INFO    09:35:25 main.go:48: registering *datapath.Collector
Nov 28 09:35:25 np0005538513.localdomain openstack_network_exporter[240658]: INFO    09:35:25 main.go:48: registering *iface.Collector
Nov 28 09:35:25 np0005538513.localdomain openstack_network_exporter[240658]: INFO    09:35:25 main.go:48: registering *memory.Collector
Nov 28 09:35:25 np0005538513.localdomain openstack_network_exporter[240658]: INFO    09:35:25 main.go:48: registering *ovnnorthd.Collector
Nov 28 09:35:25 np0005538513.localdomain openstack_network_exporter[240658]: INFO    09:35:25 main.go:48: registering *ovn.Collector
Nov 28 09:35:25 np0005538513.localdomain openstack_network_exporter[240658]: INFO    09:35:25 main.go:48: registering *ovsdbserver.Collector
Nov 28 09:35:25 np0005538513.localdomain openstack_network_exporter[240658]: INFO    09:35:25 main.go:48: registering *pmd_perf.Collector
Nov 28 09:35:25 np0005538513.localdomain openstack_network_exporter[240658]: INFO    09:35:25 main.go:48: registering *pmd_rxq.Collector
Nov 28 09:35:25 np0005538513.localdomain openstack_network_exporter[240658]: INFO    09:35:25 main.go:48: registering *vswitch.Collector
Nov 28 09:35:25 np0005538513.localdomain openstack_network_exporter[240658]: NOTICE  09:35:25 main.go:82: listening on http://:9105/metrics
Nov 28 09:35:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:35:25 np0005538513.localdomain podman[240642]: 2025-11-28 09:35:25.179491031 +0000 UTC m=+1.175479442 container start a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, config_id=edpm, io.openshift.expose-services=, container_name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 28 09:35:25 np0005538513.localdomain podman[240642]: openstack_network_exporter
Nov 28 09:35:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63958 DF PROTO=TCP SPT=52332 DPT=9105 SEQ=1190426356 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1FEDC20000000001030307) 
Nov 28 09:35:25 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:25 np0005538513.localdomain systemd[1]: Started openstack_network_exporter container.
Nov 28 09:35:25 np0005538513.localdomain sudo[240596]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:25 np0005538513.localdomain podman[240669]: 2025-11-28 09:35:25.770132357 +0000 UTC m=+0.587401207 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=starting, vcs-type=git, build-date=2025-08-20T13:12:41, version=9.6, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, config_id=edpm, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Nov 28 09:35:25 np0005538513.localdomain podman[240669]: 2025-11-28 09:35:25.813431075 +0000 UTC m=+0.630699935 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter)
Nov 28 09:35:26 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:26 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:26.764 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:27 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:28 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32106 DF PROTO=TCP SPT=35094 DPT=9101 SEQ=1959211545 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1FF8D90000000001030307) 
Nov 28 09:35:28 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:28 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:35:29 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:29.237 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:29 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:29 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:30 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:30 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Nov 28 09:35:30 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-95fcfabaf1695644f149e65ad3965d41d9d311fa2ace2d9f1efd734cfe54eb68-merged.mount: Deactivated successfully.
Nov 28 09:35:30 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32108 DF PROTO=TCP SPT=35094 DPT=9101 SEQ=1959211545 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2004C20000000001030307) 
Nov 28 09:35:31 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:31 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:31 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 28 09:35:31 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:31.765 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:35:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:32 np0005538513.localdomain podman[240704]: 2025-11-28 09:35:32.374678998 +0000 UTC m=+0.106176961 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:35:32 np0005538513.localdomain podman[240704]: 2025-11-28 09:35:32.385310247 +0000 UTC m=+0.116808200 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:35:32 np0005538513.localdomain podman[240704]: unhealthy
Nov 28 09:35:32 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:35:32 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Failed with result 'exit-code'.
Nov 28 09:35:33 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:35:33 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:35:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 28 09:35:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-aa2c8f8913bff67439dab4aeb6db6e2056909e99c44d0de88f7837d07d6c8847-merged.mount: Deactivated successfully.
Nov 28 09:35:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-2cd9444c84550fbd551e3826a8110fcc009757858b99e84f1119041f2325189b-merged.mount: Deactivated successfully.
Nov 28 09:35:33 np0005538513.localdomain podman[240778]: 2025-11-28 09:35:33.341636478 +0000 UTC m=+0.082402877 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:35:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63959 DF PROTO=TCP SPT=52332 DPT=9105 SEQ=1190426356 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB200D820000000001030307) 
Nov 28 09:35:33 np0005538513.localdomain podman[240779]: 2025-11-28 09:35:33.409461413 +0000 UTC m=+0.148469777 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:35:33 np0005538513.localdomain podman[240778]: 2025-11-28 09:35:33.430407031 +0000 UTC m=+0.171173420 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Nov 28 09:35:33 np0005538513.localdomain podman[240779]: 2025-11-28 09:35:33.442859925 +0000 UTC m=+0.181868259 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 09:35:33 np0005538513.localdomain sudo[240859]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxlmjllbanfkracgjtysnvrjglashlav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322533.1559575-2309-270107626280646/AnsiballZ_find.py
Nov 28 09:35:33 np0005538513.localdomain sudo[240859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:35:33 np0005538513.localdomain python3.9[240861]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 09:35:33 np0005538513.localdomain sudo[240859]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:34 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:34.245 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:35:35 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:35 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a02f32b7b5e21aed25175ffba2f815ea97f4b11687dc87cafc732563545474b2-merged.mount: Deactivated successfully.
Nov 28 09:35:35 np0005538513.localdomain sudo[240981]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nxgyytpouscxexcuzlrrbqnetqvjzoai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322534.197774-2337-162478819346533/AnsiballZ_podman_container_info.py
Nov 28 09:35:35 np0005538513.localdomain sudo[240981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:35:35 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:35:35 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:35:35 np0005538513.localdomain podman[240924]: 2025-11-28 09:35:35.784108359 +0000 UTC m=+0.865040753 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Nov 28 09:35:35 np0005538513.localdomain python3.9[240983]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Nov 28 09:35:35 np0005538513.localdomain podman[240924]: 2025-11-28 09:35:35.819485761 +0000 UTC m=+0.900418115 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Nov 28 09:35:35 np0005538513.localdomain podman[240924]: unhealthy
Nov 28 09:35:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39360 DF PROTO=TCP SPT=50730 DPT=9100 SEQ=1224037347 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2019420000000001030307) 
Nov 28 09:35:36 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 28 09:35:36 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:35:36 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:36.768 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:36 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:35:37 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:38 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:38 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:38 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:35:38 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Failed with result 'exit-code'.
Nov 28 09:35:38 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 28 09:35:39 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:39.292 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21644 DF PROTO=TCP SPT=57170 DPT=9102 SEQ=392364504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2025820000000001030307) 
Nov 28 09:35:40 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:40 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:40 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:41 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:35:41 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:41 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:41 np0005538513.localdomain sudo[240981]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:41 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:41.771 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:42 np0005538513.localdomain sudo[241108]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ueclweblymodllsvqnssecstzwaotlmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322541.7958815-2345-249155720329088/AnsiballZ_podman_container_exec.py
Nov 28 09:35:42 np0005538513.localdomain sudo[241108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:35:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39362 DF PROTO=TCP SPT=50730 DPT=9100 SEQ=1224037347 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2031020000000001030307) 
Nov 28 09:35:42 np0005538513.localdomain python3.9[241110]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:35:42 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:42 np0005538513.localdomain systemd[1]: Started libpod-conmon-9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.scope.
Nov 28 09:35:42 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:42 np0005538513.localdomain podman[241111]: 2025-11-28 09:35:42.608499551 +0000 UTC m=+0.128215082 container exec 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 28 09:35:42 np0005538513.localdomain podman[241111]: 2025-11-28 09:35:42.642544862 +0000 UTC m=+0.162260403 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 28 09:35:43 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:43 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:43 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:44 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:44 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:44 np0005538513.localdomain sudo[241108]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:44 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:44.294 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:44 np0005538513.localdomain sudo[241247]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exxlqogqfmnnecmifwerltsuehqfrajq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322544.3086119-2353-37652215082849/AnsiballZ_podman_container_exec.py
Nov 28 09:35:44 np0005538513.localdomain sudo[241247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:35:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:44 np0005538513.localdomain python3.9[241249]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:35:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:35:45 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:45 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:46 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:46 np0005538513.localdomain systemd[1]: libpod-conmon-9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.scope: Deactivated successfully.
Nov 28 09:35:46 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:46 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:46 np0005538513.localdomain podman[241261]: 2025-11-28 09:35:46.283287429 +0000 UTC m=+0.520184640 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:35:46 np0005538513.localdomain podman[241261]: 2025-11-28 09:35:46.321773577 +0000 UTC m=+0.558670828 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:35:46 np0005538513.localdomain systemd[1]: Started libpod-conmon-9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.scope.
Nov 28 09:35:46 np0005538513.localdomain podman[241250]: 2025-11-28 09:35:46.343277172 +0000 UTC m=+1.574209640 container exec 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 09:35:46 np0005538513.localdomain podman[241250]: 2025-11-28 09:35:46.373484275 +0000 UTC m=+1.604416743 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 09:35:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:46.774 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:47 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:47 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17944 DF PROTO=TCP SPT=41772 DPT=9882 SEQ=156577219 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2043820000000001030307) 
Nov 28 09:35:47 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:47 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:47 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:35:47 np0005538513.localdomain sudo[241247]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:47 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:47 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:47 np0005538513.localdomain sudo[241412]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnugfzahfdslytgcnkklmvvuhlflqpmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322547.6752384-2361-176297945151296/AnsiballZ_file.py
Nov 28 09:35:47 np0005538513.localdomain sudo[241412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:35:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:48 np0005538513.localdomain python3.9[241414]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:35:48 np0005538513.localdomain sudo[241412]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:48 np0005538513.localdomain sudo[241522]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apnindjistudmvisoiizjpkagjcgjbbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322548.4151444-2370-249362242013914/AnsiballZ_podman_container_info.py
Nov 28 09:35:48 np0005538513.localdomain sudo[241522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:35:48 np0005538513.localdomain python3.9[241524]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Nov 28 09:35:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42695 DF PROTO=TCP SPT=58530 DPT=9105 SEQ=4281876847 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB204B420000000001030307) 
Nov 28 09:35:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:49.336 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:49 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:35:49 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a02f32b7b5e21aed25175ffba2f815ea97f4b11687dc87cafc732563545474b2-merged.mount: Deactivated successfully.
Nov 28 09:35:49 np0005538513.localdomain systemd[1]: libpod-conmon-9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.scope: Deactivated successfully.
Nov 28 09:35:49 np0005538513.localdomain podman[241537]: 2025-11-28 09:35:49.865664252 +0000 UTC m=+0.270155846 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:35:49 np0005538513.localdomain podman[241537]: 2025-11-28 09:35:49.877988563 +0000 UTC m=+0.282480147 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:35:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 28 09:35:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:35:50 np0005538513.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:50 np0005538513.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:50 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:35:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:35:50.814 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:35:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:35:50.814 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:35:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:35:50.816 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:35:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.066 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.067 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.067 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.068 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:35:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42696 DF PROTO=TCP SPT=58530 DPT=9105 SEQ=4281876847 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2053430000000001030307) 
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.357 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.358 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.358 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.359 228337 DEBUG nova.objects.instance [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.776 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.794 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.812 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.812 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.813 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.814 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.814 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.815 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.815 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.816 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.816 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.817 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.832 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.832 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.833 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.833 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:35:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:51.834 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:35:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:52.311 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:35:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:52.385 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:35:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:52.386 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:35:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:52.592 228337 WARNING nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:35:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:52.594 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12425MB free_disk=41.837093353271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:35:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:52.594 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:35:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:52.594 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:35:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:52.689 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:35:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:52.690 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:35:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:52.690 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:35:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:52.739 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:35:53 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:53 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-850a5494078c53945d7c69a5c914ed859c31bc07523103c44dd52329f5c128d7-merged.mount: Deactivated successfully.
Nov 28 09:35:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:53.180 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:35:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:53.186 228337 DEBUG nova.compute.provider_tree [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:35:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:53.205 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:35:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:53.208 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:35:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:53.208 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.614s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:35:53 np0005538513.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:53 np0005538513.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:53 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-850a5494078c53945d7c69a5c914ed859c31bc07523103c44dd52329f5c128d7-merged.mount: Deactivated successfully.
Nov 28 09:35:53 np0005538513.localdomain sudo[241522]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:53 np0005538513.localdomain sudo[241707]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltjmxspbwmspyvabtbxqyueebxzkkynw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322553.4603975-2378-59394296547495/AnsiballZ_podman_container_exec.py
Nov 28 09:35:53 np0005538513.localdomain sudo[241707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:35:53 np0005538513.localdomain python3.9[241709]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:35:54 np0005538513.localdomain systemd[1]: Started libpod-conmon-ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.scope.
Nov 28 09:35:54 np0005538513.localdomain podman[241710]: 2025-11-28 09:35:54.060182695 +0000 UTC m=+0.128244233 container exec ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:35:54 np0005538513.localdomain podman[241710]: 2025-11-28 09:35:54.095395892 +0000 UTC m=+0.163457370 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent)
Nov 28 09:35:54 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:54.338 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:54 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 28 09:35:54 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:35:54 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:54 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:54 np0005538513.localdomain sudo[241707]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:54 np0005538513.localdomain sudo[241772]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:35:54 np0005538513.localdomain sudo[241772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:35:54 np0005538513.localdomain sudo[241772]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:54 np0005538513.localdomain sudo[241812]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:35:54 np0005538513.localdomain sudo[241812]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:35:54 np0005538513.localdomain sudo[241882]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gljdchglavalptxrjtavexikvbvhcmnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322554.6340785-2386-269525520713321/AnsiballZ_podman_container_exec.py
Nov 28 09:35:54 np0005538513.localdomain sudo[241882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:35:55 np0005538513.localdomain python3.9[241884]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:35:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42697 DF PROTO=TCP SPT=58530 DPT=9105 SEQ=4281876847 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2063020000000001030307) 
Nov 28 09:35:55 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:55 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:55 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:56 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:56 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:56.780 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:56 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:56 np0005538513.localdomain systemd[1]: libpod-conmon-ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.scope: Deactivated successfully.
Nov 28 09:35:56 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:56 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:56 np0005538513.localdomain systemd[1]: Started libpod-conmon-ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.scope.
Nov 28 09:35:56 np0005538513.localdomain podman[241898]: 2025-11-28 09:35:56.966137463 +0000 UTC m=+1.819300341 container exec ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible)
Nov 28 09:35:56 np0005538513.localdomain podman[241898]: 2025-11-28 09:35:56.994845439 +0000 UTC m=+1.848008297 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 09:35:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49238 DF PROTO=TCP SPT=55004 DPT=9101 SEQ=1791525045 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB206E0B0000000001030307) 
Nov 28 09:35:58 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:35:58 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:58 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:58 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:58 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:58 np0005538513.localdomain sudo[241882]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:58 np0005538513.localdomain podman[241928]: 2025-11-28 09:35:58.900377113 +0000 UTC m=+0.414152534 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, maintainer=Red Hat, Inc., vendor=Red Hat, Inc.)
Nov 28 09:35:58 np0005538513.localdomain podman[241928]: 2025-11-28 09:35:58.915894823 +0000 UTC m=+0.429670294 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, name=ubi9-minimal, version=9.6, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Nov 28 09:35:59 np0005538513.localdomain sudo[241812]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:59 np0005538513.localdomain sudo[242073]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ikmmyizzmuzafktrwdehjuxlbqrwoxoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322559.0489218-2394-199582460138295/AnsiballZ_file.py
Nov 28 09:35:59 np0005538513.localdomain sudo[242073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:35:59 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:35:59.372 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:35:59 np0005538513.localdomain python3.9[242075]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:35:59 np0005538513.localdomain sudo[242073]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:59 np0005538513.localdomain sudo[242076]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:35:59 np0005538513.localdomain sudo[242076]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:35:59 np0005538513.localdomain sudo[242076]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:59 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:59 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:59 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:59 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:59 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:59 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:59 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:35:59 np0005538513.localdomain systemd[1]: libpod-conmon-ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.scope: Deactivated successfully.
Nov 28 09:36:00 np0005538513.localdomain sudo[242202]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tloakkhbtwdlxiskygswjztlpbxmsabn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322559.7306857-2403-132210428868231/AnsiballZ_podman_container_info.py
Nov 28 09:36:00 np0005538513.localdomain sudo[242202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:36:00 np0005538513.localdomain python3.9[242204]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.668 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.668 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.669 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.673 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 12742 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80989d41-d4c6-40d1-9078-4b3420b335ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12742, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:36:00.669338', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'a67c8c40-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.84124414, 'message_signature': 'f53061f66588a32fb6319f3d7d0a1e17d06ffd781739a369fe1eda8039993be9'}]}, 'timestamp': '2025-11-28 09:36:00.673989', '_unique_id': 'e9622042d9824c65959b49ce9e7d0051'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.675 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.676 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.716 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.716 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eaffc968-cf22-400e-89bf-ce595d7b17e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:36:00.676787', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a6830c00-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.84868108, 'message_signature': '57caa742ea3dcd58b0a2dbb5067e223bdc65ea5695a62b224b7b7b773f7dd835'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:36:00.676787', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a6831cf4-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.84868108, 'message_signature': 'fa8e098be0e510142fc874ad45eb5e741795ac7d45a2ddca937b00b59469af2f'}]}, 'timestamp': '2025-11-28 09:36:00.716928', '_unique_id': 'c226e0e4ac6d48b3b9e6bf54be6c56b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.717 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.719 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.719 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1313024378 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.719 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 168520223 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60b75969-1cb6-497c-bef3-d4e403bc17d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1313024378, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:36:00.719258', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a68388c4-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.84868108, 'message_signature': '4eee4386785e1f55ae7824eeb7b4f96a992ceb2eede935769f8bdb1d605138fe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 168520223, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:36:00.719258', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a6839896-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.84868108, 'message_signature': '8d13b9b9a3f8df59636f27f2642940277f0f253524763b0da1bdd98e3404fd63'}]}, 'timestamp': '2025-11-28 09:36:00.720117', '_unique_id': '5ba5a9e09d6b4b148827b79a0ad0cbd5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.721 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.722 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.722 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55f184de-7906-4376-9149-f41acc0f1eea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:36:00.722267', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'a683fe76-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.84124414, 'message_signature': '740b2bd22fed5b999a0cdb240a239bf51ca7300363944774c347c14299334602'}]}, 'timestamp': '2025-11-28 09:36:00.722727', '_unique_id': '699e1d5cccf14edf92aff6ed2a2acf06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.723 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.724 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.743 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 46960000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ac7785e-56b6-4d50-9d4e-13a1a3dd9d8a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46960000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:36:00.724837', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'a68746bc-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.915557075, 'message_signature': '62ca1d4406473bdd2becd335e145f43b9c6257d35bcde3af3d6b98ab919bbdde'}]}, 'timestamp': '2025-11-28 09:36:00.744257', '_unique_id': 'ad33fdba7ed649dc8da6e8edf88a77ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.745 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.746 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.746 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae9a7010-3616-4574-ae71-e292e84970bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:36:00.746353', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'a687ab16-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.84124414, 'message_signature': '2e71a5f352cefe537de06f5f58fac80720b4cb5089e1b72de959c024e3dc4287'}]}, 'timestamp': '2025-11-28 09:36:00.746834', '_unique_id': '22a4757bd669479aa8d061af71a37aa3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.747 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.748 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.748 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 9175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30a8f5c1-0d09-4dfc-9ca1-d93743379daf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9175, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:36:00.748906', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'a6880ffc-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.84124414, 'message_signature': 'c71df1919e4d4b21ad7de1dd6bf91d4211281f0d0e0cad1693a92cf13765d496'}]}, 'timestamp': '2025-11-28 09:36:00.749391', '_unique_id': 'f40bce998bee4e5697a9a6a24660d19f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.750 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.751 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.764 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.765 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:00 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '968dfb60-80d9-43fe-afce-9dd3a034274c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:36:00.751518', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a68a8214-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.923409287, 'message_signature': '65300ea8d6338a06132f2901eebd352f83dd0826034b41ddf688edf7a7c34feb'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:36:00.751518', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a68a9268-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.923409287, 'message_signature': '7d42faace65349e00227845b38d55030e80318c6e4c93d52ed0f7cf6c002db1f'}]}, 'timestamp': '2025-11-28 09:36:00.765806', '_unique_id': '87296448f4074723a1cc511e2e926c01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.770 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.771 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.772 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.772 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d3b436f-a557-4342-8ea9-3d1d95a7928f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:36:00.772114', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a68b9974-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.923409287, 'message_signature': '810153aafd51aba448ca5bd3ac2c33939e494b1fd4f33b66de922c51dfe731bf'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:36:00.772114', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a68ba96e-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.923409287, 'message_signature': 'a56c7109a1c41b88cc4910b4fea69ae5cae366af066adc1be8b77f1cf6d966a1'}]}, 'timestamp': '2025-11-28 09:36:00.772950', '_unique_id': 'a04890813b4f45529c1a4c462fbdff25'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.773 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.774 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.775 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.775 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09ebb322-9512-4d4a-a875-6d5694b1f5ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:36:00.775136', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a68c0f4e-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.84868108, 'message_signature': '314e1312e4e7321b75fff16d282a6ef7595f2839a5eb594a45c29df7a1afd351'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:36:00.775136', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a68c1f52-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.84868108, 'message_signature': '8104cf4fbc2bdabe9f44cdd86c0c3b6f1938be4bb1f3406ed9f996499eaee90d'}]}, 'timestamp': '2025-11-28 09:36:00.775968', '_unique_id': 'c7ba7523783c423a9ab61dc562e3f47e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.776 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.777 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.778 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd15c1125-c2cd-462f-840a-5b57cbd2d373', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 144, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:36:00.778134', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'a68c84b0-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.84124414, 'message_signature': '668833978665ed86d751e1fe999bf93b77580cd0b2efec69a14a5c39161b0b1d'}]}, 'timestamp': '2025-11-28 09:36:00.778594', '_unique_id': '470bbbc478d4442c9acb2296b9e769c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.779 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.780 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.780 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 73981952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.781 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd07d55a5-cdf6-4553-b285-a7d086dd01c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73981952, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:36:00.780767', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a68ceb12-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.84868108, 'message_signature': '3ddedcfd257998f148d75723d816c71929edac4efdac128a30b77816a647b6ba'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:36:00.780767', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a68cfca6-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.84868108, 'message_signature': '737dd5d744e170e0b7e7f9fd724baa398d5b849148be9ef292935f27245a906f'}]}, 'timestamp': '2025-11-28 09:36:00.781633', '_unique_id': '0b4510815fea4279918bb7715f608de1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.782 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.783 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.783 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.783 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f33c8556-ba86-47fe-b60d-f5faf7a73d70', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:36:00.783899', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'a68d675e-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.84124414, 'message_signature': '44431154db3c5831e1dd7c122420d4b9cb467fefe5943a3aeb9ad5d8b54623b4'}]}, 'timestamp': '2025-11-28 09:36:00.784397', '_unique_id': 'bc9e54af80aa4989a618ed909c64b841'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.785 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.786 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.786 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '688b20fc-b535-4091-b45e-7c5db17a2493', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:36:00.786480', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'a68dca6e-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.84124414, 'message_signature': '98628b815edb689fa2ed2462a599fabdc117e246e712b5baa10acd4163dbca0d'}]}, 'timestamp': '2025-11-28 09:36:00.786957', '_unique_id': '86d93c5b0bb34a45a0585fed4e497525'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.788 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.789 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.789 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.789 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa4ef915-1cbc-4b63-b71e-103c228bd3fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:36:00.789199', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a68e34ae-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.923409287, 'message_signature': 'b58a0ad8cebea49f468667a3349fcbedecf3752086a82b9187bb8bb58f111762'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:36:00.789199', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a68e446c-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.923409287, 'message_signature': 'ed492a8998ee9af361f5fd6938acc8a7272b5eec30511d84313f614dba357003'}]}, 'timestamp': '2025-11-28 09:36:00.790057', '_unique_id': '196012eee9634d34bdcc8ed1f22cf4ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.792 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.792 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5dcad88f-b230-4962-900f-f5e21dc1000f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:36:00.792192', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'a68ea9a2-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.84124414, 'message_signature': 'f8be0a7f6b2adb118c2e51d948b4c6b7d29d7873853c1d9554c8ffc1d10ea9ea'}]}, 'timestamp': '2025-11-28 09:36:00.792644', '_unique_id': '1c1568dd92424acb9bd3fda74f9220dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.794 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.794 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '901ca721-fae4-41a7-a86e-19fbb51d3836', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:36:00.794703', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'a68f0ce4-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.84124414, 'message_signature': 'd5f68edad6c257da330613ae9844a081c7a8329fd4bf9f7f083f81bb5c95d707'}]}, 'timestamp': '2025-11-28 09:36:00.795214', '_unique_id': '769d0d43a7fe4ed2ad30b4b79bbe9ce2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.796 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.797 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.797 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 52.40234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a49b0268-0e78-4bf8-a63b-294a1167788c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.40234375, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:36:00.797284', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'a68f704e-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.915557075, 'message_signature': '2caddb114615a19fb59f79a2ecf2d3a5813c0c21fd5ef354d600dfa484520033'}]}, 'timestamp': '2025-11-28 09:36:00.797714', '_unique_id': '0bba6d74857543ad85a508e3ee03b744'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.798 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.800 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.800 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.800 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 566 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.800 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b87d6d92-bd80-4426-ab68-4cda6d73a462', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 566, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:36:00.800307', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a68fe6a0-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.84868108, 'message_signature': '3ed88950a4c40989b8d77fc50a6e681465c7c9e7e2d6ebc41397f2a30b47e15b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:36:00.800307', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a68ff672-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.84868108, 'message_signature': '6ba3094f979a8974ad8c13474a0e7623e96b9792bb84a6fd8f21a1a4c6db4110'}]}, 'timestamp': '2025-11-28 09:36:00.801171', '_unique_id': '797cbd9f54e84912aad8d550fdf62a2b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.802 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 86 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa8ab1d4-6ca0-4d69-86f7-b57936bb021f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 86, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:36:00.803188', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'a69053e2-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.84124414, 'message_signature': '80f9c5fb361b348cc4735eacaf82c30253152a32a7d160aa00528a245da0ac32'}]}, 'timestamp': '2025-11-28 09:36:00.803469', '_unique_id': '28fae328e87a461a918183ea38a5027f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.803 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.804 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.804 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 305908425 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.804 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 30452399 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f2e18ed-8781-49dd-ac9c-6d5ee0f511c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 305908425, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:36:00.804709', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a6908f2e-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.84868108, 'message_signature': 'b18c3323358b78b540704aec7a626ccc9378274e7426c81b88f556f87a31d422'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30452399, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:36:00.804709', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a69099b0-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10394.84868108, 'message_signature': '027233cd9f53da723e8e4d05a46a3e5e1fd923927cb3871e29a756a43d63a2c6'}]}, 'timestamp': '2025-11-28 09:36:00.805237', '_unique_id': 'fc8162e83d3d4af49e254fe698c153c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:36:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:36:00.805 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:36:00 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 28 09:36:00 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-abe1c4a3f68b93d18940fc38699bd560bf3f77db8c7795c1004a1bfa9999fcdb-merged.mount: Deactivated successfully.
Nov 28 09:36:01 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-abe1c4a3f68b93d18940fc38699bd560bf3f77db8c7795c1004a1bfa9999fcdb-merged.mount: Deactivated successfully.
Nov 28 09:36:01 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:01 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49240 DF PROTO=TCP SPT=55004 DPT=9101 SEQ=1791525045 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB207A020000000001030307) 
Nov 28 09:36:01 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:01.783 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:03 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21507 DF PROTO=TCP SPT=34680 DPT=9102 SEQ=4131501185 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2083020000000001030307) 
Nov 28 09:36:03 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:36:03 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:03 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:03 np0005538513.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:03 np0005538513.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:03 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:03 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:03 np0005538513.localdomain sudo[242202]: pam_unix(sudo:session): session closed for user root
Nov 28 09:36:03 np0005538513.localdomain podman[242219]: 2025-11-28 09:36:03.795694114 +0000 UTC m=+0.347751923 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:36:03 np0005538513.localdomain podman[242219]: 2025-11-28 09:36:03.80237749 +0000 UTC m=+0.354435279 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:36:03 np0005538513.localdomain podman[242219]: unhealthy
Nov 28 09:36:04 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:04.372 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:05 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:05 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:05 np0005538513.localdomain sudo[242349]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqvuquktlgoyxpubkmtprbrkylxnycex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322565.0529304-2411-589333421942/AnsiballZ_podman_container_exec.py
Nov 28 09:36:05 np0005538513.localdomain sudo[242349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:36:05 np0005538513.localdomain python3.9[242351]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:36:05 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:06 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:36:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:36:06 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-850a5494078c53945d7c69a5c914ed859c31bc07523103c44dd52329f5c128d7-merged.mount: Deactivated successfully.
Nov 28 09:36:06 np0005538513.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:06 np0005538513.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:06 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:36:06 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Failed with result 'exit-code'.
Nov 28 09:36:06 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:06 np0005538513.localdomain podman[242364]: 2025-11-28 09:36:06.364477547 +0000 UTC m=+0.265491653 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 28 09:36:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65313 DF PROTO=TCP SPT=36762 DPT=9100 SEQ=3552939440 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB208E820000000001030307) 
Nov 28 09:36:06 np0005538513.localdomain podman[242364]: 2025-11-28 09:36:06.394847165 +0000 UTC m=+0.295861251 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 28 09:36:06 np0005538513.localdomain systemd[1]: Started libpod-conmon-7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.scope.
Nov 28 09:36:06 np0005538513.localdomain podman[242352]: 2025-11-28 09:36:06.431734914 +0000 UTC m=+0.831964821 container exec 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:36:06 np0005538513.localdomain podman[242352]: 2025-11-28 09:36:06.467949003 +0000 UTC m=+0.868178870 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3)
Nov 28 09:36:06 np0005538513.localdomain podman[242363]: 2025-11-28 09:36:06.519854096 +0000 UTC m=+0.419008574 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 28 09:36:06 np0005538513.localdomain podman[242363]: 2025-11-28 09:36:06.571993407 +0000 UTC m=+0.471147895 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 28 09:36:06 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:06.785 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:07 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:07 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:07 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 28 09:36:07 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 28 09:36:07 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:36:07 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:07 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:07 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:36:07 np0005538513.localdomain sudo[242349]: pam_unix(sudo:session): session closed for user root
Nov 28 09:36:08 np0005538513.localdomain sudo[242530]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idspevczhpazmbicakzneobvuanoflmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322567.8296993-2419-180616883967357/AnsiballZ_podman_container_exec.py
Nov 28 09:36:08 np0005538513.localdomain sudo[242530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:36:08 np0005538513.localdomain podman[238687]: time="2025-11-28T09:36:08Z" level=error msg="Getting root fs size for \"635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951\": getting diffsize of layer \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\" and its parent \"c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy"
Nov 28 09:36:08 np0005538513.localdomain python3.9[242532]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:36:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:36:08 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:08 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:08 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:08 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:08 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:08 np0005538513.localdomain systemd[1]: libpod-conmon-7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.scope: Deactivated successfully.
Nov 28 09:36:08 np0005538513.localdomain systemd[1]: Started libpod-conmon-7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.scope.
Nov 28 09:36:08 np0005538513.localdomain podman[242533]: 2025-11-28 09:36:08.692857692 +0000 UTC m=+0.345245225 container exec 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:36:08 np0005538513.localdomain podman[242533]: 2025-11-28 09:36:08.726398859 +0000 UTC m=+0.378786372 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 09:36:08 np0005538513.localdomain podman[242545]: 2025-11-28 09:36:08.736838681 +0000 UTC m=+0.227418506 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 28 09:36:08 np0005538513.localdomain podman[242545]: 2025-11-28 09:36:08.770424979 +0000 UTC m=+0.261004784 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 28 09:36:08 np0005538513.localdomain podman[242545]: unhealthy
Nov 28 09:36:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47712 DF PROTO=TCP SPT=34374 DPT=9100 SEQ=332810096 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2099820000000001030307) 
Nov 28 09:36:09 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:09.405 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:09 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:09 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:10 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:11 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-fc6f1986e221e102f7feefd3b96110a2bdd081d61203281438672df579c3b8f4-merged.mount: Deactivated successfully.
Nov 28 09:36:11 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:11 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:11 np0005538513.localdomain sudo[242530]: pam_unix(sudo:session): session closed for user root
Nov 28 09:36:11 np0005538513.localdomain sudo[242686]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cccjjimkrspzcntengcqiqvjykedecxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322571.3314404-2427-73211796451096/AnsiballZ_file.py
Nov 28 09:36:11 np0005538513.localdomain sudo[242686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:36:11 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:11.786 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:11 np0005538513.localdomain python3.9[242688]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:36:11 np0005538513.localdomain sudo[242686]: pam_unix(sudo:session): session closed for user root
Nov 28 09:36:11 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:11 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:11 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:36:11 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:36:11 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:36:11 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Failed with result 'exit-code'.
Nov 28 09:36:11 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:11 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:11 np0005538513.localdomain systemd[1]: libpod-conmon-7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.scope: Deactivated successfully.
Nov 28 09:36:12 np0005538513.localdomain sudo[242796]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zinxwdafdnvildjtrkratvcxikisrcdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322572.0178928-2436-196268168231175/AnsiballZ_podman_container_info.py
Nov 28 09:36:12 np0005538513.localdomain sudo[242796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:36:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65315 DF PROTO=TCP SPT=36762 DPT=9100 SEQ=3552939440 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB20A6420000000001030307) 
Nov 28 09:36:12 np0005538513.localdomain python3.9[242798]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Nov 28 09:36:13 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 28 09:36:13 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-abe1c4a3f68b93d18940fc38699bd560bf3f77db8c7795c1004a1bfa9999fcdb-merged.mount: Deactivated successfully.
Nov 28 09:36:13 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-abe1c4a3f68b93d18940fc38699bd560bf3f77db8c7795c1004a1bfa9999fcdb-merged.mount: Deactivated successfully.
Nov 28 09:36:13 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:13 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:13 np0005538513.localdomain sudo[242796]: pam_unix(sudo:session): session closed for user root
Nov 28 09:36:13 np0005538513.localdomain sudo[242917]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cvgzgixmknqssgkumbhdjmmgqpugahiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322573.475562-2444-12914832332162/AnsiballZ_podman_container_exec.py
Nov 28 09:36:13 np0005538513.localdomain sudo[242917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:36:13 np0005538513.localdomain python3.9[242919]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:36:14 np0005538513.localdomain systemd[1]: Started libpod-conmon-1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.scope.
Nov 28 09:36:14 np0005538513.localdomain podman[242920]: 2025-11-28 09:36:14.113241413 +0000 UTC m=+0.114786676 container exec 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Nov 28 09:36:14 np0005538513.localdomain podman[242920]: 2025-11-28 09:36:14.14745289 +0000 UTC m=+0.148998153 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 09:36:14 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:36:14 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-701465e48d77119796b927e784ad3d56a8f99cc392002110647f3a4cfb83b9e9-merged.mount: Deactivated successfully.
Nov 28 09:36:14 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:14.408 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:15 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:15 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:15 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:15 np0005538513.localdomain sudo[242917]: pam_unix(sudo:session): session closed for user root
Nov 28 09:36:16 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:16.787 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21075 DF PROTO=TCP SPT=49388 DPT=9882 SEQ=4180159823 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB20B8C20000000001030307) 
Nov 28 09:36:17 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:36:17 np0005538513.localdomain sudo[243068]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxxtwzizvccipjhyuaygwsqcabjqahcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322576.1242723-2452-280032524023138/AnsiballZ_podman_container_exec.py
Nov 28 09:36:17 np0005538513.localdomain sudo[243068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:36:17 np0005538513.localdomain python3.9[243070]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:36:18 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:18 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:18 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:18 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:18 np0005538513.localdomain systemd[1]: libpod-conmon-1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.scope: Deactivated successfully.
Nov 28 09:36:18 np0005538513.localdomain systemd[1]: Started libpod-conmon-1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.scope.
Nov 28 09:36:18 np0005538513.localdomain podman[243038]: 2025-11-28 09:36:18.555292298 +0000 UTC m=+1.097474708 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:36:18 np0005538513.localdomain podman[243071]: 2025-11-28 09:36:18.623045959 +0000 UTC m=+0.820454120 container exec 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, config_id=edpm)
Nov 28 09:36:18 np0005538513.localdomain podman[243038]: 2025-11-28 09:36:18.637305195 +0000 UTC m=+1.179487605 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:36:18 np0005538513.localdomain podman[243071]: 2025-11-28 09:36:18.652053986 +0000 UTC m=+0.849462117 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Nov 28 09:36:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39856 DF PROTO=TCP SPT=45440 DPT=9105 SEQ=2565626300 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB20C0430000000001030307) 
Nov 28 09:36:19 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:19.452 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:19 np0005538513.localdomain systemd[1]: tmp-crun.j0sgSw.mount: Deactivated successfully.
Nov 28 09:36:19 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:20 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:20 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:20 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:20 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:36:20 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:20 np0005538513.localdomain sudo[243068]: pam_unix(sudo:session): session closed for user root
Nov 28 09:36:20 np0005538513.localdomain sudo[243223]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtgqpezxqzbkqtqgzshefgugethkuknl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322580.6166065-2460-263491218901929/AnsiballZ_file.py
Nov 28 09:36:20 np0005538513.localdomain sudo[243223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:36:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:36:21 np0005538513.localdomain python3.9[243226]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:36:21 np0005538513.localdomain sudo[243223]: pam_unix(sudo:session): session closed for user root
Nov 28 09:36:21 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39857 DF PROTO=TCP SPT=45440 DPT=9105 SEQ=2565626300 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB20C8420000000001030307) 
Nov 28 09:36:21 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:21 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:21 np0005538513.localdomain sudo[243344]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mpiqvwkawrvesazetujsrwldbjslfatm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322581.2747009-2469-151926053193199/AnsiballZ_podman_container_info.py
Nov 28 09:36:21 np0005538513.localdomain sudo[243344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:36:21 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:21 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:21 np0005538513.localdomain systemd[1]: libpod-conmon-1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.scope: Deactivated successfully.
Nov 28 09:36:21 np0005538513.localdomain podman[243225]: 2025-11-28 09:36:21.723447102 +0000 UTC m=+0.828680072 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd)
Nov 28 09:36:21 np0005538513.localdomain podman[243225]: 2025-11-28 09:36:21.744278719 +0000 UTC m=+0.849511619 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 09:36:21 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:21.788 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:21 np0005538513.localdomain python3.9[243346]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Nov 28 09:36:22 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:36:22 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:22 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:22 np0005538513.localdomain systemd[1]: tmp-crun.8PlkWZ.mount: Deactivated successfully.
Nov 28 09:36:22 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:22 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:22 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:24 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:24.454 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-fc6f1986e221e102f7feefd3b96110a2bdd081d61203281438672df579c3b8f4-merged.mount: Deactivated successfully.
Nov 28 09:36:24 np0005538513.localdomain sudo[243344]: pam_unix(sudo:session): session closed for user root
Nov 28 09:36:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39858 DF PROTO=TCP SPT=45440 DPT=9105 SEQ=2565626300 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB20D8030000000001030307) 
Nov 28 09:36:25 np0005538513.localdomain sudo[243474]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-leisjkxmqvlvqdioahhkjxvlieyrjgbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322585.0951962-2477-162059028196765/AnsiballZ_podman_container_exec.py
Nov 28 09:36:25 np0005538513.localdomain sudo[243474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:36:25 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:36:25 np0005538513.localdomain python3.9[243476]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:36:25 np0005538513.localdomain systemd[1]: Started libpod-conmon-49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.scope.
Nov 28 09:36:25 np0005538513.localdomain podman[243477]: 2025-11-28 09:36:25.740146104 +0000 UTC m=+0.119177833 container exec 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:36:25 np0005538513.localdomain podman[243477]: 2025-11-28 09:36:25.773449572 +0000 UTC m=+0.152481271 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:36:26 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:26 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:26 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:26.789 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:27 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:27 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e44c8d0954aa6a829c62ac9efb238e93c579afb851dd0e28c7fd1cbc63565274-merged.mount: Deactivated successfully.
Nov 28 09:36:27 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e44c8d0954aa6a829c62ac9efb238e93c579afb851dd0e28c7fd1cbc63565274-merged.mount: Deactivated successfully.
Nov 28 09:36:27 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:27 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:27 np0005538513.localdomain sudo[243474]: pam_unix(sudo:session): session closed for user root
Nov 28 09:36:27 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:27 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:27 np0005538513.localdomain sudo[243615]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vaxwrnkzlsgcwgoeachydcrxcjbdvzgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322587.663729-2485-46435611088827/AnsiballZ_podman_container_exec.py
Nov 28 09:36:27 np0005538513.localdomain sudo[243615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:36:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48259 DF PROTO=TCP SPT=34236 DPT=9101 SEQ=3913446707 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB20E33A0000000001030307) 
Nov 28 09:36:28 np0005538513.localdomain python3.9[243617]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:36:29 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:29 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:29.491 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:29 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:29 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:29 np0005538513.localdomain systemd[1]: libpod-conmon-49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.scope: Deactivated successfully.
Nov 28 09:36:29 np0005538513.localdomain systemd[1]: Started libpod-conmon-49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.scope.
Nov 28 09:36:29 np0005538513.localdomain podman[243618]: 2025-11-28 09:36:29.889638786 +0000 UTC m=+1.721129383 container exec 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:36:29 np0005538513.localdomain podman[243618]: 2025-11-28 09:36:29.976059457 +0000 UTC m=+1.807550054 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:36:30 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:36:30 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:36:30 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-701465e48d77119796b927e784ad3d56a8f99cc392002110647f3a4cfb83b9e9-merged.mount: Deactivated successfully.
Nov 28 09:36:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48261 DF PROTO=TCP SPT=34236 DPT=9101 SEQ=3913446707 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB20EF430000000001030307) 
Nov 28 09:36:31 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:31 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:31 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:31.791 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:31 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:31 np0005538513.localdomain systemd[1]: libpod-conmon-49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.scope: Deactivated successfully.
Nov 28 09:36:31 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:31 np0005538513.localdomain sudo[243615]: pam_unix(sudo:session): session closed for user root
Nov 28 09:36:32 np0005538513.localdomain podman[243648]: 2025-11-28 09:36:32.038688277 +0000 UTC m=+1.778003671 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-type=git, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, release=1755695350)
Nov 28 09:36:32 np0005538513.localdomain podman[243648]: 2025-11-28 09:36:32.052310204 +0000 UTC m=+1.791625608 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64)
Nov 28 09:36:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39859 DF PROTO=TCP SPT=45440 DPT=9105 SEQ=2565626300 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB20F7820000000001030307) 
Nov 28 09:36:33 np0005538513.localdomain sudo[243775]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ismukgxenowvxhygriwzwghowvubtepf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322592.9512057-2493-242687711217445/AnsiballZ_file.py
Nov 28 09:36:33 np0005538513.localdomain sudo[243775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:36:33 np0005538513.localdomain python3.9[243777]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:36:33 np0005538513.localdomain sudo[243775]: pam_unix(sudo:session): session closed for user root
Nov 28 09:36:34 np0005538513.localdomain sudo[243885]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tsdaaecvfxcvcjoxwrmkltokewgzywza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322593.754685-2502-272408845921770/AnsiballZ_podman_container_info.py
Nov 28 09:36:34 np0005538513.localdomain sudo[243885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:36:34 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:34 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:34 np0005538513.localdomain python3.9[243887]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Nov 28 09:36:34 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:34.493 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:34 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:34 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:34 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:34 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:36:35 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:35 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:35 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:36 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21647 DF PROTO=TCP SPT=57170 DPT=9102 SEQ=392364504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2103830000000001030307) 
Nov 28 09:36:36 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:36:36 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:36 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:36.792 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:36 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:37 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:37 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:37 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:37 np0005538513.localdomain sudo[243885]: pam_unix(sudo:session): session closed for user root
Nov 28 09:36:37 np0005538513.localdomain podman[243900]: 2025-11-28 09:36:37.528685194 +0000 UTC m=+1.112835358 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:36:37 np0005538513.localdomain podman[243900]: 2025-11-28 09:36:37.566013406 +0000 UTC m=+1.150163580 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:36:37 np0005538513.localdomain podman[243900]: unhealthy
Nov 28 09:36:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:36:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:36:37 np0005538513.localdomain sudo[244053]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxufjljctqnzgaysjojehetdlmkfwlbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322597.6851838-2510-28275890390424/AnsiballZ_podman_container_exec.py
Nov 28 09:36:37 np0005538513.localdomain sudo[244053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:36:38 np0005538513.localdomain python3.9[244055]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:36:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39365 DF PROTO=TCP SPT=50730 DPT=9100 SEQ=1224037347 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB210F820000000001030307) 
Nov 28 09:36:39 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:39.539 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-025413710f75ace93305dda8659754012c7e6b0ed71f82e0bba1b040134d7ebe-merged.mount: Deactivated successfully.
Nov 28 09:36:39 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:36:39 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Failed with result 'exit-code'.
Nov 28 09:36:40 np0005538513.localdomain podman[243979]: 2025-11-28 09:36:39.99945313 +0000 UTC m=+2.233282677 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:36:40 np0005538513.localdomain podman[243980]: 2025-11-28 09:36:40.04918137 +0000 UTC m=+2.274809317 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 28 09:36:40 np0005538513.localdomain podman[243980]: 2025-11-28 09:36:40.057333209 +0000 UTC m=+2.282961136 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:36:40 np0005538513.localdomain podman[243979]: 2025-11-28 09:36:40.070550114 +0000 UTC m=+2.304379621 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:36:40 np0005538513.localdomain systemd[1]: Started libpod-conmon-3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.scope.
Nov 28 09:36:40 np0005538513.localdomain podman[244056]: 2025-11-28 09:36:40.17315456 +0000 UTC m=+2.024682841 container exec 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:36:40 np0005538513.localdomain podman[244056]: 2025-11-28 09:36:40.207410477 +0000 UTC m=+2.058938738 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:36:40 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:40 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:36:40 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:36:41 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:36:41 np0005538513.localdomain sudo[244053]: pam_unix(sudo:session): session closed for user root
Nov 28 09:36:41 np0005538513.localdomain sudo[244213]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgzvnwamlbnfiulxevmlhvlygiuyggnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322601.2231297-2518-236594829800130/AnsiballZ_podman_container_exec.py
Nov 28 09:36:41 np0005538513.localdomain sudo[244213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:36:41 np0005538513.localdomain python3.9[244215]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:36:41 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 28 09:36:41 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:41.793 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7165 DF PROTO=TCP SPT=49462 DPT=9100 SEQ=787236909 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB211B820000000001030307) 
Nov 28 09:36:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:36:43 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:43 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e44c8d0954aa6a829c62ac9efb238e93c579afb851dd0e28c7fd1cbc63565274-merged.mount: Deactivated successfully.
Nov 28 09:36:43 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e44c8d0954aa6a829c62ac9efb238e93c579afb851dd0e28c7fd1cbc63565274-merged.mount: Deactivated successfully.
Nov 28 09:36:43 np0005538513.localdomain systemd[1]: libpod-conmon-3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.scope: Deactivated successfully.
Nov 28 09:36:43 np0005538513.localdomain systemd[1]: Started libpod-conmon-3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.scope.
Nov 28 09:36:43 np0005538513.localdomain podman[244216]: 2025-11-28 09:36:43.658909512 +0000 UTC m=+1.932498693 container exec 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:36:43 np0005538513.localdomain podman[244216]: 2025-11-28 09:36:43.691487677 +0000 UTC m=+1.965076858 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:36:43 np0005538513.localdomain podman[244228]: 2025-11-28 09:36:43.747139629 +0000 UTC m=+0.979623787 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 09:36:43 np0005538513.localdomain podman[244228]: 2025-11-28 09:36:43.759483357 +0000 UTC m=+0.991967535 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute)
Nov 28 09:36:44 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:44.539 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:36:45 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:45 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:46 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:46 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:46 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:36:46 np0005538513.localdomain sudo[244213]: pam_unix(sudo:session): session closed for user root
Nov 28 09:36:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:46.795 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20752 DF PROTO=TCP SPT=39420 DPT=9882 SEQ=2260185983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB212E020000000001030307) 
Nov 28 09:36:47 np0005538513.localdomain sudo[244370]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtfexuqzldiewasfwllyfssvfmqekqwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322607.0620198-2526-53242190344890/AnsiballZ_file.py
Nov 28 09:36:47 np0005538513.localdomain sudo[244370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:36:47 np0005538513.localdomain python3.9[244372]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:36:47 np0005538513.localdomain sudo[244370]: pam_unix(sudo:session): session closed for user root
Nov 28 09:36:47 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:48 np0005538513.localdomain sudo[244480]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wyinitazssvkcebldpfzejjtefqavobk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322607.7792065-2535-1143529841186/AnsiballZ_podman_container_info.py
Nov 28 09:36:48 np0005538513.localdomain sudo[244480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:36:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:48 np0005538513.localdomain python3.9[244482]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Nov 28 09:36:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:48 np0005538513.localdomain systemd[1]: libpod-conmon-3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.scope: Deactivated successfully.
Nov 28 09:36:48 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38952 DF PROTO=TCP SPT=43242 DPT=9105 SEQ=2441211023 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2135820000000001030307) 
Nov 28 09:36:49 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:49.574 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:36:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:50 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:50 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:50 np0005538513.localdomain podman[244494]: 2025-11-28 09:36:50.761536304 +0000 UTC m=+0.273185602 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:36:50 np0005538513.localdomain podman[244494]: 2025-11-28 09:36:50.769397754 +0000 UTC m=+0.281047042 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:36:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:36:50.815 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:36:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:36:50.815 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:36:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:36:50.816 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:36:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:50.818 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:36:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:50.818 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:36:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:50.844 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:36:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:50.844 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:36:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:50.844 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:36:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38953 DF PROTO=TCP SPT=43242 DPT=9105 SEQ=2441211023 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB213D820000000001030307) 
Nov 28 09:36:51 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:51 np0005538513.localdomain podman[238687]: time="2025-11-28T09:36:51Z" level=error msg="Getting root fs size for \"6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265\": getting diffsize of layer \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\" and its parent \"c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy"
Nov 28 09:36:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:51.418 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:36:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:51.419 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:36:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:51.419 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:36:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:51.419 228337 DEBUG nova.objects.instance [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:36:51 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:51 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:51.796 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:51.886 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:36:51 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:36:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:51.937 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:36:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:51.940 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:36:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:51.941 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:36:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:51.942 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:36:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:51.943 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:36:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:51.944 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:36:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:51.944 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:36:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:51.945 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:36:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:51.946 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:36:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:51.950 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:36:51 np0005538513.localdomain sudo[244480]: pam_unix(sudo:session): session closed for user root
Nov 28 09:36:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:51.999 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:36:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:51.999 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:36:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:51.999 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:36:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:52.000 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:36:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:52.000 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:36:52 np0005538513.localdomain sudo[244646]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uzizspshnwccmdjyxquzboqhmjrvrpcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322612.1784003-2543-57935985758823/AnsiballZ_podman_container_exec.py
Nov 28 09:36:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:52.439 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:36:52 np0005538513.localdomain sudo[244646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:36:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:36:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:52.520 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:36:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:52.520 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:36:52 np0005538513.localdomain podman[244650]: 2025-11-28 09:36:52.529179567 +0000 UTC m=+0.072393095 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:36:52 np0005538513.localdomain podman[244650]: 2025-11-28 09:36:52.537567473 +0000 UTC m=+0.080781001 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd)
Nov 28 09:36:52 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:52 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:52 np0005538513.localdomain python3.9[244651]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:36:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:52.694 228337 WARNING nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:36:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:52.695 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12456MB free_disk=41.837093353271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:36:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:52.695 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:36:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:52.696 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:36:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:52.769 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:36:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:52.769 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:36:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:52.769 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:36:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:52.816 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:36:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:53.256 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:36:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:53.262 228337 DEBUG nova.compute.provider_tree [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:36:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:53.294 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:36:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:53.296 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:36:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:53.297 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:36:53 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:54 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:54 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-025413710f75ace93305dda8659754012c7e6b0ed71f82e0bba1b040134d7ebe-merged.mount: Deactivated successfully.
Nov 28 09:36:54 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:36:54 np0005538513.localdomain systemd[1]: Started libpod-conmon-a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.scope.
Nov 28 09:36:54 np0005538513.localdomain podman[244670]: 2025-11-28 09:36:54.403217122 +0000 UTC m=+1.728061844 container exec a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Nov 28 09:36:54 np0005538513.localdomain podman[244670]: 2025-11-28 09:36:54.435317822 +0000 UTC m=+1.760162504 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter)
Nov 28 09:36:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:54.578 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38954 DF PROTO=TCP SPT=43242 DPT=9105 SEQ=2441211023 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB214D430000000001030307) 
Nov 28 09:36:55 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 28 09:36:55 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:36:55 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:55 np0005538513.localdomain sudo[244646]: pam_unix(sudo:session): session closed for user root
Nov 28 09:36:55 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 28 09:36:55 np0005538513.localdomain sudo[244827]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-frjatekttbiqgbeyzorekpceyiymdqws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322615.5497837-2551-259487201520574/AnsiballZ_podman_container_exec.py
Nov 28 09:36:55 np0005538513.localdomain sudo[244827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:36:56 np0005538513.localdomain python3.9[244829]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:36:56 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 28 09:36:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:56.798 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-b8fe5cc4372bb33f59044e24a5715f4f503612636c3ff0d7cebe06eec67fcb4f-merged.mount: Deactivated successfully.
Nov 28 09:36:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-b8fe5cc4372bb33f59044e24a5715f4f503612636c3ff0d7cebe06eec67fcb4f-merged.mount: Deactivated successfully.
Nov 28 09:36:57 np0005538513.localdomain systemd[1]: libpod-conmon-a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.scope: Deactivated successfully.
Nov 28 09:36:57 np0005538513.localdomain systemd[1]: Started libpod-conmon-a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.scope.
Nov 28 09:36:57 np0005538513.localdomain podman[244830]: 2025-11-28 09:36:57.98666474 +0000 UTC m=+1.961183490 container exec a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 09:36:58 np0005538513.localdomain podman[244830]: 2025-11-28 09:36:58.01546959 +0000 UTC m=+1.989988400 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public)
Nov 28 09:36:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52621 DF PROTO=TCP SPT=33480 DPT=9101 SEQ=391394662 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2158690000000001030307) 
Nov 28 09:36:59 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:36:59 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:36:59.614 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:36:59 np0005538513.localdomain sudo[244858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:36:59 np0005538513.localdomain sudo[244858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:36:59 np0005538513.localdomain sudo[244858]: pam_unix(sudo:session): session closed for user root
Nov 28 09:36:59 np0005538513.localdomain sudo[244876]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:36:59 np0005538513.localdomain sudo[244876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:37:00 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:37:00 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:37:00 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:37:00 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:00 np0005538513.localdomain sudo[244827]: pam_unix(sudo:session): session closed for user root
Nov 28 09:37:01 np0005538513.localdomain sudo[245013]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bntrgjojeamzjzkzysptfmjxnbaxwgzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322620.7322974-2559-32732361972666/AnsiballZ_file.py
Nov 28 09:37:01 np0005538513.localdomain sudo[245013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:37:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52623 DF PROTO=TCP SPT=33480 DPT=9101 SEQ=391394662 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2164820000000001030307) 
Nov 28 09:37:01 np0005538513.localdomain python3.9[245015]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:37:01 np0005538513.localdomain sudo[245013]: pam_unix(sudo:session): session closed for user root
Nov 28 09:37:01 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:01.800 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:37:02 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:02 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:37:02 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:37:03 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:37:03 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:03 np0005538513.localdomain systemd[1]: libpod-conmon-a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.scope: Deactivated successfully.
Nov 28 09:37:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62679 DF PROTO=TCP SPT=59382 DPT=9102 SEQ=3789529679 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB216D430000000001030307) 
Nov 28 09:37:03 np0005538513.localdomain sudo[244876]: pam_unix(sudo:session): session closed for user root
Nov 28 09:37:03 np0005538513.localdomain sudo[245052]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:37:03 np0005538513.localdomain sudo[245052]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:37:03 np0005538513.localdomain sudo[245052]: pam_unix(sudo:session): session closed for user root
Nov 28 09:37:04 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:04 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:04.619 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:37:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:37:04 np0005538513.localdomain systemd[1]: tmp-crun.8LGxzx.mount: Deactivated successfully.
Nov 28 09:37:04 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:04 np0005538513.localdomain podman[245070]: 2025-11-28 09:37:04.879605962 +0000 UTC m=+0.110067976 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.openshift.expose-services=, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:37:04 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:04 np0005538513.localdomain podman[245070]: 2025-11-28 09:37:04.892362321 +0000 UTC m=+0.122824335 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:37:05 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:37:05 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:05 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:05 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:05 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:05 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:37:06 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:06 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50894 DF PROTO=TCP SPT=44122 DPT=9100 SEQ=622779395 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2179020000000001030307) 
Nov 28 09:37:06 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:06 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:06.802 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:37:07 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:07 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:07 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:08 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-2ed6a910a473599a3c0eb40fb417f8a3b4e8b49460b9c3a2f878bde6830d5976-merged.mount: Deactivated successfully.
Nov 28 09:37:08 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-2ed6a910a473599a3c0eb40fb417f8a3b4e8b49460b9c3a2f878bde6830d5976-merged.mount: Deactivated successfully.
Nov 28 09:37:08 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-da9f726a106a4f4af24ed404443eca5cd50a43c6e5c864c256f158761c28e938-merged.mount: Deactivated successfully.
Nov 28 09:37:09 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Nov 28 09:37:09 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:37:09 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:37:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62681 DF PROTO=TCP SPT=59382 DPT=9102 SEQ=3789529679 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2185020000000001030307) 
Nov 28 09:37:09 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:09.666 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:37:09 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-b8fe5cc4372bb33f59044e24a5715f4f503612636c3ff0d7cebe06eec67fcb4f-merged.mount: Deactivated successfully.
Nov 28 09:37:10 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-da9f726a106a4f4af24ed404443eca5cd50a43c6e5c864c256f158761c28e938-merged.mount: Deactivated successfully.
Nov 28 09:37:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:37:10 np0005538513.localdomain systemd[1]: tmp-crun.s4iN6e.mount: Deactivated successfully.
Nov 28 09:37:10 np0005538513.localdomain podman[245089]: 2025-11-28 09:37:10.125222589 +0000 UTC m=+0.094856111 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:37:10 np0005538513.localdomain podman[245089]: 2025-11-28 09:37:10.132542293 +0000 UTC m=+0.102175815 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:37:10 np0005538513.localdomain podman[245089]: unhealthy
Nov 28 09:37:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:37:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:37:11 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:11.803 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:37:12 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:37:12 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:37:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50896 DF PROTO=TCP SPT=44122 DPT=9100 SEQ=622779395 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2190C20000000001030307) 
Nov 28 09:37:12 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:37:12 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:37:12 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Failed with result 'exit-code'.
Nov 28 09:37:12 np0005538513.localdomain podman[245113]: 2025-11-28 09:37:12.802991872 +0000 UTC m=+1.040255869 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 09:37:12 np0005538513.localdomain podman[245113]: 2025-11-28 09:37:12.837236489 +0000 UTC m=+1.074500446 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 28 09:37:12 np0005538513.localdomain podman[245112]: 2025-11-28 09:37:12.856112106 +0000 UTC m=+1.095813277 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 28 09:37:12 np0005538513.localdomain podman[245112]: 2025-11-28 09:37:12.958171675 +0000 UTC m=+1.197872876 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 28 09:37:14 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:14.668 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:37:14 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:14 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:37:15 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:37:15 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:37:15 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:16 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:37:16 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:16.805 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:37:16 np0005538513.localdomain podman[245155]: 2025-11-28 09:37:16.861181703 +0000 UTC m=+0.094455499 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 09:37:16 np0005538513.localdomain podman[245155]: 2025-11-28 09:37:16.878394209 +0000 UTC m=+0.111668005 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:37:16 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:17 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:17 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:17 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:17 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:37:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14862 DF PROTO=TCP SPT=48696 DPT=9882 SEQ=2551031526 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB21A3430000000001030307) 
Nov 28 09:37:17 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8124 DF PROTO=TCP SPT=54614 DPT=9105 SEQ=132900306 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB21AAC20000000001030307) 
Nov 28 09:37:19 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-2ed6a910a473599a3c0eb40fb417f8a3b4e8b49460b9c3a2f878bde6830d5976-merged.mount: Deactivated successfully.
Nov 28 09:37:19 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:37:19 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-870580852a6f869c68321019e7d77d0da890e64d97007d93989d967a33c02183-merged.mount: Deactivated successfully.
Nov 28 09:37:19 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:19.698 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:37:20 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-da9f726a106a4f4af24ed404443eca5cd50a43c6e5c864c256f158761c28e938-merged.mount: Deactivated successfully.
Nov 28 09:37:20 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 28 09:37:20 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 28 09:37:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8125 DF PROTO=TCP SPT=54614 DPT=9105 SEQ=132900306 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB21B2C20000000001030307) 
Nov 28 09:37:21 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:21 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:21 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:21 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Nov 28 09:37:21 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:37:21 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:21 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:21 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:21.808 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:37:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:37:22 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 28 09:37:22 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17-merged.mount: Deactivated successfully.
Nov 28 09:37:22 np0005538513.localdomain podman[245175]: 2025-11-28 09:37:22.60980495 +0000 UTC m=+0.101571095 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:37:22 np0005538513.localdomain podman[245175]: 2025-11-28 09:37:22.646530965 +0000 UTC m=+0.138297130 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:37:23 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:37:23 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:37:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:37:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:37:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:37:24 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:37:24 np0005538513.localdomain podman[245198]: 2025-11-28 09:37:24.704582283 +0000 UTC m=+0.184897590 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 09:37:24 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:24.714 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:37:24 np0005538513.localdomain podman[245198]: 2025-11-28 09:37:24.728343446 +0000 UTC m=+0.208658783 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 28 09:37:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:37:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8126 DF PROTO=TCP SPT=54614 DPT=9105 SEQ=132900306 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB21C2830000000001030307) 
Nov 28 09:37:26 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:37:26 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:26 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:37:26 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:37:26 np0005538513.localdomain podman[238687]: time="2025-11-28T09:37:26Z" level=error msg="Unmounting /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged: invalid argument"
Nov 28 09:37:26 np0005538513.localdomain podman[238687]: time="2025-11-28T09:37:26Z" level=error msg="Getting root fs size for \"9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef\": getting diffsize of layer \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\" and its parent \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\": creating overlay mount to /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged, mount_data=\"lowerdir=/var/lib/containers/storage/overlay/l/HMUKP4FWJ2DKWHHDEEBC5QLPOY:/var/lib/containers/storage/overlay/l/BO3U54FAUNG3PEE6EV7WXO7QWP,upperdir=/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/diff,workdir=/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/work,nodev,metacopy=on\": no such file or directory"
Nov 28 09:37:26 np0005538513.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:26 np0005538513.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:26 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:26.810 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:37:27 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:37:27 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:37:27 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:27 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:27 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5258 DF PROTO=TCP SPT=38530 DPT=9101 SEQ=3718644584 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB21CD990000000001030307) 
Nov 28 09:37:28 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:28 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:28 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:29 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:37:29 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:29.724 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:37:29 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d63efe17da859108a09d9b90626ba0c433787abe209cd4ac755f6ba2a5206671-merged.mount: Deactivated successfully.
Nov 28 09:37:29 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d8443c9fdf039c2367e44e0edbe81c941f30f604c3f1eccc2fc81efb5a97a784-merged.mount: Deactivated successfully.
Nov 28 09:37:29 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d8443c9fdf039c2367e44e0edbe81c941f30f604c3f1eccc2fc81efb5a97a784-merged.mount: Deactivated successfully.
Nov 28 09:37:30 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:37:30 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d63efe17da859108a09d9b90626ba0c433787abe209cd4ac755f6ba2a5206671-merged.mount: Deactivated successfully.
Nov 28 09:37:30 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d63efe17da859108a09d9b90626ba0c433787abe209cd4ac755f6ba2a5206671-merged.mount: Deactivated successfully.
Nov 28 09:37:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5260 DF PROTO=TCP SPT=38530 DPT=9101 SEQ=3718644584 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB21D9820000000001030307) 
Nov 28 09:37:31 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Nov 28 09:37:31 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Nov 28 09:37:31 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:31.812 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:37:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d8443c9fdf039c2367e44e0edbe81c941f30f604c3f1eccc2fc81efb5a97a784-merged.mount: Deactivated successfully.
Nov 28 09:37:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:37:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-870580852a6f869c68321019e7d77d0da890e64d97007d93989d967a33c02183-merged.mount: Deactivated successfully.
Nov 28 09:37:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 28 09:37:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 28 09:37:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16433 DF PROTO=TCP SPT=47760 DPT=9102 SEQ=2844046942 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB21E2820000000001030307) 
Nov 28 09:37:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:34 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:35 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:35.562 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:37:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:37:35 np0005538513.localdomain systemd[1]: tmp-crun.MqZfDQ.mount: Deactivated successfully.
Nov 28 09:37:35 np0005538513.localdomain podman[245217]: 2025-11-28 09:37:35.685382463 +0000 UTC m=+0.094727178 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 09:37:35 np0005538513.localdomain podman[245217]: 2025-11-28 09:37:35.698440668 +0000 UTC m=+0.107785343 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, release=1755695350, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 09:37:35 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:37:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20809 DF PROTO=TCP SPT=44252 DPT=9102 SEQ=4124906310 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB21ED820000000001030307) 
Nov 28 09:37:36 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:37:36 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:36 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:37:36 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:36.814 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:37:37 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 28 09:37:37 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17-merged.mount: Deactivated successfully.
Nov 28 09:37:38 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:38 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:37:38 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:37:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7168 DF PROTO=TCP SPT=49462 DPT=9100 SEQ=787236909 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB21F9820000000001030307) 
Nov 28 09:37:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:37:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:37:39 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:39 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:37:39 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:39.779 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:37:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:40 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:40 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:41 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:41 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:41 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:41 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:37:41 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:41 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:41.816 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:37:42 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:37:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45457 DF PROTO=TCP SPT=60464 DPT=9100 SEQ=1028383145 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2205C20000000001030307) 
Nov 28 09:37:43 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:37:43 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:43 np0005538513.localdomain podman[245236]: 2025-11-28 09:37:43.157004622 +0000 UTC m=+0.091077191 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:37:43 np0005538513.localdomain podman[245236]: 2025-11-28 09:37:43.191319732 +0000 UTC m=+0.125392311 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:37:43 np0005538513.localdomain podman[245236]: unhealthy
Nov 28 09:37:43 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:37:43 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Failed with result 'exit-code'.
Nov 28 09:37:43 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:43 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:37:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-4f1ef42e70cf0e1a0a8a92baa1944c6e077a6987018321471d4c334fae1280ec-merged.mount: Deactivated successfully.
Nov 28 09:37:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:44 np0005538513.localdomain podman[238687]: time="2025-11-28T09:37:44Z" level=error msg="Getting root fs size for \"9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef\": getting diffsize of layer \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\" and its parent \"c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy"
Nov 28 09:37:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:44 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:44.781 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:37:45 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 28 09:37:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:37:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:37:45 np0005538513.localdomain podman[245261]: 2025-11-28 09:37:45.175869416 +0000 UTC m=+0.100014776 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 09:37:45 np0005538513.localdomain podman[245261]: 2025-11-28 09:37:45.183467987 +0000 UTC m=+0.107613397 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 09:37:45 np0005538513.localdomain podman[245260]: 2025-11-28 09:37:45.140304837 +0000 UTC m=+0.071587313 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 09:37:45 np0005538513.localdomain podman[245260]: 2025-11-28 09:37:45.225438609 +0000 UTC m=+0.156721135 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:37:45 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:45 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:45.682 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:45.683 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:45.683 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 09:37:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:45.704 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 09:37:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:45.707 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:45.707 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 09:37:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:45.719 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:46 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:37:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:46.819 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:37:46 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:37:46 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:37:47 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:37:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52030 DF PROTO=TCP SPT=58478 DPT=9882 SEQ=3943258280 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2218420000000001030307) 
Nov 28 09:37:47 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:47 np0005538513.localdomain podman[245300]: 2025-11-28 09:37:47.25399608 +0000 UTC m=+0.102483204 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=edpm)
Nov 28 09:37:47 np0005538513.localdomain podman[245300]: 2025-11-28 09:37:47.266410944 +0000 UTC m=+0.114898068 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3)
Nov 28 09:37:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:47.726 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:47.727 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:47.747 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:37:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:47.748 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:37:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:47.748 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:37:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:47.748 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:37:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:47.749 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:37:48 np0005538513.localdomain systemd[1]: tmp-crun.rVul13.mount: Deactivated successfully.
Nov 28 09:37:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d63efe17da859108a09d9b90626ba0c433787abe209cd4ac755f6ba2a5206671-merged.mount: Deactivated successfully.
Nov 28 09:37:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d8443c9fdf039c2367e44e0edbe81c941f30f604c3f1eccc2fc81efb5a97a784-merged.mount: Deactivated successfully.
Nov 28 09:37:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c554e4db742c3febdfb49af667b5d853d210243bf4484c52c41efc08d328831f-merged.mount: Deactivated successfully.
Nov 28 09:37:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:48.164 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:37:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c554e4db742c3febdfb49af667b5d853d210243bf4484c52c41efc08d328831f-merged.mount: Deactivated successfully.
Nov 28 09:37:48 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:37:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:48.415 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:37:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:48.416 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:37:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:48.590 228337 WARNING nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:37:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:48.591 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12385MB free_disk=41.837093353271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:37:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:48.591 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:37:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:48.591 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:37:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:48.721 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:37:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:48.721 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:37:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:48.721 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:37:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:48.785 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Refreshing inventories for resource provider 35fead26-0bad-4950-b646-987079d58a17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 09:37:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:48.869 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Updating ProviderTree inventory for provider 35fead26-0bad-4950-b646-987079d58a17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 09:37:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:48.869 228337 DEBUG nova.compute.provider_tree [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 09:37:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:48.888 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Refreshing aggregate associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 09:37:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:48.916 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Refreshing trait associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,HW_CPU_X86_FMA3,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX2,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_CLMUL,HW_CPU_X86_ABM,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE41,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_F16C,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 09:37:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:48.945 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:37:49 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d63efe17da859108a09d9b90626ba0c433787abe209cd4ac755f6ba2a5206671-merged.mount: Deactivated successfully.
Nov 28 09:37:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8176 DF PROTO=TCP SPT=54766 DPT=9105 SEQ=3023380174 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2220020000000001030307) 
Nov 28 09:37:49 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Nov 28 09:37:49 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:37:49 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:37:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:49.424 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:37:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:49.430 228337 DEBUG nova.compute.provider_tree [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:37:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:49.448 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:37:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:49.450 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:37:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:49.450 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.859s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:37:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:49.800 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:37:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:50.401 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:50.402 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:50.402 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:37:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:50.402 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:37:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:37:50.818 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:37:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:37:50.820 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:37:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:37:50.824 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:37:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8177 DF PROTO=TCP SPT=54766 DPT=9105 SEQ=3023380174 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2228020000000001030307) 
Nov 28 09:37:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:51.428 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:37:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:51.428 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:37:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:51.429 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:37:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:51.429 228337 DEBUG nova.objects.instance [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:37:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:51.859 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:37:52 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:52 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:37:52 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:37:52 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:53 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:37:53 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:54 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:54 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:37:54 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:54 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:54 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:54 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:54 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:37:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:54.801 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:37:54 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:54 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:54 np0005538513.localdomain podman[245363]: 2025-11-28 09:37:54.856143493 +0000 UTC m=+0.094323055 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:37:54 np0005538513.localdomain podman[245363]: 2025-11-28 09:37:54.892364542 +0000 UTC m=+0.130544104 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:37:55 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:37:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8178 DF PROTO=TCP SPT=54766 DPT=9105 SEQ=3023380174 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2237C30000000001030307) 
Nov 28 09:37:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:55.503 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:37:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:55.524 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:37:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:55.524 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:37:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:55.524 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:55.525 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:55.525 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:55.525 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:55.525 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:37:55 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:55 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:56 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:37:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:56.862 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:37:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:37:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-722caa9ca8bac97f5e9f74c37c16703e7dbaf1a59ddfd48c9720d9c7ac6caca3-merged.mount: Deactivated successfully.
Nov 28 09:37:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-722caa9ca8bac97f5e9f74c37c16703e7dbaf1a59ddfd48c9720d9c7ac6caca3-merged.mount: Deactivated successfully.
Nov 28 09:37:57 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:57 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:57 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:37:57 np0005538513.localdomain podman[245386]: 2025-11-28 09:37:57.512533929 +0000 UTC m=+0.747388211 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 09:37:57 np0005538513.localdomain podman[245386]: 2025-11-28 09:37:57.553371785 +0000 UTC m=+0.788226027 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 09:37:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63301 DF PROTO=TCP SPT=45150 DPT=9101 SEQ=532369119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2242C90000000001030307) 
Nov 28 09:37:58 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:58 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 28 09:37:58 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 28 09:37:58 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:37:58 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:58 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:59 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 28 09:37:59 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:59 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 28 09:37:59 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 28 09:37:59 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:37:59.833 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:00 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:00 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.666 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.667 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.667 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.708 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 73981952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.709 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9167363-926a-4b34-9d37-2ce0e7fc5698', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73981952, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:38:00.667327', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ee087880-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.839188845, 'message_signature': '4ecb4336cd10356dc7960ef97336d27c9e7380629d933faf4256028abacc4a5b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:38:00.667327', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ee088366-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.839188845, 'message_signature': 'd2d1ab95ba9ade2904bf453373b1b6eb2d92e54e5110febff56aaa8547ef015f'}]}, 'timestamp': '2025-11-28 09:38:00.709313', '_unique_id': 'f4dbd54f69454253a8bdfeb1b20c01d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.710 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7a63663-5665-4181-a5ce-b455c5ff3e2d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:38:00.710833', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ee08c70e-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.839188845, 'message_signature': '91c3170fbb837bd0f9e36865b6b89671cea0ed7d26a1cbbabbf70b06079a191a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:38:00.710833', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ee08cf92-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.839188845, 'message_signature': '3262ca5fbdc65a1aae141e4d832ea6aaf7266a4b46c4ba64ee923f3518030ced'}]}, 'timestamp': '2025-11-28 09:38:00.711251', '_unique_id': '33ae770c1e124e37b6d6e8dc80956226'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.711 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.712 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.715 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92a421e1-0b82-4b3f-a550-883a49e6955b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:38:00.712319', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'ee0991fc-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.884196933, 'message_signature': '3b55137bbcd0f06b0f7af73c909bb477c847402b642e5abd594fab5f93a1c8d7'}]}, 'timestamp': '2025-11-28 09:38:00.716246', '_unique_id': '6e3adb94133a4acfa1b2898a2d93c102'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.716 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.717 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.717 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 305908425 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.717 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 30452399 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '782aa00e-3fc2-4c9a-a031-a453fd0f7678', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 305908425, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:38:00.717238', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ee09c10e-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.839188845, 'message_signature': '48dcfc1a61d3f1559a0372557e13d2f3368495c15fef22fd7cd9c9651ea99307'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30452399, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:38:00.717238', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ee09c820-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.839188845, 'message_signature': '0793811a9afa6712c22e17234438d92dfe8079673dc062bcf161c1e5184db1f9'}]}, 'timestamp': '2025-11-28 09:38:00.717610', '_unique_id': 'ca523a1ffb4147208b5ff136ba67314f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.718 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fca0a0c9-d2ae-4f40-9ae5-5f53e2efa556', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:38:00.718594', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'ee09f610-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.884196933, 'message_signature': '15bf6dbd590e45fbc191ec38f1d56d9de68d123f82ffb272809b8e75105f7512'}]}, 'timestamp': '2025-11-28 09:38:00.718800', '_unique_id': 'ef53b224d9f84bf8b15bf171d2fdaf33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.719 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 86 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0cfb8b6-60bc-4e24-b5eb-ff0893020467', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 86, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:38:00.719791', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'ee0a24d2-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.884196933, 'message_signature': '18dc070dde3cdf842f3b5c45feb72b624e734fc2b77cf52e22fdcd818dca5b11'}]}, 'timestamp': '2025-11-28 09:38:00.719999', '_unique_id': '78518f6e79354aab81b4cf1ca3822994'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.720 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1313024378 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 168520223 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4469e106-5925-4ec9-9bd1-daec825a5ff3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1313024378, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:38:00.720942', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ee0a5268-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.839188845, 'message_signature': '14994699e20fb5472b27d044571cef01a2b14a2c104f2c917bf51a38023e5a15'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 168520223, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:38:00.720942', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ee0a5984-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.839188845, 'message_signature': '4fc992bade7cf30e396d331377cb8123c0a7892cad32a6768fbf59935a8fefc6'}]}, 'timestamp': '2025-11-28 09:38:00.721334', '_unique_id': '703cf5c06a7d41d4bb57e12e62e1b8f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.721 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.722 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.722 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 566 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.722 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42633767-5e2d-4d04-9a37-51561d30dd9d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 566, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:38:00.722305', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ee0a86fc-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.839188845, 'message_signature': 'cc54712459a57c2ab0037e720f560c219852ce54626f3672876ff6aa796139d3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:38:00.722305', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ee0a8df0-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.839188845, 'message_signature': 'a1171d41a566bdf75acb7550aa44571adf1018aaf88e896c191acf466d05f1a1'}]}, 'timestamp': '2025-11-28 09:38:00.722675', '_unique_id': '8b4645239f27407b8b8a4231e2ea63cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.723 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.747 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 52.40234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '301dd817-35de-44d1-92c0-b7f6291166e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.40234375, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:38:00.723653', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'ee0e77b2-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.919249816, 'message_signature': 'c67828df8adf313649de4b96936c263854b0b01c607dae5fe9bd2ae58e792580'}]}, 'timestamp': '2025-11-28 09:38:00.748591', '_unique_id': '8350024d2aa54f949ea42239368e3e4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.750 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.751 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.752 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4e6fca2a-5ff0-473b-ad9e-e758feffd2a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:38:00.751963', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'ee0f14ce-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.884196933, 'message_signature': 'a252540059689487026d5a09aa74e1bcfb738f4336eac88b445c340042894798'}]}, 'timestamp': '2025-11-28 09:38:00.752493', '_unique_id': '6a1baed2d7764b0aad5bd46d9838f350'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.753 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.754 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.754 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f39ed629-8732-4c1d-bd8e-b7bbe291082c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:38:00.754688', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'ee0f7cf2-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.884196933, 'message_signature': '4feff7681190a77136fbbae7e078fcc1d947f1c863383cc3472badcbe76601ef'}]}, 'timestamp': '2025-11-28 09:38:00.755187', '_unique_id': 'bb84d84919b84893bd54824163cdd09f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.757 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.758 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.758 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c78b5cec-3171-4031-a618-b60c950b025f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:38:00.758557', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'ee101400-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.884196933, 'message_signature': 'ee647ec044beb129bdaad3632ca5cb58be482c18f25f4d6feb95a9373b2d332e'}]}, 'timestamp': '2025-11-28 09:38:00.759223', '_unique_id': '64bd0ec79e514581919c7a46d0aef51b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.760 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.761 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.776 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.776 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ab83b51-33b9-4a6d-bb54-5a3ba578b613', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:38:00.761361', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ee12d032-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.933304072, 'message_signature': '2c257194527fa3586e6f86b84cfd71d74cfb7c70a71de328d803ca77b8983f83'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:38:00.761361', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ee12e2d4-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.933304072, 'message_signature': 'f9b1744671d460e013c817263cdad8316a11ec078437ae9a11a696ab759726c7'}]}, 'timestamp': '2025-11-28 09:38:00.777419', '_unique_id': '0ab675f826ea4d859aa292cd42baa455'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.779 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.779 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.780 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 12742 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe8698b6-3b32-40be-ac9a-2354d570034a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12742, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:38:00.780144', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'ee135fa2-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.884196933, 'message_signature': '44e5e18c9880181dd85de8e3b40c49c9780971ab25e729309769da5a3ae062d5'}]}, 'timestamp': '2025-11-28 09:38:00.780619', '_unique_id': '696fa2cf04fd494a8863553433571468'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.782 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.782 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 47970000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24c28b2f-db1e-4703-aeb5-ac966674e54f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 47970000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:38:00.782733', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'ee13c438-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.919249816, 'message_signature': '9fc93a7c2b6445ec0d363437d4a661c41a6b4a8ea5df0db3b478afa39f66409b'}]}, 'timestamp': '2025-11-28 09:38:00.783209', '_unique_id': '15889231ab184e35ad67dce944b1a6b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.785 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.785 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52b1a854-7b2b-4f6a-9ca5-106ef567bc3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:38:00.785560', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'ee143594-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.884196933, 'message_signature': 'adf409482abad4661da49ca843e6e6521e643dbebddec6cb316b28d81bcc0e1a'}]}, 'timestamp': '2025-11-28 09:38:00.786159', '_unique_id': '80e190902fcb434eaac0771f9e6df2d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.788 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.788 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.789 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97524046-9bfa-471c-b62e-23884f0dd476', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:38:00.788598', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ee14a9b6-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.933304072, 'message_signature': '3286287556d4ac0868852158e8de661a457d9cb1a926acfb230bc87dd5c41e33'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:38:00.788598', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ee14bbae-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.933304072, 'message_signature': '90f5a46a265195fa0d62644c3994aa7ec583c24f2aad16bf490427ebf673acf1'}]}, 'timestamp': '2025-11-28 09:38:00.789501', '_unique_id': '3393426d7ef14726b09f118ef474885a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.791 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.791 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 9175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db5358b2-6af2-4db5-abc6-4f1c9ce52710', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9175, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:38:00.791931', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'ee152efe-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.884196933, 'message_signature': 'c8e479f23715666858594ffc898e2ac6aba867c48b18aa1ac301d30febe57513'}]}, 'timestamp': '2025-11-28 09:38:00.792482', '_unique_id': '34a1fc6336d84c8eb3fbef8349d29d64'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.794 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc981f84-3ff9-48cf-8920-32984cf07f4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:38:00.794963', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ee15a4ba-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.839188845, 'message_signature': '34732cce9c605cd6a8f8532868590df9ec4532f39b370261e20c1dbd115c55b0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:38:00.794963', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ee15b6d0-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.839188845, 'message_signature': 'c9a2c1e000e6fa5fed55783d595b6f139ec454743350d44f1adc340906c0a0c3'}]}, 'timestamp': '2025-11-28 09:38:00.795929', '_unique_id': '5dbae4e594b04a6189dcc952e95b3262'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.796 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.797 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.797 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.797 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6a52463-ae31-49de-aea6-a14462281202', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:38:00.797656', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ee1607fc-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.933304072, 'message_signature': '8ce21669fb45074cf2dccd8724aeeb4603f84a53832504365d74f633ab748f79'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:38:00.797656', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ee1612a6-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.933304072, 'message_signature': '5cafd50bae9048ff0829f91054116e7df74ee1cf327a709483877606f3649c70'}]}, 'timestamp': '2025-11-28 09:38:00.798250', '_unique_id': '279623b676d94a48b46d8b2e399ea838'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.798 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.799 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.799 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.799 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88d25aba-4331-40f9-bfe6-59a1ddb6bd80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 144, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:38:00.799719', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'ee1658b0-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10514.884196933, 'message_signature': '6cbfa11a6aa053886c78ddb9a5f07a88e6fe02b03a229e37fd5185c6f2700304'}]}, 'timestamp': '2025-11-28 09:38:00.800011', '_unique_id': '5be57c7913794c49a80310f4e7e1a9c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:38:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:38:00.800 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:38:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63303 DF PROTO=TCP SPT=45150 DPT=9101 SEQ=532369119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB224EC20000000001030307) 
Nov 28 09:38:01 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:38:01 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-4f1ef42e70cf0e1a0a8a92baa1944c6e077a6987018321471d4c334fae1280ec-merged.mount: Deactivated successfully.
Nov 28 09:38:01 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 28 09:38:01 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-6bc7f6cf2f2d6fa477b99c9c15d2f85320ca24368fa72b515260201c2b251c67-merged.mount: Deactivated successfully.
Nov 28 09:38:01 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:01.896 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:02 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-4f1ef42e70cf0e1a0a8a92baa1944c6e077a6987018321471d4c334fae1280ec-merged.mount: Deactivated successfully.
Nov 28 09:38:02 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:02 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 28 09:38:02 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 28 09:38:03 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8179 DF PROTO=TCP SPT=54766 DPT=9105 SEQ=3023380174 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2257820000000001030307) 
Nov 28 09:38:03 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:04 np0005538513.localdomain sudo[245406]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:38:04 np0005538513.localdomain sudo[245406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:38:04 np0005538513.localdomain sudo[245406]: pam_unix(sudo:session): session closed for user root
Nov 28 09:38:04 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:38:04 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 28 09:38:04 np0005538513.localdomain sudo[245424]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:38:04 np0005538513.localdomain sudo[245424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:38:04 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c554e4db742c3febdfb49af667b5d853d210243bf4484c52c41efc08d328831f-merged.mount: Deactivated successfully.
Nov 28 09:38:04 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c554e4db742c3febdfb49af667b5d853d210243bf4484c52c41efc08d328831f-merged.mount: Deactivated successfully.
Nov 28 09:38:04 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:04.834 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:05 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:05 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:38:05 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:38:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62964 DF PROTO=TCP SPT=47164 DPT=9100 SEQ=3389370259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2263420000000001030307) 
Nov 28 09:38:06 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:38:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:38:06 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:38:06 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:06.899 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:06 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:38:07 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:07 np0005538513.localdomain podman[245456]: 2025-11-28 09:38:07.046981187 +0000 UTC m=+0.354793340 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, version=9.6, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1755695350, config_id=edpm, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:38:07 np0005538513.localdomain podman[245456]: 2025-11-28 09:38:07.088574878 +0000 UTC m=+0.396386991 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, release=1755695350, version=9.6, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc.)
Nov 28 09:38:07 np0005538513.localdomain sudo[245424]: pam_unix(sudo:session): session closed for user root
Nov 28 09:38:07 np0005538513.localdomain sudo[245495]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:38:07 np0005538513.localdomain sudo[245495]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:38:07 np0005538513.localdomain sudo[245495]: pam_unix(sudo:session): session closed for user root
Nov 28 09:38:08 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:08 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:08 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:38:08 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:38:09 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:38:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11296 DF PROTO=TCP SPT=34894 DPT=9102 SEQ=3325970008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB226F820000000001030307) 
Nov 28 09:38:09 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:09 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:09 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:09.880 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:09 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:09 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:10 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:10 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:10 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:10 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:10 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:10 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:10 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:11 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:11 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:11.936 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:12 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:38:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62966 DF PROTO=TCP SPT=47164 DPT=9100 SEQ=3389370259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB227B020000000001030307) 
Nov 28 09:38:12 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-be016c2b2baa2373c75faceeaeaba806753f973ba55e627044f9b5780da52a4c-merged.mount: Deactivated successfully.
Nov 28 09:38:12 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-be016c2b2baa2373c75faceeaeaba806753f973ba55e627044f9b5780da52a4c-merged.mount: Deactivated successfully.
Nov 28 09:38:13 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:38:13 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:13 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-722caa9ca8bac97f5e9f74c37c16703e7dbaf1a59ddfd48c9720d9c7ac6caca3-merged.mount: Deactivated successfully.
Nov 28 09:38:13 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 28 09:38:13 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:13 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:13 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:38:14 np0005538513.localdomain podman[245513]: 2025-11-28 09:38:14.109819923 +0000 UTC m=+0.100140319 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:38:14 np0005538513.localdomain podman[245513]: 2025-11-28 09:38:14.143566555 +0000 UTC m=+0.133886921 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:38:14 np0005538513.localdomain podman[245513]: unhealthy
Nov 28 09:38:14 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 28 09:38:14 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:14 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 28 09:38:14 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 28 09:38:14 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:14 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:38:14 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Failed with result 'exit-code'.
Nov 28 09:38:14 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:14.882 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:15 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:15 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-926487bff9cf92c9a8f31e53ba63d52e32adb5e89e0ae8702a4e60968ca6f3f1-merged.mount: Deactivated successfully.
Nov 28 09:38:15 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:15 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 28 09:38:15 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-926487bff9cf92c9a8f31e53ba63d52e32adb5e89e0ae8702a4e60968ca6f3f1-merged.mount: Deactivated successfully.
Nov 28 09:38:16 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 28 09:38:16 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:16 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:38:16 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:16.938 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:38:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:38:17 np0005538513.localdomain podman[245536]: 2025-11-28 09:38:17.110838048 +0000 UTC m=+0.087705705 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 09:38:17 np0005538513.localdomain podman[245536]: 2025-11-28 09:38:17.153468361 +0000 UTC m=+0.130336048 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:38:17 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 28 09:38:17 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:17 np0005538513.localdomain podman[245537]: 2025-11-28 09:38:17.174120526 +0000 UTC m=+0.154506775 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 09:38:17 np0005538513.localdomain podman[245537]: 2025-11-28 09:38:17.182320146 +0000 UTC m=+0.162706425 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 09:38:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8151 DF PROTO=TCP SPT=35832 DPT=9882 SEQ=926459536 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB228D820000000001030307) 
Nov 28 09:38:17 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 28 09:38:17 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-6bc7f6cf2f2d6fa477b99c9c15d2f85320ca24368fa72b515260201c2b251c67-merged.mount: Deactivated successfully.
Nov 28 09:38:17 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:38:17 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:38:18 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:38:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:38:18 np0005538513.localdomain systemd[1]: tmp-crun.o6Agzn.mount: Deactivated successfully.
Nov 28 09:38:18 np0005538513.localdomain podman[245576]: 2025-11-28 09:38:18.868004535 +0000 UTC m=+0.109135964 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 09:38:18 np0005538513.localdomain podman[245576]: 2025-11-28 09:38:18.882311859 +0000 UTC m=+0.123443328 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 09:38:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49482 DF PROTO=TCP SPT=45204 DPT=9105 SEQ=661925949 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2295020000000001030307) 
Nov 28 09:38:19 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:19.921 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:19 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f04f6aa8018da724c9daa5ca37db7cd13477323f1b725eec5dac97862d883048-merged.mount: Deactivated successfully.
Nov 28 09:38:20 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:38:20 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-b574f97f279779c52df37c61d993141d596fdb6544fa700fbddd8f35f27a4d3b-merged.mount: Deactivated successfully.
Nov 28 09:38:20 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:38:20 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:38:20 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-b574f97f279779c52df37c61d993141d596fdb6544fa700fbddd8f35f27a4d3b-merged.mount: Deactivated successfully.
Nov 28 09:38:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49483 DF PROTO=TCP SPT=45204 DPT=9105 SEQ=661925949 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB229D020000000001030307) 
Nov 28 09:38:21 np0005538513.localdomain sshd[228740]: Received disconnect from 192.168.122.30 port 37702:11: disconnected by user
Nov 28 09:38:21 np0005538513.localdomain sshd[228740]: Disconnected from user zuul 192.168.122.30 port 37702
Nov 28 09:38:21 np0005538513.localdomain sshd[228737]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:38:21 np0005538513.localdomain systemd[1]: session-55.scope: Deactivated successfully.
Nov 28 09:38:21 np0005538513.localdomain systemd[1]: session-55.scope: Consumed 1min 13.224s CPU time.
Nov 28 09:38:21 np0005538513.localdomain systemd-logind[764]: Session 55 logged out. Waiting for processes to exit.
Nov 28 09:38:21 np0005538513.localdomain systemd-logind[764]: Removed session 55.
Nov 28 09:38:21 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:21.967 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:22 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:22 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:38:22 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-47afe78ba3ac18f156703d7ad9e4be64941a9d1bd472a4c2a59f4f2c3531ee35-merged.mount: Deactivated successfully.
Nov 28 09:38:22 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f04f6aa8018da724c9daa5ca37db7cd13477323f1b725eec5dac97862d883048-merged.mount: Deactivated successfully.
Nov 28 09:38:23 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:38:23 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:23 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:23 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:38:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-47afe78ba3ac18f156703d7ad9e4be64941a9d1bd472a4c2a59f4f2c3531ee35-merged.mount: Deactivated successfully.
Nov 28 09:38:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:24 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:24.927 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:25 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-47afe78ba3ac18f156703d7ad9e4be64941a9d1bd472a4c2a59f4f2c3531ee35-merged.mount: Deactivated successfully.
Nov 28 09:38:25 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49484 DF PROTO=TCP SPT=45204 DPT=9105 SEQ=661925949 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB22ACC20000000001030307) 
Nov 28 09:38:25 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:38:25 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:38:25 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:38:26 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Nov 28 09:38:26 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:38:26 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:38:26 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:26.972 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:27 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:38:27 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-be016c2b2baa2373c75faceeaeaba806753f973ba55e627044f9b5780da52a4c-merged.mount: Deactivated successfully.
Nov 28 09:38:27 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-be016c2b2baa2373c75faceeaeaba806753f973ba55e627044f9b5780da52a4c-merged.mount: Deactivated successfully.
Nov 28 09:38:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:38:27 np0005538513.localdomain podman[245593]: 2025-11-28 09:38:27.828754292 +0000 UTC m=+0.068713043 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:38:27 np0005538513.localdomain podman[245593]: 2025-11-28 09:38:27.864854519 +0000 UTC m=+0.104813270 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:38:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55704 DF PROTO=TCP SPT=58132 DPT=9101 SEQ=1154286442 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB22B7F90000000001030307) 
Nov 28 09:38:28 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 28 09:38:28 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-b574f97f279779c52df37c61d993141d596fdb6544fa700fbddd8f35f27a4d3b-merged.mount: Deactivated successfully.
Nov 28 09:38:28 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:28 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:28 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:38:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:38:28 np0005538513.localdomain podman[245615]: 2025-11-28 09:38:28.524058627 +0000 UTC m=+0.084495098 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:38:28 np0005538513.localdomain podman[245615]: 2025-11-28 09:38:28.555560016 +0000 UTC m=+0.115996497 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:38:29 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:29.951 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:30 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:38:30 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:38:30 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:30 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:30 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:38:30 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:38:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55706 DF PROTO=TCP SPT=58132 DPT=9101 SEQ=1154286442 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB22C4020000000001030307) 
Nov 28 09:38:31 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:31 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 28 09:38:31 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-926487bff9cf92c9a8f31e53ba63d52e32adb5e89e0ae8702a4e60968ca6f3f1-merged.mount: Deactivated successfully.
Nov 28 09:38:31 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-926487bff9cf92c9a8f31e53ba63d52e32adb5e89e0ae8702a4e60968ca6f3f1-merged.mount: Deactivated successfully.
Nov 28 09:38:32 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:31.999 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:38:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 28 09:38:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:38:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:38:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2607 DF PROTO=TCP SPT=50444 DPT=9102 SEQ=4135660608 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB22CD030000000001030307) 
Nov 28 09:38:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:34 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:34 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:38:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:38:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6600.1 total, 600.0 interval
                                                          Cumulative writes: 4899 writes, 22K keys, 4899 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4899 writes, 608 syncs, 8.06 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 09:38:34 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:34.953 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:35 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50949 DF PROTO=TCP SPT=49938 DPT=9100 SEQ=2773529894 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB22D8820000000001030307) 
Nov 28 09:38:36 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f04f6aa8018da724c9daa5ca37db7cd13477323f1b725eec5dac97862d883048-merged.mount: Deactivated successfully.
Nov 28 09:38:36 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-b574f97f279779c52df37c61d993141d596fdb6544fa700fbddd8f35f27a4d3b-merged.mount: Deactivated successfully.
Nov 28 09:38:37 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:37.003 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:37 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:38:37 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-0ffc9017578f246ce8b4d7e2db3f29d809acca5f23dc5c91669a411567041b2c-merged.mount: Deactivated successfully.
Nov 28 09:38:37 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:37 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:38:38 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:38:38 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:38 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45460 DF PROTO=TCP SPT=60464 DPT=9100 SEQ=1028383145 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB22E3820000000001030307) 
Nov 28 09:38:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:38:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f04f6aa8018da724c9daa5ca37db7cd13477323f1b725eec5dac97862d883048-merged.mount: Deactivated successfully.
Nov 28 09:38:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:38:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6600.3 total, 600.0 interval
                                                          Cumulative writes: 5616 writes, 25K keys, 5616 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5616 writes, 758 syncs, 7.41 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 09:38:39 np0005538513.localdomain systemd[1]: tmp-crun.G3JWri.mount: Deactivated successfully.
Nov 28 09:38:39 np0005538513.localdomain podman[245635]: 2025-11-28 09:38:39.623055014 +0000 UTC m=+0.108347512 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, maintainer=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, name=ubi9-minimal, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc.)
Nov 28 09:38:39 np0005538513.localdomain podman[245635]: 2025-11-28 09:38:39.658957515 +0000 UTC m=+0.144250043 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7)
Nov 28 09:38:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:38:39 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:39.996 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:40 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-29b4c40b440520fe09ba18ff760c1e8bcaccec35e4300eedc14527806169da1c-merged.mount: Deactivated successfully.
Nov 28 09:38:40 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:38:40 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-47afe78ba3ac18f156703d7ad9e4be64941a9d1bd472a4c2a59f4f2c3531ee35-merged.mount: Deactivated successfully.
Nov 28 09:38:40 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-47afe78ba3ac18f156703d7ad9e4be64941a9d1bd472a4c2a59f4f2c3531ee35-merged.mount: Deactivated successfully.
Nov 28 09:38:40 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:38:41 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:38:41 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:38:41 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 28 09:38:41 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 28 09:38:41 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:38:42 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:42.042 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50951 DF PROTO=TCP SPT=49938 DPT=9100 SEQ=2773529894 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB22F0420000000001030307) 
Nov 28 09:38:42 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:42 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 28 09:38:42 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 28 09:38:43 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:43 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-b574f97f279779c52df37c61d993141d596fdb6544fa700fbddd8f35f27a4d3b-merged.mount: Deactivated successfully.
Nov 28 09:38:44 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:38:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 28 09:38:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-4619bcf03a81c0f8085ac49d4104b51527a7e62eb19e6e3702042c0e09f1d20b-merged.mount: Deactivated successfully.
Nov 28 09:38:44 np0005538513.localdomain podman[245656]: 2025-11-28 09:38:44.617788936 +0000 UTC m=+0.099334334 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:38:44 np0005538513.localdomain podman[245656]: 2025-11-28 09:38:44.624447079 +0000 UTC m=+0.105992507 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:38:44 np0005538513.localdomain podman[245656]: unhealthy
Nov 28 09:38:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:44.998 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:45.680 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:38:46 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:38:46 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:38:46 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:38:46 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:38:46 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:38:46 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Failed with result 'exit-code'.
Nov 28 09:38:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:47.045 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35591 DF PROTO=TCP SPT=34814 DPT=9882 SEQ=4038281451 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2302C20000000001030307) 
Nov 28 09:38:47 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:38:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:47.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:38:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:47.708 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:38:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:47.708 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:38:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:47.708 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:38:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:47.709 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:38:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:47.709 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:38:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:48.172 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:38:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:48.231 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:38:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:48.232 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:38:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:48.423 228337 WARNING nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:38:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:48.425 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12231MB free_disk=41.837093353271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:38:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:48.425 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:38:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:48.425 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:38:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:38:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:38:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:38:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:38:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:48.517 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:38:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:48.517 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:38:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:48.518 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:38:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:48.574 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:38:48 np0005538513.localdomain podman[245701]: 2025-11-28 09:38:48.554834853 +0000 UTC m=+0.085404226 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:38:48 np0005538513.localdomain podman[245702]: 2025-11-28 09:38:48.61967268 +0000 UTC m=+0.134821930 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 09:38:48 np0005538513.localdomain podman[245702]: 2025-11-28 09:38:48.647654397 +0000 UTC m=+0.162803627 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:38:48 np0005538513.localdomain podman[245701]: 2025-11-28 09:38:48.700627484 +0000 UTC m=+0.231196877 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:38:48 np0005538513.localdomain podman[238687]: time="2025-11-28T09:38:48Z" level=error msg="Unmounting /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged: invalid argument"
Nov 28 09:38:48 np0005538513.localdomain podman[238687]: time="2025-11-28T09:38:48Z" level=error msg="Getting root fs size for \"d527d6e05b826bf85108ac755f1e40f44588049a9c22eb8644f76dede82580c1\": getting diffsize of layer \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\" and its parent \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\": creating overlay mount to /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged, mount_data=\"lowerdir=/var/lib/containers/storage/overlay/l/HMUKP4FWJ2DKWHHDEEBC5QLPOY:/var/lib/containers/storage/overlay/l/BO3U54FAUNG3PEE6EV7WXO7QWP,upperdir=/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/diff,workdir=/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/work,nodev,metacopy=on\": no such file or directory"
Nov 28 09:38:48 np0005538513.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:48 np0005538513.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:48 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:38:48 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:38:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:48.998 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:38:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:49.003 228337 DEBUG nova.compute.provider_tree [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:38:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:49.016 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:38:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:49.018 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:38:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:49.018 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.592s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:38:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10972 DF PROTO=TCP SPT=56290 DPT=9105 SEQ=1687668287 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB230A420000000001030307) 
Nov 28 09:38:49 np0005538513.localdomain systemd[1]: tmp-crun.PLYoO9.mount: Deactivated successfully.
Nov 28 09:38:49 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:38:49 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:49 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:49 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:50.031 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:38:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:50 np0005538513.localdomain podman[245763]: 2025-11-28 09:38:50.553444731 +0000 UTC m=+0.091392769 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 09:38:50 np0005538513.localdomain podman[245763]: 2025-11-28 09:38:50.591769589 +0000 UTC m=+0.129717637 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:38:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:38:50.817 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:38:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:38:50.817 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:38:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:38:50.818 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:38:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:51.013 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:38:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:51.014 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:38:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:51.037 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:38:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:51.037 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:38:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:51.038 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:38:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:51.038 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:38:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10973 DF PROTO=TCP SPT=56290 DPT=9105 SEQ=1687668287 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2312420000000001030307) 
Nov 28 09:38:51 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:51 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:51 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:38:51 np0005538513.localdomain systemd[1]: tmp-crun.sNVADd.mount: Deactivated successfully.
Nov 28 09:38:51 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:38:51 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-171c9fab2d92ad957aca0c5cf0ee4736f8cd2abe2dd09f25e9d89f04a4353dd2-merged.mount: Deactivated successfully.
Nov 28 09:38:51 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:51.682 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:38:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:51.683 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:38:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:51.683 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:38:51 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:52.083 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:52.359 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:38:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:52.360 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:38:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:52.360 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:38:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:52.360 228337 DEBUG nova.objects.instance [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:38:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:52.757 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:38:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:52.773 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:38:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:52.773 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:38:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:52.773 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:38:52 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:38:52 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Nov 28 09:38:52 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Nov 28 09:38:53 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:38:53 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-0ffc9017578f246ce8b4d7e2db3f29d809acca5f23dc5c91669a411567041b2c-merged.mount: Deactivated successfully.
Nov 28 09:38:53 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-0ffc9017578f246ce8b4d7e2db3f29d809acca5f23dc5c91669a411567041b2c-merged.mount: Deactivated successfully.
Nov 28 09:38:54 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:38:54 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:38:54 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:38:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:54.680 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:38:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:55.036 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:55 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10974 DF PROTO=TCP SPT=56290 DPT=9105 SEQ=1687668287 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2322020000000001030307) 
Nov 28 09:38:55 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:55 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:55 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:55 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:55 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:55 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:55 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:55 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:56 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:56 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:56 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:56 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:38:56 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-29b4c40b440520fe09ba18ff760c1e8bcaccec35e4300eedc14527806169da1c-merged.mount: Deactivated successfully.
Nov 28 09:38:56 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 28 09:38:57 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:38:57.087 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:38:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 28 09:38:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 28 09:38:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Nov 28 09:38:58 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14992 DF PROTO=TCP SPT=54186 DPT=9101 SEQ=3775725919 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB232D290000000001030307) 
Nov 28 09:38:58 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:58 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:58 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 28 09:38:58 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-0de61d8517bf2f9ac2b1f4b59bb4938a9bcb5e653aa4685b7d899b570eb2d566-merged.mount: Deactivated successfully.
Nov 28 09:38:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:38:58 np0005538513.localdomain podman[245781]: 2025-11-28 09:38:58.555743133 +0000 UTC m=+0.087176353 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:38:58 np0005538513.localdomain podman[245781]: 2025-11-28 09:38:58.567335385 +0000 UTC m=+0.098768615 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:38:58 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:58 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:58 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:38:58 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:38:58 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:58 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:59 np0005538513.localdomain podman[238687]: time="2025-11-28T09:38:59Z" level=error msg="Getting root fs size for \"d50eb1873bbbfb25c378d8d09e1b58b68df7de1660980d5d6de8b3858f88a068\": unmounting layer c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6: replacing mount point \"/var/lib/containers/storage/overlay/c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6/merged\": device or resource busy"
Nov 28 09:38:59 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:59 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:59 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:59 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:59 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:00 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:00.058 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:39:00 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 28 09:39:00 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-4619bcf03a81c0f8085ac49d4104b51527a7e62eb19e6e3702042c0e09f1d20b-merged.mount: Deactivated successfully.
Nov 28 09:39:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:39:00 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:39:00 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-bb58edccbf984a5f7da43e28613aa15e67186fff8a0f55da5237efa2fd62c40a-merged.mount: Deactivated successfully.
Nov 28 09:39:00 np0005538513.localdomain podman[245803]: 2025-11-28 09:39:00.855968573 +0000 UTC m=+0.084728485 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 09:39:00 np0005538513.localdomain podman[245803]: 2025-11-28 09:39:00.890178709 +0000 UTC m=+0.118938551 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 09:39:01 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14994 DF PROTO=TCP SPT=54186 DPT=9101 SEQ=3775725919 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2339420000000001030307) 
Nov 28 09:39:02 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:02.108 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:39:02 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:39:02 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:39:02 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:39:02 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:39:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10975 DF PROTO=TCP SPT=56290 DPT=9105 SEQ=1687668287 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2341820000000001030307) 
Nov 28 09:39:04 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:04 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:39:05 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:05.061 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:39:05 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:05 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11299 DF PROTO=TCP SPT=34894 DPT=9102 SEQ=3325970008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB234D820000000001030307) 
Nov 28 09:39:06 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:06 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:39:06 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:39:07 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:07.112 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:39:07 np0005538513.localdomain sudo[245821]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:39:07 np0005538513.localdomain sudo[245821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:39:07 np0005538513.localdomain sudo[245821]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:08 np0005538513.localdomain sudo[245839]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:39:08 np0005538513.localdomain sudo[245839]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:39:08 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cc38caeb419d32ecfc51d7c0645586de2c8ada37c45a7d9cd6676aef2fc0ca1f-merged.mount: Deactivated successfully.
Nov 28 09:39:08 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cc38caeb419d32ecfc51d7c0645586de2c8ada37c45a7d9cd6676aef2fc0ca1f-merged.mount: Deactivated successfully.
Nov 28 09:39:09 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:39:09 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-171c9fab2d92ad957aca0c5cf0ee4736f8cd2abe2dd09f25e9d89f04a4353dd2-merged.mount: Deactivated successfully.
Nov 28 09:39:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62969 DF PROTO=TCP SPT=47164 DPT=9100 SEQ=3389370259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2359830000000001030307) 
Nov 28 09:39:09 np0005538513.localdomain sudo[245839]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:10 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:10.096 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:39:10 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:39:10 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Nov 28 09:39:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:39:10 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Nov 28 09:39:11 np0005538513.localdomain podman[245890]: 2025-11-28 09:39:11.01909627 +0000 UTC m=+0.127525976 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-type=git, release=1755695350, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 09:39:11 np0005538513.localdomain podman[245890]: 2025-11-28 09:39:11.033007356 +0000 UTC m=+0.141437051 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-08-20T13:12:41, release=1755695350, version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Nov 28 09:39:11 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:39:11 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:39:12 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:12 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:12.153 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:39:12 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:39:12 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:39:12 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:39:12 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:12 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32273 DF PROTO=TCP SPT=55886 DPT=9100 SEQ=2820844050 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2365820000000001030307) 
Nov 28 09:39:12 np0005538513.localdomain sudo[245910]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:39:12 np0005538513.localdomain sudo[245910]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:39:12 np0005538513.localdomain sudo[245910]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:13 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:13 np0005538513.localdomain podman[238687]: time="2025-11-28T09:39:13Z" level=error msg="Getting root fs size for \"e9e81dc4d463c3e33e320bcac7f504711c3df69a454f9c676b6265b24fcfa267\": getting diffsize of layer \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\" and its parent \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\": unmounting layer 3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae: replacing mount point \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged\": device or resource busy"
Nov 28 09:39:14 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:14 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:39:14 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:39:14 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:15 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:15.096 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:39:15 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Nov 28 09:39:15 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-0de61d8517bf2f9ac2b1f4b59bb4938a9bcb5e653aa4685b7d899b570eb2d566-merged.mount: Deactivated successfully.
Nov 28 09:39:15 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:15 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:16 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:39:16 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:16 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:39:17 np0005538513.localdomain podman[245928]: 2025-11-28 09:39:17.104993019 +0000 UTC m=+0.087885347 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:39:17 np0005538513.localdomain podman[245928]: 2025-11-28 09:39:17.113340566 +0000 UTC m=+0.096232934 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:39:17 np0005538513.localdomain podman[245928]: unhealthy
Nov 28 09:39:17 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:39:17 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:17.157 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:39:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38320 DF PROTO=TCP SPT=32968 DPT=9882 SEQ=1144176661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2378020000000001030307) 
Nov 28 09:39:17 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:39:17 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-bb58edccbf984a5f7da43e28613aa15e67186fff8a0f55da5237efa2fd62c40a-merged.mount: Deactivated successfully.
Nov 28 09:39:17 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-bb58edccbf984a5f7da43e28613aa15e67186fff8a0f55da5237efa2fd62c40a-merged.mount: Deactivated successfully.
Nov 28 09:39:18 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:39:18 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-1223dc0c4e426b2bfe915961a21ef517e6d9229ac1981a406c76a3ee9d07ac7d-merged.mount: Deactivated successfully.
Nov 28 09:39:18 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:39:18 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Failed with result 'exit-code'.
Nov 28 09:39:19 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21532 DF PROTO=TCP SPT=38850 DPT=9105 SEQ=3386210266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB237F820000000001030307) 
Nov 28 09:39:19 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:39:19 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:39:19 np0005538513.localdomain podman[245951]: 2025-11-28 09:39:19.616194138 +0000 UTC m=+0.087946298 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 09:39:19 np0005538513.localdomain podman[245951]: 2025-11-28 09:39:19.699423245 +0000 UTC m=+0.171175455 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 28 09:39:19 np0005538513.localdomain systemd[1]: tmp-crun.gQcWzE.mount: Deactivated successfully.
Nov 28 09:39:19 np0005538513.localdomain podman[245952]: 2025-11-28 09:39:19.715932804 +0000 UTC m=+0.184062628 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:39:19 np0005538513.localdomain podman[245952]: 2025-11-28 09:39:19.752436193 +0000 UTC m=+0.220566077 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:39:20 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:20.099 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:39:20 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309-merged.mount: Deactivated successfully.
Nov 28 09:39:20 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449-merged.mount: Deactivated successfully.
Nov 28 09:39:20 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449-merged.mount: Deactivated successfully.
Nov 28 09:39:21 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21533 DF PROTO=TCP SPT=38850 DPT=9105 SEQ=3386210266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2387820000000001030307) 
Nov 28 09:39:21 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:39:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:39:21 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:39:21 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:39:21 np0005538513.localdomain podman[245992]: 2025-11-28 09:39:21.611134448 +0000 UTC m=+0.242955565 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:39:21 np0005538513.localdomain podman[245992]: 2025-11-28 09:39:21.64770624 +0000 UTC m=+0.279527387 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3)
Nov 28 09:39:21 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:39:22 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:22.198 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:39:23 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:39:23 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309-merged.mount: Deactivated successfully.
Nov 28 09:39:23 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:23 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:39:23 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:39:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309-merged.mount: Deactivated successfully.
Nov 28 09:39:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:39:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:39:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:39:25 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:25 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:39:25 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:25.101 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:39:25 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21534 DF PROTO=TCP SPT=38850 DPT=9105 SEQ=3386210266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2397420000000001030307) 
Nov 28 09:39:25 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Nov 28 09:39:26 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:26 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:39:27 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:27.201 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:39:27 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449-merged.mount: Deactivated successfully.
Nov 28 09:39:27 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-91ffc49efb3c7a4f0573f98c172baf0327c9cb7db605d9d237beebdefd054f1a-merged.mount: Deactivated successfully.
Nov 28 09:39:28 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5119 DF PROTO=TCP SPT=54336 DPT=9101 SEQ=587810520 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB23A25A0000000001030307) 
Nov 28 09:39:28 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:39:28 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cc38caeb419d32ecfc51d7c0645586de2c8ada37c45a7d9cd6676aef2fc0ca1f-merged.mount: Deactivated successfully.
Nov 28 09:39:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:39:29 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:39:29 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 28 09:39:29 np0005538513.localdomain systemd[1]: tmp-crun.3rP6lD.mount: Deactivated successfully.
Nov 28 09:39:29 np0005538513.localdomain podman[246010]: 2025-11-28 09:39:29.369873318 +0000 UTC m=+0.108873179 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:39:29 np0005538513.localdomain podman[246010]: 2025-11-28 09:39:29.378343339 +0000 UTC m=+0.117343200 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:39:30 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:30.123 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:39:30 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 28 09:39:30 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:39:31 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:39:31 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5121 DF PROTO=TCP SPT=54336 DPT=9101 SEQ=587810520 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB23AE820000000001030307) 
Nov 28 09:39:31 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:39:31 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:39:32 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:32.244 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:39:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:39:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:39:33 np0005538513.localdomain sshd[246043]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:39:33 np0005538513.localdomain podman[246032]: 2025-11-28 09:39:33.103518249 +0000 UTC m=+0.097002468 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd)
Nov 28 09:39:33 np0005538513.localdomain podman[246032]: 2025-11-28 09:39:33.114667236 +0000 UTC m=+0.108151475 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 09:39:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:39:33 np0005538513.localdomain sshd[246043]: Accepted publickey for zuul from 192.168.122.30 port 33390 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:39:33 np0005538513.localdomain systemd-logind[764]: New session 56 of user zuul.
Nov 28 09:39:33 np0005538513.localdomain systemd[1]: Started Session 56 of User zuul.
Nov 28 09:39:33 np0005538513.localdomain sshd[246043]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:39:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25735 DF PROTO=TCP SPT=34254 DPT=9102 SEQ=480635340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB23B7420000000001030307) 
Nov 28 09:39:33 np0005538513.localdomain sudo[246143]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwmfmtvqujoqkdewdgrpwouqoehnolvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322773.2906866-3038-5351143288332/AnsiballZ_file.py
Nov 28 09:39:33 np0005538513.localdomain sudo[246143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:33 np0005538513.localdomain python3.9[246145]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:33 np0005538513.localdomain sudo[246143]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:33 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:39:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:39:34 np0005538513.localdomain sudo[246253]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prfhlatgdgyterophqnckajzwbgmdbpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322774.0433857-3068-152285626364520/AnsiballZ_stat.py
Nov 28 09:39:35 np0005538513.localdomain sudo[246253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:35 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:35.148 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:39:35 np0005538513.localdomain python3.9[246255]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:39:35 np0005538513.localdomain sudo[246253]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:35 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:35 np0005538513.localdomain sudo[246341]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtwbwbsybfxxvqxadueqokfagojswtlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322774.0433857-3068-152285626364520/AnsiballZ_copy.py
Nov 28 09:39:35 np0005538513.localdomain sudo[246341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:35 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:35 np0005538513.localdomain podman[238687]: time="2025-11-28T09:39:35Z" level=error msg="Getting root fs size for \"f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b\": getting diffsize of layer \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\" and its parent \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy"
Nov 28 09:39:35 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:35 np0005538513.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:35 np0005538513.localdomain python3.9[246343]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322774.0433857-3068-152285626364520/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:35 np0005538513.localdomain sudo[246341]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18114 DF PROTO=TCP SPT=49022 DPT=9100 SEQ=2983312152 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB23C2C20000000001030307) 
Nov 28 09:39:36 np0005538513.localdomain sudo[246451]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-frglbgscoycgmvblehyjrfvwbwekqhlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322776.2350264-3116-250606852278803/AnsiballZ_file.py
Nov 28 09:39:36 np0005538513.localdomain sudo[246451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:36 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:36 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:39:36 np0005538513.localdomain python3.9[246453]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:36 np0005538513.localdomain sudo[246451]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:37 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:37.247 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:39:37 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 28 09:39:37 np0005538513.localdomain sudo[246561]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhhrlhygkhlrtimgunnttkttqfluhirm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322776.9927073-3140-36411593181160/AnsiballZ_stat.py
Nov 28 09:39:37 np0005538513.localdomain sudo[246561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:37 np0005538513.localdomain podman[238687]: time="2025-11-28T09:39:37Z" level=error msg="Unable to write json: \"write unix /run/podman/podman.sock->@: write: broken pipe\""
Nov 28 09:39:37 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:34:22 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 4096 "" "Go-http-client/1.1"
Nov 28 09:39:37 np0005538513.localdomain python3.9[246563]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:39:37 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-b9124600c137d24812fa12ae9a3723386ce2301b24a72f8d26d11f6206571e1d-merged.mount: Deactivated successfully.
Nov 28 09:39:37 np0005538513.localdomain sudo[246561]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:37 np0005538513.localdomain sudo[246618]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nexgbkadexhlfsrudhjfgelyhfjfctsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322776.9927073-3140-36411593181160/AnsiballZ_file.py
Nov 28 09:39:37 np0005538513.localdomain sudo[246618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:38 np0005538513.localdomain python3.9[246620]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:38 np0005538513.localdomain sudo[246618]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:38 np0005538513.localdomain sudo[246728]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nkdpqrwbnnpcueupygkspdzvdyyazfgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322778.330988-3176-265730504275219/AnsiballZ_stat.py
Nov 28 09:39:38 np0005538513.localdomain sudo[246728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:38 np0005538513.localdomain python3.9[246730]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:39:38 np0005538513.localdomain sudo[246728]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:39 np0005538513.localdomain sudo[246785]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-keqslnssclnioijpjohqltdulyyyepjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322778.330988-3176-265730504275219/AnsiballZ_file.py
Nov 28 09:39:39 np0005538513.localdomain sudo[246785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:39:39 np0005538513.localdomain python3.9[246787]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.8i3cv5_5 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-1223dc0c4e426b2bfe915961a21ef517e6d9229ac1981a406c76a3ee9d07ac7d-merged.mount: Deactivated successfully.
Nov 28 09:39:39 np0005538513.localdomain sudo[246785]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25737 DF PROTO=TCP SPT=34254 DPT=9102 SEQ=480635340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB23CF020000000001030307) 
Nov 28 09:39:39 np0005538513.localdomain sudo[246895]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkjbsypflifvuwcfreajhwgctifrikxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322779.5940423-3212-70677345091107/AnsiballZ_stat.py
Nov 28 09:39:39 np0005538513.localdomain sudo[246895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:40 np0005538513.localdomain python3.9[246897]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:39:40 np0005538513.localdomain sudo[246895]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:40 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:40.160 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:39:40 np0005538513.localdomain sudo[246952]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cnkvzlepuyhxfpgtjgdjvsmrslwungqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322779.5940423-3212-70677345091107/AnsiballZ_file.py
Nov 28 09:39:40 np0005538513.localdomain sudo[246952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:40 np0005538513.localdomain python3.9[246954]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:40 np0005538513.localdomain sudo[246952]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:41 np0005538513.localdomain sudo[247062]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqgobrogbhifsrweujnwzuanxfhsywip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322780.8822613-3251-193627591978510/AnsiballZ_command.py
Nov 28 09:39:41 np0005538513.localdomain sudo[247062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:41 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309-merged.mount: Deactivated successfully.
Nov 28 09:39:41 np0005538513.localdomain python3.9[247064]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:39:41 np0005538513.localdomain sudo[247062]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:41 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449-merged.mount: Deactivated successfully.
Nov 28 09:39:41 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449-merged.mount: Deactivated successfully.
Nov 28 09:39:42 np0005538513.localdomain sudo[247173]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjibdqzancxltxxogwbebbpwmzyerweu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764322781.617311-3275-267690113431320/AnsiballZ_edpm_nftables_from_files.py
Nov 28 09:39:42 np0005538513.localdomain sudo[247173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:42 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:42.272 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:39:42 np0005538513.localdomain python3[247175]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 28 09:39:42 np0005538513.localdomain sudo[247173]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:42 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18116 DF PROTO=TCP SPT=49022 DPT=9100 SEQ=2983312152 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB23DA820000000001030307) 
Nov 28 09:39:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:39:42 np0005538513.localdomain podman[247214]: 2025-11-28 09:39:42.862423705 +0000 UTC m=+0.097101932 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, maintainer=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41)
Nov 28 09:39:42 np0005538513.localdomain podman[247214]: 2025-11-28 09:39:42.87847818 +0000 UTC m=+0.113156387 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 09:39:42 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:39:43 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309-merged.mount: Deactivated successfully.
Nov 28 09:39:43 np0005538513.localdomain sudo[247301]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jehzpeziaynerpiyimxyjcwyctxareoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322782.718701-3299-280385177430081/AnsiballZ_stat.py
Nov 28 09:39:43 np0005538513.localdomain sudo[247301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:43 np0005538513.localdomain python3.9[247303]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:39:43 np0005538513.localdomain sudo[247301]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:43 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:39:43 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309-merged.mount: Deactivated successfully.
Nov 28 09:39:43 np0005538513.localdomain sudo[247358]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wgrixvtzoyxwlkkznjsbarktimpvyawb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322782.718701-3299-280385177430081/AnsiballZ_file.py
Nov 28 09:39:43 np0005538513.localdomain sudo[247358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:43 np0005538513.localdomain python3.9[247360]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:43 np0005538513.localdomain sudo[247358]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:43 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:39:43 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:39:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:39:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Nov 28 09:39:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:39:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:39:44 np0005538513.localdomain sudo[247468]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zcvoqjikxgaiamuqvabalpltrjrbypur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322784.57259-3335-81646368350618/AnsiballZ_stat.py
Nov 28 09:39:44 np0005538513.localdomain sudo[247468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:45 np0005538513.localdomain python3.9[247470]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:39:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:45.163 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:39:45 np0005538513.localdomain sudo[247468]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:45 np0005538513.localdomain sudo[247525]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jnnhknvsbuklvqaymvqkbtbawokrognp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322784.57259-3335-81646368350618/AnsiballZ_file.py
Nov 28 09:39:45 np0005538513.localdomain sudo[247525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:45 np0005538513.localdomain python3.9[247527]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:45 np0005538513.localdomain sudo[247525]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:45.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:39:46 np0005538513.localdomain sudo[247635]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yitikbgpjfwtmqfglvvlyhyuousmpwlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322785.8367643-3371-109341485737663/AnsiballZ_stat.py
Nov 28 09:39:46 np0005538513.localdomain sudo[247635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:46 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449-merged.mount: Deactivated successfully.
Nov 28 09:39:46 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-91ffc49efb3c7a4f0573f98c172baf0327c9cb7db605d9d237beebdefd054f1a-merged.mount: Deactivated successfully.
Nov 28 09:39:46 np0005538513.localdomain python3.9[247637]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:39:47 np0005538513.localdomain sudo[247635]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29367 DF PROTO=TCP SPT=47314 DPT=9882 SEQ=2724722329 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB23ED020000000001030307) 
Nov 28 09:39:47 np0005538513.localdomain sudo[247692]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vepjyyzzdomihkkamuygdcaexnmsdbdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322785.8367643-3371-109341485737663/AnsiballZ_file.py
Nov 28 09:39:47 np0005538513.localdomain sudo[247692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:47.276 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:39:47 np0005538513.localdomain python3.9[247694]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:47 np0005538513.localdomain sudo[247692]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:48 np0005538513.localdomain sudo[247802]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftyoxqgvdmcasxdtwyczoibftgxxkrcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322787.597564-3407-72298604569398/AnsiballZ_stat.py
Nov 28 09:39:48 np0005538513.localdomain sudo[247802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:48 np0005538513.localdomain python3.9[247804]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:39:48 np0005538513.localdomain sudo[247802]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:39:48 np0005538513.localdomain sudo[247859]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idlpzzufmurssvvqosfjwcudavdumtkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322787.597564-3407-72298604569398/AnsiballZ_file.py
Nov 28 09:39:48 np0005538513.localdomain sudo[247859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 28 09:39:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:48.680 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:39:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:48.720 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:39:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:48.720 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:39:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:48.720 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:39:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:48.721 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:39:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:48.721 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:39:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 28 09:39:48 np0005538513.localdomain python3.9[247861]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:39:48 np0005538513.localdomain sudo[247859]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:48 np0005538513.localdomain podman[247863]: 2025-11-28 09:39:48.910477541 +0000 UTC m=+0.084717614 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:39:48 np0005538513.localdomain podman[247863]: 2025-11-28 09:39:48.918078195 +0000 UTC m=+0.092318268 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:39:48 np0005538513.localdomain podman[247863]: unhealthy
Nov 28 09:39:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:49.150 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:39:49 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22305 DF PROTO=TCP SPT=44632 DPT=9105 SEQ=33193345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB23F4C20000000001030307) 
Nov 28 09:39:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:49.234 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:39:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:49.235 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:39:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:49.486 228337 WARNING nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:39:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:49.487 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12282MB free_disk=41.837093353271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:39:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:49.488 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:39:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:49.488 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:39:49 np0005538513.localdomain sudo[248011]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btaetffoikliakuzbharnhojrfsnberf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322789.0711846-3443-242435804638298/AnsiballZ_stat.py
Nov 28 09:39:49 np0005538513.localdomain sudo[248011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:49.594 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:39:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:49.595 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:39:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:49.595 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:39:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:49.659 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:39:49 np0005538513.localdomain python3.9[248013]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:39:49 np0005538513.localdomain sudo[248011]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:50.109 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:39:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:50.116 228337 DEBUG nova.compute.provider_tree [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:39:50 np0005538513.localdomain sudo[248122]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qivveivloslfshokyfhtkhxqctzmnplq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322789.0711846-3443-242435804638298/AnsiballZ_copy.py
Nov 28 09:39:50 np0005538513.localdomain sudo[248122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:50.144 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:39:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:50.147 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:39:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:50.147 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:39:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:39:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:50.201 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:39:50 np0005538513.localdomain python3.9[248125]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322789.0711846-3443-242435804638298/.source.nft follow=False _original_basename=ruleset.j2 checksum=953266ca5f7d82d2777a0a437bd7feceb9259ee8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:50 np0005538513.localdomain sudo[248122]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:39:50 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:39:50 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Failed with result 'exit-code'.
Nov 28 09:39:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:39:50.817 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:39:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:39:50.818 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:39:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:39:50.820 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:39:50 np0005538513.localdomain sudo[248235]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdhhmhceouexvndhuhddayedsrqwgfcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322790.5839927-3488-103736559806621/AnsiballZ_file.py
Nov 28 09:39:50 np0005538513.localdomain sudo[248235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:51 np0005538513.localdomain python3.9[248237]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:51 np0005538513.localdomain sudo[248235]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:51 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22306 DF PROTO=TCP SPT=44632 DPT=9105 SEQ=33193345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB23FCC20000000001030307) 
Nov 28 09:39:51 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:51 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:51 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:51 np0005538513.localdomain sudo[248345]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxuyjcfgvrburasjxfajqyhczjcqsesy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322791.2606163-3512-272023269004548/AnsiballZ_command.py
Nov 28 09:39:51 np0005538513.localdomain sudo[248345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:39:51 np0005538513.localdomain python3.9[248347]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:39:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:39:51 np0005538513.localdomain sudo[248345]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:51 np0005538513.localdomain systemd[1]: tmp-crun.59AN9H.mount: Deactivated successfully.
Nov 28 09:39:51 np0005538513.localdomain podman[248350]: 2025-11-28 09:39:51.828717221 +0000 UTC m=+0.064033523 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 09:39:51 np0005538513.localdomain podman[248350]: 2025-11-28 09:39:51.833199754 +0000 UTC m=+0.068516056 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 09:39:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:52.149 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:39:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:52.150 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:39:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:52.150 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:39:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:52.150 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:39:52 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:39:52 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:52.308 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:39:52 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:39:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:52.342 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:39:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:52.343 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:39:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:52.343 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:39:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:52.343 228337 DEBUG nova.objects.instance [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:39:52 np0005538513.localdomain podman[248348]: 2025-11-28 09:39:52.365372002 +0000 UTC m=+0.600592211 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:39:52 np0005538513.localdomain podman[248348]: 2025-11-28 09:39:52.419448155 +0000 UTC m=+0.654668395 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 09:39:52 np0005538513.localdomain sudo[248497]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwvjoneahygudrmdkdegtogbyqirftqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322791.9888294-3536-93633470108422/AnsiballZ_blockinfile.py
Nov 28 09:39:52 np0005538513.localdomain sudo[248497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:52 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:39:52 np0005538513.localdomain python3.9[248499]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:52 np0005538513.localdomain sudo[248497]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:52 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:39:52 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:39:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:52.738 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:39:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:52.752 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:39:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:52.752 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:39:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:52.753 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:39:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:52.753 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:39:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:52.753 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:39:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:52.754 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:39:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:52.754 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:39:53 np0005538513.localdomain sudo[248607]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqiofjlfbzgvjorcqmuzxyzfpzklkuub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322793.0014064-3563-18600395661643/AnsiballZ_command.py
Nov 28 09:39:53 np0005538513.localdomain sudo[248607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:53 np0005538513.localdomain python3.9[248609]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:39:53 np0005538513.localdomain sudo[248607]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:39:53 np0005538513.localdomain podman[248611]: 2025-11-28 09:39:53.84424323 +0000 UTC m=+0.076457720 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:39:53 np0005538513.localdomain podman[248611]: 2025-11-28 09:39:53.883751255 +0000 UTC m=+0.115965745 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 09:39:54 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 28 09:39:54 np0005538513.localdomain sudo[248738]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwbtnvlspcgnbebhocbqugcpphopfypa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322793.9282768-3587-199583561327506/AnsiballZ_stat.py
Nov 28 09:39:54 np0005538513.localdomain sudo[248738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:54 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-b9124600c137d24812fa12ae9a3723386ce2301b24a72f8d26d11f6206571e1d-merged.mount: Deactivated successfully.
Nov 28 09:39:54 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:39:54 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:34:26 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 142736 "" "Go-http-client/1.1"
Nov 28 09:39:54 np0005538513.localdomain podman_exporter[238894]: ts=2025-11-28T09:39:54.388Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 28 09:39:54 np0005538513.localdomain podman_exporter[238894]: ts=2025-11-28T09:39:54.389Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 28 09:39:54 np0005538513.localdomain podman_exporter[238894]: ts=2025-11-28T09:39:54.389Z caller=tls_config.go:316 level=info msg="TLS is disabled." http2=false address=[::]:9882
Nov 28 09:39:54 np0005538513.localdomain python3.9[248740]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:39:54 np0005538513.localdomain sudo[248738]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:54 np0005538513.localdomain sudo[248851]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wphkyvqavwmflktsgbsiwuvuzrifklpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322794.6256702-3611-152965706056224/AnsiballZ_command.py
Nov 28 09:39:54 np0005538513.localdomain sudo[248851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:55 np0005538513.localdomain python3.9[248853]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:39:55 np0005538513.localdomain sudo[248851]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:55.203 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:39:55 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:39:55 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:39:55 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:39:55 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:39:55 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:39:55 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:39:55 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:39:55 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:39:55 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:39:55 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:39:55 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:39:55 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:39:55 np0005538513.localdomain sudo[248969]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtqqxkqbiananuxletrhqsizuyrlnvvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322795.3783214-3635-18729195107902/AnsiballZ_file.py
Nov 28 09:39:55 np0005538513.localdomain sudo[248969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:55 np0005538513.localdomain python3.9[248971]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:55 np0005538513.localdomain sudo[248969]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:56 np0005538513.localdomain sshd[246043]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:39:56 np0005538513.localdomain systemd-logind[764]: Session 56 logged out. Waiting for processes to exit.
Nov 28 09:39:56 np0005538513.localdomain systemd[1]: session-56.scope: Deactivated successfully.
Nov 28 09:39:56 np0005538513.localdomain systemd[1]: session-56.scope: Consumed 13.686s CPU time.
Nov 28 09:39:56 np0005538513.localdomain systemd-logind[764]: Removed session 56.
Nov 28 09:39:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:56.682 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:39:57 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:39:57.311 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:00 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:00.244 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.670 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.671 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.675 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 9175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3cc3613a-84c9-4c70-a12b-69e736eaa1b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9175, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:40:00.671446', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3589f04e-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.843345277, 'message_signature': '05feebe9b5840ec01ff2c7d1088bd817547530187e1e23362fa7eb534f838ae5'}]}, 'timestamp': '2025-11-28 09:40:00.675962', '_unique_id': '30dce1888bde483f9a4418201b69c5b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.677 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.678 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.701 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 52.40234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f254a761-a476-49ae-ab0c-2d4988b1fa73', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.40234375, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:40:00.679137', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '358defaa-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.87338685, 'message_signature': 'd099e3aed9f6318d729d23dd11c9bad6b2d1d70391a78e35475af99b69902feb'}]}, 'timestamp': '2025-11-28 09:40:00.702103', '_unique_id': 'c84961c6a6f141238dc6c283f4626612'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.703 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.704 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.704 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9546889-22c9-4998-8851-7741a806000e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:40:00.704390', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '358e5c38-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.843345277, 'message_signature': '7d5a80027f65dc8fffc09f017d4eb4191321ae6c67d721d2e63ba5d6e6323f6b'}]}, 'timestamp': '2025-11-28 09:40:00.704853', '_unique_id': '52a8402396e64e07a429122703a2c50a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.705 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.706 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.707 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e488c4b-cf4a-4d4d-9bed-61ce05c89b04', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:40:00.707012', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '358ec42a-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.843345277, 'message_signature': '00c8733515ff48b90cf6c38adbb27aa25434609d64161791c26184c645d163ab'}]}, 'timestamp': '2025-11-28 09:40:00.707511', '_unique_id': '1d6d1c3eec51414b83cc84f06f8bff41'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.708 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.709 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.744 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.744 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ef89e48-c062-4e16-95a9-ea08e85b171e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:40:00.709655', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '359478e8-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.881550581, 'message_signature': '2255f27137a9918595b4634fee50a621b073f7e91665c58f2f1039d3e91ed153'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:40:00.709655', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '35948bbc-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.881550581, 'message_signature': '51c412e44f1ce11eea572681433354fcabf56ed402972642ed3e87cc5251b59a'}]}, 'timestamp': '2025-11-28 09:40:00.745394', '_unique_id': '638719d2f9be4fc6b5b6740293140f03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.746 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.747 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.747 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94226482-5711-48ab-b499-8976e3febe74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:40:00.747602', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3594f408-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.843345277, 'message_signature': '6af998c0b675931178101350e157ab5e7fc9573401a9f0c13e945ddc7063e90e'}]}, 'timestamp': '2025-11-28 09:40:00.748154', '_unique_id': '89d187cfaf8144d190afcb9c6adc7779'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.749 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.750 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.762 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.763 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3ea0dfe-c31d-429a-b14f-71fe1ea85e18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:40:00.750272', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '35974ece-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.922167542, 'message_signature': '7749a3da99203c37d1120e60e84ab7aeaa48977a0ce9515290121af8aab54a94'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:40:00.750272', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '35975f2c-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.922167542, 'message_signature': '48bc2f15cda7c9035e7e80aba7d3403f491a8ecdd190c818f35aee8100404294'}]}, 'timestamp': '2025-11-28 09:40:00.763877', '_unique_id': '4ef1854b85cd4b90aeeeb514010b7d03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.764 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.765 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.766 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4508b6f3-0d35-4a7c-95b3-fdd22f91c92b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 144, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:40:00.766116', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3597c7dc-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.843345277, 'message_signature': '4b3ce338f601bbf29db72472ae94e15e98ec7372f6b50a4c9b8f10a5438ff790'}]}, 'timestamp': '2025-11-28 09:40:00.766591', '_unique_id': '146d653a1e854d749deec8cbb91ff86f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.767 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.768 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.768 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd74fff9a-f607-4ed9-9f5e-e858e17d590e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:40:00.768663', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '35982b32-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.843345277, 'message_signature': '15560e91702367b69c1b1f2b74d9c502ae8987ee994a0681533393b7b6626e42'}]}, 'timestamp': '2025-11-28 09:40:00.769166', '_unique_id': '847853873baa4451b4d2879de7010e10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.770 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.771 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.771 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53ad1305-f74a-440f-863c-edad1fc76098', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:40:00.771246', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '35988ffa-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.843345277, 'message_signature': 'ae27685c0ebaaaf7cc6be9092798d40675b5cb2e79a243c12450860d0383c687'}]}, 'timestamp': '2025-11-28 09:40:00.771713', '_unique_id': '8dad3fa814734adcb4e77352ddab6ea9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.773 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.773 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.774 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ce844ea-9df5-4c03-84c9-68921a4a4410', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:40:00.773799', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3598f418-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.922167542, 'message_signature': '6528d1a98f8b5f02a1d314a97b079eb6099747829d61719c4fbd096243350acf'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:40:00.773799', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '35990494-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.922167542, 'message_signature': '96cb92f1c153ceb5cb85e98a32ebad3785fd66266859fa4f01a927e7c7d0e335'}]}, 'timestamp': '2025-11-28 09:40:00.774672', '_unique_id': '9768d081e0314f84b89d9766915fdac5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.775 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.776 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.776 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 73981952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.777 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '028cb6f3-a769-4d11-b859-09fa6d1306c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73981952, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:40:00.776783', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '35996790-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.881550581, 'message_signature': 'ff585328ad1d659ae19eff2914135fbc3821bb6a2f85d97468274d27a64fa4e4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:40:00.776783', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '359979ec-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.881550581, 'message_signature': '6039dd2037e0c1d89e0aa8ec1a4ae165f37b88dc7f4a81c6e4d220548388a540'}]}, 'timestamp': '2025-11-28 09:40:00.777672', '_unique_id': 'e9d8f85d7bf8468982683f79808f8e49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.779 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.779 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.779 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 86 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1f732ab-69e8-4382-ac3c-d53bdc901bfa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 86, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:40:00.779907', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3599e350-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.843345277, 'message_signature': 'de5baaeaf7aa7664ef899bcba597b2b5a5b2fe25848103d97a34bd1fd2e0d6f0'}]}, 'timestamp': '2025-11-28 09:40:00.780398', '_unique_id': '9225bfdd1c244ad6bc8d5a397a203ab0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.782 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.783 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.783 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 12742 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7854603a-f4e3-4c97-8392-7c8f45af0fc7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12742, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:40:00.783173', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '359a61cc-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.843345277, 'message_signature': '5503f771e4aaff661bde75f65cfc6073804215836db8f381c440f6189f73b5f9'}]}, 'timestamp': '2025-11-28 09:40:00.783873', '_unique_id': '3c120cefe2ee43cc89ba9217eb42733c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.785 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.786 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.786 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.786 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4961d90f-4fcc-4082-bd84-8d23a776f9f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:40:00.786165', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '359ad670-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.922167542, 'message_signature': '4c77fe7ef9b911037b2578c989aa751c485ad5497a361a2ffcac8259e5e6138e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:40:00.786165', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '359ae642-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.922167542, 'message_signature': '4753905dfa95e8622550be68ec319d82881bb917a3cfc7b077ec6f486e9dd6aa'}]}, 'timestamp': '2025-11-28 09:40:00.787000', '_unique_id': '41a85da6b08940f4beb7101916beb5d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.788 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.789 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 566 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.789 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ca0b219-01f2-44ab-b7f1-81f3f94a1148', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 566, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:40:00.789142', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '359b4b32-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.881550581, 'message_signature': '67fe9f58206e316098094f8e5f50244bd98ee0352bda1aef647614665a56286a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:40:00.789142', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '359b5afa-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.881550581, 'message_signature': '6fc95a7269dc15816a7532afae2bbe808f5de0cf31ad430f6c4dac7a3d8ded44'}]}, 'timestamp': '2025-11-28 09:40:00.789985', '_unique_id': 'c84dad30b6f347f7aff3fbcee8ed3caa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.791 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.792 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1313024378 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.792 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 168520223 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b23434bd-afcf-4815-9daa-e4b164ed41f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1313024378, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:40:00.792103', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '359bbe78-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.881550581, 'message_signature': '56b9a0644cb951d2951ee0aa7fec35800f3d5c8d9bfde9b762bad18273cc04c6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 168520223, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:40:00.792103', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '359bce36-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.881550581, 'message_signature': 'f8c3375e7021b30441573e1300a07e4cb820685882088e6df59c6fa10ab450d9'}]}, 'timestamp': '2025-11-28 09:40:00.792938', '_unique_id': '2288924b9a914a4c8862aa92eceaccd6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.794 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.795 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 305908425 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 30452399 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26525102-2129-4a19-9442-6ef565a7e051', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 305908425, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:40:00.795253', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '359c3952-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.881550581, 'message_signature': '8c72b4b08086e98a7d5ccff1b32eb393da8e20f98829f1cc47e7fd13c878b90a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30452399, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:40:00.795253', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '359c48fc-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.881550581, 'message_signature': '2ba7bfcfbb71a87616f71b9d744b1a97fd3a5854492bf860abc241d235a6ad87'}]}, 'timestamp': '2025-11-28 09:40:00.796110', '_unique_id': '94ca0fb8defb430090f4f0626082ebb0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.796 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.798 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.798 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4f88b5e-1caf-4246-a051-e6446dced979', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:40:00.798471', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '359cb742-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.843345277, 'message_signature': '1789a7dd378d9871da720de6b9385eee762b2f43ba1f0f70135b00861862ee2d'}]}, 'timestamp': '2025-11-28 09:40:00.798930', '_unique_id': '32a43900aef8447aaacf6eb2ddd203fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.799 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.800 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.801 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 48960000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc20fcef-f59a-4dc5-95c4-1f7fbf7f0c22', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 48960000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:40:00.800977', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '359d1aa2-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.87338685, 'message_signature': '006cc34b6205b8b41d25e354f34ec62a28442a551f052fa5e81249a2b52497a1'}]}, 'timestamp': '2025-11-28 09:40:00.801479', '_unique_id': '7c71ccc840174967bf49d05ee5946f06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.802 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.803 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.803 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.803 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90539fac-e1e2-476b-bae0-a36922fbbe1d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:40:00.803516', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '359d7bb4-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.881550581, 'message_signature': '6fd1ac2f1345087a99f791b680c5de2502283cfd87a11af81ea21af4befb2824'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:40:00.803516', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '359d8cf8-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10634.881550581, 'message_signature': 'ac9bc2d89f319973292ee140e70d2efc195c4e62d93de5d5f7ee170e9e0d628a'}]}, 'timestamp': '2025-11-28 09:40:00.804375', '_unique_id': '7f4f9d401c6e400f904b60171fc483ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:40:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:40:00.805 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:40:01 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:40:01 np0005538513.localdomain podman[248989]: 2025-11-28 09:40:01.861814841 +0000 UTC m=+0.090075256 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:40:01 np0005538513.localdomain podman[248989]: 2025-11-28 09:40:01.875381646 +0000 UTC m=+0.103642101 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:40:01 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:40:02 np0005538513.localdomain sshd[249013]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:40:02 np0005538513.localdomain sshd[249013]: Accepted publickey for zuul from 192.168.122.30 port 38510 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:40:02 np0005538513.localdomain systemd-logind[764]: New session 57 of user zuul.
Nov 28 09:40:02 np0005538513.localdomain systemd[1]: Started Session 57 of User zuul.
Nov 28 09:40:02 np0005538513.localdomain sshd[249013]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:40:02 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:02.346 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:02 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32713 DF PROTO=TCP SPT=36404 DPT=9102 SEQ=1545759116 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2428750000000001030307) 
Nov 28 09:40:03 np0005538513.localdomain sudo[249124]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vismtkeazwdievsfwvxbbeojbrtyqsag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322802.2998397-26-3299558220653/AnsiballZ_file.py
Nov 28 09:40:03 np0005538513.localdomain sudo[249124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:03 np0005538513.localdomain python3.9[249126]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:03 np0005538513.localdomain sudo[249124]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32714 DF PROTO=TCP SPT=36404 DPT=9102 SEQ=1545759116 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB242C830000000001030307) 
Nov 28 09:40:03 np0005538513.localdomain sudo[249234]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llsrnahvwadjjkboxyyihvsldttufjjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322803.451343-26-222092847437806/AnsiballZ_file.py
Nov 28 09:40:03 np0005538513.localdomain sudo[249234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:03 np0005538513.localdomain python3.9[249236]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:03 np0005538513.localdomain sudo[249234]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:04 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25739 DF PROTO=TCP SPT=34254 DPT=9102 SEQ=480635340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB242F820000000001030307) 
Nov 28 09:40:04 np0005538513.localdomain sudo[249344]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jxctfcuymqhcjaqlvgqqvdgrtjqnkxkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322804.0536323-26-64641477631959/AnsiballZ_file.py
Nov 28 09:40:04 np0005538513.localdomain sudo[249344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:40:04 np0005538513.localdomain systemd[1]: tmp-crun.ad2Ugg.mount: Deactivated successfully.
Nov 28 09:40:04 np0005538513.localdomain podman[249347]: 2025-11-28 09:40:04.391600546 +0000 UTC m=+0.085644145 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 09:40:04 np0005538513.localdomain podman[249347]: 2025-11-28 09:40:04.403437625 +0000 UTC m=+0.097481224 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 28 09:40:04 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:40:04 np0005538513.localdomain python3.9[249346]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated/neutron-sriov-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:04 np0005538513.localdomain sudo[249344]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:05 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:05.247 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:05 np0005538513.localdomain python3.9[249473]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/neutron_sriov_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:05 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32715 DF PROTO=TCP SPT=36404 DPT=9102 SEQ=1545759116 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2434820000000001030307) 
Nov 28 09:40:06 np0005538513.localdomain python3.9[249559]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/neutron_sriov_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322804.7000277-104-76802868233118/.source.yaml follow=False _original_basename=neutron_sriov_agent.yaml.j2 checksum=d3942d8476d006ea81540d2a1d96dd9d67f33f5f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24214 DF PROTO=TCP SPT=41794 DPT=9102 SEQ=1762287142 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2437820000000001030307) 
Nov 28 09:40:06 np0005538513.localdomain python3.9[249667]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:07 np0005538513.localdomain python3.9[249753]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322806.2667172-149-155126410404315/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:07 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:07.348 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:07 np0005538513.localdomain python3.9[249861]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:08 np0005538513.localdomain python3.9[249947]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322807.3855577-149-89824399989012/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:08 np0005538513.localdomain python3.9[250055]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron-sriov-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:09 np0005538513.localdomain python3.9[250141]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron-sriov-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322808.5060346-149-275524779201054/.source.conf follow=False _original_basename=neutron-sriov-agent.conf.j2 checksum=9db3e6bc777692b5f6ddca8df1bc6c670a364e19 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32716 DF PROTO=TCP SPT=36404 DPT=9102 SEQ=1545759116 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2444420000000001030307) 
Nov 28 09:40:10 np0005538513.localdomain podman[238687]: time="2025-11-28T09:40:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:40:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:40:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144351 "" "Go-http-client/1.1"
Nov 28 09:40:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:40:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16350 "" "Go-http-client/1.1"
Nov 28 09:40:10 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:10.250 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:10 np0005538513.localdomain python3.9[250253]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/10-neutron-sriov.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:11 np0005538513.localdomain python3.9[250339]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/10-neutron-sriov.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322810.4261882-323-143548155481683/.source.conf _original_basename=10-neutron-sriov.conf follow=False checksum=ecea6b6701c9be6f1d83be82edd3c16fe40b7bb4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:12 np0005538513.localdomain python3.9[250447]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:40:12 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:12.385 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:13 np0005538513.localdomain sudo[250467]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:40:13 np0005538513.localdomain sudo[250467]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:40:13 np0005538513.localdomain sudo[250467]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:13 np0005538513.localdomain auditd[725]: Audit daemon rotating log files
Nov 28 09:40:13 np0005538513.localdomain sudo[250501]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:40:13 np0005538513.localdomain sudo[250501]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:40:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:40:13 np0005538513.localdomain podman[250542]: 2025-11-28 09:40:13.461441329 +0000 UTC m=+0.096141142 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc.)
Nov 28 09:40:13 np0005538513.localdomain podman[250542]: 2025-11-28 09:40:13.477397139 +0000 UTC m=+0.112097002 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-type=git, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, com.redhat.component=ubi9-minimal-container)
Nov 28 09:40:13 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:40:13 np0005538513.localdomain sudo[250611]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxhpnhsvsvampgrptgwmckwslpnedtfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322813.251021-395-188738674116316/AnsiballZ_file.py
Nov 28 09:40:13 np0005538513.localdomain sudo[250611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:13 np0005538513.localdomain python3.9[250613]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:13 np0005538513.localdomain sudo[250611]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:14 np0005538513.localdomain sudo[250501]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:14 np0005538513.localdomain sudo[250752]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vzodetxkfbpwxhzfakknjpmhlwxrsgkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322813.918402-419-216684196945618/AnsiballZ_stat.py
Nov 28 09:40:14 np0005538513.localdomain sudo[250752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:14 np0005538513.localdomain python3.9[250754]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:14 np0005538513.localdomain sudo[250752]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:14 np0005538513.localdomain sudo[250757]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:40:14 np0005538513.localdomain sudo[250757]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:40:14 np0005538513.localdomain sudo[250757]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:14 np0005538513.localdomain sudo[250827]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzsoskvuvgwzbrqhbtwobyouapcevlkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322813.918402-419-216684196945618/AnsiballZ_file.py
Nov 28 09:40:14 np0005538513.localdomain sudo[250827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:14 np0005538513.localdomain python3.9[250829]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:14 np0005538513.localdomain sudo[250827]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:15 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:15.252 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:15 np0005538513.localdomain sudo[250937]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wuahlxdgtqheshsdllyflzukepsbaymu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322815.5201952-419-130087674447153/AnsiballZ_stat.py
Nov 28 09:40:15 np0005538513.localdomain sudo[250937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:15 np0005538513.localdomain python3.9[250939]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:15 np0005538513.localdomain sudo[250937]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:16 np0005538513.localdomain sudo[250994]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jeuaahcybosouupsdnzbmpdjticpcitz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322815.5201952-419-130087674447153/AnsiballZ_file.py
Nov 28 09:40:16 np0005538513.localdomain sudo[250994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:16 np0005538513.localdomain python3.9[250996]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:16 np0005538513.localdomain sudo[250994]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:17 np0005538513.localdomain sudo[251104]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uotlguryyntwtcdhxcblllrwpspwhqjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322816.788908-488-170499656705841/AnsiballZ_file.py
Nov 28 09:40:17 np0005538513.localdomain sudo[251104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:17 np0005538513.localdomain python3.9[251106]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:40:17 np0005538513.localdomain sudo[251104]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:17 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:17.388 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:17 np0005538513.localdomain sudo[251214]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-npicteluvhgmwfdmjaftchnfizofsutw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322817.445957-512-101072681454298/AnsiballZ_stat.py
Nov 28 09:40:17 np0005538513.localdomain sudo[251214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:17 np0005538513.localdomain python3.9[251216]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:17 np0005538513.localdomain sudo[251214]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:18 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32717 DF PROTO=TCP SPT=36404 DPT=9102 SEQ=1545759116 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2465820000000001030307) 
Nov 28 09:40:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:40:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:40:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:40:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:40:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:40:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:40:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:40:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:40:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:40:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:40:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:40:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:40:18 np0005538513.localdomain sudo[251271]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gxxsermjtwukubyookkstrbwcvxrskew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322817.445957-512-101072681454298/AnsiballZ_file.py
Nov 28 09:40:18 np0005538513.localdomain sudo[251271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:18 np0005538513.localdomain python3.9[251273]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:40:18 np0005538513.localdomain sudo[251271]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:18 np0005538513.localdomain sudo[251381]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exfongdfdsohtytwxbrchhyjwcdythaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322818.6937044-548-26671760198153/AnsiballZ_stat.py
Nov 28 09:40:18 np0005538513.localdomain sudo[251381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:19 np0005538513.localdomain python3.9[251383]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:19 np0005538513.localdomain sudo[251381]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:19 np0005538513.localdomain sudo[251438]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gflhkbklqyswjlmrjxakjncgdyqrdeix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322818.6937044-548-26671760198153/AnsiballZ_file.py
Nov 28 09:40:19 np0005538513.localdomain sudo[251438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:19 np0005538513.localdomain python3.9[251440]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:40:19 np0005538513.localdomain sudo[251438]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:20 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:20.287 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:20 np0005538513.localdomain sudo[251548]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kojpbfnrpzfxuadwsvujrjiiwxxgmhit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322819.849157-584-14360088025036/AnsiballZ_systemd.py
Nov 28 09:40:20 np0005538513.localdomain sudo[251548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:40:20 np0005538513.localdomain systemd[1]: tmp-crun.C8PmUz.mount: Deactivated successfully.
Nov 28 09:40:20 np0005538513.localdomain podman[251551]: 2025-11-28 09:40:20.579894326 +0000 UTC m=+0.090939395 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:40:20 np0005538513.localdomain podman[251551]: 2025-11-28 09:40:20.592483389 +0000 UTC m=+0.103528468 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:40:20 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:40:20 np0005538513.localdomain python3.9[251550]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:40:20 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:40:20 np0005538513.localdomain systemd-sysv-generator[251598]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:40:20 np0005538513.localdomain systemd-rc-local-generator[251594]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:40:20 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:20 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:20 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:20 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:20 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:40:20 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:20 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:20 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:20 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:21 np0005538513.localdomain sudo[251548]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:21 np0005538513.localdomain sudo[251718]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-trrpileokiihxuyorncdszvsweybzjxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322821.3901625-608-38387508916314/AnsiballZ_stat.py
Nov 28 09:40:21 np0005538513.localdomain sudo[251718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:21 np0005538513.localdomain python3.9[251720]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:21 np0005538513.localdomain sudo[251718]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:22 np0005538513.localdomain sudo[251775]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqeqgrvopfgarlkwbaecewucldcuakrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322821.3901625-608-38387508916314/AnsiballZ_file.py
Nov 28 09:40:22 np0005538513.localdomain sudo[251775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:22 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:22.422 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:22 np0005538513.localdomain python3.9[251777]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:40:22 np0005538513.localdomain sudo[251775]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:40:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:40:22 np0005538513.localdomain podman[251841]: 2025-11-28 09:40:22.857178461 +0000 UTC m=+0.085555962 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 28 09:40:22 np0005538513.localdomain podman[251847]: 2025-11-28 09:40:22.90304032 +0000 UTC m=+0.128405155 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 28 09:40:22 np0005538513.localdomain podman[251847]: 2025-11-28 09:40:22.933209587 +0000 UTC m=+0.158574422 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:40:22 np0005538513.localdomain sudo[251927]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmkcfthjbwufsndqdqvbxcpninerenur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322822.658669-644-88803141716364/AnsiballZ_stat.py
Nov 28 09:40:22 np0005538513.localdomain sudo[251927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:22 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:40:22 np0005538513.localdomain podman[251841]: 2025-11-28 09:40:22.987357921 +0000 UTC m=+0.215735462 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 09:40:22 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:40:23 np0005538513.localdomain python3.9[251929]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:23 np0005538513.localdomain sudo[251927]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:23 np0005538513.localdomain sudo[251984]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pwtwoikqdjkowmqkgaokxofqbwkjkuqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322822.658669-644-88803141716364/AnsiballZ_file.py
Nov 28 09:40:23 np0005538513.localdomain sudo[251984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:23 np0005538513.localdomain python3.9[251986]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:40:23 np0005538513.localdomain sudo[251984]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:24 np0005538513.localdomain sudo[252094]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ouyfcukzmbnmbstzivogbowciqesschv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322823.8115697-680-259353940301563/AnsiballZ_systemd.py
Nov 28 09:40:24 np0005538513.localdomain sudo[252094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:24 np0005538513.localdomain python3.9[252096]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:40:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:40:24 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:40:24 np0005538513.localdomain podman[252098]: 2025-11-28 09:40:24.698075146 +0000 UTC m=+0.098140155 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 09:40:24 np0005538513.localdomain systemd-rc-local-generator[252135]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:40:24 np0005538513.localdomain systemd-sysv-generator[252141]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:40:24 np0005538513.localdomain podman[252098]: 2025-11-28 09:40:24.742560971 +0000 UTC m=+0.142625970 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:40:24 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:24 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:24 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:24 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:24 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:40:24 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:24 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:24 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:24 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:24 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:40:25 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:25.289 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:26 np0005538513.localdomain systemd[1]: Starting Create netns directory...
Nov 28 09:40:26 np0005538513.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 09:40:26 np0005538513.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 09:40:26 np0005538513.localdomain systemd[1]: Finished Create netns directory.
Nov 28 09:40:26 np0005538513.localdomain sudo[252094]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:26 np0005538513.localdomain sudo[252265]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gswizpuqokhjuznqgpaanrtnmfhgzyel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322826.5226562-710-174936808512677/AnsiballZ_file.py
Nov 28 09:40:26 np0005538513.localdomain sudo[252265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:27 np0005538513.localdomain python3.9[252267]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:27 np0005538513.localdomain sudo[252265]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:27 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:27.425 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:27 np0005538513.localdomain sudo[252375]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pldtassjrmadnzvvdjdghvxgpdxlcxmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322827.250702-734-117592992469740/AnsiballZ_stat.py
Nov 28 09:40:27 np0005538513.localdomain sudo[252375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:27 np0005538513.localdomain python3.9[252377]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_sriov_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:27 np0005538513.localdomain sudo[252375]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:28 np0005538513.localdomain sudo[252463]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjytrpcfmksjfnhnkyyhoufqcjwgxmnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322827.250702-734-117592992469740/AnsiballZ_copy.py
Nov 28 09:40:28 np0005538513.localdomain sudo[252463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:28 np0005538513.localdomain python3.9[252465]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_sriov_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322827.250702-734-117592992469740/.source.json _original_basename=.40kk4yn5 follow=False checksum=a32073fdba4733b9ffe872cfb91708eff83a585a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:40:28 np0005538513.localdomain sudo[252463]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:28 np0005538513.localdomain sudo[252573]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvhojvdkxeghqmiqhomrcwocxkzvtkrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322828.454183-779-180293696729152/AnsiballZ_file.py
Nov 28 09:40:28 np0005538513.localdomain sudo[252573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:28 np0005538513.localdomain python3.9[252575]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:40:28 np0005538513.localdomain sudo[252573]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:29 np0005538513.localdomain sudo[252683]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-accypfxjcnvdovgitxyrwooytztapifd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322829.1786597-803-24530812686152/AnsiballZ_stat.py
Nov 28 09:40:29 np0005538513.localdomain sudo[252683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:29 np0005538513.localdomain sudo[252683]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:30 np0005538513.localdomain sudo[252771]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iyletmpycbvxocojpjvdilvcapafbouc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322829.1786597-803-24530812686152/AnsiballZ_copy.py
Nov 28 09:40:30 np0005538513.localdomain sudo[252771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:30 np0005538513.localdomain sudo[252771]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:30 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:30.331 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:31 np0005538513.localdomain sudo[252881]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpbiwfpacuitwfpmtscibjnrqdvtxjcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322830.6835861-854-276524362636363/AnsiballZ_container_config_data.py
Nov 28 09:40:31 np0005538513.localdomain sudo[252881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:31 np0005538513.localdomain python3.9[252883]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_pattern=*.json debug=False
Nov 28 09:40:31 np0005538513.localdomain sudo[252881]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:32 np0005538513.localdomain sudo[252991]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azjvvmhcithdzodufmjyanjcouwngcdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322831.575327-881-243470004867029/AnsiballZ_container_config_hash.py
Nov 28 09:40:32 np0005538513.localdomain sudo[252991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:40:32 np0005538513.localdomain systemd[1]: tmp-crun.S5aXP1.mount: Deactivated successfully.
Nov 28 09:40:32 np0005538513.localdomain podman[252994]: 2025-11-28 09:40:32.133294751 +0000 UTC m=+0.081613025 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:40:32 np0005538513.localdomain podman[252994]: 2025-11-28 09:40:32.167454445 +0000 UTC m=+0.115772689 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:40:32 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:40:32 np0005538513.localdomain python3.9[252993]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:40:32 np0005538513.localdomain sudo[252991]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:32 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22064 DF PROTO=TCP SPT=59194 DPT=9102 SEQ=4294502352 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB249DA50000000001030307) 
Nov 28 09:40:32 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:32.461 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:32 np0005538513.localdomain sudo[253124]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzftqxarnpndmsfvyppksejcijlbosqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322832.5090232-908-99557445247081/AnsiballZ_podman_container_info.py
Nov 28 09:40:32 np0005538513.localdomain sudo[253124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:33 np0005538513.localdomain python3.9[253126]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 28 09:40:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22065 DF PROTO=TCP SPT=59194 DPT=9102 SEQ=4294502352 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB24A1C30000000001030307) 
Nov 28 09:40:33 np0005538513.localdomain sudo[253124]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:34 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32718 DF PROTO=TCP SPT=36404 DPT=9102 SEQ=1545759116 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB24A5820000000001030307) 
Nov 28 09:40:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:40:34 np0005538513.localdomain podman[253171]: 2025-11-28 09:40:34.843627128 +0000 UTC m=+0.076043977 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 09:40:34 np0005538513.localdomain podman[253171]: 2025-11-28 09:40:34.860332234 +0000 UTC m=+0.092749063 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd)
Nov 28 09:40:34 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:40:35 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:35.335 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:35 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22066 DF PROTO=TCP SPT=59194 DPT=9102 SEQ=4294502352 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB24A9C20000000001030307) 
Nov 28 09:40:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25740 DF PROTO=TCP SPT=34254 DPT=9102 SEQ=480635340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB24AD820000000001030307) 
Nov 28 09:40:37 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:37.466 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:38 np0005538513.localdomain sudo[253282]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkibaenqumbkapljzpaxmslaohnuenst ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764322836.7285266-947-103925836384158/AnsiballZ_edpm_container_manage.py
Nov 28 09:40:38 np0005538513.localdomain sudo[253282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:38 np0005538513.localdomain python3[253284]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_id=neutron_sriov_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:40:38 np0005538513.localdomain podman[253322]: 
Nov 28 09:40:38 np0005538513.localdomain podman[253322]: 2025-11-28 09:40:38.7551576 +0000 UTC m=+0.089316950 container create d32a1c6ff574ca4f6aa0b39a9fbf199eefa3a2abe55188a744806476ffe220c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=neutron_sriov_agent, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2960208a5229c9e98585e4d0f71ed9dd7186c4f5218764069851d2bb4586b424'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_sriov_agent, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 09:40:38 np0005538513.localdomain podman[253322]: 2025-11-28 09:40:38.705615921 +0000 UTC m=+0.039775281 image pull  quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Nov 28 09:40:38 np0005538513.localdomain python3[253284]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_sriov_agent --conmon-pidfile /run/neutron_sriov_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=2960208a5229c9e98585e4d0f71ed9dd7186c4f5218764069851d2bb4586b424 --label config_id=neutron_sriov_agent --label container_name=neutron_sriov_agent --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2960208a5229c9e98585e4d0f71ed9dd7186c4f5218764069851d2bb4586b424'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user neutron --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Nov 28 09:40:38 np0005538513.localdomain sudo[253282]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:39 np0005538513.localdomain sudo[253467]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmmteegjrljaiyrcpdmnmounxjbhjslz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322839.1650994-971-276699862468671/AnsiballZ_stat.py
Nov 28 09:40:39 np0005538513.localdomain sudo[253467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22067 DF PROTO=TCP SPT=59194 DPT=9102 SEQ=4294502352 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB24B9820000000001030307) 
Nov 28 09:40:39 np0005538513.localdomain python3.9[253469]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:40:39 np0005538513.localdomain sudo[253467]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:40 np0005538513.localdomain podman[238687]: time="2025-11-28T09:40:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:40:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:40:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146306 "" "Go-http-client/1.1"
Nov 28 09:40:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:40:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16664 "" "Go-http-client/1.1"
Nov 28 09:40:40 np0005538513.localdomain sudo[253579]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xjcqppmvenanddjjhbetbiqxfwaplsyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322839.9788609-998-258600062947025/AnsiballZ_file.py
Nov 28 09:40:40 np0005538513.localdomain sudo[253579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:40 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:40.377 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:40 np0005538513.localdomain python3.9[253581]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:40:40 np0005538513.localdomain sudo[253579]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:40 np0005538513.localdomain sudo[253634]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aurklpsgupazkaomxaxjjtltklbxqquv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322839.9788609-998-258600062947025/AnsiballZ_stat.py
Nov 28 09:40:40 np0005538513.localdomain sudo[253634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:40 np0005538513.localdomain python3.9[253636]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:40:40 np0005538513.localdomain sudo[253634]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:41 np0005538513.localdomain sudo[253743]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofhnkpyzhuwfprfdwgqnvojzrszaypvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322840.9776845-998-185668980442825/AnsiballZ_copy.py
Nov 28 09:40:41 np0005538513.localdomain sudo[253743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:41 np0005538513.localdomain python3.9[253745]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764322840.9776845-998-185668980442825/source dest=/etc/systemd/system/edpm_neutron_sriov_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:40:41 np0005538513.localdomain sudo[253743]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:41 np0005538513.localdomain sudo[253798]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqvcbbvvuqoykcjhdzutrqytayiqgwqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322840.9776845-998-185668980442825/AnsiballZ_systemd.py
Nov 28 09:40:41 np0005538513.localdomain sudo[253798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:42 np0005538513.localdomain python3.9[253800]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:40:42 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:40:42 np0005538513.localdomain systemd-rc-local-generator[253821]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:40:42 np0005538513.localdomain systemd-sysv-generator[253825]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:40:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:40:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:42 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:42.469 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:42 np0005538513.localdomain sudo[253798]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:42 np0005538513.localdomain sudo[253888]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvpcdnugthucpahpfyaqlynymnytbhiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322840.9776845-998-185668980442825/AnsiballZ_systemd.py
Nov 28 09:40:42 np0005538513.localdomain sudo[253888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:43 np0005538513.localdomain python3.9[253890]: ansible-systemd Invoked with state=restarted name=edpm_neutron_sriov_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:40:43 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:40:43 np0005538513.localdomain systemd-rc-local-generator[253916]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:40:43 np0005538513.localdomain systemd-sysv-generator[253919]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:40:43 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:43 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:43 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:43 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:43 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:40:43 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:43 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:43 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:43 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:40:43 np0005538513.localdomain systemd[1]: Starting neutron_sriov_agent container...
Nov 28 09:40:43 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:40:43 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94d6fef9e693b65b9b1947b90220142a43ebab030c99d7a015e183c1f5152e1a/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 28 09:40:43 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94d6fef9e693b65b9b1947b90220142a43ebab030c99d7a015e183c1f5152e1a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 09:40:43 np0005538513.localdomain podman[253931]: 2025-11-28 09:40:43.829003131 +0000 UTC m=+0.126917970 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., release=1755695350, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, managed_by=edpm_ansible, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Nov 28 09:40:43 np0005538513.localdomain podman[253931]: 2025-11-28 09:40:43.846342983 +0000 UTC m=+0.144257842 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, version=9.6, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 09:40:43 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:40:43 np0005538513.localdomain podman[253932]: 2025-11-28 09:40:43.862354823 +0000 UTC m=+0.160981226 container init d32a1c6ff574ca4f6aa0b39a9fbf199eefa3a2abe55188a744806476ffe220c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2960208a5229c9e98585e4d0f71ed9dd7186c4f5218764069851d2bb4586b424'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:40:43 np0005538513.localdomain podman[253932]: 2025-11-28 09:40:43.87213391 +0000 UTC m=+0.170760313 container start d32a1c6ff574ca4f6aa0b39a9fbf199eefa3a2abe55188a744806476ffe220c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=neutron_sriov_agent, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2960208a5229c9e98585e4d0f71ed9dd7186c4f5218764069851d2bb4586b424'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:40:43 np0005538513.localdomain podman[253932]: neutron_sriov_agent
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: + sudo -E kolla_set_configs
Nov 28 09:40:43 np0005538513.localdomain systemd[1]: Started neutron_sriov_agent container.
Nov 28 09:40:43 np0005538513.localdomain sudo[253888]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: INFO:__main__:Validating config file
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: INFO:__main__:Copying service configuration files
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: INFO:__main__:Writing out command to execute
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/40d5da59-6201-424a-8380-80ecc3d67c7e.pid.haproxy
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/40d5da59-6201-424a-8380-80ecc3d67c7e.conf
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: ++ cat /run_command
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: + CMD=/usr/bin/neutron-sriov-nic-agent
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: + ARGS=
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: + sudo kolla_copy_cacerts
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: + [[ ! -n '' ]]
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: + . kolla_extend_start
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\'''
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: Running command: '/usr/bin/neutron-sriov-nic-agent'
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: + umask 0022
Nov 28 09:40:43 np0005538513.localdomain neutron_sriov_agent[253957]: + exec /usr/bin/neutron-sriov-nic-agent
Nov 28 09:40:44 np0005538513.localdomain sudo[254085]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vegjctkiwkaplktsmhfxfssibdhmtghb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322844.1585383-1082-102418186365671/AnsiballZ_systemd.py
Nov 28 09:40:44 np0005538513.localdomain sudo[254085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:44 np0005538513.localdomain python3.9[254088]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_sriov_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:40:44 np0005538513.localdomain systemd[1]: Stopping neutron_sriov_agent container...
Nov 28 09:40:44 np0005538513.localdomain systemd[1]: tmp-crun.0OW84F.mount: Deactivated successfully.
Nov 28 09:40:44 np0005538513.localdomain systemd[1]: libpod-d32a1c6ff574ca4f6aa0b39a9fbf199eefa3a2abe55188a744806476ffe220c6.scope: Deactivated successfully.
Nov 28 09:40:44 np0005538513.localdomain podman[254092]: 2025-11-28 09:40:44.861617594 +0000 UTC m=+0.088301487 container died d32a1c6ff574ca4f6aa0b39a9fbf199eefa3a2abe55188a744806476ffe220c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2960208a5229c9e98585e4d0f71ed9dd7186c4f5218764069851d2bb4586b424'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=neutron_sriov_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 09:40:44 np0005538513.localdomain podman[254092]: 2025-11-28 09:40:44.95486471 +0000 UTC m=+0.181548653 container cleanup d32a1c6ff574ca4f6aa0b39a9fbf199eefa3a2abe55188a744806476ffe220c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.schema-version=1.0, config_id=neutron_sriov_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2960208a5229c9e98585e4d0f71ed9dd7186c4f5218764069851d2bb4586b424'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_sriov_agent, tcib_managed=true)
Nov 28 09:40:44 np0005538513.localdomain podman[254092]: neutron_sriov_agent
Nov 28 09:40:44 np0005538513.localdomain podman[254105]: 2025-11-28 09:40:44.956988179 +0000 UTC m=+0.093702582 container cleanup d32a1c6ff574ca4f6aa0b39a9fbf199eefa3a2abe55188a744806476ffe220c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=neutron_sriov_agent, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2960208a5229c9e98585e4d0f71ed9dd7186c4f5218764069851d2bb4586b424'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:40:45 np0005538513.localdomain podman[254119]: 2025-11-28 09:40:45.025313676 +0000 UTC m=+0.037429095 container cleanup d32a1c6ff574ca4f6aa0b39a9fbf199eefa3a2abe55188a744806476ffe220c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, tcib_managed=true, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2960208a5229c9e98585e4d0f71ed9dd7186c4f5218764069851d2bb4586b424'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=neutron_sriov_agent)
Nov 28 09:40:45 np0005538513.localdomain podman[254119]: neutron_sriov_agent
Nov 28 09:40:45 np0005538513.localdomain systemd[1]: edpm_neutron_sriov_agent.service: Deactivated successfully.
Nov 28 09:40:45 np0005538513.localdomain systemd[1]: Stopped neutron_sriov_agent container.
Nov 28 09:40:45 np0005538513.localdomain systemd[1]: Starting neutron_sriov_agent container...
Nov 28 09:40:45 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:40:45 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94d6fef9e693b65b9b1947b90220142a43ebab030c99d7a015e183c1f5152e1a/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 28 09:40:45 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94d6fef9e693b65b9b1947b90220142a43ebab030c99d7a015e183c1f5152e1a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 09:40:45 np0005538513.localdomain podman[254131]: 2025-11-28 09:40:45.15205383 +0000 UTC m=+0.099646515 container init d32a1c6ff574ca4f6aa0b39a9fbf199eefa3a2abe55188a744806476ffe220c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2960208a5229c9e98585e4d0f71ed9dd7186c4f5218764069851d2bb4586b424'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_sriov_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=neutron_sriov_agent, org.label-schema.vendor=CentOS)
Nov 28 09:40:45 np0005538513.localdomain podman[254131]: 2025-11-28 09:40:45.159379348 +0000 UTC m=+0.106972063 container start d32a1c6ff574ca4f6aa0b39a9fbf199eefa3a2abe55188a744806476ffe220c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2960208a5229c9e98585e4d0f71ed9dd7186c4f5218764069851d2bb4586b424'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:40:45 np0005538513.localdomain podman[254131]: neutron_sriov_agent
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: + sudo -E kolla_set_configs
Nov 28 09:40:45 np0005538513.localdomain systemd[1]: Started neutron_sriov_agent container.
Nov 28 09:40:45 np0005538513.localdomain sudo[254085]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: INFO:__main__:Validating config file
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: INFO:__main__:Copying service configuration files
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: INFO:__main__:Writing out command to execute
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/40d5da59-6201-424a-8380-80ecc3d67c7e.pid.haproxy
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/40d5da59-6201-424a-8380-80ecc3d67c7e.conf
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: ++ cat /run_command
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: + CMD=/usr/bin/neutron-sriov-nic-agent
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: + ARGS=
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: + sudo kolla_copy_cacerts
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: + [[ ! -n '' ]]
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: + . kolla_extend_start
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: Running command: '/usr/bin/neutron-sriov-nic-agent'
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\'''
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: + umask 0022
Nov 28 09:40:45 np0005538513.localdomain neutron_sriov_agent[254147]: + exec /usr/bin/neutron-sriov-nic-agent
Nov 28 09:40:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:45.381 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:45.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:40:45 np0005538513.localdomain sshd[249013]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:40:45 np0005538513.localdomain systemd[1]: session-57.scope: Deactivated successfully.
Nov 28 09:40:45 np0005538513.localdomain systemd[1]: session-57.scope: Consumed 23.315s CPU time.
Nov 28 09:40:45 np0005538513.localdomain systemd-logind[764]: Session 57 logged out. Waiting for processes to exit.
Nov 28 09:40:45 np0005538513.localdomain systemd-logind[764]: Removed session 57.
Nov 28 09:40:45 np0005538513.localdomain systemd[1]: tmp-crun.SEbZvl.mount: Deactivated successfully.
Nov 28 09:40:46 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 09:40:46.843 2 INFO neutron.common.config [-] Logging enabled!
Nov 28 09:40:46 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 09:40:46.843 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev43
Nov 28 09:40:46 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 09:40:46.844 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}
Nov 28 09:40:46 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 09:40:46.845 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}
Nov 28 09:40:46 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 09:40:46.845 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}
Nov 28 09:40:46 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 09:40:46.845 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}
Nov 28 09:40:46 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 09:40:46.845 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005538513.localdomain'}
Nov 28 09:40:46 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 09:40:46.846 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-cda9fd6b-1ea8-487a-ad30-e5dcfe9d4ec2 - - - - - -] RPC agent_id: nic-switch-agent.np0005538513.localdomain
Nov 28 09:40:46 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 09:40:46.850 2 INFO neutron.agent.agent_extensions_manager [None req-cda9fd6b-1ea8-487a-ad30-e5dcfe9d4ec2 - - - - - -] Loaded agent extensions: ['qos']
Nov 28 09:40:46 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 09:40:46.850 2 INFO neutron.agent.agent_extensions_manager [None req-cda9fd6b-1ea8-487a-ad30-e5dcfe9d4ec2 - - - - - -] Initializing agent extension 'qos'
Nov 28 09:40:47 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 09:40:47.246 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-cda9fd6b-1ea8-487a-ad30-e5dcfe9d4ec2 - - - - - -] Agent initialized successfully, now running... 
Nov 28 09:40:47 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 09:40:47.247 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-cda9fd6b-1ea8-487a-ad30-e5dcfe9d4ec2 - - - - - -] SRIOV NIC Agent RPC Daemon Started!
Nov 28 09:40:47 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 09:40:47.248 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-cda9fd6b-1ea8-487a-ad30-e5dcfe9d4ec2 - - - - - -] Agent out of sync with plugin!
Nov 28 09:40:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:47.470 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22068 DF PROTO=TCP SPT=59194 DPT=9102 SEQ=4294502352 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB24D9820000000001030307) 
Nov 28 09:40:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:40:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:40:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:40:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:40:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:40:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:40:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:40:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:40:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:40:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:40:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:40:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:40:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:50.424 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:50.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:40:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:50.682 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:40:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:50.699 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:40:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:50.700 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:40:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:50.700 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:40:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:50.701 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:40:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:50.701 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:40:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:40:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:40:50.817 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:40:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:40:50.818 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:40:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:40:50.819 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:40:50 np0005538513.localdomain systemd[1]: tmp-crun.KOwvYD.mount: Deactivated successfully.
Nov 28 09:40:50 np0005538513.localdomain podman[254181]: 2025-11-28 09:40:50.863547946 +0000 UTC m=+0.093915760 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:40:50 np0005538513.localdomain podman[254181]: 2025-11-28 09:40:50.876345661 +0000 UTC m=+0.106713485 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:40:50 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:40:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:51.230 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.529s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:40:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:51.316 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:40:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:51.317 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:40:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:51.519 228337 WARNING nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:40:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:51.521 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12172MB free_disk=41.837093353271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:40:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:51.522 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:40:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:51.523 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:40:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:51.591 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:40:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:51.592 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:40:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:51.592 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:40:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:51.644 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:40:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:52.113 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:40:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:52.120 228337 DEBUG nova.compute.provider_tree [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:40:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:52.137 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:40:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:52.140 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:40:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:52.141 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.618s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:40:52 np0005538513.localdomain sshd[254248]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:40:52 np0005538513.localdomain sshd[254248]: Accepted publickey for zuul from 192.168.122.30 port 38254 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:40:52 np0005538513.localdomain systemd-logind[764]: New session 58 of user zuul.
Nov 28 09:40:52 np0005538513.localdomain systemd[1]: Started Session 58 of User zuul.
Nov 28 09:40:52 np0005538513.localdomain sshd[254248]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:40:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:52.498 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:53 np0005538513.localdomain python3.9[254359]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:40:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:40:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:40:53 np0005538513.localdomain systemd[1]: tmp-crun.GaC8rn.mount: Deactivated successfully.
Nov 28 09:40:53 np0005538513.localdomain podman[254382]: 2025-11-28 09:40:53.864451729 +0000 UTC m=+0.091469449 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:40:53 np0005538513.localdomain podman[254382]: 2025-11-28 09:40:53.871884561 +0000 UTC m=+0.098902271 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 28 09:40:53 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:40:53 np0005538513.localdomain systemd[1]: tmp-crun.cedtuU.mount: Deactivated successfully.
Nov 28 09:40:53 np0005538513.localdomain podman[254381]: 2025-11-28 09:40:53.963395761 +0000 UTC m=+0.192548510 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Nov 28 09:40:54 np0005538513.localdomain podman[254381]: 2025-11-28 09:40:54.037462885 +0000 UTC m=+0.266615684 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 28 09:40:54 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:40:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:54.140 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:40:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:54.143 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:40:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:54.168 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:40:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:54.168 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:40:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:54.169 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:40:54 np0005538513.localdomain sudo[254511]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blrotwrjpydavhvwiepnnlkrjawmiecb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322854.1841128-65-124441729461107/AnsiballZ_setup.py
Nov 28 09:40:54 np0005538513.localdomain sudo[254511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:54.552 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:40:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:54.553 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:40:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:54.553 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:40:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:54.554 228337 DEBUG nova.objects.instance [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:40:54 np0005538513.localdomain python3.9[254513]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:40:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:55.012 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:40:55 np0005538513.localdomain sudo[254511]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:55.163 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:40:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:55.164 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:40:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:55.165 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:40:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:55.165 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:40:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:55.166 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:40:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:55.166 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:40:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:55.427 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:55 np0005538513.localdomain sudo[254574]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-glwswjvdzuwkbpfmzrajlxtmtvqqcgfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322854.1841128-65-124441729461107/AnsiballZ_dnf.py
Nov 28 09:40:55 np0005538513.localdomain sudo[254574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:40:55 np0005538513.localdomain podman[254577]: 2025-11-28 09:40:55.652245322 +0000 UTC m=+0.083286613 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, config_id=edpm, org.label-schema.vendor=CentOS)
Nov 28 09:40:55 np0005538513.localdomain podman[254577]: 2025-11-28 09:40:55.663324211 +0000 UTC m=+0.094365582 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:40:55 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:40:55 np0005538513.localdomain python3.9[254576]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:40:57 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:57.502 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:40:58 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:40:58.683 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:40:58 np0005538513.localdomain sudo[254574]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:59 np0005538513.localdomain sudo[254706]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swhfmsdvcfytpjzzohtynursgtovcvve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322859.1221461-101-127119650357654/AnsiballZ_systemd.py
Nov 28 09:40:59 np0005538513.localdomain sudo[254706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:00 np0005538513.localdomain python3.9[254708]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 09:41:00 np0005538513.localdomain sudo[254706]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:00 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:00.462 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:00 np0005538513.localdomain sudo[254819]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmdkgjbthmjodxabjzrtebsjfxgtdbdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322860.2999852-128-216608877011957/AnsiballZ_file.py
Nov 28 09:41:00 np0005538513.localdomain sudo[254819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:00 np0005538513.localdomain python3.9[254821]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:00 np0005538513.localdomain sudo[254819]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:01 np0005538513.localdomain sudo[254929]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmlllxmnaoyehiurmobynidemyboqqjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322861.1182818-128-2278634595838/AnsiballZ_file.py
Nov 28 09:41:01 np0005538513.localdomain sudo[254929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:01 np0005538513.localdomain python3.9[254931]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:01 np0005538513.localdomain sudo[254929]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:02 np0005538513.localdomain sudo[255039]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwvpagvedxkvupdemhyozckrivmhtxtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322861.7519732-128-277165250000040/AnsiballZ_file.py
Nov 28 09:41:02 np0005538513.localdomain sudo[255039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:02 np0005538513.localdomain python3.9[255041]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:02 np0005538513.localdomain sudo[255039]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:02 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40556 DF PROTO=TCP SPT=33708 DPT=9102 SEQ=2153988774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2512D40000000001030307) 
Nov 28 09:41:02 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:02.533 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:02 np0005538513.localdomain sudo[255149]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idairunqwujzmvsceuazkmvhtidssxrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322862.3772094-128-36602724735653/AnsiballZ_file.py
Nov 28 09:41:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:41:02 np0005538513.localdomain sudo[255149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:02 np0005538513.localdomain systemd[1]: tmp-crun.O8jC6j.mount: Deactivated successfully.
Nov 28 09:41:02 np0005538513.localdomain podman[255151]: 2025-11-28 09:41:02.787841347 +0000 UTC m=+0.098037053 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:41:02 np0005538513.localdomain podman[255151]: 2025-11-28 09:41:02.800308581 +0000 UTC m=+0.110504287 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:41:02 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:41:02 np0005538513.localdomain python3.9[255152]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:02 np0005538513.localdomain sudo[255149]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40557 DF PROTO=TCP SPT=33708 DPT=9102 SEQ=2153988774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2516C20000000001030307) 
Nov 28 09:41:04 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22069 DF PROTO=TCP SPT=59194 DPT=9102 SEQ=4294502352 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2519820000000001030307) 
Nov 28 09:41:04 np0005538513.localdomain sudo[255283]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whszjtsovfywkwtteqjwxhjdhnerqfoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322863.8313823-128-86858516941573/AnsiballZ_file.py
Nov 28 09:41:04 np0005538513.localdomain sudo[255283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:04 np0005538513.localdomain python3.9[255285]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:04 np0005538513.localdomain sudo[255283]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:04 np0005538513.localdomain sudo[255393]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fykeirrccxfvcojqgwmybxllirjxhhbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322864.4863398-128-62304766411023/AnsiballZ_file.py
Nov 28 09:41:04 np0005538513.localdomain sudo[255393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:04 np0005538513.localdomain python3.9[255395]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ns-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:04 np0005538513.localdomain sudo[255393]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:05 np0005538513.localdomain sudo[255503]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qikatrwxrwoxhfigbeeoenpliisckdcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322865.0905702-128-70604602955937/AnsiballZ_file.py
Nov 28 09:41:05 np0005538513.localdomain sudo[255503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:41:05 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40558 DF PROTO=TCP SPT=33708 DPT=9102 SEQ=2153988774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB251EC20000000001030307) 
Nov 28 09:41:05 np0005538513.localdomain podman[255506]: 2025-11-28 09:41:05.459755283 +0000 UTC m=+0.096811733 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 09:41:05 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:05.466 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:05 np0005538513.localdomain podman[255506]: 2025-11-28 09:41:05.474322506 +0000 UTC m=+0.111378936 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 09:41:05 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:41:05 np0005538513.localdomain python3.9[255505]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:05 np0005538513.localdomain sudo[255503]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32719 DF PROTO=TCP SPT=36404 DPT=9102 SEQ=1545759116 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2523820000000001030307) 
Nov 28 09:41:06 np0005538513.localdomain sudo[255632]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxuuzmrtunbsubnonxnsbmeaambssicb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322866.4977038-278-176888053733553/AnsiballZ_stat.py
Nov 28 09:41:06 np0005538513.localdomain sudo[255632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:07 np0005538513.localdomain python3.9[255634]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/neutron_dhcp_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:07 np0005538513.localdomain sudo[255632]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:07 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:07.536 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:07 np0005538513.localdomain sudo[255720]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmwaifoederonxysxykpovbxltadjoln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322866.4977038-278-176888053733553/AnsiballZ_copy.py
Nov 28 09:41:07 np0005538513.localdomain sudo[255720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:07 np0005538513.localdomain python3.9[255722]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/neutron_dhcp_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322866.4977038-278-176888053733553/.source.yaml follow=False _original_basename=neutron_dhcp_agent.yaml.j2 checksum=3ebfe8ab1da42a1c6ca52429f61716009c5fd177 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:07 np0005538513.localdomain sudo[255720]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:08 np0005538513.localdomain python3.9[255830]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:09 np0005538513.localdomain python3.9[255916]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322868.1180243-323-134386153477879/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40559 DF PROTO=TCP SPT=33708 DPT=9102 SEQ=2153988774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB252E820000000001030307) 
Nov 28 09:41:09 np0005538513.localdomain python3.9[256024]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:10 np0005538513.localdomain podman[238687]: time="2025-11-28T09:41:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:41:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:41:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146307 "" "Go-http-client/1.1"
Nov 28 09:41:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:41:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16784 "" "Go-http-client/1.1"
Nov 28 09:41:10 np0005538513.localdomain python3.9[256110]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322869.266755-323-210375084791812/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:10 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:10.505 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:10 np0005538513.localdomain python3.9[256218]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron-dhcp-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:11 np0005538513.localdomain python3.9[256304]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron-dhcp-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322870.4484112-323-52438634521212/.source.conf follow=False _original_basename=neutron-dhcp-agent.conf.j2 checksum=9ed181e64410eef7eb78d577152a610102a86592 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:12 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:12.539 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:12 np0005538513.localdomain python3.9[256412]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/10-neutron-dhcp.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:13 np0005538513.localdomain python3.9[256498]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/10-neutron-dhcp.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322872.3160498-497-258555538557523/.source.conf _original_basename=10-neutron-dhcp.conf follow=False checksum=ecea6b6701c9be6f1d83be82edd3c16fe40b7bb4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:14 np0005538513.localdomain python3.9[256606]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:14 np0005538513.localdomain rsyslogd[759]: imjournal: 6598 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Nov 28 09:41:14 np0005538513.localdomain python3.9[256692]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322873.582973-542-34120828442781/.source follow=False _original_basename=haproxy.j2 checksum=e4288860049c1baef23f6e1bb6c6f91acb5432e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:14 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:41:14 np0005538513.localdomain sudo[256754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:41:14 np0005538513.localdomain sudo[256754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:41:14 np0005538513.localdomain sudo[256754]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:14 np0005538513.localdomain podman[256748]: 2025-11-28 09:41:14.850005722 +0000 UTC m=+0.083166410 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.openshift.expose-services=, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, distribution-scope=public, com.redhat.component=ubi9-minimal-container)
Nov 28 09:41:14 np0005538513.localdomain podman[256748]: 2025-11-28 09:41:14.888185131 +0000 UTC m=+0.121345809 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm)
Nov 28 09:41:14 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:41:14 np0005538513.localdomain sudo[256809]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:41:14 np0005538513.localdomain sudo[256809]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:41:15 np0005538513.localdomain python3.9[256856]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:15 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:15.507 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:15 np0005538513.localdomain sudo[256809]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:15 np0005538513.localdomain python3.9[256961]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322874.6889603-542-233357954995091/.source follow=False _original_basename=dnsmasq.j2 checksum=efc19f376a79c40570368e9c2b979cde746f1ea8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:16 np0005538513.localdomain python3.9[257081]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:17 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:17.540 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:17 np0005538513.localdomain python3.9[257136]: ansible-ansible.legacy.file Invoked with mode=0755 setype=container_file_t dest=/var/lib/neutron/kill_scripts/haproxy-kill _original_basename=kill-script.j2 recurse=False state=file path=/var/lib/neutron/kill_scripts/haproxy-kill force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40560 DF PROTO=TCP SPT=33708 DPT=9102 SEQ=2153988774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB254F820000000001030307) 
Nov 28 09:41:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:41:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:41:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:41:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:41:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:41:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:41:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:41:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:41:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:41:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:41:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:41:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:41:18 np0005538513.localdomain python3.9[257244]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/dnsmasq-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:18 np0005538513.localdomain sudo[257245]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:41:18 np0005538513.localdomain sudo[257245]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:41:18 np0005538513.localdomain sudo[257245]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:18 np0005538513.localdomain python3.9[257348]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/dnsmasq-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322877.7987523-629-127520689412943/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:20 np0005538513.localdomain python3.9[257456]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:41:20 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:20.540 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:20 np0005538513.localdomain sudo[257566]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrkquvbklihoodgxxqophqcapbtkazkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322880.390138-734-7013538152441/AnsiballZ_file.py
Nov 28 09:41:20 np0005538513.localdomain sudo[257566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:20 np0005538513.localdomain python3.9[257568]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:20 np0005538513.localdomain sudo[257566]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:21 np0005538513.localdomain sudo[257676]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmlroxnfhlucfaiwaakxmliausfvpqqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322881.0984495-758-147467373019950/AnsiballZ_stat.py
Nov 28 09:41:21 np0005538513.localdomain sudo[257676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:41:21 np0005538513.localdomain podman[257679]: 2025-11-28 09:41:21.474649704 +0000 UTC m=+0.083098658 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:41:21 np0005538513.localdomain podman[257679]: 2025-11-28 09:41:21.482681155 +0000 UTC m=+0.091130159 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:41:21 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:41:21 np0005538513.localdomain python3.9[257678]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:21 np0005538513.localdomain sudo[257676]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:21 np0005538513.localdomain sudo[257756]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjbavoccsbtdrgjeytwgomjkypcamvow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322881.0984495-758-147467373019950/AnsiballZ_file.py
Nov 28 09:41:21 np0005538513.localdomain sudo[257756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:22 np0005538513.localdomain python3.9[257758]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:22 np0005538513.localdomain sudo[257756]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:22 np0005538513.localdomain sudo[257866]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sklchoslsynpmpbifzitjfamwbloeurl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322882.2040532-758-261864202362507/AnsiballZ_stat.py
Nov 28 09:41:22 np0005538513.localdomain sudo[257866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:22 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:22.562 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:22 np0005538513.localdomain python3.9[257868]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:22 np0005538513.localdomain sudo[257866]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:22 np0005538513.localdomain sudo[257923]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efqdoglfdutqszuvqhetjyecvyirvvjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322882.2040532-758-261864202362507/AnsiballZ_file.py
Nov 28 09:41:22 np0005538513.localdomain sudo[257923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:23 np0005538513.localdomain python3.9[257925]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:23 np0005538513.localdomain sudo[257923]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:23 np0005538513.localdomain sudo[258033]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oronguwbiiivydjmazvvoxfoxfawujgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322883.4555962-827-113158109273327/AnsiballZ_file.py
Nov 28 09:41:23 np0005538513.localdomain sudo[258033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:23 np0005538513.localdomain python3.9[258035]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:41:23 np0005538513.localdomain sudo[258033]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:24 np0005538513.localdomain sudo[258143]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qilwznqscybqiyzpiojzwettzunvwxdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322884.1726575-851-210159405676804/AnsiballZ_stat.py
Nov 28 09:41:24 np0005538513.localdomain sudo[258143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:41:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:41:24 np0005538513.localdomain podman[258146]: 2025-11-28 09:41:24.559183002 +0000 UTC m=+0.082307431 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 09:41:24 np0005538513.localdomain podman[258147]: 2025-11-28 09:41:24.629496754 +0000 UTC m=+0.145790952 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:41:24 np0005538513.localdomain podman[258146]: 2025-11-28 09:41:24.639010684 +0000 UTC m=+0.162135123 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 28 09:41:24 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:41:24 np0005538513.localdomain podman[258147]: 2025-11-28 09:41:24.660351616 +0000 UTC m=+0.176645754 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 09:41:24 np0005538513.localdomain python3.9[258145]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:24 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:41:24 np0005538513.localdomain sudo[258143]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:24 np0005538513.localdomain sudo[258244]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olonvgibywinpfnghrklentakwmkxdni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322884.1726575-851-210159405676804/AnsiballZ_file.py
Nov 28 09:41:24 np0005538513.localdomain sudo[258244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:25 np0005538513.localdomain python3.9[258246]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:41:25 np0005538513.localdomain sudo[258244]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:25 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:25.543 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:25 np0005538513.localdomain sudo[258354]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eblyousqmwbeibgdyoqoczocmjrqzdnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322885.342551-887-254378569960856/AnsiballZ_stat.py
Nov 28 09:41:25 np0005538513.localdomain sudo[258354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:41:25 np0005538513.localdomain python3.9[258356]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:25 np0005538513.localdomain podman[258357]: 2025-11-28 09:41:25.85812528 +0000 UTC m=+0.086672024 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:41:25 np0005538513.localdomain sudo[258354]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:25 np0005538513.localdomain podman[258357]: 2025-11-28 09:41:25.89849027 +0000 UTC m=+0.127037014 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:41:25 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:41:26 np0005538513.localdomain sudo[258430]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shfiutakfdtfkjeicpyxaxyoyrhbbliz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322885.342551-887-254378569960856/AnsiballZ_file.py
Nov 28 09:41:26 np0005538513.localdomain sudo[258430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:26 np0005538513.localdomain python3.9[258432]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:41:26 np0005538513.localdomain sudo[258430]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:26 np0005538513.localdomain sudo[258540]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rowuleaohwqdlioyurcwbhrsyzygrldh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322886.4981422-923-231004244583577/AnsiballZ_systemd.py
Nov 28 09:41:26 np0005538513.localdomain sudo[258540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:27 np0005538513.localdomain python3.9[258542]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:41:27 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:41:27 np0005538513.localdomain systemd-rc-local-generator[258564]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:41:27 np0005538513.localdomain systemd-sysv-generator[258567]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:41:27 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:27 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:27 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:27 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:27 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:41:27 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:27 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:27 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:27 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:27 np0005538513.localdomain sudo[258540]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:27 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:27.601 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:28 np0005538513.localdomain sudo[258688]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-biedbxfitnykksvefcwkwswhfknxbwvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322887.7671838-947-88891446970965/AnsiballZ_stat.py
Nov 28 09:41:28 np0005538513.localdomain sudo[258688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:28 np0005538513.localdomain python3.9[258690]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:28 np0005538513.localdomain sudo[258688]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:28 np0005538513.localdomain sudo[258745]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cddpgkkewbcntrigeetvynewulvuxmrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322887.7671838-947-88891446970965/AnsiballZ_file.py
Nov 28 09:41:28 np0005538513.localdomain sudo[258745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:28 np0005538513.localdomain python3.9[258747]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:41:28 np0005538513.localdomain sudo[258745]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:29 np0005538513.localdomain sudo[258855]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xebdrxitxejmvsugphxytutghhfehxmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322888.8896294-983-266444454759344/AnsiballZ_stat.py
Nov 28 09:41:29 np0005538513.localdomain sudo[258855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:29 np0005538513.localdomain python3.9[258857]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:29 np0005538513.localdomain sudo[258855]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:29 np0005538513.localdomain sudo[258912]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-syhtmysxkdatelftwernfbkwmnnfyyew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322888.8896294-983-266444454759344/AnsiballZ_file.py
Nov 28 09:41:29 np0005538513.localdomain sudo[258912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:29 np0005538513.localdomain python3.9[258914]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:41:29 np0005538513.localdomain sudo[258912]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:30 np0005538513.localdomain sudo[259022]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-caywrwjykpivcaxpfsepnsipeoaneamk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322890.11546-1019-40424331568965/AnsiballZ_systemd.py
Nov 28 09:41:30 np0005538513.localdomain sudo[259022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:30 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:30.579 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:30 np0005538513.localdomain python3.9[259024]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:41:30 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:41:30 np0005538513.localdomain systemd-rc-local-generator[259046]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:41:30 np0005538513.localdomain systemd-sysv-generator[259053]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:41:30 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:30 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:30 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:30 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:30 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:41:30 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:30 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:30 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:30 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:31 np0005538513.localdomain systemd[1]: Starting Create netns directory...
Nov 28 09:41:31 np0005538513.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 09:41:31 np0005538513.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 09:41:31 np0005538513.localdomain systemd[1]: Finished Create netns directory.
Nov 28 09:41:31 np0005538513.localdomain sudo[259022]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:32 np0005538513.localdomain sudo[259174]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wurgomiweehwhgcsootlddymqvqdgvte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322891.698-1049-130729111738370/AnsiballZ_file.py
Nov 28 09:41:32 np0005538513.localdomain sudo[259174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:32 np0005538513.localdomain python3.9[259176]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:32 np0005538513.localdomain sudo[259174]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:32 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41788 DF PROTO=TCP SPT=43694 DPT=9102 SEQ=689482158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2588040000000001030307) 
Nov 28 09:41:32 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:32.632 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:32 np0005538513.localdomain sudo[259284]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqbqwuzlrotehxvgntrtaxlehjlgwpaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322892.4152684-1073-276583804516189/AnsiballZ_stat.py
Nov 28 09:41:32 np0005538513.localdomain sudo[259284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:32 np0005538513.localdomain python3.9[259286]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_dhcp_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:32 np0005538513.localdomain sudo[259284]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:33 np0005538513.localdomain sudo[259372]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-heqqmeyluyupjlbcwnkvfjxkutveiyzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322892.4152684-1073-276583804516189/AnsiballZ_copy.py
Nov 28 09:41:33 np0005538513.localdomain sudo[259372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:33 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:41:33 np0005538513.localdomain systemd[1]: tmp-crun.yUZ8UQ.mount: Deactivated successfully.
Nov 28 09:41:33 np0005538513.localdomain podman[259375]: 2025-11-28 09:41:33.370997439 +0000 UTC m=+0.094041953 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:41:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41789 DF PROTO=TCP SPT=43694 DPT=9102 SEQ=689482158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB258C020000000001030307) 
Nov 28 09:41:33 np0005538513.localdomain podman[259375]: 2025-11-28 09:41:33.419499463 +0000 UTC m=+0.142543927 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:41:33 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:41:33 np0005538513.localdomain python3.9[259374]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_dhcp_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322892.4152684-1073-276583804516189/.source.json _original_basename=.v9_fvik_ follow=False checksum=c62829c98c0f9e788d62f52aa71fba276cd98270 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:41:33 np0005538513.localdomain sudo[259372]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:33 np0005538513.localdomain sudo[259504]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ebnxskruoizcuvqtpcvuiicbsgwdglgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322893.6877885-1118-42678777982591/AnsiballZ_file.py
Nov 28 09:41:33 np0005538513.localdomain sudo[259504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:34 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40561 DF PROTO=TCP SPT=33708 DPT=9102 SEQ=2153988774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB258F820000000001030307) 
Nov 28 09:41:34 np0005538513.localdomain python3.9[259506]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_dhcp state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:41:34 np0005538513.localdomain sudo[259504]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:35 np0005538513.localdomain sudo[259614]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utllqiwiwgredpvzvbaqtkniarksmhil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322894.8040862-1142-12960114205260/AnsiballZ_stat.py
Nov 28 09:41:35 np0005538513.localdomain sudo[259614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:35 np0005538513.localdomain sudo[259614]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:35 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41790 DF PROTO=TCP SPT=43694 DPT=9102 SEQ=689482158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2594020000000001030307) 
Nov 28 09:41:35 np0005538513.localdomain sudo[259702]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aarpzanlrgkfccyzodykokcwxunzusoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322894.8040862-1142-12960114205260/AnsiballZ_copy.py
Nov 28 09:41:35 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:35.604 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:35 np0005538513.localdomain sudo[259702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:41:35 np0005538513.localdomain systemd[1]: tmp-crun.seBCVf.mount: Deactivated successfully.
Nov 28 09:41:35 np0005538513.localdomain podman[259705]: 2025-11-28 09:41:35.712491561 +0000 UTC m=+0.083478479 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3)
Nov 28 09:41:35 np0005538513.localdomain podman[259705]: 2025-11-28 09:41:35.748727307 +0000 UTC m=+0.119714205 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 28 09:41:35 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:41:35 np0005538513.localdomain sudo[259702]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22070 DF PROTO=TCP SPT=59194 DPT=9102 SEQ=4294502352 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2597820000000001030307) 
Nov 28 09:41:36 np0005538513.localdomain sudo[259831]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-szlwgyceippaocuhgfzxxtbyvjebaipw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322896.2753046-1193-199271117115567/AnsiballZ_container_config_data.py
Nov 28 09:41:36 np0005538513.localdomain sudo[259831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:36 np0005538513.localdomain python3.9[259833]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_pattern=*.json debug=False
Nov 28 09:41:36 np0005538513.localdomain sudo[259831]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:37 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:37.644 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:37 np0005538513.localdomain sudo[259941]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-myihaeqwglnzfistqyalaajvojgzheqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322897.2447674-1220-186050922165894/AnsiballZ_container_config_hash.py
Nov 28 09:41:37 np0005538513.localdomain sudo[259941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:37 np0005538513.localdomain python3.9[259943]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:41:37 np0005538513.localdomain sudo[259941]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:38 np0005538513.localdomain sudo[260051]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vpmwxbetpuxasjxgkinkjqxnwqrsmywj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322898.2712924-1247-167632478180855/AnsiballZ_podman_container_info.py
Nov 28 09:41:38 np0005538513.localdomain sudo[260051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:38 np0005538513.localdomain python3.9[260053]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 28 09:41:39 np0005538513.localdomain sudo[260051]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41791 DF PROTO=TCP SPT=43694 DPT=9102 SEQ=689482158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB25A3C20000000001030307) 
Nov 28 09:41:40 np0005538513.localdomain podman[238687]: time="2025-11-28T09:41:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:41:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:41:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146307 "" "Go-http-client/1.1"
Nov 28 09:41:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:41:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16788 "" "Go-http-client/1.1"
Nov 28 09:41:40 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:40.633 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:42 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:42.659 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:44 np0005538513.localdomain sudo[260188]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfgpdbhzbqfyibjsbfcqfqmodouktevu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764322903.6443117-1286-223617020933591/AnsiballZ_edpm_container_manage.py
Nov 28 09:41:44 np0005538513.localdomain sudo[260188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:44 np0005538513.localdomain python3[260190]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_id=neutron_dhcp config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:41:44 np0005538513.localdomain podman[260229]: 
Nov 28 09:41:44 np0005538513.localdomain podman[260229]: 2025-11-28 09:41:44.698335756 +0000 UTC m=+0.094912111 container create 41dbdfa347a213d049fb00ac3c34299901fd76ab2d5dd16b179c3895a0118a85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, config_id=neutron_dhcp, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5020394c74ae3012d879e5ff3e11d0579506779b70afd59ced150226597d9373'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, container_name=neutron_dhcp_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 09:41:44 np0005538513.localdomain podman[260229]: 2025-11-28 09:41:44.648277841 +0000 UTC m=+0.044854296 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 09:41:44 np0005538513.localdomain python3[260190]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_dhcp_agent --cgroupns=host --conmon-pidfile /run/neutron_dhcp_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=5020394c74ae3012d879e5ff3e11d0579506779b70afd59ced150226597d9373 --label config_id=neutron_dhcp --label container_name=neutron_dhcp_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5020394c74ae3012d879e5ff3e11d0579506779b70afd59ced150226597d9373'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/netns:/run/netns:shared --volume /var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 09:41:44 np0005538513.localdomain sudo[260188]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:45 np0005538513.localdomain sudo[260373]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-comzkvlkurerfeprudfeabnfrobfvarp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322905.0703058-1310-4426655727578/AnsiballZ_stat.py
Nov 28 09:41:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:41:45 np0005538513.localdomain sudo[260373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:45 np0005538513.localdomain podman[260375]: 2025-11-28 09:41:45.461351819 +0000 UTC m=+0.083847851 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, name=ubi9-minimal, release=1755695350, vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Nov 28 09:41:45 np0005538513.localdomain podman[260375]: 2025-11-28 09:41:45.474806946 +0000 UTC m=+0.097302968 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Nov 28 09:41:45 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:41:45 np0005538513.localdomain python3.9[260376]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:41:45 np0005538513.localdomain sudo[260373]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:45.636 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:46 np0005538513.localdomain sudo[260505]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rypkinghquxcyoajjnngrprxxivsndhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322905.9335504-1337-85648442905521/AnsiballZ_file.py
Nov 28 09:41:46 np0005538513.localdomain sudo[260505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:46 np0005538513.localdomain python3.9[260507]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:41:46 np0005538513.localdomain sudo[260505]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:46 np0005538513.localdomain sudo[260560]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwbdzcwseuqvdichikmveiqbzurlxdoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322905.9335504-1337-85648442905521/AnsiballZ_stat.py
Nov 28 09:41:46 np0005538513.localdomain sudo[260560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:46.680 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:41:46 np0005538513.localdomain python3.9[260562]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:41:46 np0005538513.localdomain sudo[260560]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:47 np0005538513.localdomain sudo[260669]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwsdbbebttvknrvafiswoacixvfmhzak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322906.910816-1337-43812824345335/AnsiballZ_copy.py
Nov 28 09:41:47 np0005538513.localdomain sudo[260669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:47 np0005538513.localdomain python3.9[260671]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764322906.910816-1337-43812824345335/source dest=/etc/systemd/system/edpm_neutron_dhcp_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:41:47 np0005538513.localdomain sudo[260669]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41792 DF PROTO=TCP SPT=43694 DPT=9102 SEQ=689482158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB25C3820000000001030307) 
Nov 28 09:41:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:47.687 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:47 np0005538513.localdomain sudo[260724]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-muuiyfjsxtardwjrbsibolmyzryxgzup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322906.910816-1337-43812824345335/AnsiballZ_systemd.py
Nov 28 09:41:47 np0005538513.localdomain sudo[260724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:41:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:41:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:41:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:41:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:41:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:41:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:41:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:41:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:41:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:41:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:41:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:41:48 np0005538513.localdomain python3.9[260726]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:41:48 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:41:48 np0005538513.localdomain systemd-rc-local-generator[260748]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:41:48 np0005538513.localdomain systemd-sysv-generator[260753]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:41:48 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:48 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:48 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:48 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:48 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:41:48 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:48 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:48 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:48 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:48 np0005538513.localdomain sudo[260724]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:48 np0005538513.localdomain sudo[260815]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zokiqoqjyqlamcpfmknjhxnglsuyygfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322906.910816-1337-43812824345335/AnsiballZ_systemd.py
Nov 28 09:41:48 np0005538513.localdomain sudo[260815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:49 np0005538513.localdomain python3.9[260817]: ansible-systemd Invoked with state=restarted name=edpm_neutron_dhcp_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:41:49 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:41:49 np0005538513.localdomain systemd-rc-local-generator[260845]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:41:49 np0005538513.localdomain systemd-sysv-generator[260850]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:41:49 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:49 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:49 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:49 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:49 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:41:49 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:49 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:49 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:49 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:49 np0005538513.localdomain systemd[1]: Starting neutron_dhcp_agent container...
Nov 28 09:41:49 np0005538513.localdomain systemd[1]: tmp-crun.XmGJKY.mount: Deactivated successfully.
Nov 28 09:41:49 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:41:49 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83921fe5428134accaa21c01a209a23cffcc43ff5cc885ed88771872ecf2a1dd/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 28 09:41:49 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83921fe5428134accaa21c01a209a23cffcc43ff5cc885ed88771872ecf2a1dd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 09:41:49 np0005538513.localdomain podman[260859]: 2025-11-28 09:41:49.672295966 +0000 UTC m=+0.129017508 container init 41dbdfa347a213d049fb00ac3c34299901fd76ab2d5dd16b179c3895a0118a85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5020394c74ae3012d879e5ff3e11d0579506779b70afd59ced150226597d9373'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=neutron_dhcp_agent, config_id=neutron_dhcp, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: + sudo -E kolla_set_configs
Nov 28 09:41:49 np0005538513.localdomain podman[260859]: 2025-11-28 09:41:49.688652716 +0000 UTC m=+0.145374258 container start 41dbdfa347a213d049fb00ac3c34299901fd76ab2d5dd16b179c3895a0118a85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=neutron_dhcp, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5020394c74ae3012d879e5ff3e11d0579506779b70afd59ced150226597d9373'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_dhcp_agent)
Nov 28 09:41:49 np0005538513.localdomain podman[260859]: neutron_dhcp_agent
Nov 28 09:41:49 np0005538513.localdomain systemd[1]: Started neutron_dhcp_agent container.
Nov 28 09:41:49 np0005538513.localdomain sudo[260815]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Validating config file
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Copying service configuration files
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Writing out command to execute
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/a99cc267a5c3ade03c88b3bb0a43299c9bb62825df6f4ca0c30c03cccfac55c1
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/40d5da59-6201-424a-8380-80ecc3d67c7e.pid.haproxy
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/40d5da59-6201-424a-8380-80ecc3d67c7e.conf
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: ++ cat /run_command
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: + CMD=/usr/bin/neutron-dhcp-agent
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: + ARGS=
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: + sudo kolla_copy_cacerts
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: + [[ ! -n '' ]]
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: + . kolla_extend_start
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\'''
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: Running command: '/usr/bin/neutron-dhcp-agent'
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: + umask 0022
Nov 28 09:41:49 np0005538513.localdomain neutron_dhcp_agent[260873]: + exec /usr/bin/neutron-dhcp-agent
Nov 28 09:41:50 np0005538513.localdomain sudo[260994]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rohxmsyypatuxslndpggbzinngtbnbgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322909.9416823-1421-209264318840967/AnsiballZ_systemd.py
Nov 28 09:41:50 np0005538513.localdomain sudo[260994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:50 np0005538513.localdomain python3.9[260997]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_dhcp_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:41:50 np0005538513.localdomain systemd[1]: Stopping neutron_dhcp_agent container...
Nov 28 09:41:50 np0005538513.localdomain systemd[1]: tmp-crun.ZIc1A9.mount: Deactivated successfully.
Nov 28 09:41:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:50.680 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:41:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:50.693 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:50 np0005538513.localdomain systemd[1]: libpod-41dbdfa347a213d049fb00ac3c34299901fd76ab2d5dd16b179c3895a0118a85.scope: Deactivated successfully.
Nov 28 09:41:50 np0005538513.localdomain systemd[1]: libpod-41dbdfa347a213d049fb00ac3c34299901fd76ab2d5dd16b179c3895a0118a85.scope: Consumed 1.010s CPU time.
Nov 28 09:41:50 np0005538513.localdomain podman[261001]: 2025-11-28 09:41:50.703578996 +0000 UTC m=+0.119188020 container died 41dbdfa347a213d049fb00ac3c34299901fd76ab2d5dd16b179c3895a0118a85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, container_name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5020394c74ae3012d879e5ff3e11d0579506779b70afd59ced150226597d9373'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.schema-version=1.0, config_id=neutron_dhcp, org.label-schema.vendor=CentOS)
Nov 28 09:41:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:50.716 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:41:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:50.716 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:41:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:50.716 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:41:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:50.717 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:41:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:50.717 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:41:50 np0005538513.localdomain podman[261001]: 2025-11-28 09:41:50.799208299 +0000 UTC m=+0.214817323 container cleanup 41dbdfa347a213d049fb00ac3c34299901fd76ab2d5dd16b179c3895a0118a85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5020394c74ae3012d879e5ff3e11d0579506779b70afd59ced150226597d9373'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=neutron_dhcp_agent)
Nov 28 09:41:50 np0005538513.localdomain podman[261001]: neutron_dhcp_agent
Nov 28 09:41:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:41:50.818 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:41:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:41:50.818 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:41:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:41:50.819 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:41:50 np0005538513.localdomain podman[261065]: error opening file `/run/crun/41dbdfa347a213d049fb00ac3c34299901fd76ab2d5dd16b179c3895a0118a85/status`: No such file or directory
Nov 28 09:41:50 np0005538513.localdomain podman[261035]: 2025-11-28 09:41:50.912798325 +0000 UTC m=+0.070798328 container cleanup 41dbdfa347a213d049fb00ac3c34299901fd76ab2d5dd16b179c3895a0118a85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, container_name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=neutron_dhcp, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5020394c74ae3012d879e5ff3e11d0579506779b70afd59ced150226597d9373'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']})
Nov 28 09:41:50 np0005538513.localdomain podman[261035]: neutron_dhcp_agent
Nov 28 09:41:50 np0005538513.localdomain systemd[1]: edpm_neutron_dhcp_agent.service: Deactivated successfully.
Nov 28 09:41:50 np0005538513.localdomain systemd[1]: Stopped neutron_dhcp_agent container.
Nov 28 09:41:50 np0005538513.localdomain systemd[1]: Starting neutron_dhcp_agent container...
Nov 28 09:41:51 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:41:51 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83921fe5428134accaa21c01a209a23cffcc43ff5cc885ed88771872ecf2a1dd/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 28 09:41:51 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83921fe5428134accaa21c01a209a23cffcc43ff5cc885ed88771872ecf2a1dd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 09:41:51 np0005538513.localdomain podman[261067]: 2025-11-28 09:41:51.063743484 +0000 UTC m=+0.112593645 container init 41dbdfa347a213d049fb00ac3c34299901fd76ab2d5dd16b179c3895a0118a85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5020394c74ae3012d879e5ff3e11d0579506779b70afd59ced150226597d9373'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:41:51 np0005538513.localdomain podman[261067]: 2025-11-28 09:41:51.072659733 +0000 UTC m=+0.121509854 container start 41dbdfa347a213d049fb00ac3c34299901fd76ab2d5dd16b179c3895a0118a85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=neutron_dhcp_agent, org.label-schema.build-date=20251125, tcib_managed=true, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5020394c74ae3012d879e5ff3e11d0579506779b70afd59ced150226597d9373'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']})
Nov 28 09:41:51 np0005538513.localdomain podman[261067]: neutron_dhcp_agent
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: + sudo -E kolla_set_configs
Nov 28 09:41:51 np0005538513.localdomain systemd[1]: Started neutron_dhcp_agent container.
Nov 28 09:41:51 np0005538513.localdomain sudo[260994]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:51.143 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Validating config file
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Copying service configuration files
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Writing out command to execute
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/a99cc267a5c3ade03c88b3bb0a43299c9bb62825df6f4ca0c30c03cccfac55c1
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/40d5da59-6201-424a-8380-80ecc3d67c7e.pid.haproxy
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/40d5da59-6201-424a-8380-80ecc3d67c7e.conf
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: ++ cat /run_command
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: + CMD=/usr/bin/neutron-dhcp-agent
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: + ARGS=
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: + sudo kolla_copy_cacerts
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: + [[ ! -n '' ]]
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: + . kolla_extend_start
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: Running command: '/usr/bin/neutron-dhcp-agent'
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\'''
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: + umask 0022
Nov 28 09:41:51 np0005538513.localdomain neutron_dhcp_agent[261080]: + exec /usr/bin/neutron-dhcp-agent
Nov 28 09:41:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:51.213 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:41:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:51.213 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:41:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:51.384 228337 WARNING nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:41:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:51.385 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12201MB free_disk=41.837093353271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:41:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:51.385 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:41:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:51.385 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:41:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:51.459 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:41:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:51.459 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:41:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:51.460 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:41:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:51.492 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:41:51 np0005538513.localdomain sshd[254248]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:41:51 np0005538513.localdomain systemd[1]: session-58.scope: Deactivated successfully.
Nov 28 09:41:51 np0005538513.localdomain systemd[1]: session-58.scope: Consumed 33.814s CPU time.
Nov 28 09:41:51 np0005538513.localdomain systemd-logind[764]: Session 58 logged out. Waiting for processes to exit.
Nov 28 09:41:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:41:51 np0005538513.localdomain systemd-logind[764]: Removed session 58.
Nov 28 09:41:51 np0005538513.localdomain systemd[1]: tmp-crun.h8bdlP.mount: Deactivated successfully.
Nov 28 09:41:51 np0005538513.localdomain podman[261114]: 2025-11-28 09:41:51.646525618 +0000 UTC m=+0.090058583 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:41:51 np0005538513.localdomain podman[261114]: 2025-11-28 09:41:51.657511195 +0000 UTC m=+0.101044200 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:41:51 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:41:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:51.949 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:41:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:51.955 228337 DEBUG nova.compute.provider_tree [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:41:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:51.972 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:41:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:51.975 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:41:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:51.975 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.590s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:41:52 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:41:52.344 261084 INFO neutron.common.config [-] Logging enabled!
Nov 28 09:41:52 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:41:52.344 261084 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev43
Nov 28 09:41:52 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:41:52.704 261084 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Nov 28 09:41:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:52.734 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:52.976 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:41:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:41:53.592 261084 INFO neutron.agent.dhcp.agent [None req-bd797fc6-33fd-41fe-a146-138b235f669c - - - - - -] All active networks have been fetched through RPC.
Nov 28 09:41:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:41:53.592 261084 INFO neutron.agent.dhcp.agent [None req-bd797fc6-33fd-41fe-a146-138b235f669c - - - - - -] Synchronizing state complete
Nov 28 09:41:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:41:53.644 261084 INFO neutron.agent.dhcp.agent [None req-bd797fc6-33fd-41fe-a146-138b235f669c - - - - - -] DHCP agent started
Nov 28 09:41:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:53.677 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:41:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:41:54.258 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 09:41:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:41:54.259 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 09:41:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:41:54.261 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:41:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:54.293 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:54.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:41:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:54.681 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:41:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:54.682 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:41:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:54.744 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:41:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:54.744 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:41:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:54.745 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:41:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:54.745 228337 DEBUG nova.objects.instance [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:41:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:41:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:41:54 np0005538513.localdomain podman[261161]: 2025-11-28 09:41:54.838270007 +0000 UTC m=+0.075929176 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 28 09:41:54 np0005538513.localdomain podman[261162]: 2025-11-28 09:41:54.894203522 +0000 UTC m=+0.127740607 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:41:54 np0005538513.localdomain podman[261162]: 2025-11-28 09:41:54.898406018 +0000 UTC m=+0.131943123 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 09:41:54 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:41:54 np0005538513.localdomain podman[261161]: 2025-11-28 09:41:54.950144857 +0000 UTC m=+0.187804036 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 28 09:41:54 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:41:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:55.695 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:41:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:41:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:56.791 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:41:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:56.818 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:41:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:56.819 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:41:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:56.819 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:41:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:56.820 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:41:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:56.820 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:41:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:56.820 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:41:56 np0005538513.localdomain podman[261205]: 2025-11-28 09:41:56.851286978 +0000 UTC m=+0.083880383 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm)
Nov 28 09:41:56 np0005538513.localdomain podman[261205]: 2025-11-28 09:41:56.860181647 +0000 UTC m=+0.092775032 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:41:56 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:41:57 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:41:57.738 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.669 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.670 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 09:42:00 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:00.682 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.685 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.685 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8397e059-e03f-4c2c-b93c-7efc7b229807', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:42:00.670801', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7d12023a-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.842700574, 'message_signature': '7272ee69b6c3ad5790e25d857bf8f3f6cc3fb6d0bcaa1fb6cb6ab0b44bfb9c16'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:42:00.670801', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7d1215e0-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.842700574, 'message_signature': '459e264602b02946342039ad59db6f28e22fcbafbc4631d6c0a4381332bd65ae'}]}, 'timestamp': '2025-11-28 09:42:00.686344', '_unique_id': '191fd58ed7b14f6abdb80306ca840bc2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.688 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.692 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7940f4ba-5fd7-47f7-bb44-59f587d6da61', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:42:00.689045', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '7d1318be-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.860945475, 'message_signature': 'a276e1fefc3092284babd1c473b142e802014933671d22ca806e0a9960554599'}]}, 'timestamp': '2025-11-28 09:42:00.692994', '_unique_id': 'e9df2e1c49904e73b2becdc2e2fd2a9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.695 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 09:42:00 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:00.725 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.756 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.756 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c25dc02-7029-4cc6-8760-2e6978bee3b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:42:00.695276', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7d1ccc60-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': '19bca1fb4f2f6a3b8fdabd527f6867aec17f712198346aa8c99f8bf82f090d93'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:42:00.695276', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7d1cdd18-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': 'dfc4e801fb6f2ed7745d687affcd094e865b1b0c94c65bd32c7f7489dbf31e90'}]}, 'timestamp': '2025-11-28 09:42:00.756967', '_unique_id': 'da07a98f340f4a07b990431baaa92efc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.759 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.759 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 89 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'adcbcd3d-e864-43ca-949b-7446bb6d6d71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 89, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:42:00.759214', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '7d1d45be-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.860945475, 'message_signature': '12f318c3b9463c4b16a303c4dd4b2703d84ab1979855008e9c69e231d380b3f2'}]}, 'timestamp': '2025-11-28 09:42:00.759677', '_unique_id': '444249e46bb24d1b82a5ed885966ef20'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.762 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.762 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.762 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b248eb55-a3dc-493d-afe1-a19c9eba1827', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:42:00.762177', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7d1db92c-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': '9a3d065680b2c9102d520b2075b996404173d4d8cc76ef6f4b016f2eef1ba69d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:42:00.762177', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7d1dc944-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': '645de6d23ac77140be779fb1bc8039a0986d4af52387a37c2c6a4eadf8aec074'}]}, 'timestamp': '2025-11-28 09:42:00.763010', '_unique_id': '22bceb57404f4ac1881119ffc8ef9277'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.765 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.765 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba1bb1fe-174f-4e8f-add4-39db5d61820f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:42:00.765383', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '7d1e36ea-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.860945475, 'message_signature': '37f2c2f172b0568edb7fff26cdcbd9d69ad80fd770c7d78df269aca6e8b89fe1'}]}, 'timestamp': '2025-11-28 09:42:00.765852', '_unique_id': '491b698a778f481ab38deec370663bc8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.767 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.768 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1313024378 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.768 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 168520223 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae62639f-9bb6-43ae-af7c-9a49824ce051', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1313024378, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:42:00.768083', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7d1ea0da-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': '3e87c5c1b76b133f6b668ddf6ae1b4827d41c8c65d52e1aded3c79078d535598'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 168520223, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:42:00.768083', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7d1eb16a-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': '25d7ed23ff8ba5d911fed983dbe68ddc66310d43056d62a27fd60cbd1c4f5dec'}]}, 'timestamp': '2025-11-28 09:42:00.768957', '_unique_id': '49706601d5554413b1cca1bda2482302'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.771 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.771 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.771 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6758c450-1ec0-4d51-a1d0-0ed94312bd40', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:42:00.771405', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '7d1f2276-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.860945475, 'message_signature': '9aad6023e07961ed1e05a3bddfe45581b3f733d616aa1632b8974e1fba53474f'}]}, 'timestamp': '2025-11-28 09:42:00.771912', '_unique_id': '2cca8b4ad6e847428515e5fb7dd1f7ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.773 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.774 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.774 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b30325f2-55ba-41dd-a08e-fbf478b8574e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:42:00.774089', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7d1f8a5e-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.842700574, 'message_signature': '6cd83d7bfeca754f996321fe9539723382d4098c8944f3d02252cb2100d66dc6'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:42:00.774089', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7d1f9a08-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.842700574, 'message_signature': '586db6e4d3271d181e4021788f149eeddee41f96535f7f0518691a16b17e34f9'}]}, 'timestamp': '2025-11-28 09:42:00.774909', '_unique_id': '2a29994f9fcc482f8a85a6541344e36e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.777 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.777 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 266 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64cc9f67-46d6-4347-b9d4-d1f9f472db5d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 266, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:42:00.777176', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '7d200312-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.860945475, 'message_signature': 'a2df22ad6e6c948c653535a1a268bfd9e70a172e3eba4228880586f8d9cd74a9'}]}, 'timestamp': '2025-11-28 09:42:00.777627', '_unique_id': '149bc3c9a80549249689902f64043eb0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.779 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.799 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 49930000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21b4c679-f126-4515-b78d-f131268bbb0a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 49930000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:42:00.779756', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '7d236d86-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.971296987, 'message_signature': '63d0ad2f14cdc76fde1ad58e9c7f56fb1539bfa45a8e0ab7576f5936f57f5445'}]}, 'timestamp': '2025-11-28 09:42:00.800128', '_unique_id': '502f7c8b5afc4f389ee2d2ab9b864b7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.802 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.802 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.802 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 73981952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.802 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b91abf40-df2e-40e3-9785-ed2537c4af2e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73981952, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:42:00.802455', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7d23dec4-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': '1f5b2327931a5ed5d67c3bbcda184724c6847584b3c8a8e2c1468c8eacf807a7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:42:00.802455', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7d23efb8-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': '94c0db7d1b9e243c931870539d6e77a08309418cea8ce5a561e56472cd962e7c'}]}, 'timestamp': '2025-11-28 09:42:00.803317', '_unique_id': '60f00974a46741fc8632f0335d759007'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.805 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.805 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0801ef0-a08f-4721-866c-77ddf55bee09', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:42:00.805468', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '7d245444-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.860945475, 'message_signature': 'c05071d6702786a21acfce26e912183e9aa8354f506752f21f6f3de4f0f16134'}]}, 'timestamp': '2025-11-28 09:42:00.805918', '_unique_id': '4e4cf6ff60a64bb0884ae4bcc5bf2597'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.807 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.808 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.808 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '258ea80d-c571-4cd0-9b78-48efa0006e5f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:42:00.807981', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7d24b754-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.842700574, 'message_signature': 'eca16226cca8aefb6db81ba34beaf0aa136400d32fb766710725de53e110a3ca'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:42:00.807981', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7d24c71c-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.842700574, 'message_signature': '075b1d9e137f1fc91a796d761bccb5e6583fb32cb2f18c9eae8dc8c28e639874'}]}, 'timestamp': '2025-11-28 09:42:00.808832', '_unique_id': '786a9e33d5e045979cf832bfc77aecfa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.810 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.811 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18bfa783-aaaa-4d84-ad3c-4c0ae47aa724', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:42:00.811062', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '7d252ef0-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.860945475, 'message_signature': '7604c64c510f31a6e92ebf85951f6a037eae55f3bd44cb897f25344068675bd7'}]}, 'timestamp': '2025-11-28 09:42:00.811520', '_unique_id': '28bbb5aa682f486cbae6f7d86874fd68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.813 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.813 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 566 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.814 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f32572f-b918-402b-be1b-5c7116282906', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 566, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:42:00.813797', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7d259ac0-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': 'e9bff0c12ca938a667dd9907508ed7728c053746447e3cc097118ef3b14f49b9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:42:00.813797', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7d25ac9a-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': '97d0010aad7cbc4f1cbe3c18902e26c67624e9738286584eabb7b6f422cbc507'}]}, 'timestamp': '2025-11-28 09:42:00.814713', '_unique_id': '67040096a6f143048b27e29cba94d3e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.816 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.816 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 52.40234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '971a4b5b-4e4e-4ca9-beaa-0602f4a2408b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.40234375, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:42:00.816888', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '7d26170c-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.971296987, 'message_signature': 'fe36d818741872586e848023535130a1086e8ad8b7fc677acc3604e930d05885'}]}, 'timestamp': '2025-11-28 09:42:00.817462', '_unique_id': '5bab9f13f0d34873981b6b300cdc74a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.819 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.819 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.819 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 9441 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85cfd3aa-b732-46db-813e-74119f7f5860', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9441, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:42:00.819795', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '7d268764-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.860945475, 'message_signature': 'b7fa53f9ef7963ba14cae7a6cb5247485f3d1c4f48aebd5680ad761b1d618703'}]}, 'timestamp': '2025-11-28 09:42:00.820387', '_unique_id': '0e3ad8da8cc94bf9bd33021015f6d9f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.822 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.822 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 12742 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07886e67-231e-49dd-8abf-98ed5deea5de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12742, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:42:00.822801', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '7d26fe1a-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.860945475, 'message_signature': '8557ff7eeb67763164e885aedf0d152eb40d6dda3548a930f699df87bc767518'}]}, 'timestamp': '2025-11-28 09:42:00.823382', '_unique_id': '8a232ed842984347b982adab06537e3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.825 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.825 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3037b332-8fcf-4c4f-9cfe-3728a863a9e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 144, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:42:00.825476', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '7d2761ac-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.860945475, 'message_signature': '01499f8464dc08e6c146e4e655092e02fb49227e182e825c28230ba22bed284f'}]}, 'timestamp': '2025-11-28 09:42:00.825926', '_unique_id': '2511ebb1bffa4f7ab3fbcb1b01de7943'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.827 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.828 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 305908425 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.828 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 30452399 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e943ed3b-6a6c-464a-96b0-75a34a704d31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 305908425, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:42:00.828001', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7d27c5ca-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': 'e45a48065a5fd2429393d25101914bade81280059035ac398066e454c6c7d509'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30452399, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:42:00.828001', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7d27d574-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': '54a318662de78279463810fd77e0d9584ee284c2191b132dba314091f7e49e74'}]}, 'timestamp': '2025-11-28 09:42:00.828858', '_unique_id': 'ffd126483f7a486ba521a608156f6c3b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:42:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:42:02 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24392 DF PROTO=TCP SPT=57376 DPT=9102 SEQ=2455724692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB25FD350000000001030307) 
Nov 28 09:42:02 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:02.772 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24393 DF PROTO=TCP SPT=57376 DPT=9102 SEQ=2455724692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2601420000000001030307) 
Nov 28 09:42:03 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:42:03 np0005538513.localdomain podman[261226]: 2025-11-28 09:42:03.848982719 +0000 UTC m=+0.077224477 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:42:03 np0005538513.localdomain podman[261226]: 2025-11-28 09:42:03.862434366 +0000 UTC m=+0.090676164 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:42:03 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:42:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41793 DF PROTO=TCP SPT=43694 DPT=9102 SEQ=689482158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2603820000000001030307) 
Nov 28 09:42:05 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24394 DF PROTO=TCP SPT=57376 DPT=9102 SEQ=2455724692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2609430000000001030307) 
Nov 28 09:42:05 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:05.729 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40562 DF PROTO=TCP SPT=33708 DPT=9102 SEQ=2153988774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB260D820000000001030307) 
Nov 28 09:42:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:42:06 np0005538513.localdomain podman[261250]: 2025-11-28 09:42:06.837494262 +0000 UTC m=+0.077079574 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:42:06 np0005538513.localdomain podman[261250]: 2025-11-28 09:42:06.847897889 +0000 UTC m=+0.087483211 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 09:42:06 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:42:07 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:07.775 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24395 DF PROTO=TCP SPT=57376 DPT=9102 SEQ=2455724692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2619020000000001030307) 
Nov 28 09:42:10 np0005538513.localdomain podman[238687]: time="2025-11-28T09:42:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:42:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:42:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1"
Nov 28 09:42:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:42:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17224 "" "Go-http-client/1.1"
Nov 28 09:42:10 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:10.760 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:12 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:12.815 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:42:15 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:15.764 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:15 np0005538513.localdomain podman[261270]: 2025-11-28 09:42:15.845633973 +0000 UTC m=+0.079104339 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., release=1755695350, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, container_name=openstack_network_exporter, vendor=Red Hat, Inc.)
Nov 28 09:42:15 np0005538513.localdomain podman[261270]: 2025-11-28 09:42:15.86032962 +0000 UTC m=+0.093800026 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 09:42:15 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:42:17 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:17.817 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24396 DF PROTO=TCP SPT=57376 DPT=9102 SEQ=2455724692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2639820000000001030307) 
Nov 28 09:42:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:42:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:42:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:42:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:42:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:42:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:42:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:42:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:42:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:42:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:42:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:42:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:42:18 np0005538513.localdomain sudo[261290]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:42:18 np0005538513.localdomain sudo[261290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:42:18 np0005538513.localdomain sudo[261290]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:18 np0005538513.localdomain sudo[261308]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 09:42:18 np0005538513.localdomain sudo[261308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:42:19 np0005538513.localdomain sudo[261308]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:19 np0005538513.localdomain sudo[261347]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:42:19 np0005538513.localdomain sudo[261347]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:42:19 np0005538513.localdomain sudo[261347]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:19 np0005538513.localdomain sudo[261365]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:42:19 np0005538513.localdomain sudo[261365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:42:19 np0005538513.localdomain sudo[261365]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:20 np0005538513.localdomain sudo[261416]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:42:20 np0005538513.localdomain sudo[261416]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:42:20 np0005538513.localdomain sudo[261416]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:20 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:20.794 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:42:21 np0005538513.localdomain podman[261434]: 2025-11-28 09:42:21.843291626 +0000 UTC m=+0.081878319 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:42:21 np0005538513.localdomain podman[261434]: 2025-11-28 09:42:21.856979249 +0000 UTC m=+0.095565922 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:42:21 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:42:22 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:22.865 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:24 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:42:24Z|00048|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Nov 28 09:42:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:42:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:42:25 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:25.797 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:25 np0005538513.localdomain podman[261456]: 2025-11-28 09:42:25.851377438 +0000 UTC m=+0.084276366 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:42:25 np0005538513.localdomain podman[261457]: 2025-11-28 09:42:25.907287673 +0000 UTC m=+0.135935582 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:42:25 np0005538513.localdomain podman[261457]: 2025-11-28 09:42:25.913592697 +0000 UTC m=+0.142240586 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 28 09:42:25 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:42:25 np0005538513.localdomain podman[261456]: 2025-11-28 09:42:25.963996343 +0000 UTC m=+0.196895241 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 28 09:42:25 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:42:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:42:27 np0005538513.localdomain podman[261498]: 2025-11-28 09:42:27.842013384 +0000 UTC m=+0.079830542 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:42:27 np0005538513.localdomain podman[261498]: 2025-11-28 09:42:27.857487466 +0000 UTC m=+0.095304664 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:42:27 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:27.867 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:27 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:42:30 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:30.829 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:32 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46571 DF PROTO=TCP SPT=50088 DPT=9102 SEQ=3884471758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2672650000000001030307) 
Nov 28 09:42:32 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:32.895 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:33 np0005538513.localdomain sshd[261517]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:42:33 np0005538513.localdomain sshd[261517]: Accepted publickey for zuul from 192.168.122.30 port 50052 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:42:33 np0005538513.localdomain systemd-logind[764]: New session 59 of user zuul.
Nov 28 09:42:33 np0005538513.localdomain systemd[1]: Started Session 59 of User zuul.
Nov 28 09:42:33 np0005538513.localdomain sshd[261517]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:42:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46572 DF PROTO=TCP SPT=50088 DPT=9102 SEQ=3884471758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2676820000000001030307) 
Nov 28 09:42:34 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24397 DF PROTO=TCP SPT=57376 DPT=9102 SEQ=2455724692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2679820000000001030307) 
Nov 28 09:42:34 np0005538513.localdomain python3.9[261628]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:42:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:42:34 np0005538513.localdomain podman[261650]: 2025-11-28 09:42:34.854173122 +0000 UTC m=+0.090893271 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:42:34 np0005538513.localdomain podman[261650]: 2025-11-28 09:42:34.866343397 +0000 UTC m=+0.103063526 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:42:34 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:42:35 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46573 DF PROTO=TCP SPT=50088 DPT=9102 SEQ=3884471758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB267E820000000001030307) 
Nov 28 09:42:35 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:35.831 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:36 np0005538513.localdomain python3.9[261764]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:42:36 np0005538513.localdomain network[261781]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:42:36 np0005538513.localdomain network[261782]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:42:36 np0005538513.localdomain network[261783]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:42:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41794 DF PROTO=TCP SPT=43694 DPT=9102 SEQ=689482158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2681820000000001030307) 
Nov 28 09:42:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:42:36 np0005538513.localdomain systemd[1]: tmp-crun.HffRkA.mount: Deactivated successfully.
Nov 28 09:42:36 np0005538513.localdomain podman[261792]: 2025-11-28 09:42:36.973251217 +0000 UTC m=+0.086735866 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_managed=true, managed_by=edpm_ansible)
Nov 28 09:42:36 np0005538513.localdomain podman[261792]: 2025-11-28 09:42:36.988413199 +0000 UTC m=+0.101897848 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 09:42:37 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:42:37 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:42:37 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:37.898 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46574 DF PROTO=TCP SPT=50088 DPT=9102 SEQ=3884471758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB268E420000000001030307) 
Nov 28 09:42:39 np0005538513.localdomain sudo[262034]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dmxsrzueobisjcnbdsdnciovosvwmpgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322959.5947607-101-181540784935521/AnsiballZ_setup.py
Nov 28 09:42:39 np0005538513.localdomain sudo[262034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:42:40 np0005538513.localdomain podman[238687]: time="2025-11-28T09:42:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:42:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:42:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1"
Nov 28 09:42:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:42:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17226 "" "Go-http-client/1.1"
Nov 28 09:42:40 np0005538513.localdomain python3.9[262036]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:42:40 np0005538513.localdomain sudo[262034]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:40 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:40.872 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:40 np0005538513.localdomain sudo[262097]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhhjbhkortjerzgboduvlqsiadyzcwor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322959.5947607-101-181540784935521/AnsiballZ_dnf.py
Nov 28 09:42:40 np0005538513.localdomain sudo[262097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:42:41 np0005538513.localdomain python3.9[262099]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:42:42 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:42.938 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:44 np0005538513.localdomain sudo[262097]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:44 np0005538513.localdomain sudo[262209]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-roralhunblqrcmhepnftyglaqmviehot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322964.4825795-137-92796069846123/AnsiballZ_stat.py
Nov 28 09:42:44 np0005538513.localdomain sudo[262209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:42:45 np0005538513.localdomain python3.9[262211]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:42:45 np0005538513.localdomain sudo[262209]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:45.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:45 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:45.874 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:45 np0005538513.localdomain sudo[262319]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-amlbugsbonigmpysqyqgyypdfxuljxol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322965.4933665-167-166885250696011/AnsiballZ_command.py
Nov 28 09:42:45 np0005538513.localdomain sudo[262319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:42:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:42:46 np0005538513.localdomain podman[262322]: 2025-11-28 09:42:46.052212502 +0000 UTC m=+0.087073162 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, name=ubi9-minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 09:42:46 np0005538513.localdomain podman[262322]: 2025-11-28 09:42:46.066933694 +0000 UTC m=+0.101794374 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.expose-services=, vendor=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 09:42:46 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:42:46 np0005538513.localdomain python3.9[262321]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:42:46 np0005538513.localdomain sudo[262319]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:46.709 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:46 np0005538513.localdomain sudo[262450]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cnacudmnbercxtapsfuvuraafxtzqsce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322966.5657585-197-74358008122631/AnsiballZ_stat.py
Nov 28 09:42:46 np0005538513.localdomain sudo[262450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:42:47 np0005538513.localdomain python3.9[262452]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:42:47 np0005538513.localdomain sudo[262450]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:47.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:47.681 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 09:42:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:47.710 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 09:42:47 np0005538513.localdomain sudo[262562]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxdqksgrsvqfkrioocfvcpyzjmeknqnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322967.4813786-230-40311809951232/AnsiballZ_lineinfile.py
Nov 28 09:42:47 np0005538513.localdomain sudo[262562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:42:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:47.940 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:48 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46575 DF PROTO=TCP SPT=50088 DPT=9102 SEQ=3884471758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB26AF820000000001030307) 
Nov 28 09:42:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:42:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:42:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:42:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:42:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:42:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:42:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:42:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:42:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:42:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:42:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:42:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:42:48 np0005538513.localdomain python3.9[262564]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:42:48 np0005538513.localdomain sudo[262562]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:49 np0005538513.localdomain sudo[262672]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dsrhatlranaafqciedrudrqgjomensni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322968.9172535-257-21984604567004/AnsiballZ_systemd_service.py
Nov 28 09:42:49 np0005538513.localdomain sudo[262672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:42:50 np0005538513.localdomain python3.9[262674]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:42:50 np0005538513.localdomain sudo[262672]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:42:50.819 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:42:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:42:50.820 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:42:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:42:50.821 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:42:50 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:50.906 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:51 np0005538513.localdomain sudo[262784]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ffumdzdsrlvaozafbeshcuubnxjisenu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322970.8072188-281-111572745089060/AnsiballZ_systemd_service.py
Nov 28 09:42:51 np0005538513.localdomain sudo[262784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:42:51 np0005538513.localdomain python3.9[262786]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:42:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:51.711 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:51.739 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:42:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:51.739 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:42:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:51.739 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:42:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:51.740 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:42:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:51.740 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:42:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:52.203 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:42:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:52.267 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:42:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:52.267 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:42:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:52.428 228337 WARNING nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:42:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:52.429 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12074MB free_disk=41.837093353271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:42:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:52.429 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:42:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:52.429 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:42:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:42:52 np0005538513.localdomain sudo[262784]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:52 np0005538513.localdomain podman[262811]: 2025-11-28 09:42:52.550229459 +0000 UTC m=+0.078423179 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:42:52 np0005538513.localdomain podman[262811]: 2025-11-28 09:42:52.557771125 +0000 UTC m=+0.085964815 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:42:52 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:42:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:52.749 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:42:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:52.750 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:42:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:52.750 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:42:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:52.833 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Refreshing inventories for resource provider 35fead26-0bad-4950-b646-987079d58a17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 09:42:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:52.852 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Updating ProviderTree inventory for provider 35fead26-0bad-4950-b646-987079d58a17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 09:42:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:52.852 228337 DEBUG nova.compute.provider_tree [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 09:42:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:52.864 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Refreshing aggregate associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 09:42:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:52.886 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Refreshing trait associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,HW_CPU_X86_FMA3,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX2,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_CLMUL,HW_CPU_X86_ABM,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE41,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_F16C,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 09:42:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:52.979 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:52.991 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:42:53 np0005538513.localdomain sudo[262960]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykclfraafhpxkaqhmuvzxgcviuzbgskt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322973.0443933-314-104942128649510/AnsiballZ_service_facts.py
Nov 28 09:42:53 np0005538513.localdomain sudo[262960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:42:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:53.445 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:42:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:53.451 228337 DEBUG nova.compute.provider_tree [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:42:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:53.471 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:42:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:53.473 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:42:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:53.474 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.044s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:42:53 np0005538513.localdomain python3.9[262962]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:42:53 np0005538513.localdomain network[262981]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:42:53 np0005538513.localdomain network[262982]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:42:53 np0005538513.localdomain network[262983]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:42:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:53.880 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:53.881 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:53.907 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:53.907 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:53.931 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Triggering sync for uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 28 09:42:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:53.932 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:42:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:53.933 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:42:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:53.984 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.052s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:42:54 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:42:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:55.707 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:55.707 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:42:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:55.707 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:42:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:55.908 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:42:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:55.909 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:42:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:55.910 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:42:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:55.910 228337 DEBUG nova.objects.instance [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:42:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:55.912 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:42:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:42:56 np0005538513.localdomain podman[263089]: 2025-11-28 09:42:56.092688567 +0000 UTC m=+0.087720364 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 09:42:56 np0005538513.localdomain podman[263078]: 2025-11-28 09:42:56.0659089 +0000 UTC m=+0.096297785 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:42:56 np0005538513.localdomain podman[263089]: 2025-11-28 09:42:56.132681267 +0000 UTC m=+0.127713124 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 09:42:56 np0005538513.localdomain podman[263078]: 2025-11-28 09:42:56.146765018 +0000 UTC m=+0.177153913 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:42:56 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:42:56 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:42:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:56.377 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:42:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:56.399 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:42:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:56.399 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:42:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:56.400 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:56.400 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:56 np0005538513.localdomain sudo[262960]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:56.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:56.681 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:42:57 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:57.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:57 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:57.681 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 09:42:57 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:42:57.980 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:42:58 np0005538513.localdomain sudo[263259]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mljbdgnssydaaahzvuszboypwxwooolh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322978.0008216-344-119745500937073/AnsiballZ_file.py
Nov 28 09:42:58 np0005538513.localdomain sudo[263259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:42:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:42:58 np0005538513.localdomain systemd[1]: tmp-crun.CFCK2O.mount: Deactivated successfully.
Nov 28 09:42:58 np0005538513.localdomain podman[263261]: 2025-11-28 09:42:58.542715681 +0000 UTC m=+0.082145621 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, container_name=ceilometer_agent_compute)
Nov 28 09:42:58 np0005538513.localdomain podman[263261]: 2025-11-28 09:42:58.557299618 +0000 UTC m=+0.096729608 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 09:42:58 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:42:58 np0005538513.localdomain python3.9[263262]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 09:42:58 np0005538513.localdomain sudo[263259]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:59 np0005538513.localdomain sudo[263388]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hcnyhwxrxrizcopjxsfopfvorahibuvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322978.8315308-368-90908803345041/AnsiballZ_modprobe.py
Nov 28 09:42:59 np0005538513.localdomain sudo[263388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:42:59 np0005538513.localdomain python3.9[263390]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 28 09:42:59 np0005538513.localdomain sudo[263388]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:59 np0005538513.localdomain sudo[263498]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dojmungudokujgcijefazyadfmkwefnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322979.6867485-392-23073863900949/AnsiballZ_stat.py
Nov 28 09:42:59 np0005538513.localdomain sudo[263498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:00 np0005538513.localdomain python3.9[263500]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:43:00 np0005538513.localdomain sudo[263498]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:00 np0005538513.localdomain sudo[263555]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrqtshytjyjabqzmsmqopltswdevrckh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322979.6867485-392-23073863900949/AnsiballZ_file.py
Nov 28 09:43:00 np0005538513.localdomain sudo[263555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:00 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:00.696 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:43:00 np0005538513.localdomain python3.9[263557]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/dm-multipath.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/dm-multipath.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:00 np0005538513.localdomain sudo[263555]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:00 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:00.946 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:01 np0005538513.localdomain sudo[263665]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhpgapbowqdyarjxtgaeplbqdaasdtbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322981.4758058-431-219086084000542/AnsiballZ_lineinfile.py
Nov 28 09:43:01 np0005538513.localdomain sudo[263665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:01 np0005538513.localdomain python3.9[263667]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:01 np0005538513.localdomain sudo[263665]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:02 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62567 DF PROTO=TCP SPT=54772 DPT=9102 SEQ=1163547485 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB26E7940000000001030307) 
Nov 28 09:43:02 np0005538513.localdomain sudo[263775]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fmykmtpczqgoqovqheegorekwjnykvqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322982.3088815-458-238279991862319/AnsiballZ_file.py
Nov 28 09:43:02 np0005538513.localdomain sudo[263775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:02 np0005538513.localdomain python3.9[263777]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:43:02 np0005538513.localdomain sudo[263775]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:02 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:02.993 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62568 DF PROTO=TCP SPT=54772 DPT=9102 SEQ=1163547485 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB26EB830000000001030307) 
Nov 28 09:43:04 np0005538513.localdomain sudo[263885]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgyagorkzfuafdvrtcnimdnjcytpljbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322983.7850642-485-126935787383938/AnsiballZ_stat.py
Nov 28 09:43:04 np0005538513.localdomain sudo[263885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:04 np0005538513.localdomain python3.9[263887]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:43:04 np0005538513.localdomain sudo[263885]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:04 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46576 DF PROTO=TCP SPT=50088 DPT=9102 SEQ=3884471758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB26EF830000000001030307) 
Nov 28 09:43:04 np0005538513.localdomain sudo[263997]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ipvrcgynoppwuaoaadboilqkotsemlnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322984.58483-512-168774243439613/AnsiballZ_stat.py
Nov 28 09:43:04 np0005538513.localdomain sudo[263997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:05 np0005538513.localdomain python3.9[263999]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:43:05 np0005538513.localdomain sudo[263997]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:43:05 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62569 DF PROTO=TCP SPT=54772 DPT=9102 SEQ=1163547485 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB26F3820000000001030307) 
Nov 28 09:43:05 np0005538513.localdomain systemd[1]: tmp-crun.eEca3y.mount: Deactivated successfully.
Nov 28 09:43:05 np0005538513.localdomain podman[264057]: 2025-11-28 09:43:05.49427505 +0000 UTC m=+0.085378507 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:43:05 np0005538513.localdomain podman[264057]: 2025-11-28 09:43:05.532346157 +0000 UTC m=+0.123449574 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:43:05 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:43:05 np0005538513.localdomain sudo[264132]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-buntzyoxjvricvjjcyyswfqhkzypfodw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322985.3306599-539-251740005755292/AnsiballZ_command.py
Nov 28 09:43:05 np0005538513.localdomain sudo[264132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:05 np0005538513.localdomain python3.9[264134]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:43:05 np0005538513.localdomain sudo[264132]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:05 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:05.949 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24398 DF PROTO=TCP SPT=57376 DPT=9102 SEQ=2455724692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB26F7820000000001030307) 
Nov 28 09:43:06 np0005538513.localdomain sudo[264243]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sccnuinkzybalxqyohdeklfbwdfxbmew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322986.2816439-569-226003199356223/AnsiballZ_replace.py
Nov 28 09:43:06 np0005538513.localdomain sudo[264243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:06 np0005538513.localdomain python3.9[264245]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:06 np0005538513.localdomain sudo[264243]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:07 np0005538513.localdomain sudo[264353]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ikirorkqkuvgqmbxqjiuqccfdfwppqug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322987.2775779-596-148046761825236/AnsiballZ_lineinfile.py
Nov 28 09:43:07 np0005538513.localdomain sudo[264353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:43:07 np0005538513.localdomain systemd[1]: tmp-crun.lZmpIu.mount: Deactivated successfully.
Nov 28 09:43:07 np0005538513.localdomain podman[264356]: 2025-11-28 09:43:07.663398434 +0000 UTC m=+0.091735525 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 28 09:43:07 np0005538513.localdomain podman[264356]: 2025-11-28 09:43:07.675293443 +0000 UTC m=+0.103630534 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 09:43:07 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:43:07 np0005538513.localdomain python3.9[264355]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:07 np0005538513.localdomain sudo[264353]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:07 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:07.996 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:08 np0005538513.localdomain sudo[264482]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzhhljgaihpurhfeykkuubhqpbgterzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322987.8852096-596-149912959025911/AnsiballZ_lineinfile.py
Nov 28 09:43:08 np0005538513.localdomain sudo[264482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:08 np0005538513.localdomain python3.9[264484]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:08 np0005538513.localdomain sudo[264482]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:08 np0005538513.localdomain sudo[264592]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftrojfeeyqlossgvnczqjfgrxvyksjsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322988.4912899-596-72988725478114/AnsiballZ_lineinfile.py
Nov 28 09:43:08 np0005538513.localdomain sudo[264592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:08 np0005538513.localdomain python3.9[264594]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:08 np0005538513.localdomain sudo[264592]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:09 np0005538513.localdomain sudo[264702]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mcilyrziecwpjvtdjezlhmhihtdjzzkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322989.1075819-596-159571079133472/AnsiballZ_lineinfile.py
Nov 28 09:43:09 np0005538513.localdomain sudo[264702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62570 DF PROTO=TCP SPT=54772 DPT=9102 SEQ=1163547485 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2703420000000001030307) 
Nov 28 09:43:09 np0005538513.localdomain python3.9[264704]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:09 np0005538513.localdomain sudo[264702]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:10 np0005538513.localdomain podman[238687]: time="2025-11-28T09:43:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:43:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:43:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1"
Nov 28 09:43:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:43:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17228 "" "Go-http-client/1.1"
Nov 28 09:43:10 np0005538513.localdomain sudo[264812]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxltxxmykpknaxfdfezsjuofrxrttged ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322989.864071-683-99250449026789/AnsiballZ_stat.py
Nov 28 09:43:10 np0005538513.localdomain sudo[264812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:10 np0005538513.localdomain python3.9[264814]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:43:10 np0005538513.localdomain sudo[264812]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:10 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:10.986 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:11 np0005538513.localdomain sudo[264924]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hnnxbytitwtctsnmezfniyxdowlgrbix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322990.9053006-713-266865630645206/AnsiballZ_file.py
Nov 28 09:43:11 np0005538513.localdomain sudo[264924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:11 np0005538513.localdomain python3.9[264926]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:43:11 np0005538513.localdomain sudo[264924]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:12 np0005538513.localdomain sudo[265034]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krqvlezdeuweplbiwfvjspxsljxuqtxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322991.8491879-737-96886765296151/AnsiballZ_stat.py
Nov 28 09:43:12 np0005538513.localdomain sudo[265034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:12 np0005538513.localdomain python3.9[265036]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:43:12 np0005538513.localdomain sudo[265034]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:13 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:13.039 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:13 np0005538513.localdomain sudo[265091]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-akqjbyuqxwszsubouxzaruitpnodaiva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322991.8491879-737-96886765296151/AnsiballZ_file.py
Nov 28 09:43:13 np0005538513.localdomain sudo[265091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:13 np0005538513.localdomain python3.9[265093]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:43:13 np0005538513.localdomain sudo[265091]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:13 np0005538513.localdomain sudo[265201]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wufuunetepdarplipusfvhdzqshsjgjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322993.5957298-737-86713732690825/AnsiballZ_stat.py
Nov 28 09:43:13 np0005538513.localdomain sudo[265201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:14 np0005538513.localdomain python3.9[265203]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:43:14 np0005538513.localdomain sudo[265201]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:14 np0005538513.localdomain sudo[265258]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hykirzpmvbnostixyfdxqnurhhghqcid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322993.5957298-737-86713732690825/AnsiballZ_file.py
Nov 28 09:43:14 np0005538513.localdomain sudo[265258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:14 np0005538513.localdomain python3.9[265260]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:43:14 np0005538513.localdomain sudo[265258]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:15 np0005538513.localdomain sudo[265368]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zohumrsksdakcbennwfmejgpaarnyokl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322995.3693311-806-147044513728088/AnsiballZ_file.py
Nov 28 09:43:15 np0005538513.localdomain sudo[265368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:15 np0005538513.localdomain python3.9[265370]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:15 np0005538513.localdomain sudo[265368]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:15 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:15.988 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:16 np0005538513.localdomain sudo[265478]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aplfpefxnsauahttrjfjxyhyoadxsayt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322996.0229683-830-34273178079280/AnsiballZ_stat.py
Nov 28 09:43:16 np0005538513.localdomain sudo[265478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:43:16 np0005538513.localdomain podman[265481]: 2025-11-28 09:43:16.372088176 +0000 UTC m=+0.080080622 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-type=git, managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public)
Nov 28 09:43:16 np0005538513.localdomain podman[265481]: 2025-11-28 09:43:16.388351869 +0000 UTC m=+0.096344255 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., version=9.6, config_id=edpm)
Nov 28 09:43:16 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:43:16 np0005538513.localdomain python3.9[265480]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:43:16 np0005538513.localdomain sudo[265478]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:16 np0005538513.localdomain sudo[265554]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqucvtnpsccnbpregbuhnulisxumncja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322996.0229683-830-34273178079280/AnsiballZ_file.py
Nov 28 09:43:16 np0005538513.localdomain sudo[265554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:16 np0005538513.localdomain python3.9[265556]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:16 np0005538513.localdomain sudo[265554]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:17 np0005538513.localdomain sudo[265664]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvwbrvustwppyghovhpklqmldbigptoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322997.273792-866-177139523514473/AnsiballZ_stat.py
Nov 28 09:43:17 np0005538513.localdomain sudo[265664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:17 np0005538513.localdomain python3.9[265666]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:43:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62571 DF PROTO=TCP SPT=54772 DPT=9102 SEQ=1163547485 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2723830000000001030307) 
Nov 28 09:43:17 np0005538513.localdomain sudo[265664]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:17 np0005538513.localdomain sudo[265721]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oczbjmalhncmlldbaccjlmkhyosdmkjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322997.273792-866-177139523514473/AnsiballZ_file.py
Nov 28 09:43:17 np0005538513.localdomain sudo[265721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:18 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:18.041 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:43:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:43:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:43:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:43:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:43:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:43:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:43:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:43:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:43:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:43:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:43:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:43:18 np0005538513.localdomain python3.9[265723]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:18 np0005538513.localdomain sudo[265721]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:18 np0005538513.localdomain sudo[265831]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhtdsrfewiolfzegeudqqhlmsmmshmqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322998.3257368-902-78376922643749/AnsiballZ_systemd.py
Nov 28 09:43:18 np0005538513.localdomain sudo[265831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:19 np0005538513.localdomain python3.9[265833]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:43:19 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:43:19 np0005538513.localdomain systemd-sysv-generator[265859]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:43:19 np0005538513.localdomain systemd-rc-local-generator[265855]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:43:19 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:19 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:19 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:19 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:19 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:43:19 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:19 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:19 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:19 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:19 np0005538513.localdomain sudo[265831]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:20 np0005538513.localdomain sudo[265979]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvqglvcrezkhaeibllxibrfzgjrpvppm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322999.8883529-926-94258705245852/AnsiballZ_stat.py
Nov 28 09:43:20 np0005538513.localdomain sudo[265979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:20 np0005538513.localdomain python3.9[265981]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:43:20 np0005538513.localdomain sudo[265979]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:20 np0005538513.localdomain sudo[266036]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atqqnsajgvgzwfcwscwhnqdnfuymatxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322999.8883529-926-94258705245852/AnsiballZ_file.py
Nov 28 09:43:20 np0005538513.localdomain sudo[266036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:20 np0005538513.localdomain python3.9[266038]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:20 np0005538513.localdomain sudo[266036]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:20 np0005538513.localdomain sudo[266039]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:43:20 np0005538513.localdomain sudo[266039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:43:20 np0005538513.localdomain sudo[266039]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:20 np0005538513.localdomain sudo[266073]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:43:20 np0005538513.localdomain sudo[266073]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:43:21 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:21.019 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:21 np0005538513.localdomain sudo[266203]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dekqdvyexiomwdidzwtxhzpxkljwzcpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323001.1490276-962-56740192103895/AnsiballZ_stat.py
Nov 28 09:43:21 np0005538513.localdomain sudo[266203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:21 np0005538513.localdomain sudo[266073]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:21 np0005538513.localdomain python3.9[266205]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:43:21 np0005538513.localdomain sudo[266203]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:21 np0005538513.localdomain sudo[266272]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivsdleoifrvyjkfilzrfhvamqsouvhwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323001.1490276-962-56740192103895/AnsiballZ_file.py
Nov 28 09:43:21 np0005538513.localdomain sudo[266272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:22 np0005538513.localdomain python3.9[266274]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:22 np0005538513.localdomain sudo[266272]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:22 np0005538513.localdomain sudo[266292]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:43:22 np0005538513.localdomain sudo[266292]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:43:22 np0005538513.localdomain sudo[266292]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:43:22 np0005538513.localdomain sudo[266406]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdkreoyqcmeofpbxyxqbqppaomkdvnir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323002.3515353-998-218248705016953/AnsiballZ_systemd.py
Nov 28 09:43:22 np0005538513.localdomain sudo[266406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:22 np0005538513.localdomain systemd[1]: tmp-crun.E8SDXQ.mount: Deactivated successfully.
Nov 28 09:43:22 np0005538513.localdomain podman[266382]: 2025-11-28 09:43:22.863505367 +0000 UTC m=+0.087513897 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:43:22 np0005538513.localdomain podman[266382]: 2025-11-28 09:43:22.874410494 +0000 UTC m=+0.098419024 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:43:22 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:43:23 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:23.081 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:23 np0005538513.localdomain python3.9[266414]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:43:23 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:43:23 np0005538513.localdomain systemd-sysv-generator[266454]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:43:23 np0005538513.localdomain systemd-rc-local-generator[266451]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:43:23 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:23 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:23 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:23 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:23 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:43:23 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:23 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:23 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:23 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:23 np0005538513.localdomain systemd[1]: Starting Create netns directory...
Nov 28 09:43:23 np0005538513.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 09:43:23 np0005538513.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 09:43:23 np0005538513.localdomain systemd[1]: Finished Create netns directory.
Nov 28 09:43:23 np0005538513.localdomain sudo[266406]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:24 np0005538513.localdomain sudo[266575]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncptaddyjtrkywfpdmbsnvudturfghjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323004.039371-1028-3630101820888/AnsiballZ_file.py
Nov 28 09:43:24 np0005538513.localdomain sudo[266575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:24 np0005538513.localdomain python3.9[266577]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:43:24 np0005538513.localdomain sudo[266575]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:24 np0005538513.localdomain sudo[266685]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfgovgvkmtcmukxpzjxzmwardvswflkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323004.7413173-1052-145717439503005/AnsiballZ_stat.py
Nov 28 09:43:24 np0005538513.localdomain sudo[266685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:25 np0005538513.localdomain python3.9[266687]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:43:25 np0005538513.localdomain sudo[266685]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:25 np0005538513.localdomain sudo[266742]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nayesrdbnahmqyqihgoywhtasavoosjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323004.7413173-1052-145717439503005/AnsiballZ_file.py
Nov 28 09:43:25 np0005538513.localdomain sudo[266742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:25 np0005538513.localdomain python3.9[266744]: ansible-ansible.legacy.file Invoked with group=zuul mode=0700 owner=zuul setype=container_file_t dest=/var/lib/openstack/healthchecks/multipathd/ _original_basename=healthcheck recurse=False state=file path=/var/lib/openstack/healthchecks/multipathd/ force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:43:25 np0005538513.localdomain sudo[266742]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:26 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:26.021 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:26 np0005538513.localdomain sudo[266852]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zulnwvaryrkucdwbjteuibljajhdypqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323006.1695735-1094-160537136859826/AnsiballZ_file.py
Nov 28 09:43:26 np0005538513.localdomain sudo[266852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:43:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:43:26 np0005538513.localdomain systemd[1]: tmp-crun.UdwR2W.mount: Deactivated successfully.
Nov 28 09:43:26 np0005538513.localdomain podman[266855]: 2025-11-28 09:43:26.571046561 +0000 UTC m=+0.131262480 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:43:26 np0005538513.localdomain podman[266856]: 2025-11-28 09:43:26.542562458 +0000 UTC m=+0.102182838 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 09:43:26 np0005538513.localdomain podman[266856]: 2025-11-28 09:43:26.623402736 +0000 UTC m=+0.183023086 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 09:43:26 np0005538513.localdomain podman[266855]: 2025-11-28 09:43:26.63331345 +0000 UTC m=+0.193529369 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 09:43:26 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:43:26 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:43:26 np0005538513.localdomain python3.9[266854]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:43:26 np0005538513.localdomain sudo[266852]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:27 np0005538513.localdomain sudo[267002]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngbxiglefnlekqjjswllnhbrdlbkrqkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323006.9410307-1118-95527621666560/AnsiballZ_stat.py
Nov 28 09:43:27 np0005538513.localdomain sudo[267002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:27 np0005538513.localdomain python3.9[267004]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:43:27 np0005538513.localdomain sudo[267002]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:27 np0005538513.localdomain sudo[267059]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjfojxcebzyqdijreeqdcxxecseqyeje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323006.9410307-1118-95527621666560/AnsiballZ_file.py
Nov 28 09:43:27 np0005538513.localdomain sudo[267059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:27 np0005538513.localdomain python3.9[267061]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/multipathd.json _original_basename=.3pc5sw5c recurse=False state=file path=/var/lib/kolla/config_files/multipathd.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:27 np0005538513.localdomain sudo[267059]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:28 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:28.082 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:28 np0005538513.localdomain sudo[267169]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ikemjsotukzaqinmbsoajxwcvykxaekc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323008.3153212-1154-239564973137087/AnsiballZ_file.py
Nov 28 09:43:28 np0005538513.localdomain sudo[267169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:43:28 np0005538513.localdomain python3.9[267171]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:28 np0005538513.localdomain sudo[267169]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:28 np0005538513.localdomain podman[267172]: 2025-11-28 09:43:28.859333407 +0000 UTC m=+0.094830687 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:43:28 np0005538513.localdomain podman[267172]: 2025-11-28 09:43:28.868216268 +0000 UTC m=+0.103713538 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 09:43:28 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:43:29 np0005538513.localdomain sudo[267298]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-giddnwsbhzklzajgpxethkansiqqluju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323009.0278025-1178-229027007829163/AnsiballZ_stat.py
Nov 28 09:43:29 np0005538513.localdomain sudo[267298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:29 np0005538513.localdomain sudo[267298]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:29 np0005538513.localdomain sudo[267355]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktoixmgzumlwkxztlnwlsinfucxsolja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323009.0278025-1178-229027007829163/AnsiballZ_file.py
Nov 28 09:43:29 np0005538513.localdomain sudo[267355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:29 np0005538513.localdomain sudo[267355]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:30 np0005538513.localdomain sudo[267465]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynkkgmymeefffijfnvgxqfoogxnliuyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323010.4221213-1220-267347738382754/AnsiballZ_container_config_data.py
Nov 28 09:43:30 np0005538513.localdomain sudo[267465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:31 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:31.046 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:31 np0005538513.localdomain python3.9[267467]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 28 09:43:31 np0005538513.localdomain sudo[267465]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:32 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5634 DF PROTO=TCP SPT=45426 DPT=9102 SEQ=3421532680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB275CC50000000001030307) 
Nov 28 09:43:32 np0005538513.localdomain sudo[267575]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jasxdjvwymexxcotielywowzznlizvnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323012.1518004-1247-121070812399828/AnsiballZ_container_config_hash.py
Nov 28 09:43:32 np0005538513.localdomain sudo[267575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:32 np0005538513.localdomain python3.9[267577]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:43:32 np0005538513.localdomain sudo[267575]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:33 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:33.117 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5635 DF PROTO=TCP SPT=45426 DPT=9102 SEQ=3421532680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2760C20000000001030307) 
Nov 28 09:43:34 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62572 DF PROTO=TCP SPT=54772 DPT=9102 SEQ=1163547485 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2763820000000001030307) 
Nov 28 09:43:34 np0005538513.localdomain sudo[267685]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwiwamfelqmpddgdjluxtgzzlntcmgvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323013.128872-1274-225744509334033/AnsiballZ_podman_container_info.py
Nov 28 09:43:34 np0005538513.localdomain sudo[267685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:34 np0005538513.localdomain python3.9[267687]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 28 09:43:35 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5636 DF PROTO=TCP SPT=45426 DPT=9102 SEQ=3421532680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2768C20000000001030307) 
Nov 28 09:43:35 np0005538513.localdomain sudo[267685]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:43:35 np0005538513.localdomain systemd[1]: tmp-crun.Ob0Jyl.mount: Deactivated successfully.
Nov 28 09:43:35 np0005538513.localdomain podman[267731]: 2025-11-28 09:43:35.857541205 +0000 UTC m=+0.089245304 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:43:35 np0005538513.localdomain podman[267731]: 2025-11-28 09:43:35.896580833 +0000 UTC m=+0.128284992 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:43:35 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:43:36 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:36.049 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46577 DF PROTO=TCP SPT=50088 DPT=9102 SEQ=3884471758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB276D830000000001030307) 
Nov 28 09:43:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:43:37 np0005538513.localdomain podman[267755]: 2025-11-28 09:43:37.84986616 +0000 UTC m=+0.086187724 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:43:37 np0005538513.localdomain podman[267755]: 2025-11-28 09:43:37.89046252 +0000 UTC m=+0.126784064 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 09:43:37 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:43:38 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:38.119 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:38 np0005538513.localdomain sudo[267863]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idhjsjsxnnpgiadbnrnkysctsoqyywzw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764323018.5049498-1313-54638361513359/AnsiballZ_edpm_container_manage.py
Nov 28 09:43:38 np0005538513.localdomain sudo[267863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:39 np0005538513.localdomain python3[267865]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:43:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5637 DF PROTO=TCP SPT=45426 DPT=9102 SEQ=3421532680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2778820000000001030307) 
Nov 28 09:43:39 np0005538513.localdomain python3[267865]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "f275b8d168f7f57f31e3da49224019f39f95c80a833f083696a964527b07b54f",
                                                                    "Digest": "sha256:6296d2d95faaeb90443ee98443b39aa81b5152414f9542335d72711bb15fefdd",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-multipathd:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6296d2d95faaeb90443ee98443b39aa81b5152414f9542335d72711bb15fefdd"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-11-26T06:12:42.268223466Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 249482220,
                                                                    "VirtualSize": 249482220,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/da9f726a106a4f4af24ed404443eca5cd50a43c6e5c864c256f158761c28e938/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/da9f726a106a4f4af24ed404443eca5cd50a43c6e5c864c256f158761c28e938/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:1e3477d3ea795ca64b46f28aa9428ba791c4250e0fd05e173a4b9c0fb0bdee23",
                                                                              "sha256:135e1f5eea0bd6ac73fc43c122f58d5ed97cb8a56365c4a958c72d470055986b"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.55004106Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550061231Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550071761Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550082711Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550094371Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550104472Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.937139683Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:33.845342269Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:37.752912815Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.066850603Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.343690066Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.121414134Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.758394881Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.023293708Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.666927498Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.274045447Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.934810694Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:42.460051822Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.056709748Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.656939418Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.391634882Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.866551538Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.384686341Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.893815667Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:50.280039705Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:51.365780205Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:52.238116267Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:54.354755699Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.47438266Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474435383Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474444143Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474450953Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:58.542433842Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:12:01.186918094Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:12:42.000584504Z",
                                                                              "created_by": "/bin/sh -c dnf -y install device-mapper-multipath iscsi-initiator-utils && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:12:43.229019379Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-multipathd:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 28 09:43:39 np0005538513.localdomain sudo[267863]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:40 np0005538513.localdomain podman[238687]: time="2025-11-28T09:43:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:43:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:43:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1"
Nov 28 09:43:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:43:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17219 "" "Go-http-client/1.1"
Nov 28 09:43:40 np0005538513.localdomain sudo[268035]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbsdrchnwfyeoybeztwuddaxgxxcopra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323019.8463688-1337-70530271523313/AnsiballZ_stat.py
Nov 28 09:43:40 np0005538513.localdomain sudo[268035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:40 np0005538513.localdomain python3.9[268037]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:43:40 np0005538513.localdomain sudo[268035]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:40 np0005538513.localdomain sudo[268147]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbsmaifxrzzroidwjpgkltciubtjklyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323020.7068624-1364-137032216357290/AnsiballZ_file.py
Nov 28 09:43:40 np0005538513.localdomain sudo[268147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:41 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:41.084 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:41 np0005538513.localdomain python3.9[268149]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:41 np0005538513.localdomain sudo[268147]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:41 np0005538513.localdomain sudo[268202]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-glawrbiznqrcldurxtcrbduwhorhbepp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323020.7068624-1364-137032216357290/AnsiballZ_stat.py
Nov 28 09:43:41 np0005538513.localdomain sudo[268202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:41 np0005538513.localdomain python3.9[268204]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:43:41 np0005538513.localdomain sudo[268202]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:42 np0005538513.localdomain sudo[268311]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svytowbopdfnblgwqmdyqcobwdwvjkuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323021.6238728-1364-218007772854828/AnsiballZ_copy.py
Nov 28 09:43:42 np0005538513.localdomain sudo[268311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:42 np0005538513.localdomain python3.9[268313]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764323021.6238728-1364-218007772854828/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:42 np0005538513.localdomain sudo[268311]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:43 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:43.155 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:43 np0005538513.localdomain sudo[268366]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-znspssevmegntcfjyzvroaykqwbguygd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323021.6238728-1364-218007772854828/AnsiballZ_systemd.py
Nov 28 09:43:43 np0005538513.localdomain sudo[268366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:43 np0005538513.localdomain python3.9[268368]: ansible-systemd Invoked with state=started name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:43:43 np0005538513.localdomain sudo[268366]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:44 np0005538513.localdomain python3.9[268478]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:43:45 np0005538513.localdomain sudo[268586]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aygbdawbbkpaigergaciooxdbhybfdrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323025.5574303-1466-206938888352961/AnsiballZ_file.py
Nov 28 09:43:45 np0005538513.localdomain sudo[268586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:46 np0005538513.localdomain python3.9[268588]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:46 np0005538513.localdomain sudo[268586]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:46.087 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:43:46 np0005538513.localdomain podman[268606]: 2025-11-28 09:43:46.844170222 +0000 UTC m=+0.082466192 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7)
Nov 28 09:43:46 np0005538513.localdomain podman[268606]: 2025-11-28 09:43:46.889524477 +0000 UTC m=+0.127820447 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=9.6, config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.expose-services=)
Nov 28 09:43:46 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:43:47 np0005538513.localdomain sudo[268714]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtsqojhclaoexttdokndgaulvmpgjbvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323026.8170536-1502-149203243624602/AnsiballZ_file.py
Nov 28 09:43:47 np0005538513.localdomain sudo[268714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:47 np0005538513.localdomain python3.9[268716]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 09:43:47 np0005538513.localdomain sudo[268714]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:47 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:47.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:43:47 np0005538513.localdomain sudo[268824]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qvdrzphvjotewatwtjfhclvspujgxvah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323027.4730587-1526-274044744312412/AnsiballZ_modprobe.py
Nov 28 09:43:47 np0005538513.localdomain sudo[268824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:47 np0005538513.localdomain python3.9[268826]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 28 09:43:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5638 DF PROTO=TCP SPT=45426 DPT=9102 SEQ=3421532680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2799820000000001030307) 
Nov 28 09:43:47 np0005538513.localdomain sudo[268824]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:43:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:43:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:43:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:43:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:43:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:43:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:43:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:43:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:43:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:43:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:43:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:43:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:48.156 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:48 np0005538513.localdomain sudo[268934]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shpqgxtqdfikolyzjxsbizwvwrhfyohx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323028.1830409-1550-38099109450025/AnsiballZ_stat.py
Nov 28 09:43:48 np0005538513.localdomain sudo[268934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:48 np0005538513.localdomain python3.9[268936]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:43:48 np0005538513.localdomain sudo[268934]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:48 np0005538513.localdomain sudo[268991]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwvxkhldsdgniwbtzypunwejledxhjdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323028.1830409-1550-38099109450025/AnsiballZ_file.py
Nov 28 09:43:48 np0005538513.localdomain sudo[268991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:49 np0005538513.localdomain python3.9[268993]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/nvme-fabrics.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/nvme-fabrics.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:49 np0005538513.localdomain sudo[268991]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:49 np0005538513.localdomain sudo[269101]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nayonhaowjlsagcqagketzzkdfygzviu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323029.489335-1589-265364868287251/AnsiballZ_lineinfile.py
Nov 28 09:43:49 np0005538513.localdomain sudo[269101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:49 np0005538513.localdomain python3.9[269103]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:49 np0005538513.localdomain sudo[269101]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:50 np0005538513.localdomain sudo[269211]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdzhbtrvfajbixbqqvbttzvkjevcdmpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323030.3113105-1616-13736260383066/AnsiballZ_dnf.py
Nov 28 09:43:50 np0005538513.localdomain sudo[269211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:43:50.820 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:43:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:43:50.821 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:43:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:43:50.823 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:43:50 np0005538513.localdomain python3.9[269213]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:43:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:51.125 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:52.683 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:43:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:52.707 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:43:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:52.708 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:43:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:52.709 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:43:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:52.709 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:43:52 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:52.710 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:43:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:53.100 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.391s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:43:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:53.184 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:53.336 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:43:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:53.337 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:43:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:53.538 228337 WARNING nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:43:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:53.540 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11972MB free_disk=41.837093353271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:43:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:53.540 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:43:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:53.540 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:43:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:53.748 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:43:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:53.748 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:43:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:53.749 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:43:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:43:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:53.807 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:43:53 np0005538513.localdomain podman[269238]: 2025-11-28 09:43:53.843144754 +0000 UTC m=+0.077918653 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:43:53 np0005538513.localdomain podman[269238]: 2025-11-28 09:43:53.87937512 +0000 UTC m=+0.114148999 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:43:53 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:43:53 np0005538513.localdomain sudo[269211]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:54.247 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:43:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:54.254 228337 DEBUG nova.compute.provider_tree [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:43:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:54.272 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:43:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:54.274 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:43:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:54.275 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.734s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:43:54 np0005538513.localdomain python3.9[269389]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:43:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:55.270 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:43:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:55.271 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:43:55 np0005538513.localdomain sudo[269501]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhwgxqhnjzaptrlqjiawxjygejcpbgwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323035.3417284-1668-55737327760310/AnsiballZ_file.py
Nov 28 09:43:55 np0005538513.localdomain sudo[269501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:55.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:43:55 np0005538513.localdomain python3.9[269503]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:55 np0005538513.localdomain sudo[269501]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:56.128 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:56.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:43:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:56.681 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:43:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:56.682 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:43:56 np0005538513.localdomain sudo[269611]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfovkpkushiuanonxhoevtnbllinkxrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323036.39129-1701-84416490479445/AnsiballZ_systemd_service.py
Nov 28 09:43:56 np0005538513.localdomain sudo[269611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:43:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:43:56 np0005538513.localdomain podman[269615]: 2025-11-28 09:43:56.807929615 +0000 UTC m=+0.086026718 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 09:43:56 np0005538513.localdomain systemd[1]: tmp-crun.q1NC72.mount: Deactivated successfully.
Nov 28 09:43:56 np0005538513.localdomain podman[269614]: 2025-11-28 09:43:56.871794766 +0000 UTC m=+0.150375525 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 09:43:56 np0005538513.localdomain podman[269615]: 2025-11-28 09:43:56.889541658 +0000 UTC m=+0.167638781 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 09:43:56 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:43:56 np0005538513.localdomain podman[269614]: 2025-11-28 09:43:56.914403392 +0000 UTC m=+0.192984151 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible)
Nov 28 09:43:56 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:43:57 np0005538513.localdomain python3.9[269613]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:43:57 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:43:57 np0005538513.localdomain systemd-rc-local-generator[269680]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:43:57 np0005538513.localdomain systemd-sysv-generator[269684]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:43:57 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:57 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:57 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:57 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:57 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:43:57 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:57 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:57 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:57 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:57 np0005538513.localdomain sudo[269611]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:57 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:57.528 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:43:57 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:57.529 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:43:57 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:57.529 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:43:57 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:57.529 228337 DEBUG nova.objects.instance [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:43:57 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:57.958 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:43:57 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:57.972 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:43:57 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:57.973 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:43:57 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:57.973 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:43:58 np0005538513.localdomain python3.9[269799]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:43:58 np0005538513.localdomain network[269816]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:43:58 np0005538513.localdomain network[269817]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:43:58 np0005538513.localdomain network[269818]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:43:58 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:58.187 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:43:58 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:58.682 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:43:58 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:43:58.682 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:43:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:43:59 np0005538513.localdomain podman[269829]: 2025-11-28 09:43:59.857791881 +0000 UTC m=+0.091619571 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:43:59 np0005538513.localdomain podman[269829]: 2025-11-28 09:43:59.8714893 +0000 UTC m=+0.105316940 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 09:43:59 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:44:00 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.669 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.670 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.674 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 180 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3681520-0443-42ae-82ef-63f5ef41d4b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 180, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:44:00.670358', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'c496ea30-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.842253182, 'message_signature': '2bf10f0406032fad95778c1ac565f464c12c5ac702761631d8f46331caf0d6a4'}]}, 'timestamp': '2025-11-28 09:44:00.675199', '_unique_id': 'cd6d995c4b374e58ae10f8ceef0c5160'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.678 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.722 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 305908425 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.723 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 30452399 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f894e56-a5ac-438d-96e3-a95720e37c28', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 305908425, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:44:00.678433', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c49e563a-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': '2db20201ee58e9682a4f779ab651af0414031bf04b8f8e603082388c9b423305'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30452399, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:44:00.678433', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c49e6d78-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': '4c80a147b5bb079b7d3014cbf35c10d0bc2bccbd47ca5af4894b530589361b94'}]}, 'timestamp': '2025-11-28 09:44:00.724330', '_unique_id': 'df23fb4d5a1a4875bdc76b9936d65352'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.726 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.727 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77a679b7-0ab1-49ee-81bd-cc57e3b0db30', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:44:00.726995', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'c49ee8fc-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.842253182, 'message_signature': '9b208bad587d4a983038230e97acc9e6fdcd2dbc1cf5b3fd8956c357b9c04080'}]}, 'timestamp': '2025-11-28 09:44:00.727511', '_unique_id': '35e1ba9d9d144d94b8a07afd8877f76d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.729 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.729 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 12742 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '895c0806-d43b-4a0c-8543-ed065502dc38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12742, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:44:00.729672', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'c49f4fcc-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.842253182, 'message_signature': '7b34a2303377bccfc263410b743681b83e8bc1b8ffb01bfd2676b34cccb62910'}]}, 'timestamp': '2025-11-28 09:44:00.730187', '_unique_id': '7fb96e37c74540de9a785b5ac460939c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.732 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.732 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33c92913-16ec-4256-815d-b51b8a6ce7fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:44:00.732357', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'c49fb8ae-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.842253182, 'message_signature': '3428b4045689ee37f72fb148f80f0cfd0d0002b9b059d58383318c9fe806ca50'}]}, 'timestamp': '2025-11-28 09:44:00.732825', '_unique_id': 'ab618f8657bd48c6935c623cfed63301'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.734 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.759 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 50880000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ccfbca5-e3cd-4ec6-8e20-e49134b203dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 50880000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:44:00.734975', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'c4a3d6dc-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.93104975, 'message_signature': '29f5d648590cd5575c613da41d5eced1a1982a4fd818c17d82cb583f78dfdc16'}]}, 'timestamp': '2025-11-28 09:44:00.759801', '_unique_id': 'c1ce1fd95c10456a83eab74aa6af75f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.761 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.775 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.775 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8014d84f-e943-45cc-834d-8ab1c3f3e4a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:44:00.761953', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c4a64372-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.933881413, 'message_signature': '9bc840e56cb787671ebfbdc75ad440039245138affe36ffe8026a2ee7ef4303d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:44:00.761953', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c4a6542a-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.933881413, 'message_signature': '1d582f3400f71ec6bdfecf08533dfc41bf78f61e14b3e0a364d7f6d7e2d28e04'}]}, 'timestamp': '2025-11-28 09:44:00.776132', '_unique_id': '7cf8530e20ae443080eb33d6f7f52b58'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.778 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.778 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.778 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fcb7df0-183f-4084-a34d-2c611cbf1963', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:44:00.778308', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c4a6bb4a-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.933881413, 'message_signature': '346cc0d6612f0c9022e63f5dcd32b870c8aa9299ac6ada1a0ae5c69f2fac9da4'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:44:00.778308', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c4a6ccd4-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.933881413, 'message_signature': 'a7a66c0fc4fc9e43d058926b41613adf731443e88de922664f021768ce02d365'}]}, 'timestamp': '2025-11-28 09:44:00.779212', '_unique_id': 'dc174723af8341bd91833ffc53c25dcb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.781 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.781 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.781 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a27fc27-ad17-492b-b445-1697f21675ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:44:00.781358', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c4a7326e-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.933881413, 'message_signature': 'e4bf4d54afa355ac42b92746eb103d7b4d85bae70ae8bd70a40b764dbcf3fedf'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:44:00.781358', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c4a7427c-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.933881413, 'message_signature': 'e94353dc43081b4398a6daeeba92ebc4e8bd8a1e179ea1a97821cf2a37024f14'}]}, 'timestamp': '2025-11-28 09:44:00.782224', '_unique_id': 'cf38a6f63fb9457ebc1da9b9d46cff1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.784 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.784 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1313024378 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.784 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 168520223 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2307aff1-52c3-4b8b-a252-b22dced9a7be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1313024378, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:44:00.784416', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c4a7a9e2-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': 'c1d3fe4eb9f6690f60b197d273222152af4223b41f25b15fb00f81607321baad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 168520223, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:44:00.784416', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c4a7bb26-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': 'c3b440ac4d74a826f387aac76af8bbcedef6e47c19d00fba13d58dcb11f687f0'}]}, 'timestamp': '2025-11-28 09:44:00.785289', '_unique_id': 'd89a09cde2834e929e027ff5d0928a8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.787 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.787 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67d19e6b-7141-4213-9c8c-c1c74bf08f6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 144, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:44:00.787639', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'c4a8280e-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.842253182, 'message_signature': '11ec7822c0fe2d089030de5ad6125f2bb6007e3f78447cc2c6375bf6bcf13b93'}]}, 'timestamp': '2025-11-28 09:44:00.788135', '_unique_id': '7b02fab328994d72be9a2cd0cb81df7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.790 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.790 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 91 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d80f67c-bede-417d-8843-cf16e965d476', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 91, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:44:00.790222', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'c4a88ccc-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.842253182, 'message_signature': 'ab2cb6e580e42562e73a5b3355baa9cb0566fd1a77fad24c262c53a748d15c7c'}]}, 'timestamp': '2025-11-28 09:44:00.790876', '_unique_id': '7c4e7316010145888c8bf582b8f48ac9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.792 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.793 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.793 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18029ac1-c6c6-4c8b-8cd3-6f610542eeba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:44:00.792979', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c4a8f9be-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': 'b434e829766485e4610c43fffe5db840a02ff1403b99c5122e22a405183370df'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:44:00.792979', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c4a909ae-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': 'a944736ffa854ccbe0c4a8508a05f167ce23aa6ba6d1c96ae688aaa02c37abd8'}]}, 'timestamp': '2025-11-28 09:44:00.793845', '_unique_id': '8685079441354bddbef2f8ad7eb7e59d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.795 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.796 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72548d52-276c-4487-bb5c-01b28b0d321a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:44:00.796110', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'c4a972e0-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.842253182, 'message_signature': '184c7775665b02b1910a3df2cd920b4253bfa74d83c4d02bb067302ceb32a0f0'}]}, 'timestamp': '2025-11-28 09:44:00.796570', '_unique_id': '3fc641693b0543aa9f4e48398d4b695d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.798 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.798 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16bd6754-9c90-4d27-bbc3-43220673c3c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:44:00.798680', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'c4a9d712-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.842253182, 'message_signature': 'c4662313073a4f387bab679b0dcdaa82fd4eef2b69073243d996774d74089365'}]}, 'timestamp': '2025-11-28 09:44:00.799165', '_unique_id': 'c5f4f20f278a406685df177e70ab521a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.801 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.801 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 73981952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.801 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37df9261-6ca9-4e15-be7e-d603ea7e6db8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73981952, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:44:00.801274', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c4aa3c52-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': 'b80ae72e00a2b9d664f172ce2cdf43f7521ed0bbdf1b0dbf1628b64bbc68fc3b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:44:00.801274', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c4aa4c7e-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': 'de43b4de1ec1ae929898eb12c0758de7fee6b4adce2a585ea7b798a90ee73b3a'}]}, 'timestamp': '2025-11-28 09:44:00.802139', '_unique_id': 'd69416f51e88481f9fd2b299281d4aa9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.804 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.804 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.804 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 52.40234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f640eb14-bea8-4dae-9ae0-1919acf3087c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.40234375, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:44:00.804790', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'c4aac6ea-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.93104975, 'message_signature': 'a71728d0aeb4b58e2bb30facddf52f1f963f9789aaf31612fc0bc122cfd05070'}]}, 'timestamp': '2025-11-28 09:44:00.805281', '_unique_id': 'a0119a9f9190435fbed9dfe4985f2c4b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.807 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.807 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.807 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0377595-d671-4240-8423-d02f3a110586', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:44:00.807739', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'c4ab38dc-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.842253182, 'message_signature': '885cebf6fd11564aeb74dd5ec3b964d9e41765762e42fa1ee341dbd50187dfdc'}]}, 'timestamp': '2025-11-28 09:44:00.808223', '_unique_id': '1a3cfa32cf9e4ca6b304f3edc4f3cbe8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.810 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.810 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 9621 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3c0342d-f1df-4b28-af36-ba265779a962', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9621, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:44:00.810546', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'c4abaa38-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.842253182, 'message_signature': '9b05a9f42070b59a6fc09b0acc65b35835d5ec29f105d7250eaf15c0f8d6982c'}]}, 'timestamp': '2025-11-28 09:44:00.811131', '_unique_id': 'f940f747287344c087d2f3e2fcbbf917'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.813 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.813 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 566 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.813 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '963003f6-262d-4102-a394-56131c7f2126', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 566, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:44:00.813329', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c4ac13a6-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': 'bee82a325234a6a44693dfecebbd74c49e2bddbccfe43bdd533aaba7e0112273'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:44:00.813329', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c4ac247c-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': '3caaea39f62aedef229bc5e121f850f3f1d19be6d56cf35fb83a4ce3164fdbe6'}]}, 'timestamp': '2025-11-28 09:44:00.814250', '_unique_id': '05533cea7204499e846d7540ad24dd7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.816 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.816 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da07d8b8-7d3f-4b1b-91f0-ec2ac8b5f334', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:44:00.816115', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c4ac7d28-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': '52c1be4b61344d0711412091919fb607831dc4a023f90d0049c362cf07e7a91e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:44:00.816115', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c4ac898a-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': 'cbb09714ffe6c0ea4c5f61a4770d27407666f57dbbdb97e28cbba7b4025f04d6'}]}, 'timestamp': '2025-11-28 09:44:00.816711', '_unique_id': 'a45cdde572f541a1b0ee2ff53ff91933'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.818 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:01 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:01.159 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:01 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:01.682 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:44:02 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15675 DF PROTO=TCP SPT=38844 DPT=9102 SEQ=3389022923 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB27D1F50000000001030307) 
Nov 28 09:44:03 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:03.231 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15676 DF PROTO=TCP SPT=38844 DPT=9102 SEQ=3389022923 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB27D6020000000001030307) 
Nov 28 09:44:03 np0005538513.localdomain sudo[270067]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osfycsylbsdiukgjqswsglucbzgjsecn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323043.5064476-1758-94578440472729/AnsiballZ_systemd_service.py
Nov 28 09:44:03 np0005538513.localdomain sudo[270067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:04 np0005538513.localdomain python3.9[270069]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:44:04 np0005538513.localdomain sudo[270067]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:04 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5639 DF PROTO=TCP SPT=45426 DPT=9102 SEQ=3421532680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB27D9820000000001030307) 
Nov 28 09:44:04 np0005538513.localdomain sudo[270178]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hyclktwqvvgqriftfkclksvsgfjodiwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323044.2789373-1758-147565718540868/AnsiballZ_systemd_service.py
Nov 28 09:44:04 np0005538513.localdomain sudo[270178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:04 np0005538513.localdomain python3.9[270180]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:44:04 np0005538513.localdomain sudo[270178]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:05 np0005538513.localdomain sudo[270289]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qxrgznmcalepukbwbpdlyhcwmgwqkjkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323045.08868-1758-73208642652044/AnsiballZ_systemd_service.py
Nov 28 09:44:05 np0005538513.localdomain sudo[270289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:05 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15677 DF PROTO=TCP SPT=38844 DPT=9102 SEQ=3389022923 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB27DE030000000001030307) 
Nov 28 09:44:05 np0005538513.localdomain python3.9[270291]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:44:05 np0005538513.localdomain sudo[270289]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:06 np0005538513.localdomain sudo[270400]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mntmlwkezvuqtqcljlelvuokbjjqarsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323045.840954-1758-195621629369016/AnsiballZ_systemd_service.py
Nov 28 09:44:06 np0005538513.localdomain sudo[270400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:44:06 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:06.162 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:06 np0005538513.localdomain systemd[1]: tmp-crun.LxB72V.mount: Deactivated successfully.
Nov 28 09:44:06 np0005538513.localdomain podman[270403]: 2025-11-28 09:44:06.237050689 +0000 UTC m=+0.103965086 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:44:06 np0005538513.localdomain podman[270403]: 2025-11-28 09:44:06.245903329 +0000 UTC m=+0.112817726 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:44:06 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:44:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62573 DF PROTO=TCP SPT=54772 DPT=9102 SEQ=1163547485 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB27E1820000000001030307) 
Nov 28 09:44:06 np0005538513.localdomain python3.9[270402]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:44:06 np0005538513.localdomain sudo[270400]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:06 np0005538513.localdomain sudo[270535]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfvzvetpmrvmzboobwicfliggrhquspz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323046.5805397-1758-241262273596682/AnsiballZ_systemd_service.py
Nov 28 09:44:06 np0005538513.localdomain sudo[270535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:07 np0005538513.localdomain python3.9[270537]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:44:07 np0005538513.localdomain sudo[270535]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:07 np0005538513.localdomain sudo[270646]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdykxrpwezrkiulajuajwpjlyuzdmzsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323047.3831415-1758-56828369293650/AnsiballZ_systemd_service.py
Nov 28 09:44:07 np0005538513.localdomain sudo[270646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:08 np0005538513.localdomain python3.9[270648]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:44:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:44:08 np0005538513.localdomain sudo[270646]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:08 np0005538513.localdomain podman[270650]: 2025-11-28 09:44:08.158549034 +0000 UTC m=+0.083476435 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 28 09:44:08 np0005538513.localdomain podman[270650]: 2025-11-28 09:44:08.17339242 +0000 UTC m=+0.098319851 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.schema-version=1.0)
Nov 28 09:44:08 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:44:08 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:08.234 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:08 np0005538513.localdomain sudo[270776]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzehtkeggubdmjhnpfbhbkjzpppnpgqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323048.2084992-1758-121724955257173/AnsiballZ_systemd_service.py
Nov 28 09:44:08 np0005538513.localdomain sudo[270776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:08 np0005538513.localdomain python3.9[270778]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:44:08 np0005538513.localdomain sudo[270776]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:09 np0005538513.localdomain sudo[270887]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bonfgtaeuhniqczrhzjdywxefrqfqzip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323048.9795122-1758-186804928203506/AnsiballZ_systemd_service.py
Nov 28 09:44:09 np0005538513.localdomain sudo[270887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15678 DF PROTO=TCP SPT=38844 DPT=9102 SEQ=3389022923 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB27EDC20000000001030307) 
Nov 28 09:44:09 np0005538513.localdomain python3.9[270889]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:44:10 np0005538513.localdomain podman[238687]: time="2025-11-28T09:44:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:44:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:44:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1"
Nov 28 09:44:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:44:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17216 "" "Go-http-client/1.1"
Nov 28 09:44:10 np0005538513.localdomain sudo[270887]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:11 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:11.204 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:13 np0005538513.localdomain sudo[270998]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrylvkkzksktwmrmfipelhsotdjjuvnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323052.9441772-1935-35771866677648/AnsiballZ_file.py
Nov 28 09:44:13 np0005538513.localdomain sudo[270998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:13 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:13.274 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:13 np0005538513.localdomain python3.9[271000]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:13 np0005538513.localdomain sudo[270998]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:13 np0005538513.localdomain sudo[271108]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-diojckepwggrhrfmmzwaejgibnnonrmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323053.6209311-1935-78423309865272/AnsiballZ_file.py
Nov 28 09:44:13 np0005538513.localdomain sudo[271108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:14 np0005538513.localdomain python3.9[271110]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:14 np0005538513.localdomain sudo[271108]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:14 np0005538513.localdomain sudo[271218]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ilxzpdlnnbndluvnuvzhjcmrnhmvtpfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323054.2013803-1935-254243866520233/AnsiballZ_file.py
Nov 28 09:44:14 np0005538513.localdomain sudo[271218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:14 np0005538513.localdomain python3.9[271220]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:14 np0005538513.localdomain sudo[271218]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:14 np0005538513.localdomain sudo[271328]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnfjygkjzzkicrytwhamjqiiumdveuse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323054.7443457-1935-152615818061704/AnsiballZ_file.py
Nov 28 09:44:14 np0005538513.localdomain sudo[271328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:15 np0005538513.localdomain python3.9[271330]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:15 np0005538513.localdomain sudo[271328]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:16 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:16.205 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:16 np0005538513.localdomain sudo[271438]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zelmnjfyuhhujbkzkbwvkbtaauscmdfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323056.155663-1935-8567177366233/AnsiballZ_file.py
Nov 28 09:44:16 np0005538513.localdomain sudo[271438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:16 np0005538513.localdomain python3.9[271440]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:16 np0005538513.localdomain sudo[271438]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:17 np0005538513.localdomain sudo[271548]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dilfedeiryiyvtchfyraxyvmpsvzgpmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323056.8307266-1935-279115388805705/AnsiballZ_file.py
Nov 28 09:44:17 np0005538513.localdomain sudo[271548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:44:17 np0005538513.localdomain podman[271551]: 2025-11-28 09:44:17.185793117 +0000 UTC m=+0.079047800 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Nov 28 09:44:17 np0005538513.localdomain podman[271551]: 2025-11-28 09:44:17.202301798 +0000 UTC m=+0.095556511 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vendor=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 09:44:17 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:44:17 np0005538513.localdomain python3.9[271550]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:17 np0005538513.localdomain sudo[271548]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15679 DF PROTO=TCP SPT=38844 DPT=9102 SEQ=3389022923 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB280D830000000001030307) 
Nov 28 09:44:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:44:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:44:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:44:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:44:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:44:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:44:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:44:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:44:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:44:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:44:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:44:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:44:18 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:18.274 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:18 np0005538513.localdomain sudo[271679]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvqxuwtmdhbtitbnjyooelgevusnlwsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323058.1635373-1935-100110363255860/AnsiballZ_file.py
Nov 28 09:44:18 np0005538513.localdomain sudo[271679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:18 np0005538513.localdomain python3.9[271681]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:18 np0005538513.localdomain sudo[271679]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:19 np0005538513.localdomain sudo[271789]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sosqmjphxsxzjurlkkakqiivsbzfkbcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323058.7548678-1935-168158074522634/AnsiballZ_file.py
Nov 28 09:44:19 np0005538513.localdomain sudo[271789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:19 np0005538513.localdomain python3.9[271791]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:19 np0005538513.localdomain sudo[271789]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:19 np0005538513.localdomain sudo[271899]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-keolkhwewsuursbwlitzyeimuyyqbgmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323059.4093285-2106-31714958636966/AnsiballZ_file.py
Nov 28 09:44:19 np0005538513.localdomain sudo[271899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:19 np0005538513.localdomain python3.9[271901]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:19 np0005538513.localdomain sudo[271899]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:20 np0005538513.localdomain sudo[272009]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olubshoapwueejuizdlzfeoupkisdnfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323059.9931042-2106-122807923700532/AnsiballZ_file.py
Nov 28 09:44:20 np0005538513.localdomain sudo[272009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:20 np0005538513.localdomain python3.9[272011]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:20 np0005538513.localdomain sudo[272009]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:20 np0005538513.localdomain sudo[272119]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqmyvqzhaixnxgmprjmydbwvvbomnvsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323060.5787802-2106-175016754722261/AnsiballZ_file.py
Nov 28 09:44:20 np0005538513.localdomain sudo[272119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:21 np0005538513.localdomain python3.9[272121]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:21 np0005538513.localdomain sudo[272119]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:21 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:21.229 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:21 np0005538513.localdomain sudo[272229]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wiinupmwmuwfinyavnohhqznifcqovae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323061.141667-2106-234701088607441/AnsiballZ_file.py
Nov 28 09:44:21 np0005538513.localdomain sudo[272229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:21 np0005538513.localdomain python3.9[272231]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:21 np0005538513.localdomain sudo[272229]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:22 np0005538513.localdomain sudo[272339]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujkwoipkhmfuykvkouazvfhwpjhshqwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323061.7707582-2106-280595061609594/AnsiballZ_file.py
Nov 28 09:44:22 np0005538513.localdomain sudo[272339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:22 np0005538513.localdomain python3.9[272341]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:22 np0005538513.localdomain sudo[272339]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:22 np0005538513.localdomain sudo[272375]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:44:22 np0005538513.localdomain sudo[272375]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:44:22 np0005538513.localdomain sudo[272375]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:22 np0005538513.localdomain sudo[272426]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:44:22 np0005538513.localdomain sudo[272426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:44:22 np0005538513.localdomain sudo[272485]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pyqiqwrpblefbjpxenjoplufdidderaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323062.3703673-2106-12286895217496/AnsiballZ_file.py
Nov 28 09:44:22 np0005538513.localdomain sudo[272485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:22 np0005538513.localdomain python3.9[272487]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:22 np0005538513.localdomain sudo[272485]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:23 np0005538513.localdomain sudo[272665]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfqylpebtzdiezctdaxvfpohrlxgpfvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323062.9454868-2106-28164762834720/AnsiballZ_file.py
Nov 28 09:44:23 np0005538513.localdomain sudo[272665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:23 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:23.327 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:23 np0005538513.localdomain podman[272666]: 2025-11-28 09:44:23.335748204 +0000 UTC m=+0.134199885 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, version=7, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, distribution-scope=public, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:44:23 np0005538513.localdomain python3.9[272679]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:23 np0005538513.localdomain podman[272666]: 2025-11-28 09:44:23.441407015 +0000 UTC m=+0.239858706 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, com.redhat.component=rhceph-container, distribution-scope=public, CEPH_POINT_RELEASE=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12)
Nov 28 09:44:23 np0005538513.localdomain sudo[272665]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:23 np0005538513.localdomain sudo[272426]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:23 np0005538513.localdomain sudo[272785]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:44:23 np0005538513.localdomain sudo[272785]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:44:23 np0005538513.localdomain sudo[272785]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:23 np0005538513.localdomain sudo[272825]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:44:23 np0005538513.localdomain sudo[272825]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:44:23 np0005538513.localdomain sudo[272879]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjtzkttwjqauosfjkkouhpobrccxwuri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323063.7226849-2106-276147316621742/AnsiballZ_file.py
Nov 28 09:44:23 np0005538513.localdomain sudo[272879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:44:24 np0005538513.localdomain podman[272882]: 2025-11-28 09:44:24.041784616 +0000 UTC m=+0.075356479 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:44:24 np0005538513.localdomain podman[272882]: 2025-11-28 09:44:24.057266373 +0000 UTC m=+0.090838236 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:44:24 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:44:24 np0005538513.localdomain python3.9[272881]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:24 np0005538513.localdomain sudo[272879]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:24 np0005538513.localdomain sudo[272825]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:24 np0005538513.localdomain sudo[273043]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rylqjjttczbtkxmjdhwwffjzupzktpnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323064.4458725-2280-137886689736297/AnsiballZ_command.py
Nov 28 09:44:24 np0005538513.localdomain sudo[273043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:24 np0005538513.localdomain python3.9[273045]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:44:24 np0005538513.localdomain sudo[273043]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:24 np0005538513.localdomain sudo[273048]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:44:24 np0005538513.localdomain sudo[273048]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:44:24 np0005538513.localdomain sudo[273048]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:26 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:26.232 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:26 np0005538513.localdomain python3.9[273173]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 09:44:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:44:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:44:27 np0005538513.localdomain sudo[273304]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cifvjsifxaiptehqilzufrhyslimmwgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323067.6043098-2334-254283076882113/AnsiballZ_systemd_service.py
Nov 28 09:44:27 np0005538513.localdomain podman[273261]: 2025-11-28 09:44:27.856574502 +0000 UTC m=+0.080169757 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:44:27 np0005538513.localdomain sudo[273304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:27 np0005538513.localdomain podman[273263]: 2025-11-28 09:44:27.945778023 +0000 UTC m=+0.166328188 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 09:44:27 np0005538513.localdomain podman[273261]: 2025-11-28 09:44:27.953486885 +0000 UTC m=+0.177082130 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 28 09:44:27 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:44:28 np0005538513.localdomain podman[273263]: 2025-11-28 09:44:28.004520477 +0000 UTC m=+0.225070612 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 09:44:28 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:44:28 np0005538513.localdomain python3.9[273311]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:44:28 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:44:28 np0005538513.localdomain systemd-rc-local-generator[273346]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:44:28 np0005538513.localdomain systemd-sysv-generator[273356]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:44:28 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:44:28 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:44:28 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:44:28 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:28.330 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:28 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:44:28 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:44:28 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:44:28 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:44:28 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:44:28 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:44:28 np0005538513.localdomain sudo[273304]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:29 np0005538513.localdomain sudo[273469]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vnpofgyjtqnnhfgwdyzhtrvqadwxcttm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323069.0764544-2358-137962248279162/AnsiballZ_command.py
Nov 28 09:44:29 np0005538513.localdomain sudo[273469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:29 np0005538513.localdomain python3.9[273471]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:44:29 np0005538513.localdomain sudo[273469]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:30 np0005538513.localdomain sudo[273580]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uyoqcjipnkkpmsuzwgrcasbetkwihtre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323069.8747516-2358-138778219644216/AnsiballZ_command.py
Nov 28 09:44:30 np0005538513.localdomain sudo[273580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:30 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:44:30 np0005538513.localdomain systemd[1]: tmp-crun.0f2JIp.mount: Deactivated successfully.
Nov 28 09:44:30 np0005538513.localdomain podman[273583]: 2025-11-28 09:44:30.26952364 +0000 UTC m=+0.092708297 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 28 09:44:30 np0005538513.localdomain podman[273583]: 2025-11-28 09:44:30.283434256 +0000 UTC m=+0.106618923 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:44:30 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:44:30 np0005538513.localdomain python3.9[273582]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:44:30 np0005538513.localdomain sudo[273580]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:30 np0005538513.localdomain sudo[273713]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jxlejrpfkeunfyzgkwherlhmezfxdnyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323070.5410793-2358-63517656890528/AnsiballZ_command.py
Nov 28 09:44:30 np0005538513.localdomain sudo[273713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:31 np0005538513.localdomain python3.9[273715]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:44:31 np0005538513.localdomain sudo[273713]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:31 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:31.260 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:31 np0005538513.localdomain sudo[273824]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oceynignaqwgihcsaovwougpwuzdexqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323071.1913285-2358-13640535788912/AnsiballZ_command.py
Nov 28 09:44:31 np0005538513.localdomain sudo[273824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:31 np0005538513.localdomain python3.9[273826]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:44:31 np0005538513.localdomain sudo[273824]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:32 np0005538513.localdomain sudo[273935]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bontxosoakljyngryzzhtfpwhhycasrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323071.8138926-2358-20693300777850/AnsiballZ_command.py
Nov 28 09:44:32 np0005538513.localdomain sudo[273935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:32 np0005538513.localdomain python3.9[273937]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:44:32 np0005538513.localdomain sudo[273935]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:32 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7598 DF PROTO=TCP SPT=33082 DPT=9102 SEQ=3782079456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2847250000000001030307) 
Nov 28 09:44:32 np0005538513.localdomain sudo[274046]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbalwnjbvhhfpcxnjkifujrmnwlvhjvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323072.4951677-2358-245202597436242/AnsiballZ_command.py
Nov 28 09:44:32 np0005538513.localdomain sudo[274046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:32 np0005538513.localdomain python3.9[274048]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:44:32 np0005538513.localdomain sudo[274046]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:33 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:33.371 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:33 np0005538513.localdomain sudo[274157]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mcodrrgstsyeoagjxbeitxlmdmyksszz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323073.0987356-2358-53708933196687/AnsiballZ_command.py
Nov 28 09:44:33 np0005538513.localdomain sudo[274157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7599 DF PROTO=TCP SPT=33082 DPT=9102 SEQ=3782079456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB284B420000000001030307) 
Nov 28 09:44:33 np0005538513.localdomain python3.9[274159]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:44:33 np0005538513.localdomain sudo[274157]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:33 np0005538513.localdomain sudo[274268]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwahsmcvfrwmtjzefwaqamkyidlbkqxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323073.6980922-2358-67374866223213/AnsiballZ_command.py
Nov 28 09:44:33 np0005538513.localdomain sudo[274268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:34 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15680 DF PROTO=TCP SPT=38844 DPT=9102 SEQ=3389022923 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB284D820000000001030307) 
Nov 28 09:44:34 np0005538513.localdomain python3.9[274270]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:44:34 np0005538513.localdomain sudo[274268]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:35 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7600 DF PROTO=TCP SPT=33082 DPT=9102 SEQ=3782079456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2853420000000001030307) 
Nov 28 09:44:36 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:36.264 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5640 DF PROTO=TCP SPT=45426 DPT=9102 SEQ=3421532680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2857830000000001030307) 
Nov 28 09:44:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:44:36 np0005538513.localdomain podman[274343]: 2025-11-28 09:44:36.86779723 +0000 UTC m=+0.105020370 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:44:36 np0005538513.localdomain sudo[274390]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ctfgigowvsjoysdqkzeiekcucprtafak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323076.6165614-2565-10167884893921/AnsiballZ_file.py
Nov 28 09:44:36 np0005538513.localdomain sudo[274390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:36 np0005538513.localdomain podman[274343]: 2025-11-28 09:44:36.90840491 +0000 UTC m=+0.145628000 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:44:36 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:44:37 np0005538513.localdomain python3.9[274402]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:37 np0005538513.localdomain sudo[274390]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:37 np0005538513.localdomain sudo[274510]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dodsmhnuthtntzttfrgmshyyioesmlzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323077.2237241-2565-45879175447598/AnsiballZ_file.py
Nov 28 09:44:37 np0005538513.localdomain sudo[274510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:37 np0005538513.localdomain python3.9[274512]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:37 np0005538513.localdomain sudo[274510]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:38 np0005538513.localdomain sudo[274620]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izfcmojdprssjfkenmlwxkphywokmlzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323077.8106933-2565-83915122787122/AnsiballZ_file.py
Nov 28 09:44:38 np0005538513.localdomain sudo[274620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:38 np0005538513.localdomain python3.9[274622]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:38 np0005538513.localdomain sudo[274620]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:38 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:38.373 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:44:38 np0005538513.localdomain sudo[274730]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czupqmjzxgbeeoxaybrhqlmnmisjsisw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323078.518903-2631-108962802957115/AnsiballZ_file.py
Nov 28 09:44:38 np0005538513.localdomain sudo[274730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:38 np0005538513.localdomain podman[274732]: 2025-11-28 09:44:38.852140984 +0000 UTC m=+0.084053614 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 09:44:38 np0005538513.localdomain podman[274732]: 2025-11-28 09:44:38.861228801 +0000 UTC m=+0.093141391 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=multipathd)
Nov 28 09:44:38 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:44:38 np0005538513.localdomain python3.9[274741]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:39 np0005538513.localdomain sudo[274730]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7601 DF PROTO=TCP SPT=33082 DPT=9102 SEQ=3782079456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2863020000000001030307) 
Nov 28 09:44:40 np0005538513.localdomain podman[238687]: time="2025-11-28T09:44:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:44:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:44:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1"
Nov 28 09:44:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:44:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17223 "" "Go-http-client/1.1"
Nov 28 09:44:40 np0005538513.localdomain sudo[274859]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-riyittxlhgkiiygykwkjglmzdbzeicla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323079.1590009-2631-237391001388217/AnsiballZ_file.py
Nov 28 09:44:40 np0005538513.localdomain sudo[274859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:40 np0005538513.localdomain python3.9[274861]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:40 np0005538513.localdomain sudo[274859]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:41 np0005538513.localdomain sudo[274969]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvwentvbbwhkckndaykfrofaaacxfssj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323080.8430815-2631-220462515397585/AnsiballZ_file.py
Nov 28 09:44:41 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:41.301 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:41 np0005538513.localdomain sudo[274969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:41 np0005538513.localdomain python3.9[274971]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:41 np0005538513.localdomain sudo[274969]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:42 np0005538513.localdomain sudo[275079]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izzakejevtwtcbbzfueyragizyuriryk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323082.4138746-2631-227316672502199/AnsiballZ_file.py
Nov 28 09:44:42 np0005538513.localdomain sudo[275079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:42 np0005538513.localdomain python3.9[275081]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:42 np0005538513.localdomain sudo[275079]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:43 np0005538513.localdomain sudo[275189]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-esxvybuhbjdlefbucjxsetsniooqzslu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323083.0294182-2631-139006272155900/AnsiballZ_file.py
Nov 28 09:44:43 np0005538513.localdomain sudo[275189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:43 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:43.403 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:43 np0005538513.localdomain python3.9[275191]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:43 np0005538513.localdomain sudo[275189]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:43 np0005538513.localdomain sudo[275299]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndkzqdgivkcnqcmjuuxfvntleprggsqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323083.6332006-2631-70423798355990/AnsiballZ_file.py
Nov 28 09:44:43 np0005538513.localdomain sudo[275299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:44 np0005538513.localdomain python3.9[275301]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:44 np0005538513.localdomain sudo[275299]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:44 np0005538513.localdomain sudo[275409]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wuqhorhxdxhncdovsdjfpngcvnjmhdhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323084.2592497-2631-178929543930345/AnsiballZ_file.py
Nov 28 09:44:44 np0005538513.localdomain sudo[275409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:44 np0005538513.localdomain python3.9[275411]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:44 np0005538513.localdomain sudo[275409]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:46 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:46.303 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:44:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7602 DF PROTO=TCP SPT=33082 DPT=9102 SEQ=3782079456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2883820000000001030307) 
Nov 28 09:44:47 np0005538513.localdomain podman[275429]: 2025-11-28 09:44:47.852191507 +0000 UTC m=+0.080073003 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, name=ubi9-minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=)
Nov 28 09:44:47 np0005538513.localdomain podman[275429]: 2025-11-28 09:44:47.894581596 +0000 UTC m=+0.122463102 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41)
Nov 28 09:44:47 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:44:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:44:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:44:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:44:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:44:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:44:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:44:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:44:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:44:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:44:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:44:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:44:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:44:48 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:48.404 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:49 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:49.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:44:50 np0005538513.localdomain sudo[275539]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbpcxhhrldhqkdvxorkuokkvhkawgviw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323089.9497418-2956-260195975538534/AnsiballZ_getent.py
Nov 28 09:44:50 np0005538513.localdomain sudo[275539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:50 np0005538513.localdomain python3.9[275541]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 28 09:44:50 np0005538513.localdomain sudo[275539]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:44:50.823 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:44:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:44:50.824 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:44:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:44:50.826 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:44:51 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:51.334 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:51 np0005538513.localdomain sshd[275560]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:44:51 np0005538513.localdomain sshd[275560]: Accepted publickey for zuul from 192.168.122.30 port 45126 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:44:51 np0005538513.localdomain systemd-logind[764]: New session 60 of user zuul.
Nov 28 09:44:51 np0005538513.localdomain systemd[1]: Started Session 60 of User zuul.
Nov 28 09:44:51 np0005538513.localdomain sshd[275560]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:44:51 np0005538513.localdomain sshd[275563]: Received disconnect from 192.168.122.30 port 45126:11: disconnected by user
Nov 28 09:44:51 np0005538513.localdomain sshd[275563]: Disconnected from user zuul 192.168.122.30 port 45126
Nov 28 09:44:51 np0005538513.localdomain sshd[275560]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:44:51 np0005538513.localdomain systemd[1]: session-60.scope: Deactivated successfully.
Nov 28 09:44:51 np0005538513.localdomain systemd-logind[764]: Session 60 logged out. Waiting for processes to exit.
Nov 28 09:44:51 np0005538513.localdomain systemd-logind[764]: Removed session 60.
Nov 28 09:44:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:53.448 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:53 np0005538513.localdomain python3.9[275671]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:44:53 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:53.676 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:44:54 np0005538513.localdomain python3.9[275757]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764323093.0572686-3037-135414968884635/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:54.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:44:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:54.700 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:44:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:54.700 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:44:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:54.700 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:44:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:54.701 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:44:54 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:54.702 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:44:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:44:54 np0005538513.localdomain python3.9[275865]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:44:54 np0005538513.localdomain podman[275867]: 2025-11-28 09:44:54.84593431 +0000 UTC m=+0.078574899 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:44:54 np0005538513.localdomain podman[275867]: 2025-11-28 09:44:54.879087446 +0000 UTC m=+0.111727995 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:44:54 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:44:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:55.164 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:44:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:55.250 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:44:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:55.251 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:44:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:55.449 228337 WARNING nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:44:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:55.451 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12117MB free_disk=41.837093353271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:44:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:55.451 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:44:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:55.451 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:44:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:55.542 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:44:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:55.543 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:44:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:55.543 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:44:55 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:55.587 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:44:55 np0005538513.localdomain python3.9[275965]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:56.080 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:44:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:56.087 228337 DEBUG nova.compute.provider_tree [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:44:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:56.111 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:44:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:56.114 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:44:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:56.114 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.663s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:44:56 np0005538513.localdomain python3.9[276093]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:44:56 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:56.336 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:56 np0005538513.localdomain python3.9[276181]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764323095.7662373-3037-63201533351861/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:57 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:57.110 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:44:57 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:57.143 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:44:57 np0005538513.localdomain python3.9[276289]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:44:57 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:57.680 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:44:57 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:57.681 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:44:57 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:57.681 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:44:57 np0005538513.localdomain python3.9[276375]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764323096.9453437-3037-205330200693193/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=534005c01c7af821d962fad87e973f668cecbdc9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:58 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:58.452 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:44:58 np0005538513.localdomain python3.9[276483]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:44:58 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:58.675 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:44:58 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:58.675 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:44:58 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:58.676 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:44:58 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:58.676 228337 DEBUG nova.objects.instance [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:44:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:44:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:44:58 np0005538513.localdomain podman[276521]: 2025-11-28 09:44:58.863817047 +0000 UTC m=+0.091729832 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 09:44:58 np0005538513.localdomain podman[276523]: 2025-11-28 09:44:58.917070409 +0000 UTC m=+0.141234449 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 28 09:44:58 np0005538513.localdomain podman[276523]: 2025-11-28 09:44:58.926282281 +0000 UTC m=+0.150446291 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:44:58 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:44:58 np0005538513.localdomain podman[276521]: 2025-11-28 09:44:58.982857485 +0000 UTC m=+0.210770250 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 09:44:58 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:44:59 np0005538513.localdomain python3.9[276609]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764323098.084483-3037-210506789598648/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:59 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:59.230 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:44:59 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:59.260 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:44:59 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:59.260 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:44:59 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:59.261 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:44:59 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:59.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:44:59 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:59.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:44:59 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:44:59.682 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:44:59 np0005538513.localdomain python3.9[276717]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:45:00 np0005538513.localdomain python3.9[276803]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764323099.291147-3037-86133503458409/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:45:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:45:00 np0005538513.localdomain podman[276875]: 2025-11-28 09:45:00.846276859 +0000 UTC m=+0.083178440 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 28 09:45:00 np0005538513.localdomain podman[276875]: 2025-11-28 09:45:00.861271038 +0000 UTC m=+0.098172589 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:45:00 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:45:00 np0005538513.localdomain sudo[276930]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-caxnlyhncqzdijdangomnzcrjpujvpwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323100.6264932-3286-29358835376066/AnsiballZ_file.py
Nov 28 09:45:00 np0005538513.localdomain sudo[276930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:01 np0005538513.localdomain python3.9[276932]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:45:01 np0005538513.localdomain sudo[276930]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:01 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:45:01.396 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:01 np0005538513.localdomain sudo[277040]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hggifqtjxvfonrzzglrmheclhlpglebq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323101.4363856-3310-128834115177852/AnsiballZ_copy.py
Nov 28 09:45:01 np0005538513.localdomain sudo[277040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:01 np0005538513.localdomain python3.9[277042]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:45:01 np0005538513.localdomain sudo[277040]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:02 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2452 DF PROTO=TCP SPT=34262 DPT=9102 SEQ=3361221625 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB28BC550000000001030307) 
Nov 28 09:45:02 np0005538513.localdomain sudo[277150]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kryksdjqspvdeohiifjbdphsyosjnszi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323102.1488934-3334-208767716092165/AnsiballZ_stat.py
Nov 28 09:45:02 np0005538513.localdomain sudo[277150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:02 np0005538513.localdomain python3.9[277152]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:45:02 np0005538513.localdomain sudo[277150]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:03 np0005538513.localdomain sudo[277262]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfndsaiputxqonpszwqonpafazppnyyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323103.0074105-3361-76436555466630/AnsiballZ_file.py
Nov 28 09:45:03 np0005538513.localdomain sudo[277262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2453 DF PROTO=TCP SPT=34262 DPT=9102 SEQ=3361221625 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB28C0430000000001030307) 
Nov 28 09:45:03 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:45:03.478 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:03 np0005538513.localdomain python3.9[277264]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:45:03 np0005538513.localdomain sudo[277262]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:03 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:45:03.682 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:45:04 np0005538513.localdomain python3.9[277372]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:45:04 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7603 DF PROTO=TCP SPT=33082 DPT=9102 SEQ=3782079456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB28C3820000000001030307) 
Nov 28 09:45:04 np0005538513.localdomain python3.9[277482]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:45:05 np0005538513.localdomain python3.9[277537]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute.json _original_basename=nova_compute.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:45:05 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2454 DF PROTO=TCP SPT=34262 DPT=9102 SEQ=3361221625 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB28C8420000000001030307) 
Nov 28 09:45:05 np0005538513.localdomain python3.9[277645]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:45:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15681 DF PROTO=TCP SPT=38844 DPT=9102 SEQ=3389022923 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB28CB820000000001030307) 
Nov 28 09:45:06 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:45:06.398 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:06 np0005538513.localdomain python3.9[277700]: ansible-ansible.legacy.file Invoked with mode=0700 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute_init.json _original_basename=nova_compute_init.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute_init.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:45:07 np0005538513.localdomain sudo[277808]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-agxomupxhkwuvxxbdmpwkcyshbyqcdhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323106.8751507-3490-184438537882051/AnsiballZ_container_config_data.py
Nov 28 09:45:07 np0005538513.localdomain sudo[277808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:45:07 np0005538513.localdomain podman[277811]: 2025-11-28 09:45:07.281444301 +0000 UTC m=+0.086959405 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:45:07 np0005538513.localdomain podman[277811]: 2025-11-28 09:45:07.290181609 +0000 UTC m=+0.095696673 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:45:07 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:45:07 np0005538513.localdomain python3.9[277810]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 28 09:45:07 np0005538513.localdomain sudo[277808]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:08 np0005538513.localdomain sudo[277940]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ctqjaqczgvkjpwwdlzcpqxqgmemfpuqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323107.7568014-3517-94881817842595/AnsiballZ_container_config_hash.py
Nov 28 09:45:08 np0005538513.localdomain sudo[277940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:08 np0005538513.localdomain python3.9[277942]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:45:08 np0005538513.localdomain sudo[277940]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:08 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:45:08.481 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:08 np0005538513.localdomain sudo[278050]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obgwyqhaacehjvlwrjuxsnpukmxhmvtx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764323108.715107-3547-41157719706204/AnsiballZ_edpm_container_manage.py
Nov 28 09:45:08 np0005538513.localdomain sudo[278050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:45:09 np0005538513.localdomain systemd[1]: tmp-crun.kG4RNn.mount: Deactivated successfully.
Nov 28 09:45:09 np0005538513.localdomain podman[278052]: 2025-11-28 09:45:09.053741534 +0000 UTC m=+0.068292864 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:45:09 np0005538513.localdomain podman[278052]: 2025-11-28 09:45:09.06242013 +0000 UTC m=+0.076971500 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 09:45:09 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:45:09 np0005538513.localdomain python3[278053]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:45:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2455 DF PROTO=TCP SPT=34262 DPT=9102 SEQ=3361221625 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB28D8020000000001030307) 
Nov 28 09:45:09 np0005538513.localdomain python3[278053]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a",
                                                                    "Digest": "sha256:647f1d5dc1b70ffa3e1832199619d57bfaeceac8823ff53ece64b8e42cc9688e",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:647f1d5dc1b70ffa3e1832199619d57bfaeceac8823ff53ece64b8e42cc9688e"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-11-26T06:36:07.10279245Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1211782527,
                                                                    "VirtualSize": 1211782527,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309/diff:/var/lib/containers/storage/overlay/f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a/diff:/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:1e3477d3ea795ca64b46f28aa9428ba791c4250e0fd05e173a4b9c0fb0bdee23",
                                                                              "sha256:c136b33417f134a3b932677bcf7a2df089c29f20eca250129eafd2132d4708bb",
                                                                              "sha256:7913bde445307e7f24767d9149b2e7f498930793ac9f073ccec69b608c009d31",
                                                                              "sha256:084b2323a717fe711217b0ec21da61f4804f7a0d506adae935888421b80809cf"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.55004106Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550061231Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550071761Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550082711Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550094371Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550104472Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.937139683Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:33.845342269Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:37.752912815Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.066850603Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.343690066Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.121414134Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.758394881Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.023293708Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.666927498Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.274045447Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.934810694Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:42.460051822Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.056709748Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.656939418Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.391634882Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.866551538Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.384686341Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.893815667Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:50.280039705Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:51.365780205Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:52.238116267Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:54.354755699Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.47438266Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474435383Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474444143Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474450953Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:58.542433842Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:13:58.883943816Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:14:39.655921147Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:14:42.534184087Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:20.237322707Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:20.688296939Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:21.069367201Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:23:46.989417927Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:23:54.535170465Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:34:24.828469773Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:06.089054875Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:06.610811813Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:07.099939071Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:07.100032994Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:14.509959241Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 28 09:45:09 np0005538513.localdomain sudo[278050]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:10 np0005538513.localdomain podman[238687]: time="2025-11-28T09:45:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:45:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:45:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1"
Nov 28 09:45:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:45:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17227 "" "Go-http-client/1.1"
Nov 28 09:45:10 np0005538513.localdomain sudo[278239]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggbekabrafaqzxlsfcifsjaieelewaud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323109.9128616-3571-77044790830403/AnsiballZ_stat.py
Nov 28 09:45:10 np0005538513.localdomain sudo[278239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:10 np0005538513.localdomain python3.9[278241]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:45:10 np0005538513.localdomain sudo[278239]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:11 np0005538513.localdomain sudo[278351]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfowzrcgmttjpdsnyicqxniclablmnkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323111.0888188-3607-55387168614734/AnsiballZ_container_config_data.py
Nov 28 09:45:11 np0005538513.localdomain sudo[278351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:11 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:45:11.444 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:11 np0005538513.localdomain python3.9[278353]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 28 09:45:11 np0005538513.localdomain sudo[278351]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:12 np0005538513.localdomain sudo[278461]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fmcdbowqkltuddilxxazpqlkadcdifva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323111.8830245-3634-73066240316936/AnsiballZ_container_config_hash.py
Nov 28 09:45:12 np0005538513.localdomain sudo[278461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:12 np0005538513.localdomain python3.9[278463]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:45:12 np0005538513.localdomain sudo[278461]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:13 np0005538513.localdomain sudo[278571]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gneejxyvfifipyytbbujqrdjzgrqxskz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764323112.8939447-3664-246052998846518/AnsiballZ_edpm_container_manage.py
Nov 28 09:45:13 np0005538513.localdomain sudo[278571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:13 np0005538513.localdomain python3[278573]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:45:13 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:45:13.512 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:13 np0005538513.localdomain python3[278573]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a",
                                                                    "Digest": "sha256:647f1d5dc1b70ffa3e1832199619d57bfaeceac8823ff53ece64b8e42cc9688e",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:647f1d5dc1b70ffa3e1832199619d57bfaeceac8823ff53ece64b8e42cc9688e"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-11-26T06:36:07.10279245Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1211782527,
                                                                    "VirtualSize": 1211782527,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309/diff:/var/lib/containers/storage/overlay/f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a/diff:/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:1e3477d3ea795ca64b46f28aa9428ba791c4250e0fd05e173a4b9c0fb0bdee23",
                                                                              "sha256:c136b33417f134a3b932677bcf7a2df089c29f20eca250129eafd2132d4708bb",
                                                                              "sha256:7913bde445307e7f24767d9149b2e7f498930793ac9f073ccec69b608c009d31",
                                                                              "sha256:084b2323a717fe711217b0ec21da61f4804f7a0d506adae935888421b80809cf"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.55004106Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550061231Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550071761Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550082711Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550094371Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550104472Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.937139683Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:33.845342269Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:37.752912815Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.066850603Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.343690066Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.121414134Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.758394881Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.023293708Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.666927498Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.274045447Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.934810694Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:42.460051822Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.056709748Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.656939418Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.391634882Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.866551538Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.384686341Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.893815667Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:50.280039705Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:51.365780205Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:52.238116267Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:54.354755699Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.47438266Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474435383Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474444143Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474450953Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:58.542433842Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:13:58.883943816Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:14:39.655921147Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:14:42.534184087Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:20.237322707Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:20.688296939Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:21.069367201Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:23:46.989417927Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:23:54.535170465Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:34:24.828469773Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:06.089054875Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:06.610811813Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:07.099939071Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:07.100032994Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:14.509959241Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 28 09:45:13 np0005538513.localdomain sudo[278571]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:14 np0005538513.localdomain sudo[278745]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tlytktpbkmkayxcsuqywxvdibjmhjoti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323114.036787-3688-167448759015612/AnsiballZ_stat.py
Nov 28 09:45:14 np0005538513.localdomain sudo[278745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:14 np0005538513.localdomain python3.9[278747]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:45:14 np0005538513.localdomain sudo[278745]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:15 np0005538513.localdomain sudo[278857]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndgvfosbncgwaknvgcrjkfrsibypkagp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323114.9032-3715-174997494084774/AnsiballZ_file.py
Nov 28 09:45:15 np0005538513.localdomain sudo[278857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:15 np0005538513.localdomain python3.9[278859]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:45:15 np0005538513.localdomain sudo[278857]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:15 np0005538513.localdomain sudo[278966]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovikjwvmwozysupztgttlkcmohdnumky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323115.443625-3715-193584405947285/AnsiballZ_copy.py
Nov 28 09:45:15 np0005538513.localdomain sudo[278966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:16 np0005538513.localdomain python3.9[278968]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764323115.443625-3715-193584405947285/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:45:16 np0005538513.localdomain sudo[278966]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:16 np0005538513.localdomain sudo[279021]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-enytbaprceejkuxobwkpbovhlbupjwas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323115.443625-3715-193584405947285/AnsiballZ_systemd.py
Nov 28 09:45:16 np0005538513.localdomain sudo[279021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:16 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:45:16.444 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:16 np0005538513.localdomain python3.9[279023]: ansible-systemd Invoked with state=started name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:45:16 np0005538513.localdomain sudo[279021]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2456 DF PROTO=TCP SPT=34262 DPT=9102 SEQ=3361221625 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB28F7830000000001030307) 
Nov 28 09:45:17 np0005538513.localdomain python3.9[279133]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:45:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:45:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:45:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:45:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:45:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:45:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:45:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:45:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:45:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:45:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:45:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:45:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:45:18 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:45:18.514 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:18 np0005538513.localdomain python3.9[279241]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:45:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:45:18 np0005538513.localdomain podman[279259]: 2025-11-28 09:45:18.847239295 +0000 UTC m=+0.080381535 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, release=1755695350, distribution-scope=public, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 09:45:18 np0005538513.localdomain podman[279259]: 2025-11-28 09:45:18.865966018 +0000 UTC m=+0.099108288 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 28 09:45:18 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:45:19 np0005538513.localdomain python3.9[279369]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:45:20 np0005538513.localdomain sudo[279477]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-omydhvbzrmmjfzgxkxpmtddddpgqivur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323119.991877-3883-104563857107707/AnsiballZ_podman_container.py
Nov 28 09:45:20 np0005538513.localdomain sudo[279477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:20 np0005538513.localdomain python3.9[279479]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 28 09:45:20 np0005538513.localdomain sudo[279477]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:20 np0005538513.localdomain systemd-journald[47227]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 119.8 (399 of 333 items), suggesting rotation.
Nov 28 09:45:20 np0005538513.localdomain systemd-journald[47227]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 09:45:20 np0005538513.localdomain rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:45:20 np0005538513.localdomain rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:45:21 np0005538513.localdomain sudo[279611]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fptjubsupdbaahoelqzxpauvzrxibmtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323121.1151261-3907-2260335144826/AnsiballZ_systemd.py
Nov 28 09:45:21 np0005538513.localdomain sudo[279611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:21 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:45:21.446 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:21 np0005538513.localdomain python3.9[279613]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:45:22 np0005538513.localdomain systemd[1]: Stopping nova_compute container...
Nov 28 09:45:22 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:45:22.826 228337 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170
Nov 28 09:45:23 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:45:23.546 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:24 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:45:24.143 228337 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Nov 28 09:45:24 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:45:24.145 228337 DEBUG oslo_concurrency.lockutils [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:45:24 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:45:24.146 228337 DEBUG oslo_concurrency.lockutils [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:45:24 np0005538513.localdomain nova_compute[228333]: 2025-11-28 09:45:24.146 228337 DEBUG oslo_concurrency.lockutils [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:45:24 np0005538513.localdomain systemd[1]: libpod-11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf.scope: Deactivated successfully.
Nov 28 09:45:24 np0005538513.localdomain virtqemud[201490]: End of file while reading data: Input/output error
Nov 28 09:45:24 np0005538513.localdomain systemd[1]: libpod-11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf.scope: Consumed 20.296s CPU time.
Nov 28 09:45:24 np0005538513.localdomain podman[279617]: 2025-11-28 09:45:24.513381954 +0000 UTC m=+1.757115059 container died 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:45:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf-userdata-shm.mount: Deactivated successfully.
Nov 28 09:45:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c-merged.mount: Deactivated successfully.
Nov 28 09:45:24 np0005538513.localdomain podman[279617]: 2025-11-28 09:45:24.676226093 +0000 UTC m=+1.919959158 container cleanup 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:45:24 np0005538513.localdomain podman[279617]: nova_compute
Nov 28 09:45:24 np0005538513.localdomain podman[279656]: error opening file `/run/crun/11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf/status`: No such file or directory
Nov 28 09:45:24 np0005538513.localdomain podman[279643]: 2025-11-28 09:45:24.762132955 +0000 UTC m=+0.055909874 container cleanup 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:45:24 np0005538513.localdomain podman[279643]: nova_compute
Nov 28 09:45:24 np0005538513.localdomain systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 28 09:45:24 np0005538513.localdomain systemd[1]: Stopped nova_compute container.
Nov 28 09:45:24 np0005538513.localdomain systemd[1]: Starting nova_compute container...
Nov 28 09:45:24 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:45:24 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 28 09:45:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:45:24 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 28 09:45:24 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 09:45:24 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 09:45:24 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 09:45:24 np0005538513.localdomain podman[279658]: 2025-11-28 09:45:24.913878615 +0000 UTC m=+0.120021669 container init 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 09:45:24 np0005538513.localdomain podman[279658]: 2025-11-28 09:45:24.924568392 +0000 UTC m=+0.130711426 container start 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 09:45:24 np0005538513.localdomain podman[279658]: nova_compute
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: + sudo -E kolla_set_configs
Nov 28 09:45:24 np0005538513.localdomain systemd[1]: Started nova_compute container.
Nov 28 09:45:24 np0005538513.localdomain sudo[279611]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Validating config file
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Copying service configuration files
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 28 09:45:24 np0005538513.localdomain podman[279676]: 2025-11-28 09:45:24.98355735 +0000 UTC m=+0.069747108 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Deleting /etc/ceph
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Creating directory /etc/ceph
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Setting permission for /etc/ceph
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Writing out command to execute
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: ++ cat /run_command
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: + CMD=nova-compute
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: + ARGS=
Nov 28 09:45:24 np0005538513.localdomain nova_compute[279673]: + sudo kolla_copy_cacerts
Nov 28 09:45:25 np0005538513.localdomain nova_compute[279673]: + [[ ! -n '' ]]
Nov 28 09:45:25 np0005538513.localdomain nova_compute[279673]: + . kolla_extend_start
Nov 28 09:45:25 np0005538513.localdomain nova_compute[279673]: + echo 'Running command: '\''nova-compute'\'''
Nov 28 09:45:25 np0005538513.localdomain nova_compute[279673]: Running command: 'nova-compute'
Nov 28 09:45:25 np0005538513.localdomain nova_compute[279673]: + umask 0022
Nov 28 09:45:25 np0005538513.localdomain nova_compute[279673]: + exec nova-compute
Nov 28 09:45:25 np0005538513.localdomain podman[279676]: 2025-11-28 09:45:25.01946105 +0000 UTC m=+0.105650868 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:45:25 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:45:25 np0005538513.localdomain sudo[279725]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:45:25 np0005538513.localdomain sudo[279725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:45:25 np0005538513.localdomain sudo[279725]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:25 np0005538513.localdomain sudo[279743]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:45:25 np0005538513.localdomain sudo[279743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:45:25 np0005538513.localdomain sudo[279743]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:26 np0005538513.localdomain sudo[279793]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:45:26 np0005538513.localdomain sudo[279793]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:45:26 np0005538513.localdomain sudo[279793]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:26.610 279685 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 09:45:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:26.611 279685 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 09:45:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:26.611 279685 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 09:45:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:26.611 279685 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 28 09:45:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:26.738 279685 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:45:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:26.762 279685 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:45:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:26.762 279685 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.163 279685 INFO nova.virt.driver [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.295 279685 INFO nova.compute.provider_config [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.305 279685 DEBUG oslo_concurrency.lockutils [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.306 279685 DEBUG oslo_concurrency.lockutils [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.306 279685 DEBUG oslo_concurrency.lockutils [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.306 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.307 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.307 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.307 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.307 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.307 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.307 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.308 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.308 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.308 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.308 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.308 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.308 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.309 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.309 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.309 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.309 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.309 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.310 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.310 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] console_host                   = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.310 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.310 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.310 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.310 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.311 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.311 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.311 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.311 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.311 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.312 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.312 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.312 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.312 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.312 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.312 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.313 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.313 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.313 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.313 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] host                           = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.313 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.314 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.314 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.314 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.314 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.314 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.315 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.315 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.315 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.315 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.315 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.316 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.316 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.316 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.316 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.316 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.316 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.317 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.317 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.317 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.317 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.317 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.317 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.318 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.318 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.318 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.318 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.318 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.319 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.319 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.319 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.319 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.319 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.319 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.320 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.320 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.320 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.320 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.320 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.320 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.321 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.321 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] my_block_storage_ip            = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.321 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] my_ip                          = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.321 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.321 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.322 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.322 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.322 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.322 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.322 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.322 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.323 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.323 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.323 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.323 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.323 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.324 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.324 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.324 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.324 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.324 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.324 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.325 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.325 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.325 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.325 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.325 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.325 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.326 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.326 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.326 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.326 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.326 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.327 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.327 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.327 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.327 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.327 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.328 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.328 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.328 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.328 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.328 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.328 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.329 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.329 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.329 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.329 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.329 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.329 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.330 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.330 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.330 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.330 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.330 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.331 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.331 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.331 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.331 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.331 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.331 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.332 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.332 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.332 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.332 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.332 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.332 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.333 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.333 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.333 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.333 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.333 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.334 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.334 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.334 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.334 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.334 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.335 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.335 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.335 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.335 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.335 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.335 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.336 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.336 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.336 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.336 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.336 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.337 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.337 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.337 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.337 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.337 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.337 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.338 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.338 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.338 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.338 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.338 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.339 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.339 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.339 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.339 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.339 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.339 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.340 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.340 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.340 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.340 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.340 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.341 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.341 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.341 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.341 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.341 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.341 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.342 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.342 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.342 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.342 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.342 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.343 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.343 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.343 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.343 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.343 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.343 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.344 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.344 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.344 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.344 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.344 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.345 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.345 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.345 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.345 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.345 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.345 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.346 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.346 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.346 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.346 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.346 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.346 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.347 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.347 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.347 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.347 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.347 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.348 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.348 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.348 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.348 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.348 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.348 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.349 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.349 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.349 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.349 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.349 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.350 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.350 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.350 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.350 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.350 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.350 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.351 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.351 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.351 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.351 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.351 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.352 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.352 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.352 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.352 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.352 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.352 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.353 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.353 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.353 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.353 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.353 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.354 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.354 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.354 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.354 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.354 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.354 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.355 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.355 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.355 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.355 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.355 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.356 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.356 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.356 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.356 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.356 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.356 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.357 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.357 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.357 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.357 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.357 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.358 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.358 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.358 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.358 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.358 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.358 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.359 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.359 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.359 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.359 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.359 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.359 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.360 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.360 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.360 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.360 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.360 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.361 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.361 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.361 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.361 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.362 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.362 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.362 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.362 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.362 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.362 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.363 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.363 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.363 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.363 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.363 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.364 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.364 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.364 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.364 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.364 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.364 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.365 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.365 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.365 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.365 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.365 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.365 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.366 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.366 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.366 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.366 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.366 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.367 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.367 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.367 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.367 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.367 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.367 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.368 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.368 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.368 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.368 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.368 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.369 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.369 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.369 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.369 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.369 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.369 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.370 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.370 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.370 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.370 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.370 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.371 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.371 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.371 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.371 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.371 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.372 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.372 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.372 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.372 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.372 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.373 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.373 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.373 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.373 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.373 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.373 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.374 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.374 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.374 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.374 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.374 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.375 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.375 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.375 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.375 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.375 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.375 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.376 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.376 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.376 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.376 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.376 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.377 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.377 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.377 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.377 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.377 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.377 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.378 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.378 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.378 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.378 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.378 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.379 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.379 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.379 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.379 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.379 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.379 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.380 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.380 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.380 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.380 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.380 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.381 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.381 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.381 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.381 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.381 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.381 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.382 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.382 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.383 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.383 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.383 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.383 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.383 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.384 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.384 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.384 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.384 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.384 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.385 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.385 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.385 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.385 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.385 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.385 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.386 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.386 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.386 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.386 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.386 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.387 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.387 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.387 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.387 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.387 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.387 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.388 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.388 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.388 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.388 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.388 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.388 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.389 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.389 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.389 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.389 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.389 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.390 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.390 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.390 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.390 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.390 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.390 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.391 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.391 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.391 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.391 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.391 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.392 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.392 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.392 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.392 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.392 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.392 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.393 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.393 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.393 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.393 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.393 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.394 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.394 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.394 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.394 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.394 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.394 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.395 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.395 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.395 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.395 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.395 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.395 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.396 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.396 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.396 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.396 279685 WARNING oslo_config.cfg [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: and ``live_migration_inbound_addr`` respectively.
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: ).  Its value may be silently ignored in the future.
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.397 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.397 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.397 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.397 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.397 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.397 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.398 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.398 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.398 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.398 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.399 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.399 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.399 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.399 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.399 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.400 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.400 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.400 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.400 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.rbd_secret_uuid        = 2c5417c9-00eb-57d5-a565-ddecbc7995c1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.400 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.400 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.401 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.401 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.401 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.401 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.401 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.402 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.402 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.402 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.402 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.402 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.403 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.403 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.403 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.403 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.403 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.403 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.404 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.404 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.404 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.404 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.404 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.405 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.405 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.405 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.405 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.405 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.405 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.406 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.406 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.406 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.406 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.406 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.407 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.407 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.407 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.407 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.407 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.407 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.408 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.408 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.408 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.408 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.408 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.409 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.409 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.409 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.409 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.409 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.409 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.410 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.410 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.410 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.410 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.410 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.411 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.411 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.411 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.411 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.411 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.411 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.412 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.412 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.412 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.412 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.412 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.413 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.413 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.413 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.413 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.413 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.413 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.414 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.414 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.414 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.417 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.417 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.417 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.418 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.418 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.418 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.419 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.419 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.419 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.419 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.420 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.420 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.420 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.421 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.421 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.421 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.422 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.422 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.422 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.422 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.423 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.423 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.423 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.424 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.424 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.424 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.424 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.425 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.425 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.425 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.426 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.426 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.426 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.427 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.427 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.427 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.427 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.428 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.428 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.428 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.429 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.429 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.429 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.430 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.430 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.430 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.431 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.431 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.431 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.432 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.432 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.432 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.433 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.433 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.433 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.434 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.434 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.434 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.434 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.435 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.435 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.435 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.436 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.436 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.436 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.437 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.437 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.437 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.438 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.438 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.438 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.438 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.439 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.439 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.439 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.440 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.440 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.440 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.440 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.441 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.441 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.441 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.442 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.442 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.443 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.443 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.443 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.444 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.444 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.444 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.444 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.445 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.445 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.445 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.446 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.446 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.446 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.447 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.447 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.447 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.447 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.448 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.448 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.449 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.449 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.449 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.449 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.450 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.450 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.450 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.451 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.451 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.451 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.451 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.452 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.452 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.452 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.453 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.453 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.453 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.454 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.454 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.454 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.455 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.455 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.455 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.455 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.456 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.456 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.456 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.457 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.457 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.457 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.458 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.458 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.458 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.458 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.459 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.459 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.459 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.460 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.460 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.460 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.460 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.461 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.461 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.461 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.462 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.462 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.462 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.463 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.463 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.463 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.464 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.464 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.464 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.465 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.465 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.466 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.466 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.466 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.466 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.467 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.467 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.467 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.468 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.468 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.468 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.469 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.469 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.469 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.469 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.470 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.470 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.470 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.470 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.470 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.470 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.471 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.471 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.471 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.471 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.471 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.472 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.472 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.472 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.472 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.472 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.473 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.473 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.473 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.473 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.473 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.473 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.474 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.474 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.474 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.474 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.474 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.475 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.475 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.475 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.475 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.475 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.476 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.476 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.476 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.476 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.476 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.477 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.477 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.477 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.477 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.477 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.477 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.478 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.478 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.478 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.478 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.478 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.479 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.479 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.479 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.479 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.479 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.480 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.480 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.480 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.480 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.480 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.480 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.481 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.481 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.481 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.481 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.481 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.482 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.482 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.482 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.482 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.482 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.483 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.483 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.483 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.483 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.483 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.483 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.484 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.484 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.484 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.484 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.484 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.485 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.485 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.485 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.485 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.485 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.485 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.486 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.486 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.486 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.486 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.486 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.487 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.487 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.487 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.487 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.487 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.487 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.488 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.488 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.488 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.488 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.488 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.489 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.489 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.489 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.489 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.489 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.489 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.490 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.490 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.490 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.490 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.490 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.491 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.491 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.491 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.491 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.491 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.492 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.492 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.492 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.492 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.492 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.493 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.493 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.493 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.493 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.493 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.493 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.494 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.494 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.494 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.494 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.494 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.495 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.495 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.495 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.495 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.495 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.495 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.496 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.496 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.496 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.496 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.496 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.497 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.497 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.497 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.497 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.497 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.497 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.498 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.498 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.498 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.498 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.498 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.499 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.500 279685 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.517 279685 INFO nova.virt.node [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Determined node identity 35fead26-0bad-4950-b646-987079d58a17 from /var/lib/nova/compute_id
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.518 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.518 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.519 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.519 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.532 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f37e37ce9a0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.536 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f37e37ce9a0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.537 279685 INFO nova.virt.libvirt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Connection event '1' reason 'None'
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.542 279685 INFO nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Libvirt host capabilities <capabilities>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <host>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <uuid>eb468aed-e0e9-4528-988f-9267a3530b7a</uuid>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <cpu>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <arch>x86_64</arch>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model>EPYC-Rome-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <vendor>AMD</vendor>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <microcode version='16777317'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <signature family='23' model='49' stepping='0'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature name='x2apic'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature name='tsc-deadline'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature name='osxsave'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature name='hypervisor'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature name='tsc_adjust'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature name='spec-ctrl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature name='stibp'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature name='arch-capabilities'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature name='ssbd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature name='cmp_legacy'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature name='topoext'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature name='virt-ssbd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature name='lbrv'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature name='tsc-scale'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature name='vmcb-clean'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature name='pause-filter'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature name='pfthreshold'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature name='svme-addr-chk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature name='rdctl-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature name='skip-l1dfl-vmentry'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature name='mds-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature name='pschange-mc-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <pages unit='KiB' size='4'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <pages unit='KiB' size='2048'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <pages unit='KiB' size='1048576'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </cpu>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <power_management>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <suspend_mem/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <suspend_disk/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <suspend_hybrid/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </power_management>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <iommu support='no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <migration_features>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <live/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <uri_transports>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <uri_transport>tcp</uri_transport>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <uri_transport>rdma</uri_transport>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </uri_transports>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </migration_features>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <topology>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <cells num='1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <cell id='0'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:           <memory unit='KiB'>16116612</memory>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:           <pages unit='KiB' size='4'>4029153</pages>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:           <pages unit='KiB' size='2048'>0</pages>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:           <distances>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:             <sibling id='0' value='10'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:           </distances>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:           <cpus num='8'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:           </cpus>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         </cell>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </cells>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </topology>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <cache>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </cache>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <secmodel>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model>selinux</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <doi>0</doi>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </secmodel>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <secmodel>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model>dac</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <doi>0</doi>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </secmodel>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   </host>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <guest>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <os_type>hvm</os_type>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <arch name='i686'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <wordsize>32</wordsize>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <domain type='qemu'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <domain type='kvm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </arch>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <features>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <pae/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <nonpae/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <acpi default='on' toggle='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <apic default='on' toggle='no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <cpuselection/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <deviceboot/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <disksnapshot default='on' toggle='no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <externalSnapshot/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </features>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   </guest>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <guest>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <os_type>hvm</os_type>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <arch name='x86_64'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <wordsize>64</wordsize>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <domain type='qemu'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <domain type='kvm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </arch>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <features>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <acpi default='on' toggle='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <apic default='on' toggle='no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <cpuselection/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <deviceboot/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <disksnapshot default='on' toggle='no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <externalSnapshot/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </features>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   </guest>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: </capabilities>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.549 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.554 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: <domainCapabilities>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <domain>kvm</domain>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <arch>i686</arch>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <vcpu max='1024'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <iothreads supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <os supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <enum name='firmware'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <loader supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='type'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>rom</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>pflash</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='readonly'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>yes</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>no</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='secure'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>no</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </loader>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   </os>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <cpu>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>on</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>off</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </mode>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <mode name='maximum' supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='maximumMigratable'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>on</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>off</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </mode>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <mode name='host-model' supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <vendor>AMD</vendor>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='x2apic'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='stibp'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='ssbd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='succor'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='ibrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='lbrv'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </mode>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <mode name='custom' supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cooperlake'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cooperlake-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cooperlake-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Denverton'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mpx'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Denverton-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mpx'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Denverton-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Denverton-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Dhyana-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Genoa'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amd-psfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='auto-ibrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='stibp-always-on'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amd-psfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='auto-ibrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='stibp-always-on'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Milan'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amd-psfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='stibp-always-on'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Rome'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='GraniteRapids'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mcdt-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pbrsb-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='prefetchiti'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mcdt-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pbrsb-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='prefetchiti'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx10'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx10-128'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx10-256'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx10-512'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mcdt-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pbrsb-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='prefetchiti'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-noTSX'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='IvyBridge'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='IvyBridge-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='IvyBridge-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='KnightsMill'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-4fmaps'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-4vnniw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512er'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512pf'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='KnightsMill-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-4fmaps'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-4vnniw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512er'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512pf'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Opteron_G4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fma4'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xop'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fma4'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xop'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Opteron_G5'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fma4'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tbm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xop'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fma4'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tbm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xop'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SapphireRapids'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SierraForest'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-ne-convert'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cmpccxadd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mcdt-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pbrsb-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SierraForest-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-ne-convert'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cmpccxadd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mcdt-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pbrsb-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Snowridge'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='core-capability'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mpx'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='split-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Snowridge-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='core-capability'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mpx'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='split-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Snowridge-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='core-capability'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='split-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Snowridge-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='core-capability'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='split-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Snowridge-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='athlon'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnow'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnowext'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='athlon-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnow'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnowext'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='core2duo'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='core2duo-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='coreduo'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='coreduo-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='n270'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='n270-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='phenom'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnow'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnowext'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='phenom-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnow'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnowext'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </mode>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   </cpu>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <memoryBacking supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <enum name='sourceType'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <value>file</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <value>anonymous</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <value>memfd</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   </memoryBacking>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <devices>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <disk supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='diskDevice'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>disk</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>cdrom</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>floppy</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>lun</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='bus'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>fdc</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>scsi</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>usb</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>sata</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='model'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio-transitional</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio-non-transitional</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </disk>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <graphics supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='type'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vnc</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>egl-headless</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>dbus</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </graphics>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <video supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='modelType'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vga</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>cirrus</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>none</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>bochs</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>ramfb</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </video>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <hostdev supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='mode'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>subsystem</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='startupPolicy'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>default</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>mandatory</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>requisite</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>optional</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='subsysType'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>usb</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>pci</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>scsi</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='capsType'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='pciBackend'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </hostdev>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <rng supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='model'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio-transitional</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio-non-transitional</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='backendModel'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>random</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>egd</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>builtin</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </rng>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <filesystem supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='driverType'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>path</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>handle</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtiofs</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </filesystem>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <tpm supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='model'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>tpm-tis</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>tpm-crb</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='backendModel'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>emulator</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>external</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='backendVersion'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>2.0</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </tpm>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <redirdev supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='bus'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>usb</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </redirdev>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <channel supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='type'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>pty</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>unix</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </channel>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <crypto supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='model'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='type'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>qemu</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='backendModel'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>builtin</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </crypto>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <interface supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='backendType'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>default</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>passt</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </interface>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <panic supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='model'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>isa</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>hyperv</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </panic>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <console supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='type'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>null</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vc</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>pty</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>dev</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>file</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>pipe</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>stdio</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>udp</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>tcp</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>unix</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>qemu-vdagent</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>dbus</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </console>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   </devices>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <features>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <gic supported='no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <vmcoreinfo supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <genid supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <backingStoreInput supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <backup supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <async-teardown supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <ps2 supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <sev supported='no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <sgx supported='no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <hyperv supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='features'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>relaxed</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vapic</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>spinlocks</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vpindex</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>runtime</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>synic</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>stimer</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>reset</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vendor_id</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>frequencies</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>reenlightenment</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>tlbflush</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>ipi</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>avic</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>emsr_bitmap</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>xmm_input</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <defaults>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <spinlocks>4095</spinlocks>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <stimer_direct>on</stimer_direct>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </defaults>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </hyperv>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <launchSecurity supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='sectype'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>tdx</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </launchSecurity>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   </features>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: </domainCapabilities>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.560 279685 DEBUG nova.virt.libvirt.volume.mount [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.564 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: <domainCapabilities>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <domain>kvm</domain>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <arch>i686</arch>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <vcpu max='240'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <iothreads supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <os supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <enum name='firmware'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <loader supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='type'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>rom</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>pflash</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='readonly'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>yes</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>no</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='secure'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>no</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </loader>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   </os>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <cpu>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>on</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>off</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </mode>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <mode name='maximum' supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='maximumMigratable'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>on</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>off</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </mode>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <mode name='host-model' supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <vendor>AMD</vendor>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='x2apic'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='stibp'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='ssbd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='succor'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='ibrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='lbrv'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </mode>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <mode name='custom' supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cooperlake'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cooperlake-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cooperlake-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Denverton'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mpx'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Denverton-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mpx'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Denverton-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Denverton-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Dhyana-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Genoa'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amd-psfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='auto-ibrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='stibp-always-on'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amd-psfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='auto-ibrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='stibp-always-on'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Milan'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amd-psfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='stibp-always-on'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Rome'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='GraniteRapids'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mcdt-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pbrsb-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='prefetchiti'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mcdt-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pbrsb-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='prefetchiti'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx10'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx10-128'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx10-256'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx10-512'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mcdt-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pbrsb-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='prefetchiti'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-noTSX'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='IvyBridge'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='IvyBridge-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='IvyBridge-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='KnightsMill'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-4fmaps'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-4vnniw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512er'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512pf'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='KnightsMill-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-4fmaps'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-4vnniw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512er'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512pf'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Opteron_G4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fma4'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xop'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fma4'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xop'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Opteron_G5'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fma4'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tbm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xop'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fma4'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tbm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xop'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SapphireRapids'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SierraForest'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-ne-convert'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cmpccxadd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mcdt-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pbrsb-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SierraForest-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-ne-convert'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cmpccxadd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mcdt-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pbrsb-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Snowridge'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='core-capability'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mpx'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='split-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Snowridge-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='core-capability'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mpx'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='split-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Snowridge-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='core-capability'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='split-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Snowridge-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='core-capability'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='split-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Snowridge-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='athlon'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnow'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnowext'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='athlon-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnow'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnowext'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='core2duo'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='core2duo-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='coreduo'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='coreduo-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='n270'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='n270-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='phenom'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnow'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnowext'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='phenom-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnow'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnowext'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </mode>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   </cpu>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <memoryBacking supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <enum name='sourceType'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <value>file</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <value>anonymous</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <value>memfd</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   </memoryBacking>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <devices>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <disk supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='diskDevice'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>disk</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>cdrom</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>floppy</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>lun</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='bus'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>ide</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>fdc</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>scsi</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>usb</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>sata</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='model'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio-transitional</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio-non-transitional</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </disk>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <graphics supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='type'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vnc</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>egl-headless</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>dbus</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </graphics>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <video supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='modelType'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vga</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>cirrus</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>none</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>bochs</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>ramfb</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </video>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <hostdev supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='mode'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>subsystem</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='startupPolicy'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>default</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>mandatory</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>requisite</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>optional</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='subsysType'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>usb</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>pci</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>scsi</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='capsType'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='pciBackend'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </hostdev>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <rng supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='model'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio-transitional</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio-non-transitional</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='backendModel'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>random</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>egd</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>builtin</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </rng>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <filesystem supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='driverType'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>path</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>handle</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtiofs</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </filesystem>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <tpm supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='model'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>tpm-tis</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>tpm-crb</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='backendModel'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>emulator</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>external</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='backendVersion'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>2.0</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </tpm>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <redirdev supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='bus'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>usb</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </redirdev>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <channel supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='type'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>pty</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>unix</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </channel>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <crypto supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='model'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='type'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>qemu</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='backendModel'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>builtin</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </crypto>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <interface supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='backendType'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>default</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>passt</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </interface>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <panic supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='model'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>isa</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>hyperv</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </panic>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <console supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='type'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>null</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vc</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>pty</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>dev</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>file</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>pipe</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>stdio</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>udp</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>tcp</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>unix</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>qemu-vdagent</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>dbus</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </console>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   </devices>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <features>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <gic supported='no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <vmcoreinfo supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <genid supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <backingStoreInput supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <backup supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <async-teardown supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <ps2 supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <sev supported='no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <sgx supported='no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <hyperv supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='features'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>relaxed</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vapic</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>spinlocks</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vpindex</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>runtime</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>synic</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>stimer</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>reset</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vendor_id</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>frequencies</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>reenlightenment</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>tlbflush</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>ipi</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>avic</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>emsr_bitmap</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>xmm_input</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <defaults>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <spinlocks>4095</spinlocks>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <stimer_direct>on</stimer_direct>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </defaults>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </hyperv>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <launchSecurity supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='sectype'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>tdx</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </launchSecurity>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   </features>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: </domainCapabilities>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.598 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.603 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: <domainCapabilities>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <domain>kvm</domain>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <arch>x86_64</arch>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <vcpu max='1024'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <iothreads supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <os supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <enum name='firmware'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <value>efi</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <loader supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='type'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>rom</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>pflash</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='readonly'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>yes</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>no</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='secure'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>yes</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>no</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </loader>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   </os>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <cpu>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>on</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>off</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </mode>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <mode name='maximum' supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='maximumMigratable'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>on</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>off</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </mode>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <mode name='host-model' supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <vendor>AMD</vendor>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='x2apic'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='stibp'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='ssbd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='succor'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='ibrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='lbrv'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </mode>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <mode name='custom' supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cooperlake'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cooperlake-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cooperlake-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Denverton'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mpx'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Denverton-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mpx'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Denverton-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Denverton-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Dhyana-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Genoa'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amd-psfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='auto-ibrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='stibp-always-on'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amd-psfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='auto-ibrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='stibp-always-on'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Milan'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amd-psfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='stibp-always-on'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Rome'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='GraniteRapids'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mcdt-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pbrsb-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='prefetchiti'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mcdt-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pbrsb-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='prefetchiti'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx10'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx10-128'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx10-256'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx10-512'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mcdt-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pbrsb-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='prefetchiti'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-noTSX'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='IvyBridge'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='IvyBridge-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='IvyBridge-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='KnightsMill'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-4fmaps'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-4vnniw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512er'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512pf'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='KnightsMill-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-4fmaps'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-4vnniw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512er'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512pf'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Opteron_G4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fma4'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xop'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fma4'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xop'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Opteron_G5'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fma4'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tbm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xop'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fma4'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tbm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xop'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SapphireRapids'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SierraForest'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-ne-convert'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cmpccxadd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mcdt-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pbrsb-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SierraForest-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-ne-convert'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cmpccxadd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mcdt-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pbrsb-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Snowridge'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='core-capability'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mpx'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='split-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Snowridge-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='core-capability'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mpx'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='split-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Snowridge-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='core-capability'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='split-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Snowridge-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='core-capability'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='split-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Snowridge-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='athlon'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnow'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnowext'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='athlon-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnow'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnowext'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='core2duo'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='core2duo-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='coreduo'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='coreduo-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='n270'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='n270-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='phenom'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnow'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnowext'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='phenom-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnow'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnowext'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </mode>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   </cpu>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <memoryBacking supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <enum name='sourceType'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <value>file</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <value>anonymous</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <value>memfd</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   </memoryBacking>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <devices>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <disk supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='diskDevice'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>disk</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>cdrom</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>floppy</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>lun</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='bus'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>fdc</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>scsi</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>usb</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>sata</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='model'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio-transitional</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio-non-transitional</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </disk>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <graphics supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='type'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vnc</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>egl-headless</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>dbus</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </graphics>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <video supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='modelType'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vga</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>cirrus</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>none</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>bochs</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>ramfb</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </video>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <hostdev supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='mode'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>subsystem</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='startupPolicy'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>default</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>mandatory</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>requisite</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>optional</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='subsysType'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>usb</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>pci</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>scsi</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='capsType'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='pciBackend'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </hostdev>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <rng supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='model'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio-transitional</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio-non-transitional</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='backendModel'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>random</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>egd</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>builtin</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </rng>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <filesystem supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='driverType'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>path</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>handle</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtiofs</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </filesystem>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <tpm supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='model'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>tpm-tis</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>tpm-crb</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='backendModel'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>emulator</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>external</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='backendVersion'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>2.0</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </tpm>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <redirdev supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='bus'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>usb</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </redirdev>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <channel supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='type'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>pty</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>unix</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </channel>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <crypto supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='model'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='type'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>qemu</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='backendModel'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>builtin</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </crypto>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <interface supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='backendType'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>default</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>passt</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </interface>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <panic supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='model'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>isa</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>hyperv</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </panic>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <console supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='type'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>null</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vc</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>pty</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>dev</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>file</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>pipe</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>stdio</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>udp</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>tcp</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>unix</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>qemu-vdagent</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>dbus</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </console>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   </devices>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <features>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <gic supported='no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <vmcoreinfo supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <genid supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <backingStoreInput supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <backup supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <async-teardown supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <ps2 supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <sev supported='no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <sgx supported='no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <hyperv supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='features'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>relaxed</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vapic</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>spinlocks</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vpindex</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>runtime</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>synic</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>stimer</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>reset</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vendor_id</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>frequencies</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>reenlightenment</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>tlbflush</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>ipi</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>avic</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>emsr_bitmap</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>xmm_input</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <defaults>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <spinlocks>4095</spinlocks>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <stimer_direct>on</stimer_direct>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </defaults>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </hyperv>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <launchSecurity supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='sectype'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>tdx</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </launchSecurity>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   </features>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: </domainCapabilities>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.655 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: <domainCapabilities>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <domain>kvm</domain>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <arch>x86_64</arch>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <vcpu max='240'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <iothreads supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <os supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <enum name='firmware'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <loader supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='type'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>rom</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>pflash</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='readonly'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>yes</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>no</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='secure'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>no</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </loader>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   </os>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <cpu>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>on</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>off</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </mode>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <mode name='maximum' supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='maximumMigratable'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>on</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>off</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </mode>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <mode name='host-model' supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <vendor>AMD</vendor>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='x2apic'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='stibp'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='ssbd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='succor'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='ibrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='lbrv'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </mode>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <mode name='custom' supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Broadwell-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cooperlake'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cooperlake-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Cooperlake-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Denverton'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mpx'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Denverton-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mpx'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Denverton-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Denverton-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Dhyana-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Genoa'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amd-psfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='auto-ibrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='stibp-always-on'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amd-psfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='auto-ibrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='stibp-always-on'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Milan'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amd-psfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='stibp-always-on'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Rome'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='EPYC-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='GraniteRapids'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mcdt-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pbrsb-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='prefetchiti'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mcdt-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pbrsb-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='prefetchiti'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx10'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx10-128'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx10-256'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx10-512'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mcdt-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pbrsb-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='prefetchiti'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-noTSX'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Haswell-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='IvyBridge'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='IvyBridge-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='IvyBridge-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='KnightsMill'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-4fmaps'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-4vnniw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512er'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512pf'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='KnightsMill-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-4fmaps'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-4vnniw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512er'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512pf'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Opteron_G4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fma4'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xop'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fma4'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xop'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Opteron_G5'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fma4'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tbm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xop'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fma4'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tbm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xop'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SapphireRapids'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='amx-tile'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-bf16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-fp16'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bitalg'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrc'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fzrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='la57'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='taa-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xfd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SierraForest'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-ne-convert'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cmpccxadd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mcdt-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pbrsb-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='SierraForest-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-ifma'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-ne-convert'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx-vnni-int8'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cmpccxadd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fbsdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='fsrs'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ibrs-all'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mcdt-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pbrsb-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='psdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='serialize'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vaes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='hle'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='rtm'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512bw'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512cd'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512dq'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512f'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='avx512vl'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='invpcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pcid'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='pku'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Snowridge'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='core-capability'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mpx'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='split-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Snowridge-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='core-capability'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='mpx'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='split-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Snowridge-v2'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='core-capability'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='split-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Snowridge-v3'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='core-capability'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='split-lock-detect'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='Snowridge-v4'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='cldemote'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='erms'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='gfni'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdir64b'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='movdiri'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='xsaves'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='athlon'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnow'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnowext'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='athlon-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnow'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnowext'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='core2duo'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='core2duo-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='coreduo'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='coreduo-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='n270'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='n270-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='ss'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='phenom'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnow'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnowext'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <blockers model='phenom-v1'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnow'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <feature name='3dnowext'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </blockers>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </mode>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   </cpu>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <memoryBacking supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <enum name='sourceType'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <value>file</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <value>anonymous</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <value>memfd</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   </memoryBacking>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <devices>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <disk supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='diskDevice'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>disk</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>cdrom</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>floppy</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>lun</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='bus'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>ide</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>fdc</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>scsi</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>usb</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>sata</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='model'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio-transitional</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio-non-transitional</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </disk>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <graphics supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='type'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vnc</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>egl-headless</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>dbus</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </graphics>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <video supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='modelType'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vga</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>cirrus</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>none</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>bochs</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>ramfb</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </video>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <hostdev supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='mode'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>subsystem</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='startupPolicy'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>default</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>mandatory</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>requisite</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>optional</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='subsysType'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>usb</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>pci</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>scsi</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='capsType'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='pciBackend'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </hostdev>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <rng supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='model'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio-transitional</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtio-non-transitional</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='backendModel'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>random</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>egd</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>builtin</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </rng>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <filesystem supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='driverType'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>path</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>handle</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>virtiofs</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </filesystem>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <tpm supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='model'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>tpm-tis</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>tpm-crb</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='backendModel'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>emulator</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>external</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='backendVersion'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>2.0</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </tpm>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <redirdev supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='bus'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>usb</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </redirdev>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <channel supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='type'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>pty</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>unix</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </channel>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <crypto supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='model'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='type'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>qemu</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='backendModel'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>builtin</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </crypto>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <interface supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='backendType'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>default</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>passt</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </interface>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <panic supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='model'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>isa</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>hyperv</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </panic>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <console supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='type'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>null</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vc</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>pty</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>dev</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>file</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>pipe</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>stdio</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>udp</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>tcp</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>unix</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>qemu-vdagent</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>dbus</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </console>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   </devices>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   <features>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <gic supported='no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <vmcoreinfo supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <genid supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <backingStoreInput supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <backup supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <async-teardown supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <ps2 supported='yes'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <sev supported='no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <sgx supported='no'/>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <hyperv supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='features'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>relaxed</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vapic</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>spinlocks</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vpindex</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>runtime</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>synic</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>stimer</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>reset</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>vendor_id</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>frequencies</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>reenlightenment</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>tlbflush</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>ipi</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>avic</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>emsr_bitmap</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>xmm_input</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <defaults>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <spinlocks>4095</spinlocks>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <stimer_direct>on</stimer_direct>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </defaults>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </hyperv>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     <launchSecurity supported='yes'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       <enum name='sectype'>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:         <value>tdx</value>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:       </enum>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:     </launchSecurity>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:   </features>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: </domainCapabilities>
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.713 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.714 279685 INFO nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Secure Boot support detected
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.716 279685 INFO nova.virt.libvirt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.717 279685 INFO nova.virt.libvirt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.732 279685 DEBUG nova.virt.libvirt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.761 279685 INFO nova.virt.node [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Determined node identity 35fead26-0bad-4950-b646-987079d58a17 from /var/lib/nova/compute_id
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.781 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Verified node 35fead26-0bad-4950-b646-987079d58a17 matches my host np0005538513.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.820 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.825 279685 DEBUG nova.virt.libvirt.vif [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T08:32:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=<?>,hidden=False,host='np0005538513.localdomain',hostname='test',id=2,image_ref='391767f1-35f2-4b68-ae15-e0b29db66dcb',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T08:33:06Z,launched_on='np0005538513.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=<?>,node='np0005538513.localdomain',numa_topology=None,old_flavor=<?>,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9dda653c53224db086060962b0702694',ramdisk_id='',reservation_id='r-a3c307c0',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata=<?>,tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T08:33:07Z,user_data=None,user_id='4d9169247d4447d0a8dd4c33f8b23dee',uuid=c2f0c7d6-df5f-4541-8b2c-bc1eaf805812,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.825 279685 DEBUG nova.network.os_vif_util [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Converting VIF {"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.826 279685 DEBUG nova.network.os_vif_util [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.827 279685 DEBUG os_vif [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.902 279685 DEBUG ovsdbapp.backend.ovs_idl [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.902 279685 DEBUG ovsdbapp.backend.ovs_idl [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.902 279685 DEBUG ovsdbapp.backend.ovs_idl [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.903 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.903 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.903 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.904 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.905 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.908 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.924 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.924 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.924 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 09:45:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:27.926 279685 INFO oslo.privsep.daemon [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpboyt0avz/privsep.sock']
Nov 28 09:45:28 np0005538513.localdomain sudo[279930]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nqrvyisyrqhhzxhyoymvxmlesaoitzhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323128.093699-3934-69751480491290/AnsiballZ_podman_container.py
Nov 28 09:45:28 np0005538513.localdomain sudo[279930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:28.519 279685 INFO oslo.privsep.daemon [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Spawned new privsep daemon via rootwrap
Nov 28 09:45:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:28.426 279933 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 28 09:45:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:28.431 279933 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 28 09:45:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:28.434 279933 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Nov 28 09:45:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:28.434 279933 INFO oslo.privsep.daemon [-] privsep daemon running as pid 279933
Nov 28 09:45:28 np0005538513.localdomain python3.9[279932]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 28 09:45:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:28.776 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:28.777 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09612b07-51, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:45:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:28.777 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap09612b07-51, col_values=(('external_ids', {'iface-id': '09612b07-5142-4b0f-9dab-74bf4403f69f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f4:fc:6c', 'vm-uuid': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:45:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:28.777 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 09:45:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:28.778 279685 INFO os_vif [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51')
Nov 28 09:45:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:28.778 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 09:45:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:28.782 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304
Nov 28 09:45:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:28.782 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 28 09:45:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:28.896 279685 DEBUG oslo_concurrency.lockutils [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:45:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:28.896 279685 DEBUG oslo_concurrency.lockutils [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:45:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:28.896 279685 DEBUG oslo_concurrency.lockutils [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:45:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:28.897 279685 DEBUG nova.compute.resource_tracker [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:45:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:28.897 279685 DEBUG oslo_concurrency.processutils [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:45:29 np0005538513.localdomain systemd[1]: Started libpod-conmon-f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967.scope.
Nov 28 09:45:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:45:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:45:29 np0005538513.localdomain systemd[1]: tmp-crun.aqJ26p.mount: Deactivated successfully.
Nov 28 09:45:29 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:45:29 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91ffc49efb3c7a4f0573f98c172baf0327c9cb7db605d9d237beebdefd054f1a/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 28 09:45:29 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91ffc49efb3c7a4f0573f98c172baf0327c9cb7db605d9d237beebdefd054f1a/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 28 09:45:29 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91ffc49efb3c7a4f0573f98c172baf0327c9cb7db605d9d237beebdefd054f1a/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 09:45:29 np0005538513.localdomain podman[279962]: 2025-11-28 09:45:29.07530636 +0000 UTC m=+0.206430156 container init f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, container_name=nova_compute_init, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 09:45:29 np0005538513.localdomain podman[279962]: 2025-11-28 09:45:29.087671029 +0000 UTC m=+0.218794825 container start f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 09:45:29 np0005538513.localdomain python3.9[279932]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Applying nova statedir ownership
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 already 42436:42436
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 to system_u:object_r:container_file_t:s0
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/console.log
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/2b1009b2ab824d8ddaaa3afb1ca6ce3f88abf415
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-2b1009b2ab824d8ddaaa3afb1ca6ce3f88abf415
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/b234715fc878456b41e32c4fbc669b417044dbe6c6684bbc9059e5c93396ffea
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/469bc4441baff9216df986857f9ff45dbf25965a8d2f755a6449ac2645cb7191
Nov 28 09:45:29 np0005538513.localdomain nova_compute_init[280019]: INFO:nova_statedir:Nova statedir ownership complete
Nov 28 09:45:29 np0005538513.localdomain systemd[1]: libpod-f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967.scope: Deactivated successfully.
Nov 28 09:45:29 np0005538513.localdomain podman[279977]: 2025-11-28 09:45:29.17159748 +0000 UTC m=+0.138888807 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 09:45:29 np0005538513.localdomain podman[280034]: 2025-11-28 09:45:29.223763719 +0000 UTC m=+0.060993410 container died f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 09:45:29 np0005538513.localdomain podman[279988]: 2025-11-28 09:45:29.224697778 +0000 UTC m=+0.190091686 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:45:29 np0005538513.localdomain podman[279977]: 2025-11-28 09:45:29.241451971 +0000 UTC m=+0.208743308 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 09:45:29 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:45:29 np0005538513.localdomain podman[279988]: 2025-11-28 09:45:29.305109612 +0000 UTC m=+0.270503560 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:45:29 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:45:29 np0005538513.localdomain podman[280034]: 2025-11-28 09:45:29.365858702 +0000 UTC m=+0.203088353 container cleanup f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 28 09:45:29 np0005538513.localdomain systemd[1]: libpod-conmon-f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967.scope: Deactivated successfully.
Nov 28 09:45:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:29.371 279685 DEBUG oslo_concurrency.processutils [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:45:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:29.440 279685 DEBUG nova.virt.libvirt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:45:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:29.440 279685 DEBUG nova.virt.libvirt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:45:29 np0005538513.localdomain sudo[279930]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:29.664 279685 WARNING nova.virt.libvirt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:45:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:29.667 279685 DEBUG nova.compute.resource_tracker [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12110MB free_disk=41.837093353271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:45:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:29.667 279685 DEBUG oslo_concurrency.lockutils [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:45:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:29.668 279685 DEBUG oslo_concurrency.lockutils [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:45:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:29.840 279685 DEBUG nova.compute.resource_tracker [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:45:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:29.841 279685 DEBUG nova.compute.resource_tracker [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:45:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:29.841 279685 DEBUG nova.compute.resource_tracker [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:45:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:29.901 279685 DEBUG nova.scheduler.client.report [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Refreshing inventories for resource provider 35fead26-0bad-4950-b646-987079d58a17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 09:45:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:29.919 279685 DEBUG nova.scheduler.client.report [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Updating ProviderTree inventory for provider 35fead26-0bad-4950-b646-987079d58a17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 09:45:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:29.919 279685 DEBUG nova.compute.provider_tree [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 09:45:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:29.932 279685 DEBUG nova.scheduler.client.report [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Refreshing aggregate associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 09:45:29 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-91ffc49efb3c7a4f0573f98c172baf0327c9cb7db605d9d237beebdefd054f1a-merged.mount: Deactivated successfully.
Nov 28 09:45:29 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967-userdata-shm.mount: Deactivated successfully.
Nov 28 09:45:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:29.971 279685 DEBUG nova.scheduler.client.report [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Refreshing trait associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_FMA3,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_ABM,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AMD_SVM,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_CLMUL,HW_CPU_X86_BMI2,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 09:45:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:30.014 279685 DEBUG oslo_concurrency.processutils [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:45:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:30.492 279685 DEBUG oslo_concurrency.processutils [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:45:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:30.499 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 28 09:45:30 np0005538513.localdomain nova_compute[279673]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Nov 28 09:45:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:30.500 279685 INFO nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] kernel doesn't support AMD SEV
Nov 28 09:45:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:30.502 279685 DEBUG nova.compute.provider_tree [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:45:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:30.503 279685 DEBUG nova.virt.libvirt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 09:45:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:30.527 279685 DEBUG nova.scheduler.client.report [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:45:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:30.555 279685 DEBUG nova.compute.resource_tracker [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:45:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:30.556 279685 DEBUG oslo_concurrency.lockutils [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.888s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:45:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:30.556 279685 DEBUG nova.service [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Nov 28 09:45:30 np0005538513.localdomain sshd[261517]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:45:30 np0005538513.localdomain systemd[1]: session-59.scope: Deactivated successfully.
Nov 28 09:45:30 np0005538513.localdomain systemd[1]: session-59.scope: Consumed 1min 29.582s CPU time.
Nov 28 09:45:30 np0005538513.localdomain systemd-logind[764]: Session 59 logged out. Waiting for processes to exit.
Nov 28 09:45:30 np0005538513.localdomain systemd-logind[764]: Removed session 59.
Nov 28 09:45:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:30.585 279685 DEBUG nova.service [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Nov 28 09:45:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:30.586 279685 DEBUG nova.servicegroup.drivers.db [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] DB_Driver: join new ServiceGroup member np0005538513.localdomain to the compute group, service = <Service: host=np0005538513.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Nov 28 09:45:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:31.483 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:45:31 np0005538513.localdomain systemd[1]: tmp-crun.ruTr97.mount: Deactivated successfully.
Nov 28 09:45:31 np0005538513.localdomain podman[280122]: 2025-11-28 09:45:31.863313794 +0000 UTC m=+0.097948032 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 09:45:31 np0005538513.localdomain podman[280122]: 2025-11-28 09:45:31.903780133 +0000 UTC m=+0.138414381 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 09:45:31 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:45:32 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49116 DF PROTO=TCP SPT=47508 DPT=9102 SEQ=3958754893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2931840000000001030307) 
Nov 28 09:45:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:32.942 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49117 DF PROTO=TCP SPT=47508 DPT=9102 SEQ=3958754893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2935830000000001030307) 
Nov 28 09:45:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2457 DF PROTO=TCP SPT=34262 DPT=9102 SEQ=3361221625 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2937830000000001030307) 
Nov 28 09:45:35 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49118 DF PROTO=TCP SPT=47508 DPT=9102 SEQ=3958754893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB293D830000000001030307) 
Nov 28 09:45:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7604 DF PROTO=TCP SPT=33082 DPT=9102 SEQ=3782079456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2941820000000001030307) 
Nov 28 09:45:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:36.486 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:45:37 np0005538513.localdomain podman[280141]: 2025-11-28 09:45:37.869775841 +0000 UTC m=+0.104046399 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:45:37 np0005538513.localdomain podman[280141]: 2025-11-28 09:45:37.906543607 +0000 UTC m=+0.140814165 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:45:37 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:45:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:37.979 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49119 DF PROTO=TCP SPT=47508 DPT=9102 SEQ=3958754893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB294D420000000001030307) 
Nov 28 09:45:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:45:39 np0005538513.localdomain podman[280162]: 2025-11-28 09:45:39.841763952 +0000 UTC m=+0.077152865 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 28 09:45:39 np0005538513.localdomain podman[280162]: 2025-11-28 09:45:39.877500107 +0000 UTC m=+0.112888970 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:45:39 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:45:40 np0005538513.localdomain podman[238687]: time="2025-11-28T09:45:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:45:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:45:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1"
Nov 28 09:45:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:45:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17224 "" "Go-http-client/1.1"
Nov 28 09:45:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:41.521 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:43.012 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:46.523 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:47 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:45:47.611 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 09:45:47 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:45:47.612 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 09:45:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:47.647 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49120 DF PROTO=TCP SPT=47508 DPT=9102 SEQ=3958754893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB296D820000000001030307) 
Nov 28 09:45:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:48.014 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:45:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:45:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:45:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:45:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:45:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:45:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:45:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:45:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:45:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:45:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:45:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:45:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:45:49 np0005538513.localdomain podman[280181]: 2025-11-28 09:45:49.844451604 +0000 UTC m=+0.081804708 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, release=1755695350, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_id=edpm)
Nov 28 09:45:49 np0005538513.localdomain podman[280181]: 2025-11-28 09:45:49.881059585 +0000 UTC m=+0.118412709 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 09:45:49 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:45:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:45:50.824 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:45:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:45:50.824 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:45:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:45:50.825 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:45:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:51.525 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:45:52.614 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:45:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:53.017 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:45:55 np0005538513.localdomain systemd[1]: tmp-crun.CqC34Q.mount: Deactivated successfully.
Nov 28 09:45:55 np0005538513.localdomain podman[280199]: 2025-11-28 09:45:55.856682457 +0000 UTC m=+0.090565306 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:45:55 np0005538513.localdomain podman[280199]: 2025-11-28 09:45:55.864429054 +0000 UTC m=+0.098311853 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:45:55 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:45:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:56.530 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:45:58.053 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:45:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:45:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:45:59 np0005538513.localdomain podman[280224]: 2025-11-28 09:45:59.848895758 +0000 UTC m=+0.080584541 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:45:59 np0005538513.localdomain podman[280224]: 2025-11-28 09:45:59.857328506 +0000 UTC m=+0.089017299 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:45:59 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:45:59 np0005538513.localdomain podman[280223]: 2025-11-28 09:45:59.904238074 +0000 UTC m=+0.140737104 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:45:59 np0005538513.localdomain podman[280223]: 2025-11-28 09:45:59.945533878 +0000 UTC m=+0.182032908 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:45:59 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.671 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.672 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.713 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 73981952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.714 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69795790-826f-4bef-a74b-eb3f7175364c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73981952, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:46:00.672169', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0c237468-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': '8b7ef05c8063d77fe61d50f6e9481e9000bd035efd5656c63336dbfb83da9490'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:46:00.672169', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0c238872-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': '78d95ddc81d06d2747dc15175db1bbc24a6e6572e9ff8709400fbcbcc8a06e2c'}]}, 'timestamp': '2025-11-28 09:46:00.714879', '_unique_id': '0ce8be25a6fb40f099f3332d0a798b57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.717 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.721 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8daddac-88b0-4d90-8a22-bf55f6417c18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:46:00.717926', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '0c24a310-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.889866817, 'message_signature': '523bded1def110a37da60e221beb7ef334eab4dc793805692c44ea470b88f468'}]}, 'timestamp': '2025-11-28 09:46:00.722172', '_unique_id': '204814ff4323457ca860467356b9698f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.724 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.724 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 96 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86b838e6-9c7d-44d2-a208-3495cb2c814e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 96, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:46:00.724349', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '0c250bca-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.889866817, 'message_signature': '0b097dad012fbeeb49350ed6ee606448c1c13c2e8f5eb0b18a606ca545049281'}]}, 'timestamp': '2025-11-28 09:46:00.724844', '_unique_id': '73d1b305818f4770930368a08e11cd33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.726 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.727 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 13176 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '135cf464-05ba-401a-b896-e83f20a13754', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 13176, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:46:00.726977', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '0c2573ee-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.889866817, 'message_signature': '01bc09f7c39a44f7d3d5b7b05fbc69788a8962f7ad8590931e334806d236a0df'}]}, 'timestamp': '2025-11-28 09:46:00.727477', '_unique_id': '5998fc29cbe04b2a89e71a2743c15f4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.729 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.742 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.742 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91e6f2da-48d7-465c-942b-556c4834ebd9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:46:00.729586', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0c27cf40-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.901483573, 'message_signature': '386a3c2fd828ea846f4ad3e78f703b73fffc8b31bdfce2e06436586655f36e42'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:46:00.729586', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0c27e516-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.901483573, 'message_signature': 'ead8d58a284c557a3d05df491689badecf643c8bbf2f22ca20258bfb3770a485'}]}, 'timestamp': '2025-11-28 09:46:00.743476', '_unique_id': '29b44f8bbcb64680987a3d56a4a1f0b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.745 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.745 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1313024378 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.746 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 168520223 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc69c382-3319-4ce6-9aa0-58968821ea84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1313024378, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:46:00.745708', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0c284eb6-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': '3258451ffdb1ccb82f61e4b50c6ec0b7658c7b9680113c8178273157093f70c5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 168520223, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:46:00.745708', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0c2860ae-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': '2d14517fd5281626c14e578e009e6b2a458f93b86179f9ce674b8aa61f803360'}]}, 'timestamp': '2025-11-28 09:46:00.746616', '_unique_id': '431e20af5f0e492988b5d901faa69230'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.748 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.748 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f7dff75-0abd-4047-8337-d18e48d3d201', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:46:00.748762', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '0c28c558-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.889866817, 'message_signature': 'd6b660a35bb0126b13d9b57b3e2833700d4cb51855de781a456ae36583eb8044'}]}, 'timestamp': '2025-11-28 09:46:00.749254', '_unique_id': '9d539671901a44ba90bfd0f7c8548da1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.751 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.751 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.751 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c7c9666-8c20-44f1-a894-e68a81711013', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:46:00.751330', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0c292944-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': 'ef9311d13208d97796d31b5547c6582d009cb9ef84cd04c3646af0abd8a316e8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:46:00.751330', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0c29395c-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': '9c01dff6c6fc517c826e15727d864b6e62049c94194aee090e98ce30de956210'}]}, 'timestamp': '2025-11-28 09:46:00.752197', '_unique_id': '18da2b7e742e4610ae49c17ce8d0f546'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.754 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.754 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c78dc005-4d73-465b-b394-117f56e550a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:46:00.754342', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '0c299f1e-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.889866817, 'message_signature': '90ba47ec2fcb6bb9494767464a50dbf3eb8a2cbe18a14cb22b74c3e761dea17b'}]}, 'timestamp': '2025-11-28 09:46:00.754796', '_unique_id': '739e6e622fe84f8ca736e22a404b122c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.756 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.757 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 434 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a1bd1b9-4f7f-4b23-acbb-e63eb9bfe398', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 434, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:46:00.757051', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '0c2a09ae-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.889866817, 'message_signature': '085cf82ef6fddfb905e509bc13f0c60047366c3b89938156e9f72aa2742e05df'}]}, 'timestamp': '2025-11-28 09:46:00.757529', '_unique_id': '77e2696f88d74e49aaa5bfb27cc6322f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.759 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.759 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.760 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b7c66d8-cabd-4d9d-83ce-7c5ccc404daa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:46:00.759592', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0c2a6bec-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.901483573, 'message_signature': '321b2f59f821755c10aed941ca6d150215d1def685e8f3e9b00ddd0fb03707c5'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:46:00.759592', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0c2a7d62-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.901483573, 'message_signature': '56c337727df1cf2d47d45e12e61f2ee77524135afd3810e9753c9db6330adf8f'}]}, 'timestamp': '2025-11-28 09:46:00.760461', '_unique_id': '7873be90d23b40b2b8b64f064b903aaf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.762 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.762 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '138178a3-31e3-44f8-8bd3-0ada4a9f7d3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:46:00.762603', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '0c2ae1e4-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.889866817, 'message_signature': '43ed3b5583aa0630e8e15ff57c1e60bc64e30d37eab8101126cb225968767ab6'}]}, 'timestamp': '2025-11-28 09:46:00.763104', '_unique_id': '01aab747f41d4fc3928edd4add67a1ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.765 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.765 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 10055 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1cfc7ae2-9b36-44fe-8586-ca26c39987c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10055, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:46:00.765297', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '0c2b4b0c-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.889866817, 'message_signature': 'e793db2d08de9654940dd4947743122ec9dc2f078207c8e20eae19054dae3f95'}]}, 'timestamp': '2025-11-28 09:46:00.765753', '_unique_id': 'ee33acd72e534df99b03bc81a52ae093'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.767 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.767 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 149 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bfbaeb64-9e2e-44ad-ad5a-af01ddb2ad42', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 149, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:46:00.767822', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '0c2baed0-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.889866817, 'message_signature': '3b63c6e0de31a770bd415a42c4f26756d21758aa801d4f95c91bdba5d8bf4c86'}]}, 'timestamp': '2025-11-28 09:46:00.768310', '_unique_id': '44c0e3ce270f4dbb82720bbc4a3ec6d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.770 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.770 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.770 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc9dced6-5aaa-44e0-9187-e911f856fd82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:46:00.770410', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0c2c12e4-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.901483573, 'message_signature': '12faf3e2fefc2de52e586590577ae527febaae7df5b9a199bebd5ddb92f791cc'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:46:00.770410', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0c2c2414-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.901483573, 'message_signature': 'e151b9572d2b55bad3e0bcaf00ed0a0b0b56f5047e94f5281708e4de1b351a5e'}]}, 'timestamp': '2025-11-28 09:46:00.771283', '_unique_id': '50f8c5ef8beb4d0187772979fe9e1a9e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.773 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 51860000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15209a9c-d1dd-475f-b66c-c99748c0cff1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 51860000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:46:00.773457', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '0c2ff616-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.96762273, 'message_signature': '0fc68ce59dd98b4d64c4e98689528022d6a466d5630b0ada185cd02cbfdff1e8'}]}, 'timestamp': '2025-11-28 09:46:00.796337', '_unique_id': '893573b226c44ee5877cfb6f66343a35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.798 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.798 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.799 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '352549ca-b39a-47a1-b588-077b2998cafe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:46:00.798691', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0c306358-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': '203466ffe0361555e440b291c3eb28d23170ddf4e7c2fc7fe419273657149c0d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:46:00.798691', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0c30762c-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': '63bff0a7175da3fd27af2c25e3cc9275eb9caf1dc5d95674e6d95881baf9c312'}]}, 'timestamp': '2025-11-28 09:46:00.799596', '_unique_id': '643908a4add3432aa4f948515d1457df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.801 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.801 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 52.40234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c11e75d3-0419-4a2c-bb6e-3251deb091d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.40234375, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:46:00.801750', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '0c30dacc-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.96762273, 'message_signature': '0206c2771afd58b28758b00ac77c2fc5e47beeac372a012b0fa44549025ea87e'}]}, 'timestamp': '2025-11-28 09:46:00.802213', '_unique_id': '35fef72b6feb4fe1ae76afea5fa17f11'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.804 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.804 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 566 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.805 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3152f87a-bba1-4041-adce-ff07282fbdb7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 566, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:46:00.804561', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0c31499e-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': 'b0fe6014b30681eb1274db46ed7f63d01c94f0b6877166cea8645953e2d89715'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:46:00.804561', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0c315b46-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': 'bb97819670396f47e9cf4256d43bcb6e56978762efaac2e0d10c5619d7b2df5c'}]}, 'timestamp': '2025-11-28 09:46:00.805461', '_unique_id': 'a454df7b4c884cde883753e710a32300'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.807 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.807 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 305908425 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.808 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 30452399 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4cab442-3178-471e-ab10-31d1d80bcc27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 305908425, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:46:00.807770', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0c31c61c-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': '4e476332997d5399ad1a893f11b455ae6553d57ce26c81ca916d882575d49f6b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30452399, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:46:00.807770', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0c31d79c-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': 'e0a7338b9a9a234e642b96d422c18d37a62ae589975bf5932bc858b69a590e34'}]}, 'timestamp': '2025-11-28 09:46:00.808642', '_unique_id': 'b77a29afbf7d437b83876e12d67f66bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.810 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.810 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.810 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 434 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ab9d566-b352-4a24-8c62-5cd93c69ad65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 434, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:46:00.810917', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '0c32425e-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.889866817, 'message_signature': '570ce8a0f62c3048d0663cd2c42eeae1829ce851dacf64f3c758c2e244d2812e'}]}, 'timestamp': '2025-11-28 09:46:00.811407', '_unique_id': 'e07ff4235af844439981cf48b30cd636'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:46:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.814 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:01.567 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:02 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52243 DF PROTO=TCP SPT=41230 DPT=9102 SEQ=2153373983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB29A6B40000000001030307) 
Nov 28 09:46:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:46:02 np0005538513.localdomain podman[280264]: 2025-11-28 09:46:02.85486874 +0000 UTC m=+0.082971553 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 09:46:02 np0005538513.localdomain podman[280264]: 2025-11-28 09:46:02.870569471 +0000 UTC m=+0.098672244 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 09:46:02 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:46:03 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:03.145 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52244 DF PROTO=TCP SPT=41230 DPT=9102 SEQ=2153373983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB29AAC20000000001030307) 
Nov 28 09:46:03 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:03.898 279685 DEBUG nova.compute.manager [None req-0ea6a6a1-b9ca-4efa-989e-4c0fd01ead00 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 09:46:03 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:03.904 279685 INFO nova.compute.manager [None req-0ea6a6a1-b9ca-4efa-989e-4c0fd01ead00 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Retrieving diagnostics
Nov 28 09:46:04 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49121 DF PROTO=TCP SPT=47508 DPT=9102 SEQ=3958754893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB29AD830000000001030307) 
Nov 28 09:46:05 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52245 DF PROTO=TCP SPT=41230 DPT=9102 SEQ=2153373983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB29B2C20000000001030307) 
Nov 28 09:46:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2458 DF PROTO=TCP SPT=34262 DPT=9102 SEQ=3361221625 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB29B5820000000001030307) 
Nov 28 09:46:06 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:06.569 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:08 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:08.185 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:46:08 np0005538513.localdomain podman[280284]: 2025-11-28 09:46:08.858650814 +0000 UTC m=+0.087572234 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:46:08 np0005538513.localdomain podman[280284]: 2025-11-28 09:46:08.895963077 +0000 UTC m=+0.124884497 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:46:08 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:46:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52246 DF PROTO=TCP SPT=41230 DPT=9102 SEQ=2153373983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB29C2820000000001030307) 
Nov 28 09:46:09 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:09.551 279685 DEBUG oslo_concurrency.lockutils [None req-1537ba4d-75cc-4878-bfd6-c825657acb5e 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Acquiring lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:46:09 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:09.552 279685 DEBUG oslo_concurrency.lockutils [None req-1537ba4d-75cc-4878-bfd6-c825657acb5e 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:46:09 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:09.552 279685 DEBUG nova.compute.manager [None req-1537ba4d-75cc-4878-bfd6-c825657acb5e 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 09:46:09 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:09.556 279685 DEBUG nova.compute.manager [None req-1537ba4d-75cc-4878-bfd6-c825657acb5e 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Nov 28 09:46:09 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:09.561 279685 DEBUG nova.objects.instance [None req-1537ba4d-75cc-4878-bfd6-c825657acb5e 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Lazy-loading 'flavor' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:46:09 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:09.610 279685 DEBUG nova.virt.libvirt.driver [None req-1537ba4d-75cc-4878-bfd6-c825657acb5e 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 28 09:46:10 np0005538513.localdomain podman[238687]: time="2025-11-28T09:46:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:46:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:46:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1"
Nov 28 09:46:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:46:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17225 "" "Go-http-client/1.1"
Nov 28 09:46:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:46:10 np0005538513.localdomain podman[280307]: 2025-11-28 09:46:10.845769479 +0000 UTC m=+0.084023096 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:46:10 np0005538513.localdomain podman[280307]: 2025-11-28 09:46:10.881964358 +0000 UTC m=+0.120217985 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Nov 28 09:46:10 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:46:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:11.602 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:12 np0005538513.localdomain kernel: device tap09612b07-51 left promiscuous mode
Nov 28 09:46:12 np0005538513.localdomain NetworkManager[5967]: <info>  [1764323172.1172] device (tap09612b07-51): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed')
Nov 28 09:46:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:12.131 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:12 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:46:12Z|00049|binding|INFO|Releasing lport 09612b07-5142-4b0f-9dab-74bf4403f69f from this chassis (sb_readonly=0)
Nov 28 09:46:12 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:46:12Z|00050|binding|INFO|Setting lport 09612b07-5142-4b0f-9dab-74bf4403f69f down in Southbound
Nov 28 09:46:12 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:46:12Z|00051|binding|INFO|Removing iface tap09612b07-51 ovn-installed in OVS
Nov 28 09:46:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:12.134 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:12 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:12.146 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:fc:6c 192.168.0.142'], port_security=['fa:16:3e:f4:fc:6c 192.168.0.142'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.142/24', 'neutron:device_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005538513.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-40d5da59-6201-424a-8380-80ecc3d67c7e', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '9dda653c53224db086060962b0702694', 'neutron:revision_number': '7', 'neutron:security_group_ids': '6d2c5a31-c9e5-413a-bccf-f97c7687bd94 b3c60f08-3369-426b-b744-9cef04caaa7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f3122580-f73f-40fa-a838-6bad2ff9da2f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=09612b07-5142-4b0f-9dab-74bf4403f69f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 09:46:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:12.148 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:12 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:12.148 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 09612b07-5142-4b0f-9dab-74bf4403f69f in datapath 40d5da59-6201-424a-8380-80ecc3d67c7e unbound from our chassis
Nov 28 09:46:12 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:12.150 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 40d5da59-6201-424a-8380-80ecc3d67c7e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 09:46:12 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:46:12Z|00052|ovn_bfd|INFO|Disabled BFD on interface ovn-07900d-0
Nov 28 09:46:12 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:46:12Z|00053|ovn_bfd|INFO|Disabled BFD on interface ovn-c3237d-0
Nov 28 09:46:12 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:46:12Z|00054|ovn_bfd|INFO|Disabled BFD on interface ovn-11aa47-0
Nov 28 09:46:12 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:46:12Z|00055|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 09:46:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:12.154 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:12 np0005538513.localdomain systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Nov 28 09:46:12 np0005538513.localdomain systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 3min 37.832s CPU time.
Nov 28 09:46:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:12.162 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:12 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:12.163 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4820d0d9-7e2f-41c0-b3ba-1697627cf918]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:12 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:12.167 158130 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e namespace which is not needed anymore
Nov 28 09:46:12 np0005538513.localdomain systemd-machined[83422]: Machine qemu-1-instance-00000002 terminated.
Nov 28 09:46:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:12.197 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:12 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:46:12Z|00056|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 09:46:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:12.207 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:12 np0005538513.localdomain systemd[1]: libpod-9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef.scope: Deactivated successfully.
Nov 28 09:46:12 np0005538513.localdomain podman[280352]: 2025-11-28 09:46:12.379196623 +0000 UTC m=+0.082845899 container died 9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12)
Nov 28 09:46:12 np0005538513.localdomain podman[280352]: 2025-11-28 09:46:12.567299387 +0000 UTC m=+0.270948603 container cleanup 9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 09:46:12 np0005538513.localdomain podman[280374]: 2025-11-28 09:46:12.581625485 +0000 UTC m=+0.188969411 container cleanup 9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, io.openshift.expose-services=)
Nov 28 09:46:12 np0005538513.localdomain systemd[1]: libpod-conmon-9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef.scope: Deactivated successfully.
Nov 28 09:46:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:12.631 279685 INFO nova.virt.libvirt.driver [None req-1537ba4d-75cc-4878-bfd6-c825657acb5e 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Instance shutdown successfully after 3 seconds.
Nov 28 09:46:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:12.641 279685 INFO nova.virt.libvirt.driver [-] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Instance destroyed successfully.
Nov 28 09:46:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:12.641 279685 DEBUG nova.objects.instance [None req-1537ba4d-75cc-4878-bfd6-c825657acb5e 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Lazy-loading 'numa_topology' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:46:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:12.672 279685 DEBUG nova.compute.manager [None req-1537ba4d-75cc-4878-bfd6-c825657acb5e 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 09:46:12 np0005538513.localdomain podman[280393]: 2025-11-28 09:46:12.675165991 +0000 UTC m=+0.084346685 container remove 9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, release=1761123044, distribution-scope=public, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 09:46:12 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:12.680 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[c14db70d-78af-48a1-84fc-62e817cae636]: (4, ('Fri Nov 28 09:46:12 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e (9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef)\n9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef\nFri Nov 28 09:46:12 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e (9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef)\n9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:12 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:12.682 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[21a9fbfe-2952-42ee-bb21-3c7e39aa4540]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:12 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:12.683 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap40d5da59-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:46:12 np0005538513.localdomain kernel: device tap40d5da59-60 left promiscuous mode
Nov 28 09:46:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:12.731 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:12.735 279685 DEBUG nova.compute.manager [req-4f08f0dc-612c-4385-be28-ef925e539829 req-2e976fac-48e1-4662-a489-6685eb8d4499 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Received event network-vif-unplugged-09612b07-5142-4b0f-9dab-74bf4403f69f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 09:46:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:12.736 279685 DEBUG oslo_concurrency.lockutils [req-4f08f0dc-612c-4385-be28-ef925e539829 req-2e976fac-48e1-4662-a489-6685eb8d4499 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:46:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:12.736 279685 DEBUG oslo_concurrency.lockutils [req-4f08f0dc-612c-4385-be28-ef925e539829 req-2e976fac-48e1-4662-a489-6685eb8d4499 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:46:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:12.737 279685 DEBUG oslo_concurrency.lockutils [req-4f08f0dc-612c-4385-be28-ef925e539829 req-2e976fac-48e1-4662-a489-6685eb8d4499 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:46:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:12.737 279685 DEBUG nova.compute.manager [req-4f08f0dc-612c-4385-be28-ef925e539829 req-2e976fac-48e1-4662-a489-6685eb8d4499 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] No waiting events found dispatching network-vif-unplugged-09612b07-5142-4b0f-9dab-74bf4403f69f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 09:46:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:12.738 279685 WARNING nova.compute.manager [req-4f08f0dc-612c-4385-be28-ef925e539829 req-2e976fac-48e1-4662-a489-6685eb8d4499 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Received unexpected event network-vif-unplugged-09612b07-5142-4b0f-9dab-74bf4403f69f for instance with vm_state active and task_state powering-off.
Nov 28 09:46:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:12.743 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:12 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:12.749 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[15198271-8fdd-47bf-ab40-06127765ab2a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:12 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:12.766 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[bc040bb8-8af1-4cf2-88be-381b34075c0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:12 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:12.768 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4cfc5912-5099-45ab-90ca-ae40740bb8b1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:12 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:12.784 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[730baa0a-cdf1-4d49-8382-e17d4c69f412]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662040, 'reachable_time': 20124, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280412, 'error': None, 'target': 'ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:12 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:12.794 158264 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 28 09:46:12 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:12.795 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[b58c1bb3-d57d-4548-9296-dd6fddf99439]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:12.804 279685 DEBUG oslo_concurrency.lockutils [None req-1537ba4d-75cc-4878-bfd6-c825657acb5e 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.253s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:46:13 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:13.187 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:13 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-b264a93705d5a28ba8f902d268499c1bea32890d992fb54a7c6890490d1eeb3f-merged.mount: Deactivated successfully.
Nov 28 09:46:13 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef-userdata-shm.mount: Deactivated successfully.
Nov 28 09:46:13 np0005538513.localdomain systemd[1]: run-netns-ovnmeta\x2d40d5da59\x2d6201\x2d424a\x2d8380\x2d80ecc3d67c7e.mount: Deactivated successfully.
Nov 28 09:46:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:14.588 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:46:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:14.617 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Triggering sync for uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 28 09:46:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:14.618 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:46:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:14.619 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:46:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:14.619 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:46:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:14.652 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.033s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:46:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:14.758 279685 DEBUG nova.compute.manager [req-d911f796-b120-46d0-9e51-57e02329e71d req-2c2b322c-3821-4542-a7ea-9674aa755611 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Received event network-vif-plugged-09612b07-5142-4b0f-9dab-74bf4403f69f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 09:46:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:14.758 279685 DEBUG oslo_concurrency.lockutils [req-d911f796-b120-46d0-9e51-57e02329e71d req-2c2b322c-3821-4542-a7ea-9674aa755611 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:46:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:14.759 279685 DEBUG oslo_concurrency.lockutils [req-d911f796-b120-46d0-9e51-57e02329e71d req-2c2b322c-3821-4542-a7ea-9674aa755611 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:46:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:14.759 279685 DEBUG oslo_concurrency.lockutils [req-d911f796-b120-46d0-9e51-57e02329e71d req-2c2b322c-3821-4542-a7ea-9674aa755611 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:46:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:14.760 279685 DEBUG nova.compute.manager [req-d911f796-b120-46d0-9e51-57e02329e71d req-2c2b322c-3821-4542-a7ea-9674aa755611 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] No waiting events found dispatching network-vif-plugged-09612b07-5142-4b0f-9dab-74bf4403f69f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 09:46:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:14.760 279685 WARNING nova.compute.manager [req-d911f796-b120-46d0-9e51-57e02329e71d req-2c2b322c-3821-4542-a7ea-9674aa755611 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Received unexpected event network-vif-plugged-09612b07-5142-4b0f-9dab-74bf4403f69f for instance with vm_state stopped and task_state None.
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.009 279685 DEBUG nova.compute.manager [None req-1366b4cf-2d04-4af5-9fd8-70514cd5e7d7 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server [None req-1366b4cf-2d04-4af5-9fd8-70514cd5e7d7 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server     _emit_versioned_exception_notification(
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server     self.force_reraise()
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server     raise self.value
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server     compute_utils.add_instance_fault_from_exc(context,
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server     self.force_reraise()
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server     raise self.value
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server     raise exception.InstanceInvalidState(
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server 
Nov 28 09:46:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:16.606 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52247 DF PROTO=TCP SPT=41230 DPT=9102 SEQ=2153373983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB29E3820000000001030307) 
Nov 28 09:46:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:46:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:46:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:46:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:46:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:46:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:46:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:46:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:46:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:46:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:46:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:46:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:46:18 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:18.189 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:46:20 np0005538513.localdomain podman[280414]: 2025-11-28 09:46:20.850282336 +0000 UTC m=+0.086423149 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, release=1755695350, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public)
Nov 28 09:46:20 np0005538513.localdomain podman[280414]: 2025-11-28 09:46:20.864379618 +0000 UTC m=+0.100520431 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Nov 28 09:46:20 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:46:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:21.644 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:23.191 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:26.645 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:46:26 np0005538513.localdomain sudo[280434]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:46:26 np0005538513.localdomain sudo[280434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:46:26 np0005538513.localdomain sudo[280434]: pam_unix(sudo:session): session closed for user root
Nov 28 09:46:26 np0005538513.localdomain podman[280440]: 2025-11-28 09:46:26.852283856 +0000 UTC m=+0.090404781 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:46:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:26.854 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:46:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:26.855 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:46:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:26.855 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:46:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:26.856 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:46:26 np0005538513.localdomain podman[280440]: 2025-11-28 09:46:26.86644893 +0000 UTC m=+0.104569895 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:46:26 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:46:26 np0005538513.localdomain sudo[280466]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:46:26 np0005538513.localdomain sudo[280466]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:46:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:27.362 279685 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764323172.361148, c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 09:46:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:27.363 279685 INFO nova.compute.manager [-] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] VM Stopped (Lifecycle Event)
Nov 28 09:46:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:27.388 279685 DEBUG nova.compute.manager [None req-c13de2fd-2c13-401e-8178-2bdaf5d565fa - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 09:46:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:27.393 279685 DEBUG nova.compute.manager [None req-c13de2fd-2c13-401e-8178-2bdaf5d565fa - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: None, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 09:46:27 np0005538513.localdomain sudo[280466]: pam_unix(sudo:session): session closed for user root
Nov 28 09:46:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:27.724 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:46:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:27.724 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:46:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:27.725 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:46:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:27.725 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:46:28 np0005538513.localdomain sudo[280524]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:46:28 np0005538513.localdomain sudo[280524]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:46:28 np0005538513.localdomain sudo[280524]: pam_unix(sudo:session): session closed for user root
Nov 28 09:46:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:28.193 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:28.244 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:46:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:28.278 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:46:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:28.278 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:46:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:28.279 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:46:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:28.280 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:46:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:28.280 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:46:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:28.280 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:46:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:28.281 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:46:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:28.282 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:46:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:28.282 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:46:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:28.283 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:46:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:28.301 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:46:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:28.302 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:46:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:28.302 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:46:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:28.303 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:46:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:28.303 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:46:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:28.811 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:46:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:28.896 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:46:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:28.897 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:46:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:29.105 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:46:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:29.108 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12568MB free_disk=41.83693313598633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:46:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:29.108 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:46:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:29.109 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:46:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:29.202 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:46:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:29.202 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:46:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:29.203 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:46:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:29.246 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:46:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:29.763 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:46:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:29.771 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:46:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:29.793 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:46:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:29.818 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:46:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:29.819 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.710s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:46:30 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:46:30 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:46:30 np0005538513.localdomain podman[280586]: 2025-11-28 09:46:30.856244856 +0000 UTC m=+0.085370476 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:46:30 np0005538513.localdomain systemd[1]: tmp-crun.yJrWyr.mount: Deactivated successfully.
Nov 28 09:46:30 np0005538513.localdomain podman[280587]: 2025-11-28 09:46:30.927230572 +0000 UTC m=+0.156671372 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 09:46:30 np0005538513.localdomain podman[280587]: 2025-11-28 09:46:30.962327527 +0000 UTC m=+0.191768317 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 28 09:46:30 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:46:31 np0005538513.localdomain podman[280586]: 2025-11-28 09:46:31.012639478 +0000 UTC m=+0.241765088 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:46:31 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:46:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:31.648 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:32 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49395 DF PROTO=TCP SPT=45152 DPT=9102 SEQ=3869660939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2A1BE40000000001030307) 
Nov 28 09:46:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:33.216 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49396 DF PROTO=TCP SPT=45152 DPT=9102 SEQ=3869660939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2A20020000000001030307) 
Nov 28 09:46:33 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:46:33 np0005538513.localdomain podman[280630]: 2025-11-28 09:46:33.850912382 +0000 UTC m=+0.086450830 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 09:46:33 np0005538513.localdomain podman[280630]: 2025-11-28 09:46:33.888161894 +0000 UTC m=+0.123700312 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:46:33 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:46:34 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52248 DF PROTO=TCP SPT=41230 DPT=9102 SEQ=2153373983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2A23820000000001030307) 
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.165 279685 DEBUG nova.compute.manager [None req-071dfc59-4eda-4122-96c1-bdcf9d99ded5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server [None req-071dfc59-4eda-4122-96c1-bdcf9d99ded5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server     _emit_versioned_exception_notification(
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server     self.force_reraise()
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server     raise self.value
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server     compute_utils.add_instance_fault_from_exc(context,
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server     self.force_reraise()
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server     raise self.value
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server     raise exception.InstanceInvalidState(
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Nov 28 09:46:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server 
Nov 28 09:46:35 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49397 DF PROTO=TCP SPT=45152 DPT=9102 SEQ=3869660939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2A28020000000001030307) 
Nov 28 09:46:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49122 DF PROTO=TCP SPT=47508 DPT=9102 SEQ=3958754893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2A2B820000000001030307) 
Nov 28 09:46:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:36.650 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:38.219 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49398 DF PROTO=TCP SPT=45152 DPT=9102 SEQ=3869660939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2A37C20000000001030307) 
Nov 28 09:46:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:46:39 np0005538513.localdomain systemd[1]: tmp-crun.RsYKlQ.mount: Deactivated successfully.
Nov 28 09:46:39 np0005538513.localdomain podman[280649]: 2025-11-28 09:46:39.858802854 +0000 UTC m=+0.092142184 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:46:39 np0005538513.localdomain podman[280649]: 2025-11-28 09:46:39.870341927 +0000 UTC m=+0.103681247 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:46:39 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:46:40 np0005538513.localdomain podman[238687]: time="2025-11-28T09:46:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:46:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:46:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146336 "" "Go-http-client/1.1"
Nov 28 09:46:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:46:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16747 "" "Go-http-client/1.1"
Nov 28 09:46:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:41.680 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:41 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:46:41 np0005538513.localdomain podman[280672]: 2025-11-28 09:46:41.840200353 +0000 UTC m=+0.079293320 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:46:41 np0005538513.localdomain podman[280672]: 2025-11-28 09:46:41.853420569 +0000 UTC m=+0.092513536 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:46:41 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:46:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:41.882 279685 DEBUG nova.objects.instance [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Lazy-loading 'flavor' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:46:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:41.907 279685 DEBUG oslo_concurrency.lockutils [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:46:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:41.908 279685 DEBUG oslo_concurrency.lockutils [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:46:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:41.908 279685 DEBUG nova.network.neutron [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 09:46:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:41.908 279685 DEBUG nova.objects.instance [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:46:42 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:46:42Z|00057|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.467 279685 DEBUG nova.network.neutron [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.492 279685 DEBUG oslo_concurrency.lockutils [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.531 279685 INFO nova.virt.libvirt.driver [-] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Instance destroyed successfully.
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.531 279685 DEBUG nova.objects.instance [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Lazy-loading 'numa_topology' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.548 279685 DEBUG nova.objects.instance [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Lazy-loading 'resources' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.562 279685 DEBUG nova.virt.libvirt.vif [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T08:32:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='np0005538513.localdomain',hostname='test',id=2,image_ref='391767f1-35f2-4b68-ae15-e0b29db66dcb',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T08:33:06Z,launched_on='np0005538513.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=None,node='np0005538513.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='9dda653c53224db086060962b0702694',ramdisk_id='',reservation_id='r-a3c307c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='391767f1-35f2-4b68-ae15-e0b29db66dcb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T09:46:12Z,user_data=None,user_id='4d9169247d4447d0a8dd4c33f8b23dee',uuid=c2f0c7d6-df5f-4541-8b2c-bc1eaf805812,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.563 279685 DEBUG nova.network.os_vif_util [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Converting VIF {"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.564 279685 DEBUG nova.network.os_vif_util [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.565 279685 DEBUG os_vif [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.567 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.568 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09612b07-51, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.570 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.571 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.574 279685 INFO os_vif [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51')
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.577 279685 DEBUG nova.virt.libvirt.host [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.578 279685 INFO nova.virt.libvirt.host [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] UEFI support detected
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.586 279685 DEBUG nova.virt.libvirt.driver [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Start _get_guest_xml network_info=[{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=391767f1-35f2-4b68-ae15-e0b29db66dcb,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'device_type': 'disk', 'encrypted': False, 'image_id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}], 'ephemerals': [{'encryption_format': None, 'size': 1, 'device_name': '/dev/vdb', 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'encrypted': False}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.591 279685 WARNING nova.virt.libvirt.driver [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.593 279685 DEBUG nova.virt.libvirt.host [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Searching host: 'np0005538513.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.594 279685 DEBUG nova.virt.libvirt.host [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.595 279685 DEBUG nova.virt.libvirt.host [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Searching host: 'np0005538513.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.596 279685 DEBUG nova.virt.libvirt.host [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.597 279685 DEBUG nova.virt.libvirt.driver [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.597 279685 DEBUG nova.virt.hardware [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T08:32:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={},flavorid='f3c44237-060e-4213-a926-aa7fdb4bf902',id=2,is_public=True,memory_mb=512,name='m1.small',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=391767f1-35f2-4b68-ae15-e0b29db66dcb,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.598 279685 DEBUG nova.virt.hardware [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.599 279685 DEBUG nova.virt.hardware [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.599 279685 DEBUG nova.virt.hardware [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.600 279685 DEBUG nova.virt.hardware [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.600 279685 DEBUG nova.virt.hardware [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.600 279685 DEBUG nova.virt.hardware [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.601 279685 DEBUG nova.virt.hardware [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.601 279685 DEBUG nova.virt.hardware [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.601 279685 DEBUG nova.virt.hardware [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.602 279685 DEBUG nova.virt.hardware [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.602 279685 DEBUG nova.objects.instance [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Lazy-loading 'vcpu_model' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.629 279685 DEBUG nova.privsep.utils [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 28 09:46:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:42.630 279685 DEBUG oslo_concurrency.processutils [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.110 279685 DEBUG oslo_concurrency.processutils [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.113 279685 DEBUG oslo_concurrency.processutils [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.518 279685 DEBUG oslo_concurrency.processutils [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.406s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.521 279685 DEBUG nova.virt.libvirt.vif [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T08:32:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='np0005538513.localdomain',hostname='test',id=2,image_ref='391767f1-35f2-4b68-ae15-e0b29db66dcb',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T08:33:06Z,launched_on='np0005538513.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=None,node='np0005538513.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='9dda653c53224db086060962b0702694',ramdisk_id='',reservation_id='r-a3c307c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='391767f1-35f2-4b68-ae15-e0b29db66dcb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T09:46:12Z,user_data=None,user_id='4d9169247d4447d0a8dd4c33f8b23dee',uuid=c2f0c7d6-df5f-4541-8b2c-bc1eaf805812,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.522 279685 DEBUG nova.network.os_vif_util [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Converting VIF {"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.523 279685 DEBUG nova.network.os_vif_util [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.525 279685 DEBUG nova.objects.instance [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Lazy-loading 'pci_devices' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.546 279685 DEBUG nova.virt.libvirt.driver [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] End _get_guest_xml xml=<domain type="kvm">
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:   <uuid>c2f0c7d6-df5f-4541-8b2c-bc1eaf805812</uuid>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:   <name>instance-00000002</name>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:   <memory>524288</memory>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:   <vcpu>1</vcpu>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:   <metadata>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <nova:name>test</nova:name>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <nova:creationTime>2025-11-28 09:46:42</nova:creationTime>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <nova:flavor name="m1.small">
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:         <nova:memory>512</nova:memory>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:         <nova:disk>1</nova:disk>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:         <nova:swap>0</nova:swap>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:         <nova:ephemeral>1</nova:ephemeral>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:         <nova:vcpus>1</nova:vcpus>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       </nova:flavor>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <nova:owner>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:         <nova:user uuid="4d9169247d4447d0a8dd4c33f8b23dee">admin</nova:user>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:         <nova:project uuid="9dda653c53224db086060962b0702694">admin</nova:project>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       </nova:owner>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <nova:root type="image" uuid="391767f1-35f2-4b68-ae15-e0b29db66dcb"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <nova:ports>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:         <nova:port uuid="09612b07-5142-4b0f-9dab-74bf4403f69f">
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:           <nova:ip type="fixed" address="192.168.0.142" ipVersion="4"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:         </nova:port>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       </nova:ports>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     </nova:instance>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:   </metadata>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:   <sysinfo type="smbios">
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <system>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <entry name="manufacturer">RDO</entry>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <entry name="product">OpenStack Compute</entry>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <entry name="serial">c2f0c7d6-df5f-4541-8b2c-bc1eaf805812</entry>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <entry name="uuid">c2f0c7d6-df5f-4541-8b2c-bc1eaf805812</entry>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <entry name="family">Virtual Machine</entry>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     </system>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:   </sysinfo>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:   <os>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <type arch="x86_64" machine="pc-q35-rhel9.0.0">hvm</type>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <boot dev="hd"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <smbios mode="sysinfo"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:   </os>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:   <features>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <acpi/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <apic/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <vmcoreinfo/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:   </features>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:   <clock offset="utc">
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <timer name="pit" tickpolicy="delay"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <timer name="hpet" present="no"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:   </clock>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:   <cpu mode="host-model" match="exact">
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <topology sockets="1" cores="1" threads="1"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:   </cpu>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:   <devices>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <disk type="network" device="disk">
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <driver type="raw" cache="none"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <source protocol="rbd" name="vms/c2f0c7d6-df5f-4541-8b2c-bc1eaf805812_disk">
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:         <host name="172.18.0.103" port="6789"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:         <host name="172.18.0.105" port="6789"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:         <host name="172.18.0.104" port="6789"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       </source>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <auth username="openstack">
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:         <secret type="ceph" uuid="2c5417c9-00eb-57d5-a565-ddecbc7995c1"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       </auth>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <target dev="vda" bus="virtio"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     </disk>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <disk type="network" device="disk">
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <driver type="raw" cache="none"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <source protocol="rbd" name="vms/c2f0c7d6-df5f-4541-8b2c-bc1eaf805812_disk.eph0">
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:         <host name="172.18.0.103" port="6789"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:         <host name="172.18.0.105" port="6789"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:         <host name="172.18.0.104" port="6789"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       </source>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <auth username="openstack">
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:         <secret type="ceph" uuid="2c5417c9-00eb-57d5-a565-ddecbc7995c1"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       </auth>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <target dev="vdb" bus="virtio"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     </disk>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <interface type="ethernet">
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <mac address="fa:16:3e:f4:fc:6c"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <model type="virtio"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <driver name="vhost" rx_queue_size="512"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <mtu size="1292"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <target dev="tap09612b07-51"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     </interface>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <serial type="pty">
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <log file="/var/lib/nova/instances/c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/console.log" append="off"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     </serial>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <video>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <model type="virtio"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     </video>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <input type="tablet" bus="usb"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <input type="keyboard" bus="usb"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <rng model="virtio">
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <backend model="random">/dev/urandom</backend>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     </rng>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <controller type="usb" index="0"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     <memballoon model="virtio">
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:       <stats period="10"/>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:     </memballoon>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:   </devices>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: </domain>
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.549 279685 DEBUG nova.virt.libvirt.driver [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.550 279685 DEBUG nova.virt.libvirt.driver [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.551 279685 DEBUG nova.virt.libvirt.vif [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T08:32:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='np0005538513.localdomain',hostname='test',id=2,image_ref='391767f1-35f2-4b68-ae15-e0b29db66dcb',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T08:33:06Z,launched_on='np0005538513.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=None,node='np0005538513.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='9dda653c53224db086060962b0702694',ramdisk_id='',reservation_id='r-a3c307c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='391767f1-35f2-4b68-ae15-e0b29db66dcb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T09:46:12Z,user_data=None,user_id='4d9169247d4447d0a8dd4c33f8b23dee',uuid=c2f0c7d6-df5f-4541-8b2c-bc1eaf805812,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.552 279685 DEBUG nova.network.os_vif_util [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Converting VIF {"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.553 279685 DEBUG nova.network.os_vif_util [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.553 279685 DEBUG os_vif [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.554 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.555 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.556 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.560 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.560 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09612b07-51, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.561 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap09612b07-51, col_values=(('external_ids', {'iface-id': '09612b07-5142-4b0f-9dab-74bf4403f69f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f4:fc:6c', 'vm-uuid': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.563 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.567 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.571 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.573 279685 INFO os_vif [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51')
Nov 28 09:46:43 np0005538513.localdomain systemd[1]: Started libvirt secret daemon.
Nov 28 09:46:43 np0005538513.localdomain kernel: device tap09612b07-51 entered promiscuous mode
Nov 28 09:46:43 np0005538513.localdomain NetworkManager[5967]: <info>  [1764323203.6912] manager: (tap09612b07-51): new Tun device (/org/freedesktop/NetworkManager/Devices/15)
Nov 28 09:46:43 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:46:43Z|00058|binding|INFO|Claiming lport 09612b07-5142-4b0f-9dab-74bf4403f69f for this chassis.
Nov 28 09:46:43 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:46:43Z|00059|binding|INFO|09612b07-5142-4b0f-9dab-74bf4403f69f: Claiming fa:16:3e:f4:fc:6c 192.168.0.142
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.691 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.698 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:43 np0005538513.localdomain systemd-udevd[280765]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.701 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:43 np0005538513.localdomain NetworkManager[5967]: <info>  [1764323203.7160] device (tap09612b07-51): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Nov 28 09:46:43 np0005538513.localdomain NetworkManager[5967]: <info>  [1764323203.7172] device (tap09612b07-51): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Nov 28 09:46:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:43.717 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:fc:6c 192.168.0.142'], port_security=['fa:16:3e:f4:fc:6c 192.168.0.142'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.142/24', 'neutron:device_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-40d5da59-6201-424a-8380-80ecc3d67c7e', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '9dda653c53224db086060962b0702694', 'neutron:revision_number': '8', 'neutron:security_group_ids': '6d2c5a31-c9e5-413a-bccf-f97c7687bd94 b3c60f08-3369-426b-b744-9cef04caaa7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f3122580-f73f-40fa-a838-6bad2ff9da2f, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=09612b07-5142-4b0f-9dab-74bf4403f69f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 09:46:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:43.720 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 09612b07-5142-4b0f-9dab-74bf4403f69f in datapath 40d5da59-6201-424a-8380-80ecc3d67c7e bound to our chassis
Nov 28 09:46:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:43.722 158130 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 40d5da59-6201-424a-8380-80ecc3d67c7e
Nov 28 09:46:43 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:46:43Z|00060|ovn_bfd|INFO|Enabled BFD on interface ovn-07900d-0
Nov 28 09:46:43 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:46:43Z|00061|ovn_bfd|INFO|Enabled BFD on interface ovn-c3237d-0
Nov 28 09:46:43 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:46:43Z|00062|ovn_bfd|INFO|Enabled BFD on interface ovn-11aa47-0
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.731 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:43.733 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[ef84b72a-f9ed-4115-b95c-d071518ce55b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:43.733 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap40d5da59-61 in ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 28 09:46:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:43.735 158233 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap40d5da59-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 28 09:46:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:43.736 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[53e62a2d-e84b-44e6-bed7-dcefa7cbafb5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:43.737 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[c5d0c1a9-b0f6-447d-a13b-2ce0dff37065]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.750 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:43 np0005538513.localdomain systemd-machined[83422]: New machine qemu-2-instance-00000002.
Nov 28 09:46:43 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:46:43Z|00063|binding|INFO|Setting lport 09612b07-5142-4b0f-9dab-74bf4403f69f up in Southbound
Nov 28 09:46:43 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:46:43Z|00064|binding|INFO|Setting lport 09612b07-5142-4b0f-9dab-74bf4403f69f ovn-installed in OVS
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.762 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:43.762 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[047b5321-ea3d-4ef2-9fb0-2dad45cd3636]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:43 np0005538513.localdomain systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Nov 28 09:46:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:43.777 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[6b9de73e-6104-463b-8dff-ae0bda8fcf0c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.809 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:43.817 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[bd9b13d9-5198-4ab7-8f36-f83fe2ae8cf7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:43 np0005538513.localdomain systemd-udevd[280767]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 09:46:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:43.824 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[bccbb675-c9f7-4aa2-862b-73318ccb8894]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:43 np0005538513.localdomain NetworkManager[5967]: <info>  [1764323203.8261] manager: (tap40d5da59-60): new Veth device (/org/freedesktop/NetworkManager/Devices/16)
Nov 28 09:46:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:43.859 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[01b8fbfb-6a0d-48dd-bce5-3f959c135eb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:43.861 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:43.862 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[6277a64c-c063-4230-b8ff-384d81f75d60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:43 np0005538513.localdomain NetworkManager[5967]: <info>  [1764323203.8848] device (tap40d5da59-60): carrier: link connected
Nov 28 09:46:43 np0005538513.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap40d5da59-61: link becomes ready
Nov 28 09:46:43 np0005538513.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap40d5da59-60: link becomes ready
Nov 28 09:46:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:43.889 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[e2e09e5d-3270-4ff7-89ca-eb00f68816b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:43.903 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[d2dc5514-10e3-4a8d-97d2-f63b3bbed7a3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap40d5da59-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:28:4d:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1103798, 'reachable_time': 31313, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280804, 'error': None, 'target': 'ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:43.914 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[369b69d3-031b-4f39-b076-a34e68ed6559]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe28:4d05'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1103798, 'tstamp': 1103798}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 280813, 'error': None, 'target': 'ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:43 np0005538513.localdomain snmpd[66832]: IfIndex of an interface changed. Such interfaces will appear multiple times in IF-MIB.
Nov 28 09:46:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:43.933 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[b71b4f94-f1ab-4f3e-99e0-91629c354f09]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap40d5da59-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:28:4d:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1103798, 'reachable_time': 31313, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 280822, 'error': None, 'target': 'ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:43.958 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[73c3e1d8-331b-4b95-b611-ab1814750803]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:44.009 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4e1026cb-9a3e-45a4-861b-c5d44395f604]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:44.011 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap40d5da59-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:44.011 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:44.011 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap40d5da59-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:46:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:44.064 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:44 np0005538513.localdomain kernel: device tap40d5da59-60 entered promiscuous mode
Nov 28 09:46:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:44.065 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:44.070 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap40d5da59-60, col_values=(('external_ids', {'iface-id': '3ff57c88-06c6-4894-984a-80ce116d1456'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:46:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:44.071 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:46:44Z|00065|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 09:46:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:44.072 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:44.074 158130 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/40d5da59-6201-424a-8380-80ecc3d67c7e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/40d5da59-6201-424a-8380-80ecc3d67c7e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:44.076 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[eb616e3d-b1fb-4d16-8059-556f4421a8ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:46:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:44.077 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:44.078 158130 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]: global
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]:     log         /dev/log local0 debug
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]:     log-tag     haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]:     user        root
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]:     group       root
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]:     maxconn     1024
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]:     pidfile     /var/lib/neutron/external/pids/40d5da59-6201-424a-8380-80ecc3d67c7e.pid.haproxy
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]:     daemon
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]: 
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]: defaults
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]:     log global
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]:     mode http
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]:     option httplog
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]:     option dontlognull
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]:     option http-server-close
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]:     option forwardfor
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]:     retries                 3
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout http-request    30s
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout connect         30s
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout client          32s
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout server          32s
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout http-keep-alive 30s
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]: 
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]: 
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]: listen listener
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]:     bind 169.254.169.254:80
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]:     server metadata /var/lib/neutron/metadata_proxy
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]:     http-request add-header X-OVN-Network-ID 40d5da59-6201-424a-8380-80ecc3d67c7e
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 28 09:46:44 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:44.079 158130 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e', 'env', 'PROCESS_TAG=haproxy-40d5da59-6201-424a-8380-80ecc3d67c7e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/40d5da59-6201-424a-8380-80ecc3d67c7e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 28 09:46:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:44.145 279685 DEBUG nova.virt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Emitting event <LifecycleEvent: 1764323204.1455128, c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 09:46:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:44.146 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] VM Resumed (Lifecycle Event)
Nov 28 09:46:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:44.150 279685 DEBUG nova.compute.manager [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 09:46:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:44.165 279685 INFO nova.virt.libvirt.driver [-] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Instance rebooted successfully.
Nov 28 09:46:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:44.165 279685 DEBUG nova.compute.manager [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 09:46:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:44.170 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 09:46:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:44.173 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 09:46:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:44.201 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] During sync_power_state the instance has a pending task (powering-on). Skip.
Nov 28 09:46:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:44.201 279685 DEBUG nova.virt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Emitting event <LifecycleEvent: 1764323204.146727, c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 09:46:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:44.201 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] VM Started (Lifecycle Event)
Nov 28 09:46:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:44.218 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 09:46:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:44.221 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 09:46:44 np0005538513.localdomain podman[280882]: 
Nov 28 09:46:44 np0005538513.localdomain podman[280882]: 2025-11-28 09:46:44.494893253 +0000 UTC m=+0.078929519 container create 5c16471a00387105cdcd384a69ea9a94b342e8ed65f03b71a8324776ec10e264 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 28 09:46:44 np0005538513.localdomain podman[280882]: 2025-11-28 09:46:44.446603983 +0000 UTC m=+0.030640259 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 09:46:44 np0005538513.localdomain systemd[1]: Started libpod-conmon-5c16471a00387105cdcd384a69ea9a94b342e8ed65f03b71a8324776ec10e264.scope.
Nov 28 09:46:44 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:46:44 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9eda2c7888720a1ed258ad38c3fe5daf04876572a87ed50f81d219a10f47fcb7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 09:46:44 np0005538513.localdomain podman[280882]: 2025-11-28 09:46:44.611720183 +0000 UTC m=+0.195756439 container init 5c16471a00387105cdcd384a69ea9a94b342e8ed65f03b71a8324776ec10e264 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 09:46:44 np0005538513.localdomain podman[280882]: 2025-11-28 09:46:44.624647908 +0000 UTC m=+0.208684174 container start 5c16471a00387105cdcd384a69ea9a94b342e8ed65f03b71a8324776ec10e264 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:46:44 np0005538513.localdomain neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e[280896]: [NOTICE]   (280900) : New worker (280902) forked
Nov 28 09:46:44 np0005538513.localdomain neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e[280896]: [NOTICE]   (280900) : Loading success.
Nov 28 09:46:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:46:44Z|00066|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 09:46:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:44.642 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:44.801 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:46:44Z|00067|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 09:46:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:46:44Z|00068|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 09:46:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:44.809 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:45.591 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:45 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:46:45Z|00069|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 09:46:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:46.100 279685 DEBUG nova.compute.manager [req-dbfec011-2b8c-49d1-ad31-0318bd596b52 req-10082b87-47c2-4107-8bed-92de6ec7c13c 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Received event network-vif-plugged-09612b07-5142-4b0f-9dab-74bf4403f69f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 09:46:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:46.101 279685 DEBUG oslo_concurrency.lockutils [req-dbfec011-2b8c-49d1-ad31-0318bd596b52 req-10082b87-47c2-4107-8bed-92de6ec7c13c 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:46:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:46.101 279685 DEBUG oslo_concurrency.lockutils [req-dbfec011-2b8c-49d1-ad31-0318bd596b52 req-10082b87-47c2-4107-8bed-92de6ec7c13c 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:46:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:46.102 279685 DEBUG oslo_concurrency.lockutils [req-dbfec011-2b8c-49d1-ad31-0318bd596b52 req-10082b87-47c2-4107-8bed-92de6ec7c13c 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:46:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:46.102 279685 DEBUG nova.compute.manager [req-dbfec011-2b8c-49d1-ad31-0318bd596b52 req-10082b87-47c2-4107-8bed-92de6ec7c13c 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] No waiting events found dispatching network-vif-plugged-09612b07-5142-4b0f-9dab-74bf4403f69f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 09:46:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:46.103 279685 WARNING nova.compute.manager [req-dbfec011-2b8c-49d1-ad31-0318bd596b52 req-10082b87-47c2-4107-8bed-92de6ec7c13c 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Received unexpected event network-vif-plugged-09612b07-5142-4b0f-9dab-74bf4403f69f for instance with vm_state active and task_state None.
Nov 28 09:46:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:46.685 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49399 DF PROTO=TCP SPT=45152 DPT=9102 SEQ=3869660939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2A57830000000001030307) 
Nov 28 09:46:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:46:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:46:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:46:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:46:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:46:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:46:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:46:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:46:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:46:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:46:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:46:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:46:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:48.150 279685 DEBUG nova.compute.manager [req-c9aba51c-3a10-4aa1-864c-602dde55d488 req-0fad0dff-8a4a-4379-867e-e4466bba83bd 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Received event network-vif-plugged-09612b07-5142-4b0f-9dab-74bf4403f69f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 09:46:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:48.151 279685 DEBUG oslo_concurrency.lockutils [req-c9aba51c-3a10-4aa1-864c-602dde55d488 req-0fad0dff-8a4a-4379-867e-e4466bba83bd 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:46:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:48.152 279685 DEBUG oslo_concurrency.lockutils [req-c9aba51c-3a10-4aa1-864c-602dde55d488 req-0fad0dff-8a4a-4379-867e-e4466bba83bd 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:46:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:48.152 279685 DEBUG oslo_concurrency.lockutils [req-c9aba51c-3a10-4aa1-864c-602dde55d488 req-0fad0dff-8a4a-4379-867e-e4466bba83bd 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:46:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:48.153 279685 DEBUG nova.compute.manager [req-c9aba51c-3a10-4aa1-864c-602dde55d488 req-0fad0dff-8a4a-4379-867e-e4466bba83bd 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] No waiting events found dispatching network-vif-plugged-09612b07-5142-4b0f-9dab-74bf4403f69f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 09:46:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:48.153 279685 WARNING nova.compute.manager [req-c9aba51c-3a10-4aa1-864c-602dde55d488 req-0fad0dff-8a4a-4379-867e-e4466bba83bd 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Received unexpected event network-vif-plugged-09612b07-5142-4b0f-9dab-74bf4403f69f for instance with vm_state active and task_state None.
Nov 28 09:46:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:48.564 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:50.825 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:46:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:50.826 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:46:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:46:50.827 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:46:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:51.728 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:46:51 np0005538513.localdomain podman[280911]: 2025-11-28 09:46:51.864862847 +0000 UTC m=+0.092807434 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-type=git, managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 09:46:51 np0005538513.localdomain podman[280911]: 2025-11-28 09:46:51.882355373 +0000 UTC m=+0.110299940 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6)
Nov 28 09:46:51 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:46:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:53.568 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:56.732 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:46:57 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:46:57Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f4:fc:6c 192.168.0.142
Nov 28 09:46:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:46:57 np0005538513.localdomain systemd[1]: tmp-crun.MqFHkw.mount: Deactivated successfully.
Nov 28 09:46:57 np0005538513.localdomain podman[280930]: 2025-11-28 09:46:57.848881157 +0000 UTC m=+0.084535281 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:46:57 np0005538513.localdomain podman[280930]: 2025-11-28 09:46:57.862786083 +0000 UTC m=+0.098440207 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:46:57 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:46:58 np0005538513.localdomain snmpd[66832]: empty variable list in _query
Nov 28 09:46:58 np0005538513.localdomain snmpd[66832]: empty variable list in _query
Nov 28 09:46:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:46:58.571 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:01.776 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:01 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:47:01 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:47:01 np0005538513.localdomain podman[280954]: 2025-11-28 09:47:01.896051291 +0000 UTC m=+0.089940937 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 09:47:01 np0005538513.localdomain podman[280954]: 2025-11-28 09:47:01.90480463 +0000 UTC m=+0.098694246 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:47:01 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:47:02 np0005538513.localdomain systemd[1]: tmp-crun.GGpsHG.mount: Deactivated successfully.
Nov 28 09:47:02 np0005538513.localdomain podman[280953]: 2025-11-28 09:47:02.013807559 +0000 UTC m=+0.210609084 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 28 09:47:02 np0005538513.localdomain podman[280953]: 2025-11-28 09:47:02.078946846 +0000 UTC m=+0.275748351 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 09:47:02 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:47:02 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45814 DF PROTO=TCP SPT=38778 DPT=9102 SEQ=3981783863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2A91150000000001030307) 
Nov 28 09:47:02 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:02.417 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 28 09:47:02 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:02.419 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Nov 28 09:47:02 np0005538513.localdomain ovn_metadata_agent[158125]: Accept: */*
Nov 28 09:47:02 np0005538513.localdomain ovn_metadata_agent[158125]: Connection: close
Nov 28 09:47:02 np0005538513.localdomain ovn_metadata_agent[158125]: Content-Type: text/plain
Nov 28 09:47:02 np0005538513.localdomain ovn_metadata_agent[158125]: Host: 169.254.169.254
Nov 28 09:47:02 np0005538513.localdomain ovn_metadata_agent[158125]: User-Agent: curl/7.84.0
Nov 28 09:47:02 np0005538513.localdomain ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142
Nov 28 09:47:02 np0005538513.localdomain ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 28 09:47:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45815 DF PROTO=TCP SPT=38778 DPT=9102 SEQ=3981783863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2A95020000000001030307) 
Nov 28 09:47:03 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:03.610 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.770 158228 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.771 158228 INFO eventlet.wsgi.server [-] 192.168.0.142,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.3527038
Nov 28 09:47:03 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45666 [28/Nov/2025:09:47:02.416] listener listener/metadata 0/0/0/1355/1355 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.790 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.791 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Accept: */*
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Connection: close
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Content-Type: text/plain
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Host: 169.254.169.254
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: User-Agent: curl/7.84.0
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 28 09:47:03 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45676 [28/Nov/2025:09:47:03.789] listener listener/metadata 0/0/0/28/28 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.818 158228 INFO eventlet.wsgi.server [-] 192.168.0.142,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 404  len: 297 time: 0.0270157
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.835 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.836 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Accept: */*
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Connection: close
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Content-Type: text/plain
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Host: 169.254.169.254
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: User-Agent: curl/7.84.0
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.852 158228 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.853 158228 INFO eventlet.wsgi.server [-] 192.168.0.142,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0169477
Nov 28 09:47:03 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45690 [28/Nov/2025:09:47:03.835] listener listener/metadata 0/0/0/18/18 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.860 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.861 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Accept: */*
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Connection: close
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Content-Type: text/plain
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Host: 169.254.169.254
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: User-Agent: curl/7.84.0
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.875 158228 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 28 09:47:03 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45694 [28/Nov/2025:09:47:03.860] listener listener/metadata 0/0/0/15/15 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.876 158228 INFO eventlet.wsgi.server [-] 192.168.0.142,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0145340
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.883 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.884 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Accept: */*
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Connection: close
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Content-Type: text/plain
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Host: 169.254.169.254
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: User-Agent: curl/7.84.0
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.897 158228 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 28 09:47:03 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45710 [28/Nov/2025:09:47:03.883] listener listener/metadata 0/0/0/14/14 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.898 158228 INFO eventlet.wsgi.server [-] 192.168.0.142,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 143 time: 0.0136721
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.905 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.906 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Accept: */*
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Connection: close
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Content-Type: text/plain
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Host: 169.254.169.254
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: User-Agent: curl/7.84.0
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.923 158228 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 28 09:47:03 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45724 [28/Nov/2025:09:47:03.905] listener listener/metadata 0/0/0/18/18 200 133 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.923 158228 INFO eventlet.wsgi.server [-] 192.168.0.142,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 149 time: 0.0174773
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.931 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.932 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Accept: */*
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Connection: close
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Content-Type: text/plain
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Host: 169.254.169.254
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: User-Agent: curl/7.84.0
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.946 158228 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 28 09:47:03 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45730 [28/Nov/2025:09:47:03.931] listener listener/metadata 0/0/0/15/15 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.946 158228 INFO eventlet.wsgi.server [-] 192.168.0.142,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 150 time: 0.0141671
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.953 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.954 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Accept: */*
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Connection: close
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Content-Type: text/plain
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Host: 169.254.169.254
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: User-Agent: curl/7.84.0
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.974 158228 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 28 09:47:03 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45732 [28/Nov/2025:09:47:03.953] listener listener/metadata 0/0/0/21/21 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.974 158228 INFO eventlet.wsgi.server [-] 192.168.0.142,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 139 time: 0.0200076
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.982 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:03.983 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Accept: */*
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Connection: close
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Content-Type: text/plain
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: Host: 169.254.169.254
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: User-Agent: curl/7.84.0
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142
Nov 28 09:47:03 np0005538513.localdomain ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.003 158228 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 28 09:47:04 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45748 [28/Nov/2025:09:47:03.981] listener listener/metadata 0/0/0/22/22 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.004 158228 INFO eventlet.wsgi.server [-] 192.168.0.142,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 139 time: 0.0208871
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.011 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.012 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Accept: */*
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Connection: close
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Content-Type: text/plain
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Host: 169.254.169.254
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: User-Agent: curl/7.84.0
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 28 09:47:04 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49400 DF PROTO=TCP SPT=45152 DPT=9102 SEQ=3869660939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2A97830000000001030307) 
Nov 28 09:47:04 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45758 [28/Nov/2025:09:47:04.011] listener listener/metadata 0/0/0/20/20 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.031 158228 INFO eventlet.wsgi.server [-] 192.168.0.142,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0188842
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.046 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.047 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Accept: */*
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Connection: close
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Content-Type: text/plain
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Host: 169.254.169.254
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: User-Agent: curl/7.84.0
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.060 158228 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 28 09:47:04 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45774 [28/Nov/2025:09:47:04.045] listener listener/metadata 0/0/0/15/15 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.061 158228 INFO eventlet.wsgi.server [-] 192.168.0.142,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 155 time: 0.0140557
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.066 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.067 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Accept: */*
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Connection: close
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Content-Type: text/plain
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Host: 169.254.169.254
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: User-Agent: curl/7.84.0
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.082 158228 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 28 09:47:04 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45784 [28/Nov/2025:09:47:04.066] listener listener/metadata 0/0/0/16/16 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.082 158228 INFO eventlet.wsgi.server [-] 192.168.0.142,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0148098
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.088 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.089 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.0
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Accept: */*
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Connection: close
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Content-Type: text/plain
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Host: 169.254.169.254
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: User-Agent: curl/7.84.0
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.102 158228 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 28 09:47:04 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45788 [28/Nov/2025:09:47:04.087] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1"
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.103 158228 INFO eventlet.wsgi.server [-] 192.168.0.142,<local> "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" status: 200  len: 143 time: 0.0142629
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.109 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.109 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Accept: */*
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Connection: close
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Content-Type: text/plain
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Host: 169.254.169.254
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: User-Agent: curl/7.84.0
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.125 158228 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 28 09:47:04 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45796 [28/Nov/2025:09:47:04.108] listener listener/metadata 0/0/0/17/17 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.126 158228 INFO eventlet.wsgi.server [-] 192.168.0.142,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0163682
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.133 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.133 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Accept: */*
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Connection: close
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Content-Type: text/plain
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Host: 169.254.169.254
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: User-Agent: curl/7.84.0
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.146 158228 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 28 09:47:04 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45806 [28/Nov/2025:09:47:04.132] listener listener/metadata 0/0/0/15/15 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.147 158228 INFO eventlet.wsgi.server [-] 192.168.0.142,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 139 time: 0.0137632
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.154 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.154 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Accept: */*
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Connection: close
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Content-Type: text/plain
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: Host: 169.254.169.254
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: User-Agent: curl/7.84.0
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.166 158228 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 28 09:47:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:04.166 158228 INFO eventlet.wsgi.server [-] 192.168.0.142,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0121791
Nov 28 09:47:04 np0005538513.localdomain haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45812 [28/Nov/2025:09:47:04.153] listener listener/metadata 0/0/0/13/13 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Nov 28 09:47:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:04.494 279685 DEBUG nova.compute.manager [None req-85143bf4-3818-42c0-80fe-88de490e3309 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 09:47:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:04.500 279685 INFO nova.compute.manager [None req-85143bf4-3818-42c0-80fe-88de490e3309 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Retrieving diagnostics
Nov 28 09:47:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:47:04 np0005538513.localdomain podman[280999]: 2025-11-28 09:47:04.848727692 +0000 UTC m=+0.084535703 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Nov 28 09:47:04 np0005538513.localdomain podman[280999]: 2025-11-28 09:47:04.864519298 +0000 UTC m=+0.100327309 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 28 09:47:04 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:47:05 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45816 DF PROTO=TCP SPT=38778 DPT=9102 SEQ=3981783863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2A9D020000000001030307) 
Nov 28 09:47:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52249 DF PROTO=TCP SPT=41230 DPT=9102 SEQ=2153373983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2AA1820000000001030307) 
Nov 28 09:47:06 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:06.780 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:08 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:08.613 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45817 DF PROTO=TCP SPT=38778 DPT=9102 SEQ=3981783863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2AACC20000000001030307) 
Nov 28 09:47:10 np0005538513.localdomain podman[238687]: time="2025-11-28T09:47:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:47:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:47:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147523 "" "Go-http-client/1.1"
Nov 28 09:47:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:47:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17233 "" "Go-http-client/1.1"
Nov 28 09:47:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:47:10 np0005538513.localdomain systemd[1]: tmp-crun.WQyjkt.mount: Deactivated successfully.
Nov 28 09:47:10 np0005538513.localdomain podman[281021]: 2025-11-28 09:47:10.859626985 +0000 UTC m=+0.093104060 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:47:10 np0005538513.localdomain podman[281021]: 2025-11-28 09:47:10.869612944 +0000 UTC m=+0.103089979 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:47:10 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:47:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:11.811 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:12 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:47:12 np0005538513.localdomain podman[281044]: 2025-11-28 09:47:12.849612036 +0000 UTC m=+0.083471891 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3)
Nov 28 09:47:12 np0005538513.localdomain podman[281044]: 2025-11-28 09:47:12.865598038 +0000 UTC m=+0.099457933 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 09:47:12 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:47:13 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:13.653 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:13 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:47:13Z|00070|memory_trim|INFO|Detected inactivity (last active 30012 ms ago): trimming memory
Nov 28 09:47:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:16.815 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45818 DF PROTO=TCP SPT=38778 DPT=9102 SEQ=3981783863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2ACD820000000001030307) 
Nov 28 09:47:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:47:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:47:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:47:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:47:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:47:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:47:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:47:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:47:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:47:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:47:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:47:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:47:18 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:18.655 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:21.842 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:47:22 np0005538513.localdomain podman[281063]: 2025-11-28 09:47:22.834202914 +0000 UTC m=+0.074650587 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=edpm, architecture=x86_64, version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 09:47:22 np0005538513.localdomain podman[281063]: 2025-11-28 09:47:22.847340053 +0000 UTC m=+0.087787706 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, io.openshift.expose-services=, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6)
Nov 28 09:47:22 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:47:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:23.680 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:26.843 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:28 np0005538513.localdomain sudo[281082]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:47:28 np0005538513.localdomain sudo[281082]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:47:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:47:28 np0005538513.localdomain sudo[281082]: pam_unix(sudo:session): session closed for user root
Nov 28 09:47:28 np0005538513.localdomain sudo[281106]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:47:28 np0005538513.localdomain sudo[281106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:47:28 np0005538513.localdomain podman[281100]: 2025-11-28 09:47:28.539121979 +0000 UTC m=+0.080847376 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:47:28 np0005538513.localdomain podman[281100]: 2025-11-28 09:47:28.547827901 +0000 UTC m=+0.089553228 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:47:28 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:47:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:28.683 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:29 np0005538513.localdomain sudo[281106]: pam_unix(sudo:session): session closed for user root
Nov 28 09:47:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:29.730 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:47:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:29.732 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:47:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:29.751 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:47:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:29.751 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:47:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:29.752 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:47:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:30.742 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:47:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:30.742 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:47:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:30.743 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:47:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:30.743 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:47:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:31.880 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:32 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29905 DF PROTO=TCP SPT=40908 DPT=9102 SEQ=3436196770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B06440000000001030307) 
Nov 28 09:47:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:32.685 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:47:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:32.714 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:47:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:32.714 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:47:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:32.715 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:47:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:32.716 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:47:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:32.716 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:47:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:32.716 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:47:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:32.717 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:47:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:32.717 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:47:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:32.717 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:47:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:32.718 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:47:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:32.744 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:47:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:32.745 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:47:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:32.745 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:47:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:32.746 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:47:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:32.746 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:47:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:47:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:47:32 np0005538513.localdomain systemd[1]: tmp-crun.aNAU91.mount: Deactivated successfully.
Nov 28 09:47:32 np0005538513.localdomain podman[281175]: 2025-11-28 09:47:32.87423015 +0000 UTC m=+0.100353660 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:47:32 np0005538513.localdomain podman[281175]: 2025-11-28 09:47:32.878703519 +0000 UTC m=+0.104827049 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:47:32 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:47:32 np0005538513.localdomain podman[281174]: 2025-11-28 09:47:32.962255482 +0000 UTC m=+0.191742329 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:47:33 np0005538513.localdomain podman[281174]: 2025-11-28 09:47:33.030897815 +0000 UTC m=+0.260384702 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller)
Nov 28 09:47:33 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:47:33 np0005538513.localdomain sudo[281234]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:47:33 np0005538513.localdomain sudo[281234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:47:33 np0005538513.localdomain sudo[281234]: pam_unix(sudo:session): session closed for user root
Nov 28 09:47:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:33.178 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:47:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:33.264 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:47:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:33.265 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:47:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29906 DF PROTO=TCP SPT=40908 DPT=9102 SEQ=3436196770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B0A420000000001030307) 
Nov 28 09:47:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:33.483 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:47:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:33.486 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12296MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:47:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:33.486 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:47:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:33.487 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:47:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:33.576 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:47:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:33.577 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:47:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:33.578 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:47:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:33.645 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:47:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:33.687 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:34.097 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:47:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:34.104 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:47:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:34.132 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:47:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:34.160 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:47:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:34.160 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.674s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:47:34 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45819 DF PROTO=TCP SPT=38778 DPT=9102 SEQ=3981783863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B0D820000000001030307) 
Nov 28 09:47:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:47:35 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29907 DF PROTO=TCP SPT=40908 DPT=9102 SEQ=3436196770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B12420000000001030307) 
Nov 28 09:47:35 np0005538513.localdomain systemd[1]: tmp-crun.emdmCK.mount: Deactivated successfully.
Nov 28 09:47:35 np0005538513.localdomain podman[281277]: 2025-11-28 09:47:35.476068132 +0000 UTC m=+0.079936039 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute)
Nov 28 09:47:35 np0005538513.localdomain podman[281277]: 2025-11-28 09:47:35.48947504 +0000 UTC m=+0.093342937 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 28 09:47:35 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:47:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49401 DF PROTO=TCP SPT=45152 DPT=9102 SEQ=3869660939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B15820000000001030307) 
Nov 28 09:47:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:36.883 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:38.692 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29908 DF PROTO=TCP SPT=40908 DPT=9102 SEQ=3436196770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B22020000000001030307) 
Nov 28 09:47:40 np0005538513.localdomain podman[238687]: time="2025-11-28T09:47:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:47:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:47:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147523 "" "Go-http-client/1.1"
Nov 28 09:47:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:47:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17234 "" "Go-http-client/1.1"
Nov 28 09:47:41 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:47:41 np0005538513.localdomain podman[281296]: 2025-11-28 09:47:41.840398294 +0000 UTC m=+0.079385994 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:47:41 np0005538513.localdomain podman[281296]: 2025-11-28 09:47:41.851413532 +0000 UTC m=+0.090401272 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:47:41 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:47:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:41.933 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:43.738 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:47:43 np0005538513.localdomain podman[281319]: 2025-11-28 09:47:43.850772664 +0000 UTC m=+0.081123215 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 09:47:43 np0005538513.localdomain podman[281319]: 2025-11-28 09:47:43.869406401 +0000 UTC m=+0.099756962 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, managed_by=edpm_ansible)
Nov 28 09:47:43 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:47:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:46.935 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29909 DF PROTO=TCP SPT=40908 DPT=9102 SEQ=3436196770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B41830000000001030307) 
Nov 28 09:47:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:47:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:47:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:47:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:47:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:47:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:47:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:47:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:47:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:47:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:47:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:47:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:47:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:48.740 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:50.826 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:47:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:50.827 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:47:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:47:50.829 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:47:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:51.969 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:53 np0005538513.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:b0:25:93 MACPROTO=0800 SRC=3.138.197.221 DST=38.102.83.64 LEN=52 TOS=0x00 PREC=0x00 TTL=50 ID=9338 PROTO=TCP SPT=36814 DPT=9090 SEQ=2185110305 ACK=0 WINDOW=65535 RES=0x00 SYN URGP=0 OPT (020405B40103030801010402) 
Nov 28 09:47:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:53.777 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:47:53 np0005538513.localdomain systemd[1]: tmp-crun.cDRt2Q.mount: Deactivated successfully.
Nov 28 09:47:53 np0005538513.localdomain podman[281337]: 2025-11-28 09:47:53.896347538 +0000 UTC m=+0.095749477 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vendor=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 09:47:53 np0005538513.localdomain podman[281337]: 2025-11-28 09:47:53.940535284 +0000 UTC m=+0.139937253 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 09:47:53 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:47:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:56.972 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:47:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:47:58.780 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:47:58 np0005538513.localdomain podman[281357]: 2025-11-28 09:47:58.849089257 +0000 UTC m=+0.083809592 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:47:58 np0005538513.localdomain podman[281357]: 2025-11-28 09:47:58.86238097 +0000 UTC m=+0.097101295 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:47:58 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.672 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.673 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.681 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c46340b1-66c6-4a9f-b3e2-d75718e49e6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:48:00.673830', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '53a5095a-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.845723655, 'message_signature': 'e3fe45d8d1cd6bd6b6896740afa1c966716cf722a68096da1490d6257540418f'}]}, 'timestamp': '2025-11-28 09:48:00.681922', '_unique_id': '1eb1eb9761944874a6de602d4b289c49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.684 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.697 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.698 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b1bbfd3-7753-4004-b171-eeabdd5c8d78', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:48:00.685002', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '53a79b98-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.856949559, 'message_signature': '10ca8ae81e4a0e0370cf5fd3a25092ecd228aeb70a92f07150af9b72ac381309'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:48:00.685002', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '53a7af5c-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.856949559, 'message_signature': '39a2ce96941ccfac835de018f1dcba103d2bbd35433a905780a51ae49e88ce4b'}]}, 'timestamp': '2025-11-28 09:48:00.699291', '_unique_id': '88413bcb86254ef4b85e28c21f3c7ae9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.701 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.729 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.730 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a319abdc-3948-4921-8098-9b13dcb8c562', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:48:00.702151', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '53ac67cc-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': 'c7cb8e1b2dbd8b58b217ac158f8275815fbd1d495fe0599de03b5f391e940bbb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:48:00.702151', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '53ac7cee-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': '46b67abe0aa8ca2f3e503ce1a8f22c4a122761c21656b93fa0901df9bed04b6a'}]}, 'timestamp': '2025-11-28 09:48:00.730653', '_unique_id': 'e03c3d302aca469e88577d1a52168530'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.732 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.733 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.733 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.733 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a002305-d03e-4055-a28f-7c41f816158e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:48:00.733229', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '53acf2e6-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': '32ed49363f41f127bbf58402cb77762fdc0c874e9dee7f5e50cf7f662edd5865'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:48:00.733229', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '53ad031c-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': 'e7ae714adce65c6e9b5b83992903d403a1ebd4c7638152c81754e31ebe4af635'}]}, 'timestamp': '2025-11-28 09:48:00.734113', '_unique_id': '8a3299ae5429448caeedc5b8466e1e8c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.736 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.736 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.736 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '033d9787-2d94-49c1-80a8-0245372a512a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:48:00.736321', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '53ad6b2c-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.856949559, 'message_signature': 'fd4365552d9007c23d9b6f7ecdaeee9d4d8a85b9b43eae33fa66cd0df2d5562a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:48:00.736321', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '53ad7b26-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.856949559, 'message_signature': '9107e4312eeaec2080beb3a38f9ce2d14d1db72fe15c5db2830fc31b5b0ba715'}]}, 'timestamp': '2025-11-28 09:48:00.737223', '_unique_id': '58f0706cabd24e10802faa5c8f3f7c69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.739 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.757 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f32ade6f-5ec4-4c84-90f5-d669f05d7e97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:48:00.739458', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '53b0b5ca-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.929620678, 'message_signature': '4b3230e3fc98f681a68f5848ed2ee4e1e52a888c84794d6d1fdcdea84176e8c6'}]}, 'timestamp': '2025-11-28 09:48:00.758375', '_unique_id': '714598eab8224e6cbbfff2c9cd4ab4b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.760 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.760 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76b732d2-6381-4561-9d33-1395c28f872c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:48:00.760667', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '53b12244-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.845723655, 'message_signature': 'c89854f75f4cce161bb602af1d6735a285e41fdbb6a874e5ed80349d40ac4a14'}]}, 'timestamp': '2025-11-28 09:48:00.761156', '_unique_id': '1e71bd99007a4a2bb88615da43ea666c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.763 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.763 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e81a661-3dd2-4b74-9980-1d4782c470cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:48:00.763278', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '53b18838-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.845723655, 'message_signature': 'eefe2a72069128421b01fc7687d9aaee7872a73e314d3764864a5fccbba82dce'}]}, 'timestamp': '2025-11-28 09:48:00.763733', '_unique_id': '76e328ef31524e0abaad51d7fc4cb2e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.765 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.765 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '493c50bb-4033-4984-bfa6-7ffffb959577', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:48:00.765803', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '53b1ebd4-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.845723655, 'message_signature': '5df403eece4e73874ee1db7d1512e399e67b3ecf5f83557497cc27910e09941f'}]}, 'timestamp': '2025-11-28 09:48:00.766315', '_unique_id': 'bc4e3db95292429eba99b07ced437574'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.768 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.768 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3dfc5792-340d-45cc-ad87-5fd1779569a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:48:00.768376', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '53b24f48-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.845723655, 'message_signature': '208021f6b01492c07f3c8322068ccab47ac5a48c7ed934d640dbd24360aad5b6'}]}, 'timestamp': '2025-11-28 09:48:00.768830', '_unique_id': 'd40d4730529a4694988e97ded02ccef0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.770 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.771 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc154370-47a5-4900-b4d8-00f7f6415266', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:48:00.771060', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '53b2b87a-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.845723655, 'message_signature': '4e84d4665bcbb0dbe14fd1e2e4183d920428fbbdfe584bd7ad041afa0172d075'}]}, 'timestamp': '2025-11-28 09:48:00.771524', '_unique_id': '5b3c8b7901954e54b2ecff31831fdc78'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.773 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.773 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.774 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'edd16667-d656-4547-98dd-729056766d2f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:48:00.773590', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '53b31acc-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': '2868c6b47e56d50aecbef58d1dd45668f99f1a1b5310784e50010d03aee7ab42'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:48:00.773590', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '53b32d46-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': 'abbfa07bfe38421041e6603ed60b7d3d6e5cb36596bb70970ecc374b06864390'}]}, 'timestamp': '2025-11-28 09:48:00.774491', '_unique_id': '9b2b15d96efc4fa8980a41894809b395'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43301249-061b-4a0f-9508-a3007e47bbf2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:48:00.775957', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '53b375e4-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.845723655, 'message_signature': '2dfe1d438d6341eb5464ea3c8def6e88f96e3cd6e6ed0fad50e480c8ea1c3c92'}]}, 'timestamp': '2025-11-28 09:48:00.776322', '_unique_id': 'c968f0b6b7cc433bb56c04b6fdf90a01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.777 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.777 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '744ef230-6a73-4cbe-858d-7c6c308280e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:48:00.777607', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '53b3b4aa-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.845723655, 'message_signature': 'c23620c8250cc41caa170358c1321b85585e981ab4d79394f1fa4f780ce05524'}]}, 'timestamp': '2025-11-28 09:48:00.777890', '_unique_id': '78f655d2c1d04042bd800e61dfbef61b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.779 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.779 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.779 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95da7934-ef29-41b2-9a7c-338924af02e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:48:00.779225', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '53b3f3b6-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.856949559, 'message_signature': '78718bcd396310cada4eaad879db1b31e6ac8d401755b84df6003ef8faf285fe'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:48:00.779225', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '53b3fd52-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.856949559, 'message_signature': '8dd2114965aa3492344dd7b2a68f35268bb2178f92b3b89365cbce42a7508a1c'}]}, 'timestamp': '2025-11-28 09:48:00.779730', '_unique_id': '0ff0ed2284304d03adcfaea77dda0711'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.781 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.781 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c12295c-4bc0-4de3-a610-454ec0ecba46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:48:00.781045', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '53b43b46-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': 'ddaf2344f2c5c65c4d7d62fcea9ff14bf0e85dfeae5cb2ddffce4970fd1da30b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:48:00.781045', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '53b444c4-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': '2c01ea7ecbad6f646fee96e1e941f44f0bd28fa7ae3246c00d1abfba4147fefc'}]}, 'timestamp': '2025-11-28 09:48:00.781558', '_unique_id': '80547a6af64c4ab4891edafb9174360f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae5a8963-698b-49a6-b1e3-375c42d89d3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:48:00.782903', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '53b48a4c-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.845723655, 'message_signature': '8b09f3cff4e7c9ff396390e6981debd09d5719f564957d2a7a6db371cf6392d4'}]}, 'timestamp': '2025-11-28 09:48:00.783359', '_unique_id': '1c1ffebebb794788aee34c8710702da5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.784 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.784 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.784 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02884105-04b9-43ac-b6d8-7bb7eef3c4f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:48:00.784613', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '53b4c624-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': '174a28293dc57b809237c7cf689c6d82aedca0eb68b0af80ab228c8ceb783a00'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:48:00.784613', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '53b4cfa2-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': 'd19e163bf7e00326197c663c92588a3dc23e1345d0c5499e061a1578ad6ffacc'}]}, 'timestamp': '2025-11-28 09:48:00.785138', '_unique_id': '2492bc05bcc74ac08026cb8481100be3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.786 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.786 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84d7028f-7a1c-45db-8d86-afa3b4282194', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:48:00.786649', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '53b51606-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.845723655, 'message_signature': '187cfc2cf54dabf82b68bcb88f607f182f826a1dda51c67b095f3d71451d6e83'}]}, 'timestamp': '2025-11-28 09:48:00.786937', '_unique_id': 'c53c71e7b2fc4eeb9e56c7a6d5b40a3a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.788 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.788 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.788 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a6a9eec-2499-4bba-b145-e2a3c223d31f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:48:00.788311', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '53b556d4-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': '2793aa680aa5374685b0b08c166f553f17a51058ffd930a3f5c2a59c3de852c5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:48:00.788311', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '53b56110-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': '5c45b08baab989fd5606d796d7b763a7e6cb3afcc6a280daa745d7290cbb5089'}]}, 'timestamp': '2025-11-28 09:48:00.788839', '_unique_id': '2d30c103c3854b72853a6e63db158b6f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.790 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.790 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 11560000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea488f5e-e8da-4bf2-8dfc-1491679807c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11560000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:48:00.790161', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '53b5a008-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.929620678, 'message_signature': 'e8b79152bddd8945ee2c34d6678805052db283344d688bc4186323b0fc23e0f9'}]}, 'timestamp': '2025-11-28 09:48:00.790478', '_unique_id': '6de4b0032969467b99166c2659751deb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:48:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:48:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:02.006 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:48:02 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48263 DF PROTO=TCP SPT=36398 DPT=9102 SEQ=2606285366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B7B750000000001030307) 
Nov 28 09:48:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48264 DF PROTO=TCP SPT=36398 DPT=9102 SEQ=2606285366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B7F830000000001030307) 
Nov 28 09:48:03 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:48:03 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:48:03 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:03.829 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:48:03 np0005538513.localdomain systemd[1]: tmp-crun.vevkIi.mount: Deactivated successfully.
Nov 28 09:48:03 np0005538513.localdomain podman[281381]: 2025-11-28 09:48:03.90065629 +0000 UTC m=+0.133876338 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:48:03 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29910 DF PROTO=TCP SPT=40908 DPT=9102 SEQ=3436196770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B81830000000001030307) 
Nov 28 09:48:03 np0005538513.localdomain podman[281382]: 2025-11-28 09:48:03.949302455 +0000 UTC m=+0.177826247 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 09:48:03 np0005538513.localdomain podman[281381]: 2025-11-28 09:48:03.975126342 +0000 UTC m=+0.208346380 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:48:03 np0005538513.localdomain podman[281382]: 2025-11-28 09:48:03.983321318 +0000 UTC m=+0.211845090 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:48:03 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:48:04 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:48:05 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48265 DF PROTO=TCP SPT=36398 DPT=9102 SEQ=2606285366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B87820000000001030307) 
Nov 28 09:48:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:48:05 np0005538513.localdomain systemd[1]: tmp-crun.q0rNuV.mount: Deactivated successfully.
Nov 28 09:48:05 np0005538513.localdomain podman[281425]: 2025-11-28 09:48:05.856241738 +0000 UTC m=+0.086268263 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Nov 28 09:48:05 np0005538513.localdomain podman[281425]: 2025-11-28 09:48:05.897700956 +0000 UTC m=+0.127727511 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:48:05 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:48:06 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45820 DF PROTO=TCP SPT=38778 DPT=9102 SEQ=3981783863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B8B820000000001030307) 
Nov 28 09:48:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:07.009 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:48:08 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:08.834 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:48:09 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48266 DF PROTO=TCP SPT=36398 DPT=9102 SEQ=2606285366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B97420000000001030307) 
Nov 28 09:48:10 np0005538513.localdomain podman[238687]: time="2025-11-28T09:48:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:48:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:48:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147523 "" "Go-http-client/1.1"
Nov 28 09:48:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:48:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17228 "" "Go-http-client/1.1"
Nov 28 09:48:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:12.050 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:48:12 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:48:12 np0005538513.localdomain podman[281443]: 2025-11-28 09:48:12.852282628 +0000 UTC m=+0.086540960 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:48:12 np0005538513.localdomain podman[281443]: 2025-11-28 09:48:12.861330771 +0000 UTC m=+0.095589063 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:48:12 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:48:13 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:13.876 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:48:14 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:48:14 np0005538513.localdomain systemd[1]: tmp-crun.M2i5Ls.mount: Deactivated successfully.
Nov 28 09:48:14 np0005538513.localdomain podman[281466]: 2025-11-28 09:48:14.852582118 +0000 UTC m=+0.079415764 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:48:14 np0005538513.localdomain sshd[281482]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:48:14 np0005538513.localdomain podman[281466]: 2025-11-28 09:48:14.867312954 +0000 UTC m=+0.094146640 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd)
Nov 28 09:48:14 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:48:14 np0005538513.localdomain sshd[281482]: Accepted publickey for zuul from 38.102.83.114 port 45730 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:48:14 np0005538513.localdomain systemd-logind[764]: New session 61 of user zuul.
Nov 28 09:48:15 np0005538513.localdomain systemd[1]: Started Session 61 of User zuul.
Nov 28 09:48:15 np0005538513.localdomain sshd[281482]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:48:15 np0005538513.localdomain sudo[281505]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eotupifmtugrajekrzaahsqazbrmyupl ; /usr/bin/python3
Nov 28 09:48:15 np0005538513.localdomain sudo[281505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 09:48:15 np0005538513.localdomain python3[281507]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:48:15 np0005538513.localdomain subscription-manager[281508]: Unregistered machine with identity: f7b9b60d-6b81-4721-85a2-48be6d80ec8a
Nov 28 09:48:15 np0005538513.localdomain sudo[281505]: pam_unix(sudo:session): session closed for user root
Nov 28 09:48:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:17.052 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:48:17 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48267 DF PROTO=TCP SPT=36398 DPT=9102 SEQ=2606285366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2BB7820000000001030307) 
Nov 28 09:48:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:48:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:48:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:48:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:48:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:48:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:48:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:48:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:48:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:48:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:48:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:48:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:48:18 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:18.879 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:48:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:22.096 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:48:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:23.914 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:48:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:48:24 np0005538513.localdomain systemd[1]: tmp-crun.oa5hI7.mount: Deactivated successfully.
Nov 28 09:48:24 np0005538513.localdomain podman[281510]: 2025-11-28 09:48:24.864655815 +0000 UTC m=+0.094786408 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, version=9.6, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 28 09:48:24 np0005538513.localdomain podman[281510]: 2025-11-28 09:48:24.876800867 +0000 UTC m=+0.106931480 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Nov 28 09:48:24 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:48:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:27.132 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:48:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:28.917 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:48:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:48:29 np0005538513.localdomain systemd[1]: tmp-crun.uFp5Ha.mount: Deactivated successfully.
Nov 28 09:48:29 np0005538513.localdomain podman[281531]: 2025-11-28 09:48:29.881251148 +0000 UTC m=+0.121040517 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:48:29 np0005538513.localdomain podman[281531]: 2025-11-28 09:48:29.890455744 +0000 UTC m=+0.130245113 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:48:29 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:48:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:32.173 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:48:32 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18443 DF PROTO=TCP SPT=56892 DPT=9102 SEQ=4071554048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2BF0A40000000001030307) 
Nov 28 09:48:33 np0005538513.localdomain sudo[281556]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:48:33 np0005538513.localdomain sudo[281556]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:48:33 np0005538513.localdomain sudo[281556]: pam_unix(sudo:session): session closed for user root
Nov 28 09:48:33 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18444 DF PROTO=TCP SPT=56892 DPT=9102 SEQ=4071554048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2BF4C20000000001030307) 
Nov 28 09:48:33 np0005538513.localdomain sudo[281574]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:48:33 np0005538513.localdomain sudo[281574]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:48:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:33.945 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:48:34 np0005538513.localdomain sudo[281574]: pam_unix(sudo:session): session closed for user root
Nov 28 09:48:34 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48268 DF PROTO=TCP SPT=36398 DPT=9102 SEQ=2606285366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2BF7830000000001030307) 
Nov 28 09:48:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:34.162 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:48:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:34.163 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:48:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:34.163 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:48:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:34.164 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:48:34 np0005538513.localdomain sudo[281624]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:48:34 np0005538513.localdomain sudo[281624]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:48:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:48:34 np0005538513.localdomain sudo[281624]: pam_unix(sudo:session): session closed for user root
Nov 28 09:48:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:48:34 np0005538513.localdomain sudo[281644]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 28 09:48:34 np0005538513.localdomain sudo[281644]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:48:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7200.1 total, 600.0 interval
                                                          Cumulative writes: 4971 writes, 22K keys, 4971 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4971 writes, 644 syncs, 7.72 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 72 writes, 178 keys, 72 commit groups, 1.0 writes per commit group, ingest: 0.29 MB, 0.00 MB/s
                                                          Interval WAL: 72 writes, 36 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 09:48:34 np0005538513.localdomain systemd-journald[47227]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation.
Nov 28 09:48:34 np0005538513.localdomain systemd-journald[47227]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 09:48:34 np0005538513.localdomain rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:48:34 np0005538513.localdomain rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:48:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:34.863 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:48:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:34.864 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:48:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:34.864 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:48:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:34.865 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:48:34 np0005538513.localdomain podman[281643]: 2025-11-28 09:48:34.885257296 +0000 UTC m=+0.367564197 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent)
Nov 28 09:48:34 np0005538513.localdomain podman[281642]: 2025-11-28 09:48:34.914199683 +0000 UTC m=+0.397791511 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 28 09:48:34 np0005538513.localdomain podman[281642]: 2025-11-28 09:48:34.941922774 +0000 UTC m=+0.425514582 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 09:48:34 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:48:34 np0005538513.localdomain sudo[281644]: pam_unix(sudo:session): session closed for user root
Nov 28 09:48:34 np0005538513.localdomain podman[281643]: 2025-11-28 09:48:34.972343792 +0000 UTC m=+0.454650683 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 28 09:48:34 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:48:35 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18445 DF PROTO=TCP SPT=56892 DPT=9102 SEQ=4071554048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2BFCC20000000001030307) 
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.012 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.039 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.039 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.040 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.041 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.041 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.041 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.042 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.042 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.043 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.043 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.067 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.068 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.068 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.069 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.069 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:48:36 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29911 DF PROTO=TCP SPT=40908 DPT=9102 SEQ=3436196770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2BFF830000000001030307) 
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.538 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.643 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.644 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:48:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:48:36 np0005538513.localdomain podman[281745]: 2025-11-28 09:48:36.860347966 +0000 UTC m=+0.093058138 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 09:48:36 np0005538513.localdomain podman[281745]: 2025-11-28 09:48:36.876410991 +0000 UTC m=+0.109121203 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.878 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.880 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12288MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.880 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.880 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:48:36 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.996 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.997 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:48:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:36.997 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:48:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:37.039 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:48:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:37.176 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:48:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:37.495 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:48:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:37.502 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:48:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:37.521 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:48:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:37.524 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:48:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:37.524 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.644s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:48:37 np0005538513.localdomain sudo[281785]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:48:37 np0005538513.localdomain sudo[281785]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:48:37 np0005538513.localdomain sudo[281785]: pam_unix(sudo:session): session closed for user root
Nov 28 09:48:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:38.949 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:48:39 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18446 DF PROTO=TCP SPT=56892 DPT=9102 SEQ=4071554048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2C0C830000000001030307) 
Nov 28 09:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:48:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7200.3 total, 600.0 interval
                                                          Cumulative writes: 5682 writes, 25K keys, 5682 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5682 writes, 779 syncs, 7.29 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 66 writes, 243 keys, 66 commit groups, 1.0 writes per commit group, ingest: 0.31 MB, 0.00 MB/s
                                                          Interval WAL: 66 writes, 21 syncs, 3.14 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 09:48:40 np0005538513.localdomain podman[238687]: time="2025-11-28T09:48:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:48:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:48:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147523 "" "Go-http-client/1.1"
Nov 28 09:48:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:48:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17239 "" "Go-http-client/1.1"
Nov 28 09:48:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:42.214 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:48:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:48:43 np0005538513.localdomain systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 28 09:48:43 np0005538513.localdomain systemd[1]: tmp-crun.9VrhWg.mount: Deactivated successfully.
Nov 28 09:48:43 np0005538513.localdomain podman[281804]: 2025-11-28 09:48:43.848130506 +0000 UTC m=+0.081047583 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:48:43 np0005538513.localdomain podman[281804]: 2025-11-28 09:48:43.859516724 +0000 UTC m=+0.092433771 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:48:43 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:48:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:43.992 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:48:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:48:45 np0005538513.localdomain systemd[1]: tmp-crun.5I4Z2r.mount: Deactivated successfully.
Nov 28 09:48:45 np0005538513.localdomain podman[281827]: 2025-11-28 09:48:45.863814579 +0000 UTC m=+0.095374266 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd)
Nov 28 09:48:45 np0005538513.localdomain podman[281827]: 2025-11-28 09:48:45.875477817 +0000 UTC m=+0.107037504 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 28 09:48:45 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:48:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:47.213 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:48:47 np0005538513.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18447 DF PROTO=TCP SPT=56892 DPT=9102 SEQ=4071554048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2C2D820000000001030307) 
Nov 28 09:48:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:48:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:48:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:48:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:48:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:48:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:48:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:48:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:48:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:48:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:48:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:48:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:48:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:48.993 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:48:49 np0005538513.localdomain sudo[281846]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:48:49 np0005538513.localdomain sudo[281846]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:48:49 np0005538513.localdomain sudo[281846]: pam_unix(sudo:session): session closed for user root
Nov 28 09:48:50 np0005538513.localdomain sudo[281864]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:48:50 np0005538513.localdomain sudo[281864]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:48:50 np0005538513.localdomain sudo[281864]: pam_unix(sudo:session): session closed for user root
Nov 28 09:48:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:48:50.827 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:48:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:48:50.827 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:48:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:48:50.829 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:48:51 np0005538513.localdomain sudo[281882]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:48:51 np0005538513.localdomain sudo[281882]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:48:51 np0005538513.localdomain sudo[281882]: pam_unix(sudo:session): session closed for user root
Nov 28 09:48:51 np0005538513.localdomain sshd[281900]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:48:52 np0005538513.localdomain sshd[281900]: Accepted publickey for tripleo-admin from 192.168.122.11 port 47732 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:48:52 np0005538513.localdomain systemd[1]: Created slice User Slice of UID 1003.
Nov 28 09:48:52 np0005538513.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Nov 28 09:48:52 np0005538513.localdomain systemd-logind[764]: New session 62 of user tripleo-admin.
Nov 28 09:48:52 np0005538513.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Nov 28 09:48:52 np0005538513.localdomain systemd[1]: Starting User Manager for UID 1003...
Nov 28 09:48:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:52.246 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:48:52 np0005538513.localdomain systemd[281904]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Nov 28 09:48:52 np0005538513.localdomain systemd[281904]: Queued start job for default target Main User Target.
Nov 28 09:48:52 np0005538513.localdomain systemd[281904]: Created slice User Application Slice.
Nov 28 09:48:52 np0005538513.localdomain systemd[281904]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 09:48:52 np0005538513.localdomain systemd[281904]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 09:48:52 np0005538513.localdomain systemd[281904]: Reached target Paths.
Nov 28 09:48:52 np0005538513.localdomain systemd[281904]: Reached target Timers.
Nov 28 09:48:52 np0005538513.localdomain systemd[281904]: Starting D-Bus User Message Bus Socket...
Nov 28 09:48:52 np0005538513.localdomain systemd[281904]: Starting Create User's Volatile Files and Directories...
Nov 28 09:48:52 np0005538513.localdomain systemd[281904]: Listening on D-Bus User Message Bus Socket.
Nov 28 09:48:52 np0005538513.localdomain systemd[281904]: Reached target Sockets.
Nov 28 09:48:52 np0005538513.localdomain systemd[281904]: Finished Create User's Volatile Files and Directories.
Nov 28 09:48:52 np0005538513.localdomain systemd[281904]: Reached target Basic System.
Nov 28 09:48:52 np0005538513.localdomain systemd[281904]: Reached target Main User Target.
Nov 28 09:48:52 np0005538513.localdomain systemd[281904]: Startup finished in 152ms.
Nov 28 09:48:52 np0005538513.localdomain systemd[1]: Started User Manager for UID 1003.
Nov 28 09:48:52 np0005538513.localdomain systemd[1]: Started Session 62 of User tripleo-admin.
Nov 28 09:48:52 np0005538513.localdomain sshd[281900]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Nov 28 09:48:53 np0005538513.localdomain sudo[282044]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zoxmvlmnscbxbtjgqggqlvkdofxpjnku ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764323332.5563138-61416-256408045430304/AnsiballZ_blockinfile.py
Nov 28 09:48:53 np0005538513.localdomain sudo[282044]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 28 09:48:53 np0005538513.localdomain python3[282046]: ansible-ansible.builtin.blockinfile Invoked with marker_begin=BEGIN ceph firewall rules marker_end=END ceph firewall rules path=/etc/nftables/edpm-rules.nft mode=0644 block=# 100 ceph_alertmanager (9093)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9093 } ct state new counter accept comment "100 ceph_alertmanager"
                                                          # 100 ceph_dashboard (8443)
                                                          add rule inet filter EDPM_INPUT tcp dport { 8443 } ct state new counter accept comment "100 ceph_dashboard"
                                                          # 100 ceph_grafana (3100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 3100 } ct state new counter accept comment "100 ceph_grafana"
                                                          # 100 ceph_prometheus (9092)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9092 } ct state new counter accept comment "100 ceph_prometheus"
                                                          # 100 ceph_rgw (8080)
                                                          add rule inet filter EDPM_INPUT tcp dport { 8080 } ct state new counter accept comment "100 ceph_rgw"
                                                          # 110 ceph_mon (6789, 3300, 9100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6789,3300,9100 } ct state new counter accept comment "110 ceph_mon"
                                                          # 112 ceph_mds (6800-7300, 9100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6800-7300,9100 } ct state new counter accept comment "112 ceph_mds"
                                                          # 113 ceph_mgr (6800-7300, 8444)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6800-7300,8444 } ct state new counter accept comment "113 ceph_mgr"
                                                          # 120 ceph_nfs (2049, 12049)
                                                          add rule inet filter EDPM_INPUT tcp dport { 2049,12049 } ct state new counter accept comment "120 ceph_nfs"
                                                          # 123 ceph_dashboard (9090, 9094, 9283)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9090,9094,9283 } ct state new counter accept comment "123 ceph_dashboard"
                                                           insertbefore=^# Lock down INPUT chains state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False unsafe_writes=False insertafter=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:48:53 np0005538513.localdomain sudo[282044]: pam_unix(sudo:session): session closed for user root
Nov 28 09:48:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:54.034 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:48:54 np0005538513.localdomain sudo[282188]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrilrvllouxurmoykbmkrvgvvkgfhgaj ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764323333.4139001-61430-82157287266072/AnsiballZ_systemd.py
Nov 28 09:48:54 np0005538513.localdomain sudo[282188]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 28 09:48:54 np0005538513.localdomain python3[282190]: ansible-ansible.builtin.systemd Invoked with name=nftables state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:48:54 np0005538513.localdomain systemd[1]: Stopping Netfilter Tables...
Nov 28 09:48:54 np0005538513.localdomain systemd[1]: nftables.service: Deactivated successfully.
Nov 28 09:48:54 np0005538513.localdomain systemd[1]: Stopped Netfilter Tables.
Nov 28 09:48:54 np0005538513.localdomain systemd[1]: Starting Netfilter Tables...
Nov 28 09:48:54 np0005538513.localdomain systemd[1]: Finished Netfilter Tables.
Nov 28 09:48:54 np0005538513.localdomain sudo[282188]: pam_unix(sudo:session): session closed for user root
Nov 28 09:48:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:48:55 np0005538513.localdomain systemd[1]: tmp-crun.ATpAdQ.mount: Deactivated successfully.
Nov 28 09:48:55 np0005538513.localdomain podman[282215]: 2025-11-28 09:48:55.86318787 +0000 UTC m=+0.093581194 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., vcs-type=git, name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter, release=1755695350, distribution-scope=public, com.redhat.component=ubi9-minimal-container, version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Nov 28 09:48:55 np0005538513.localdomain podman[282215]: 2025-11-28 09:48:55.881483258 +0000 UTC m=+0.111876552 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Nov 28 09:48:55 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:48:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:57.249 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:48:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:48:59.037 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:49:00 np0005538513.localdomain systemd[1]: tmp-crun.ncCjfe.mount: Deactivated successfully.
Nov 28 09:49:00 np0005538513.localdomain podman[282236]: 2025-11-28 09:49:00.856045936 +0000 UTC m=+0.090817553 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:49:00 np0005538513.localdomain podman[282236]: 2025-11-28 09:49:00.867415865 +0000 UTC m=+0.102187452 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:49:00 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:49:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:02.269 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:03 np0005538513.localdomain sudo[282260]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:49:03 np0005538513.localdomain sudo[282260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:03 np0005538513.localdomain sudo[282260]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:04.065 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:04 np0005538513.localdomain sudo[282278]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:49:04 np0005538513.localdomain sudo[282278]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:04 np0005538513.localdomain sudo[282278]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:49:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:49:05 np0005538513.localdomain podman[282296]: 2025-11-28 09:49:05.479091693 +0000 UTC m=+0.084871842 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 09:49:05 np0005538513.localdomain podman[282297]: 2025-11-28 09:49:05.560103503 +0000 UTC m=+0.160652231 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:49:05 np0005538513.localdomain podman[282296]: 2025-11-28 09:49:05.565234061 +0000 UTC m=+0.171014220 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:49:05 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:49:05 np0005538513.localdomain podman[282297]: 2025-11-28 09:49:05.594429925 +0000 UTC m=+0.194978643 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Nov 28 09:49:05 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:49:05 np0005538513.localdomain sudo[282340]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:49:05 np0005538513.localdomain sudo[282340]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:05 np0005538513.localdomain sudo[282340]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:06 np0005538513.localdomain sudo[282358]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:49:06 np0005538513.localdomain sudo[282358]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:49:06 np0005538513.localdomain sudo[282358]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:07 np0005538513.localdomain podman[282375]: 2025-11-28 09:49:07.076100102 +0000 UTC m=+0.082563805 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 09:49:07 np0005538513.localdomain podman[282375]: 2025-11-28 09:49:07.111891556 +0000 UTC m=+0.118355249 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, config_id=edpm, org.label-schema.license=GPLv2)
Nov 28 09:49:07 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:49:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:07.272 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:08 np0005538513.localdomain sudo[282396]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:49:08 np0005538513.localdomain sudo[282396]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:08 np0005538513.localdomain sudo[282396]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:09 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:09.069 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:09 np0005538513.localdomain sudo[282414]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:49:09 np0005538513.localdomain sudo[282414]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:09 np0005538513.localdomain sudo[282414]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:10 np0005538513.localdomain podman[238687]: time="2025-11-28T09:49:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:49:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:49:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147523 "" "Go-http-client/1.1"
Nov 28 09:49:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:49:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17238 "" "Go-http-client/1.1"
Nov 28 09:49:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:12.307 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:14.107 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:14 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:49:14 np0005538513.localdomain systemd[1]: tmp-crun.nETkih.mount: Deactivated successfully.
Nov 28 09:49:14 np0005538513.localdomain podman[282432]: 2025-11-28 09:49:14.864720406 +0000 UTC m=+0.096746698 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:49:14 np0005538513.localdomain podman[282432]: 2025-11-28 09:49:14.876518462 +0000 UTC m=+0.108544754 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:49:14 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:49:15 np0005538513.localdomain sudo[282455]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:49:15 np0005538513.localdomain sudo[282455]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:15 np0005538513.localdomain sudo[282455]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:15 np0005538513.localdomain sudo[282473]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:49:15 np0005538513.localdomain sudo[282473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:15 np0005538513.localdomain podman[282538]: 
Nov 28 09:49:15 np0005538513.localdomain podman[282538]: 2025-11-28 09:49:15.967562318 +0000 UTC m=+0.082701122 container create 4fc2c0a43759f645e048f3cbce449eb503b7b622a0ea8c12a69b5b0816c0868b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_jones, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_CLEAN=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, RELEASE=main, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12)
Nov 28 09:49:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:49:15 np0005538513.localdomain sshd[281488]: Received disconnect from 38.102.83.114 port 45730:11: disconnected by user
Nov 28 09:49:15 np0005538513.localdomain sshd[281488]: Disconnected from user zuul 38.102.83.114 port 45730
Nov 28 09:49:15 np0005538513.localdomain sshd[281482]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:49:15 np0005538513.localdomain systemd[1]: session-61.scope: Deactivated successfully.
Nov 28 09:49:15 np0005538513.localdomain systemd-logind[764]: Session 61 logged out. Waiting for processes to exit.
Nov 28 09:49:16 np0005538513.localdomain systemd-logind[764]: Removed session 61.
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: Started libpod-conmon-4fc2c0a43759f645e048f3cbce449eb503b7b622a0ea8c12a69b5b0816c0868b.scope.
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:49:16 np0005538513.localdomain podman[282538]: 2025-11-28 09:49:15.92917838 +0000 UTC m=+0.044317214 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:49:16 np0005538513.localdomain podman[282538]: 2025-11-28 09:49:16.044003856 +0000 UTC m=+0.159142630 container init 4fc2c0a43759f645e048f3cbce449eb503b7b622a0ea8c12a69b5b0816c0868b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_jones, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, distribution-scope=public, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, RELEASE=main, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, ceph=True, com.redhat.component=rhceph-container)
Nov 28 09:49:16 np0005538513.localdomain podman[282538]: 2025-11-28 09:49:16.062233921 +0000 UTC m=+0.177372705 container start 4fc2c0a43759f645e048f3cbce449eb503b7b622a0ea8c12a69b5b0816c0868b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_jones, architecture=x86_64, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., release=553, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, version=7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.openshift.expose-services=, name=rhceph, GIT_CLEAN=True, RELEASE=main)
Nov 28 09:49:16 np0005538513.localdomain podman[282538]: 2025-11-28 09:49:16.062521039 +0000 UTC m=+0.177659813 container attach 4fc2c0a43759f645e048f3cbce449eb503b7b622a0ea8c12a69b5b0816c0868b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_jones, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, release=553, GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, RELEASE=main, build-date=2025-09-24T08:57:55, architecture=x86_64, ceph=True, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public)
Nov 28 09:49:16 np0005538513.localdomain funny_jones[282559]: 167 167
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: libpod-4fc2c0a43759f645e048f3cbce449eb503b7b622a0ea8c12a69b5b0816c0868b.scope: Deactivated successfully.
Nov 28 09:49:16 np0005538513.localdomain podman[282552]: 2025-11-28 09:49:16.101093984 +0000 UTC m=+0.099180232 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true)
Nov 28 09:49:16 np0005538513.localdomain podman[282552]: 2025-11-28 09:49:16.115630655 +0000 UTC m=+0.113716903 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:49:16 np0005538513.localdomain podman[282538]: 2025-11-28 09:49:16.167694008 +0000 UTC m=+0.282832792 container died 4fc2c0a43759f645e048f3cbce449eb503b7b622a0ea8c12a69b5b0816c0868b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_jones, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, vcs-type=git, com.redhat.component=rhceph-container, RELEASE=main, io.openshift.tags=rhceph ceph, version=7, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=)
Nov 28 09:49:16 np0005538513.localdomain podman[282570]: 2025-11-28 09:49:16.265177267 +0000 UTC m=+0.187051215 container remove 4fc2c0a43759f645e048f3cbce449eb503b7b622a0ea8c12a69b5b0816c0868b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_jones, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.openshift.expose-services=, ceph=True, version=7, RELEASE=main, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: libpod-conmon-4fc2c0a43759f645e048f3cbce449eb503b7b622a0ea8c12a69b5b0816c0868b.scope: Deactivated successfully.
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:49:16 np0005538513.localdomain systemd-rc-local-generator[282617]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:49:16 np0005538513.localdomain systemd-sysv-generator[282621]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: tmp-crun.gUVNom.mount: Deactivated successfully.
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-b7efc582d40325b91c09e0286f90f723318a0cef27d1a8e7dfa2889477511945-merged.mount: Deactivated successfully.
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:49:16 np0005538513.localdomain systemd-rc-local-generator[282665]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:49:16 np0005538513.localdomain systemd-sysv-generator[282668]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:16 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:17 np0005538513.localdomain systemd[1]: Starting Ceph mds.mds.np0005538513.yljthc for 2c5417c9-00eb-57d5-a565-ddecbc7995c1...
Nov 28 09:49:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:17.311 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:17 np0005538513.localdomain podman[282726]: 
Nov 28 09:49:17 np0005538513.localdomain podman[282726]: 2025-11-28 09:49:17.43713096 +0000 UTC m=+0.054872891 container create be1102a3fe9451839ed0049a206966b47cd61ced9a9c1adaed5bffaddb4d9ab6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mds-mds-np0005538513-yljthc, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, ceph=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, release=553)
Nov 28 09:49:17 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85960ae17dc871edebaadbc6101fc16170cda42cdb233babbaaa0c0d94601f12/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 09:49:17 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85960ae17dc871edebaadbc6101fc16170cda42cdb233babbaaa0c0d94601f12/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 09:49:17 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85960ae17dc871edebaadbc6101fc16170cda42cdb233babbaaa0c0d94601f12/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 09:49:17 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85960ae17dc871edebaadbc6101fc16170cda42cdb233babbaaa0c0d94601f12/merged/var/lib/ceph/mds/ceph-mds.np0005538513.yljthc supports timestamps until 2038 (0x7fffffff)
Nov 28 09:49:17 np0005538513.localdomain podman[282726]: 2025-11-28 09:49:17.49106302 +0000 UTC m=+0.108804951 container init be1102a3fe9451839ed0049a206966b47cd61ced9a9c1adaed5bffaddb4d9ab6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mds-mds-np0005538513-yljthc, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, com.redhat.component=rhceph-container, release=553, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, ceph=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, version=7, name=rhceph, distribution-scope=public)
Nov 28 09:49:17 np0005538513.localdomain podman[282726]: 2025-11-28 09:49:17.502517175 +0000 UTC m=+0.120259126 container start be1102a3fe9451839ed0049a206966b47cd61ced9a9c1adaed5bffaddb4d9ab6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mds-mds-np0005538513-yljthc, build-date=2025-09-24T08:57:55, version=7, com.redhat.component=rhceph-container, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, RELEASE=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:49:17 np0005538513.localdomain bash[282726]: be1102a3fe9451839ed0049a206966b47cd61ced9a9c1adaed5bffaddb4d9ab6
Nov 28 09:49:17 np0005538513.localdomain podman[282726]: 2025-11-28 09:49:17.413495608 +0000 UTC m=+0.031237519 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:49:17 np0005538513.localdomain systemd[1]: Started Ceph mds.mds.np0005538513.yljthc for 2c5417c9-00eb-57d5-a565-ddecbc7995c1.
Nov 28 09:49:17 np0005538513.localdomain sudo[282473]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:17 np0005538513.localdomain ceph-mds[282744]: set uid:gid to 167:167 (ceph:ceph)
Nov 28 09:49:17 np0005538513.localdomain ceph-mds[282744]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mds, pid 2
Nov 28 09:49:17 np0005538513.localdomain ceph-mds[282744]: main not setting numa affinity
Nov 28 09:49:17 np0005538513.localdomain ceph-mds[282744]: pidfile_write: ignore empty --pid-file
Nov 28 09:49:17 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mds-mds-np0005538513-yljthc[282740]: starting mds.mds.np0005538513.yljthc at 
Nov 28 09:49:17 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc Updating MDS map to version 9 from mon.0
Nov 28 09:49:17 np0005538513.localdomain sudo[282763]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:49:17 np0005538513.localdomain sudo[282763]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:17 np0005538513.localdomain sudo[282763]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:49:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:49:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:49:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:49:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:49:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:49:18 np0005538513.localdomain systemd[1]: tmp-crun.uIcWZV.mount: Deactivated successfully.
Nov 28 09:49:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:49:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:49:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:49:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:49:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:49:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:49:18 np0005538513.localdomain sudo[282781]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:49:18 np0005538513.localdomain sudo[282781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:18 np0005538513.localdomain sudo[282781]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:18 np0005538513.localdomain sudo[282799]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:49:18 np0005538513.localdomain sudo[282799]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:18 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc Updating MDS map to version 10 from mon.0
Nov 28 09:49:18 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc Monitors have assigned me to become a standby.
Nov 28 09:49:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:19.109 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:19 np0005538513.localdomain systemd[1]: tmp-crun.Kk87MB.mount: Deactivated successfully.
Nov 28 09:49:19 np0005538513.localdomain podman[282891]: 2025-11-28 09:49:19.144345993 +0000 UTC m=+0.110983098 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, io.openshift.expose-services=, name=rhceph, vendor=Red Hat, Inc., release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, architecture=x86_64, ceph=True, version=7, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 09:49:19 np0005538513.localdomain podman[282891]: 2025-11-28 09:49:19.249122819 +0000 UTC m=+0.215759974 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.openshift.expose-services=, name=rhceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=553, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container)
Nov 28 09:49:19 np0005538513.localdomain sudo[282799]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:20 np0005538513.localdomain sudo[282976]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:49:20 np0005538513.localdomain sudo[282976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:20 np0005538513.localdomain sudo[282976]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:20 np0005538513.localdomain sudo[282994]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:49:20 np0005538513.localdomain sudo[282994]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:20 np0005538513.localdomain sudo[282994]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:22.361 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:24.152 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:49:26 np0005538513.localdomain systemd[1]: tmp-crun.c4Cjx6.mount: Deactivated successfully.
Nov 28 09:49:26 np0005538513.localdomain podman[283012]: 2025-11-28 09:49:26.85744918 +0000 UTC m=+0.091944529 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, config_id=edpm, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:49:26 np0005538513.localdomain podman[283012]: 2025-11-28 09:49:26.870726721 +0000 UTC m=+0.105222090 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, version=9.6, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vendor=Red Hat, Inc.)
Nov 28 09:49:26 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:49:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:27.365 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:29.155 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:49:31 np0005538513.localdomain podman[283032]: 2025-11-28 09:49:31.842518431 +0000 UTC m=+0.077902934 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:49:31 np0005538513.localdomain podman[283032]: 2025-11-28 09:49:31.855460862 +0000 UTC m=+0.090845265 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:49:31 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:49:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:32.128 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:49:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:32.129 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:49:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:32.154 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:49:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:32.155 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:49:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:32.155 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:49:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:32.155 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:49:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:32.155 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:49:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:32.156 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:49:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:32.156 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:49:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:32.157 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:49:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:32.178 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:49:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:32.179 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:49:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:32.179 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:49:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:32.179 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:49:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:32.180 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:49:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:32.394 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:32.658 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:49:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:32.732 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:49:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:32.732 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:49:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:32.948 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:49:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:32.949 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12263MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:49:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:32.950 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:49:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:32.950 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:49:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:33.052 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:49:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:33.053 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:49:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:33.054 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:49:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:33.101 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:49:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:33.568 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:49:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:33.577 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:49:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:33.605 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:49:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:33.609 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:49:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:33.610 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:49:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:34.193 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:34.227 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:49:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:34.228 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:49:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:34.228 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:49:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:34.779 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:49:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:34.780 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:49:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:34.780 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:49:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:34.780 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:49:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:35.300 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:49:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:35.322 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:49:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:35.323 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:49:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:49:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:49:35 np0005538513.localdomain systemd[1]: tmp-crun.je0iug.mount: Deactivated successfully.
Nov 28 09:49:35 np0005538513.localdomain podman[283100]: 2025-11-28 09:49:35.875100557 +0000 UTC m=+0.105220851 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:49:35 np0005538513.localdomain podman[283101]: 2025-11-28 09:49:35.971623307 +0000 UTC m=+0.195423195 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 28 09:49:35 np0005538513.localdomain podman[283100]: 2025-11-28 09:49:35.97690298 +0000 UTC m=+0.207023304 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:49:35 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:49:36 np0005538513.localdomain podman[283101]: 2025-11-28 09:49:36.027667533 +0000 UTC m=+0.251467421 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 28 09:49:36 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:49:36 np0005538513.localdomain systemd[1]: tmp-crun.zK7VQt.mount: Deactivated successfully.
Nov 28 09:49:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:37.397 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:49:37 np0005538513.localdomain podman[283142]: 2025-11-28 09:49:37.855240485 +0000 UTC m=+0.089090350 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:49:37 np0005538513.localdomain podman[283142]: 2025-11-28 09:49:37.864976987 +0000 UTC m=+0.098826842 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 09:49:37 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:49:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:39.198 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:40 np0005538513.localdomain podman[238687]: time="2025-11-28T09:49:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:49:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:49:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149601 "" "Go-http-client/1.1"
Nov 28 09:49:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:49:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17719 "" "Go-http-client/1.1"
Nov 28 09:49:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:42.430 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:44.210 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:45 np0005538513.localdomain sudo[283162]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:49:45 np0005538513.localdomain sudo[283162]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:49:45 np0005538513.localdomain sudo[283162]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:45 np0005538513.localdomain sudo[283181]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:49:45 np0005538513.localdomain sudo[283181]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:45 np0005538513.localdomain podman[283180]: 2025-11-28 09:49:45.325299382 +0000 UTC m=+0.089126251 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:49:45 np0005538513.localdomain podman[283180]: 2025-11-28 09:49:45.339846243 +0000 UTC m=+0.103673082 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:49:45 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:49:45 np0005538513.localdomain sudo[283181]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:46 np0005538513.localdomain sudo[283251]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:49:46 np0005538513.localdomain sudo[283251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:49:46 np0005538513.localdomain sudo[283251]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:46 np0005538513.localdomain sudo[283270]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 -- inventory --format=json-pretty --filter-for-batch
Nov 28 09:49:46 np0005538513.localdomain sudo[283270]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:46 np0005538513.localdomain systemd[1]: tmp-crun.mtRmt6.mount: Deactivated successfully.
Nov 28 09:49:46 np0005538513.localdomain podman[283269]: 2025-11-28 09:49:46.379375844 +0000 UTC m=+0.095108157 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:49:46 np0005538513.localdomain podman[283269]: 2025-11-28 09:49:46.415116131 +0000 UTC m=+0.130848454 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 09:49:46 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:49:47 np0005538513.localdomain podman[283346]: 
Nov 28 09:49:47 np0005538513.localdomain podman[283346]: 2025-11-28 09:49:47.047533002 +0000 UTC m=+0.071123285 container create 1ba0fe0e121ea5a6f50685bf2815314092a3b73ca4a38e66b8b05f4a193abf55 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_shannon, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_BRANCH=main, vcs-type=git, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:49:47 np0005538513.localdomain systemd[1]: Started libpod-conmon-1ba0fe0e121ea5a6f50685bf2815314092a3b73ca4a38e66b8b05f4a193abf55.scope.
Nov 28 09:49:47 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:49:47 np0005538513.localdomain podman[283346]: 2025-11-28 09:49:47.11460842 +0000 UTC m=+0.138198713 container init 1ba0fe0e121ea5a6f50685bf2815314092a3b73ca4a38e66b8b05f4a193abf55 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_shannon, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, GIT_BRANCH=main, vcs-type=git, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, architecture=x86_64, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main)
Nov 28 09:49:47 np0005538513.localdomain podman[283346]: 2025-11-28 09:49:47.019830344 +0000 UTC m=+0.043420687 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:49:47 np0005538513.localdomain podman[283346]: 2025-11-28 09:49:47.125205098 +0000 UTC m=+0.148795391 container start 1ba0fe0e121ea5a6f50685bf2815314092a3b73ca4a38e66b8b05f4a193abf55 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_shannon, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, ceph=True, name=rhceph, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.openshift.expose-services=, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, distribution-scope=public, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:49:47 np0005538513.localdomain podman[283346]: 2025-11-28 09:49:47.12722367 +0000 UTC m=+0.150813963 container attach 1ba0fe0e121ea5a6f50685bf2815314092a3b73ca4a38e66b8b05f4a193abf55 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_shannon, vcs-type=git, GIT_CLEAN=True, GIT_BRANCH=main, release=553, ceph=True, name=rhceph, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:49:47 np0005538513.localdomain brave_shannon[283361]: 167 167
Nov 28 09:49:47 np0005538513.localdomain systemd[1]: libpod-1ba0fe0e121ea5a6f50685bf2815314092a3b73ca4a38e66b8b05f4a193abf55.scope: Deactivated successfully.
Nov 28 09:49:47 np0005538513.localdomain podman[283346]: 2025-11-28 09:49:47.129075888 +0000 UTC m=+0.152666201 container died 1ba0fe0e121ea5a6f50685bf2815314092a3b73ca4a38e66b8b05f4a193abf55 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_shannon, version=7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, architecture=x86_64, GIT_BRANCH=main, distribution-scope=public, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:49:47 np0005538513.localdomain podman[283366]: 2025-11-28 09:49:47.232409049 +0000 UTC m=+0.089153073 container remove 1ba0fe0e121ea5a6f50685bf2815314092a3b73ca4a38e66b8b05f4a193abf55 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_shannon, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, architecture=x86_64, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7)
Nov 28 09:49:47 np0005538513.localdomain systemd[1]: libpod-conmon-1ba0fe0e121ea5a6f50685bf2815314092a3b73ca4a38e66b8b05f4a193abf55.scope: Deactivated successfully.
Nov 28 09:49:47 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-ec195e5df26607ad6d5618c97609d1a4a8af520c30e3b0fdd1937c921be2a8a4-merged.mount: Deactivated successfully.
Nov 28 09:49:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:47.434 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:47 np0005538513.localdomain podman[283388]: 
Nov 28 09:49:47 np0005538513.localdomain podman[283388]: 2025-11-28 09:49:47.453502387 +0000 UTC m=+0.064699785 container create a371f70d09e1e7561dc50984dc9e0e6dda0cb4117f7127452f8d73ab0687cbb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_poitras, version=7, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, vcs-type=git, RELEASE=main, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, name=rhceph, io.buildah.version=1.33.12, GIT_CLEAN=True)
Nov 28 09:49:47 np0005538513.localdomain systemd[1]: Started libpod-conmon-a371f70d09e1e7561dc50984dc9e0e6dda0cb4117f7127452f8d73ab0687cbb8.scope.
Nov 28 09:49:47 np0005538513.localdomain systemd[1]: tmp-crun.HFbxsi.mount: Deactivated successfully.
Nov 28 09:49:47 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:49:47 np0005538513.localdomain podman[283388]: 2025-11-28 09:49:47.422730534 +0000 UTC m=+0.033927962 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:49:47 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08a6670e3410e869bdac9d9f3cae0d281c091df3c53deae62c2a5de8444cec37/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 09:49:47 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08a6670e3410e869bdac9d9f3cae0d281c091df3c53deae62c2a5de8444cec37/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 09:49:47 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08a6670e3410e869bdac9d9f3cae0d281c091df3c53deae62c2a5de8444cec37/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 09:49:47 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08a6670e3410e869bdac9d9f3cae0d281c091df3c53deae62c2a5de8444cec37/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 09:49:47 np0005538513.localdomain podman[283388]: 2025-11-28 09:49:47.532263587 +0000 UTC m=+0.143460985 container init a371f70d09e1e7561dc50984dc9e0e6dda0cb4117f7127452f8d73ab0687cbb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_poitras, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, RELEASE=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, version=7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:49:47 np0005538513.localdomain podman[283388]: 2025-11-28 09:49:47.544157215 +0000 UTC m=+0.155354613 container start a371f70d09e1e7561dc50984dc9e0e6dda0cb4117f7127452f8d73ab0687cbb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_poitras, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, release=553, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_BRANCH=main, distribution-scope=public, ceph=True, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:49:47 np0005538513.localdomain podman[283388]: 2025-11-28 09:49:47.544446754 +0000 UTC m=+0.155644222 container attach a371f70d09e1e7561dc50984dc9e0e6dda0cb4117f7127452f8d73ab0687cbb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_poitras, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_CLEAN=True, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, release=553, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=)
Nov 28 09:49:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:49:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:49:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:49:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:49:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:49:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:49:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:49:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:49:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:49:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:49:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:49:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]: [
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:     {
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:         "available": false,
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:         "ceph_device": false,
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:         "lsm_data": {},
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:         "lvs": [],
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:         "path": "/dev/sr0",
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:         "rejected_reasons": [
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:             "Insufficient space (<5GB)",
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:             "Has a FileSystem"
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:         ],
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:         "sys_api": {
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:             "actuators": null,
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:             "device_nodes": "sr0",
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:             "human_readable_size": "482.00 KB",
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:             "id_bus": "ata",
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:             "model": "QEMU DVD-ROM",
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:             "nr_requests": "2",
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:             "partitions": {},
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:             "path": "/dev/sr0",
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:             "removable": "1",
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:             "rev": "2.5+",
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:             "ro": "0",
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:             "rotational": "1",
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:             "sas_address": "",
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:             "sas_device_handle": "",
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:             "scheduler_mode": "mq-deadline",
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:             "sectors": 0,
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:             "sectorsize": "2048",
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:             "size": 493568.0,
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:             "support_discard": "0",
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:             "type": "disk",
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:             "vendor": "QEMU"
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:         }
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]:     }
Nov 28 09:49:48 np0005538513.localdomain tender_poitras[283403]: ]
Nov 28 09:49:48 np0005538513.localdomain systemd[1]: libpod-a371f70d09e1e7561dc50984dc9e0e6dda0cb4117f7127452f8d73ab0687cbb8.scope: Deactivated successfully.
Nov 28 09:49:48 np0005538513.localdomain systemd[1]: libpod-a371f70d09e1e7561dc50984dc9e0e6dda0cb4117f7127452f8d73ab0687cbb8.scope: Consumed 1.000s CPU time.
Nov 28 09:49:48 np0005538513.localdomain podman[285428]: 2025-11-28 09:49:48.627187444 +0000 UTC m=+0.052871239 container died a371f70d09e1e7561dc50984dc9e0e6dda0cb4117f7127452f8d73ab0687cbb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_poitras, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main)
Nov 28 09:49:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-08a6670e3410e869bdac9d9f3cae0d281c091df3c53deae62c2a5de8444cec37-merged.mount: Deactivated successfully.
Nov 28 09:49:48 np0005538513.localdomain podman[285428]: 2025-11-28 09:49:48.666648166 +0000 UTC m=+0.092331911 container remove a371f70d09e1e7561dc50984dc9e0e6dda0cb4117f7127452f8d73ab0687cbb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_poitras, io.openshift.tags=rhceph ceph, release=553, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.12, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., ceph=True, GIT_BRANCH=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:49:48 np0005538513.localdomain systemd[1]: libpod-conmon-a371f70d09e1e7561dc50984dc9e0e6dda0cb4117f7127452f8d73ab0687cbb8.scope: Deactivated successfully.
Nov 28 09:49:48 np0005538513.localdomain sudo[283270]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:49.213 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:49:50.827 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:49:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:49:50.828 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:49:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:49:50.830 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:49:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:52.459 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:54.239 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:54 np0005538513.localdomain sshd[281919]: Received disconnect from 192.168.122.11 port 47732:11: disconnected by user
Nov 28 09:49:54 np0005538513.localdomain sshd[281919]: Disconnected from user tripleo-admin 192.168.122.11 port 47732
Nov 28 09:49:54 np0005538513.localdomain sshd[281900]: pam_unix(sshd:session): session closed for user tripleo-admin
Nov 28 09:49:54 np0005538513.localdomain systemd[1]: session-62.scope: Deactivated successfully.
Nov 28 09:49:54 np0005538513.localdomain systemd[1]: session-62.scope: Consumed 1.310s CPU time.
Nov 28 09:49:54 np0005538513.localdomain systemd-logind[764]: Session 62 logged out. Waiting for processes to exit.
Nov 28 09:49:54 np0005538513.localdomain systemd-logind[764]: Removed session 62.
Nov 28 09:49:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:57.461 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:49:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:49:57 np0005538513.localdomain systemd[1]: tmp-crun.790DBn.mount: Deactivated successfully.
Nov 28 09:49:57 np0005538513.localdomain podman[285443]: 2025-11-28 09:49:57.890421696 +0000 UTC m=+0.126780468 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 09:49:57 np0005538513.localdomain podman[285443]: 2025-11-28 09:49:57.905506743 +0000 UTC m=+0.141865485 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_id=edpm, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter)
Nov 28 09:49:57 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:49:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:49:59.241 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.672 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.673 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.678 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9da108e-d18b-426c-b3d0-ac1b048c79f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:50:00.673433', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '9b2b28b8-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.845327096, 'message_signature': '2f816792d133ab023d05fdc934c8941cce6aaf9687a792c9e4a3c674328a9a85'}]}, 'timestamp': '2025-11-28 09:50:00.679161', '_unique_id': 'a1435dd80b0b417992982e631dae8bf3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.682 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.682 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '032e9a4e-9ca1-4a8d-a3a4-c79c81e89424', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:50:00.682220', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '9b2bb652-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.845327096, 'message_signature': '776d34daa26623dac303c2b68bfc2d4149390c2a6ece8836f63eab1c04b20ba2'}]}, 'timestamp': '2025-11-28 09:50:00.682694', '_unique_id': 'a70ad2025bca47fcadf77eead58144fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.684 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.684 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7bae2cfb-e46c-4931-bb04-e4468bd8b08c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:50:00.684855', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '9b2c1e08-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.845327096, 'message_signature': '6a98c8cd672c34c63e587a07ab47c4761032d7d642a33346825b2de0fe6a3d0c'}]}, 'timestamp': '2025-11-28 09:50:00.685420', '_unique_id': '96068efbf5a14f8581b33afc6f1b3029'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.687 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.701 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.702 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9adbb5b8-2a70-4838-97ee-f45bb9452bfb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:50:00.688188', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9b2ebc62-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.860118054, 'message_signature': 'cdb7cf2a20c852b624dce72308c18d2f1e9f3aa757a8f579d71870b80e4810a4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:50:00.688188', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9b2ed102-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.860118054, 'message_signature': 'a5a95d08d950459e244d44f5bba03b161935c07e5949e555a22e98312bced0dc'}]}, 'timestamp': '2025-11-28 09:50:00.703050', '_unique_id': '3a224b2921834537a279d47de2695c39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.705 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.705 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.706 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4da6a30-056c-4912-9f8b-d05ddb669dab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:50:00.705748', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9b2f4cfe-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.860118054, 'message_signature': 'e4da6ea206b8048b4e06ea6b8420067d39d9e68650b5055c5be196a74e712c66'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:50:00.705748', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9b2f5e42-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.860118054, 'message_signature': 'ef952c436d165568875968fb0b2e49646bd75f1a64c765929e1069f8172dd833'}]}, 'timestamp': '2025-11-28 09:50:00.706619', '_unique_id': 'f319fa86d0d042b2b352d636cb32008e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.708 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.737 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.737 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2989b59a-fa23-4413-9c70-d1a56b7767ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:50:00.708758', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9b342486-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': 'b197c5dfe69b6b38e95ef403506598b471fa4d1d50bca9a09e1e21e14a03261a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:50:00.708758', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9b3439b2-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': 'f65dc7cf453d5fcc0ac189e2abe1222b58e63786dadcb35a893cb47ea3630a20'}]}, 'timestamp': '2025-11-28 09:50:00.738509', '_unique_id': '08e9e3256f024357bf45dfa92ae6741e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.740 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.741 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.741 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3cb9c7e2-7279-45e7-8d64-053a37f409ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:50:00.741184', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9b34b608-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': 'd1abed28a311a0e43c0416db225afa555af92289d634eeb865c70a7d90e146f6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:50:00.741184', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9b34c710-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': '7a0f1c5f5d8501aab98fd5f864917a6309140ad9ff1b2ff66156b891597f45bb'}]}, 'timestamp': '2025-11-28 09:50:00.742129', '_unique_id': '0488478a8a0340d2bcff00431cd3e51d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.746 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.746 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f865c9d3-7e1c-42ba-8e9e-e1deb08b221d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:50:00.746299', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '9b357c46-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.845327096, 'message_signature': 'fdbae69f4be147b2e9eb07bcda84d2023041b5c47dcb379a69efb749be3ded1c'}]}, 'timestamp': '2025-11-28 09:50:00.746661', '_unique_id': '0d69877709f740a49c068889f61b6c2d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e09e2ca8-7da6-4eef-be5c-75cbcbac34a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:50:00.748084', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '9b35bfb2-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.845327096, 'message_signature': 'f1c4add585ba9530ffce7a133a6b7b4bbc3454a4a9392b20e71eb89e54c3e441'}]}, 'timestamp': '2025-11-28 09:50:00.748375', '_unique_id': 'b20a20b37309401f970bc5abfacb0f86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.749 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.749 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.749 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3538317-2a8b-4dcc-9feb-94d130608a79', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:50:00.749663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9b35fd06-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': 'd9c575a505836dc89091b7d0adf5275527be6fdb9c0256baf848b22082281ce7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:50:00.749663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9b3607b0-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': 'a1ea0fba68f9cdc02c7309a2aa6e7d852192814f8db514e24830b95cf7a7f8cb'}]}, 'timestamp': '2025-11-28 09:50:00.750200', '_unique_id': 'b43da796be9a47e9beb3c049d74bf66b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.751 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.751 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b96c30c1-428b-445a-9661-d164a94021fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:50:00.751498', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '9b3644d2-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.845327096, 'message_signature': '8e54378e8e63562454a989cb86435c81daafa8bc75c0a4875cf0047eff087af5'}]}, 'timestamp': '2025-11-28 09:50:00.751797', '_unique_id': 'f84b370dc03047ff8f596ac76a109480'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.753 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.768 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 12160000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '245e5133-7ee7-4da5-9aa3-ff7ae3e920ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12160000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:50:00.753164', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '9b38e0c0-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.94039008, 'message_signature': 'c032ba4c6c8189ceed2efb6dd64abde04021f37a4d85a2fb77755f276878460d'}]}, 'timestamp': '2025-11-28 09:50:00.768874', '_unique_id': '45aadbe7cf9b435e94f998cc7d2dae60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.770 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.770 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.770 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d552264-0826-4061-bddc-29631105a1bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:50:00.770270', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '9b3921f2-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.845327096, 'message_signature': '4c9d94d925167ef5fef4fede70b3291799cde19725083f2712182f773edfbe7d'}]}, 'timestamp': '2025-11-28 09:50:00.770551', '_unique_id': '4a2ae6029e854b32aca7c6ad9c2bca61'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efa843ff-9a81-42a9-9c06-114e5602226f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:50:00.771841', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9b395f5a-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.860118054, 'message_signature': '901c81c2c526b54829317c51d4b867d2198f87adda922cd041864ecc97a67f38'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:50:00.771841', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9b3969c8-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.860118054, 'message_signature': '6b96afb9bfff66001850ed917513daf0f173b855164d796b8a1747e0ad1e5d5d'}]}, 'timestamp': '2025-11-28 09:50:00.772368', '_unique_id': '58de60eed40b41018bae2ae9938fa4ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.773 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.773 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.773 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dec807d3-c145-4fd5-91ec-402b01112d7c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:50:00.773652', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9b39a604-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': '2e450f28515143a5465be1d9b0888d5409373831a1a25faa7e684d632cdab2bf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:50:00.773652', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9b39b072-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': 'ce64b22e9ca99393aef34a23701f330b24d489e8ee55389fd3d35d75c882cc54'}]}, 'timestamp': '2025-11-28 09:50:00.774179', '_unique_id': '3258da311b654fefbc0e5bdb8c9e5e68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.775 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.775 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c17ae4d-6635-43f8-a3f7-3a3f8f820382', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:50:00.775495', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '9b39eede-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.845327096, 'message_signature': 'e915c487a0a8e0d0d11dd4ab8c6981b1d0745583739a7c893d0fcf0a45817cd4'}]}, 'timestamp': '2025-11-28 09:50:00.775794', '_unique_id': 'a02d24cd17a2452096915e2962ef87c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8e01b17-0e00-4890-a0a7-29ddb7de4d22', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:50:00.777084', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '9b3a2c14-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.94039008, 'message_signature': '18e2aef03d11fff9c43004724fc2a5b69d8baa675b4c1d702bc75ff8ff0680ef'}]}, 'timestamp': '2025-11-28 09:50:00.777351', '_unique_id': '2e9dcad99eca43e89c9247829f9f5240'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.778 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.778 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.778 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20cec515-6c19-4b6b-9a61-43851a8e28b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:50:00.778714', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '9b3a6bc0-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.845327096, 'message_signature': 'bb834b268a74c5a8740a07a6f9f5ed628a371168647a59c7dd105af7e59d96e3'}]}, 'timestamp': '2025-11-28 09:50:00.779070', '_unique_id': '80ade852dfd6434292da53e3ccf720e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.780 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.780 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c74bdf13-bc1a-4c4f-9242-7641649a4aa1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:50:00.780544', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '9b3ab350-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.845327096, 'message_signature': '6ec176d3e269ba29a96ce237fdb0e5b391be5de5ce8d109ecfd3d50add56a4ae'}]}, 'timestamp': '2025-11-28 09:50:00.780824', '_unique_id': '04255758d50045198cbec5fde94c0a38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.782 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.782 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dff361a4-32db-45d9-87db-d7b86cdb633b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:50:00.782093', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9b3aefaa-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': 'b7739264cd3cf6b862dd1130231a123df135170b651c0098e6c3879cb782cf55'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:50:00.782093', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9b3af928-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': '1ad92ac23166250bf4a60d078d0fffaa166b1cad6f1222d24223485e6778411c'}]}, 'timestamp': '2025-11-28 09:50:00.782593', '_unique_id': '2c276588e8a646c692216772d9c12deb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '630fef5c-3059-4f1a-a054-4e483fe587b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:50:00.783878', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9b3b356e-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': '012c3a74059d0681cd020801c74b709d88edc5a830c64211b7ff7ead3b1fee22'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:50:00.783878', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9b3b3fc8-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': '60d76ac905c28f19cb09df351f86ef67478bbffd43f1f89af654ca68c5468cdb'}]}, 'timestamp': '2025-11-28 09:50:00.784400', '_unique_id': '4d1a097a7f134730a5b998b4a9306d34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:50:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:50:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:02.482 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:50:02 np0005538513.localdomain podman[285463]: 2025-11-28 09:50:02.848683055 +0000 UTC m=+0.082036102 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:50:02 np0005538513.localdomain podman[285463]: 2025-11-28 09:50:02.860647377 +0000 UTC m=+0.094000464 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:50:02 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:50:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:04.274 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:04 np0005538513.localdomain systemd[1]: Stopping User Manager for UID 1003...
Nov 28 09:50:04 np0005538513.localdomain systemd[281904]: Activating special unit Exit the Session...
Nov 28 09:50:04 np0005538513.localdomain systemd[281904]: Stopped target Main User Target.
Nov 28 09:50:04 np0005538513.localdomain systemd[281904]: Stopped target Basic System.
Nov 28 09:50:04 np0005538513.localdomain systemd[281904]: Stopped target Paths.
Nov 28 09:50:04 np0005538513.localdomain systemd[281904]: Stopped target Sockets.
Nov 28 09:50:04 np0005538513.localdomain systemd[281904]: Stopped target Timers.
Nov 28 09:50:04 np0005538513.localdomain systemd[281904]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 28 09:50:04 np0005538513.localdomain systemd[281904]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 09:50:04 np0005538513.localdomain systemd[281904]: Closed D-Bus User Message Bus Socket.
Nov 28 09:50:04 np0005538513.localdomain systemd[281904]: Stopped Create User's Volatile Files and Directories.
Nov 28 09:50:04 np0005538513.localdomain systemd[281904]: Removed slice User Application Slice.
Nov 28 09:50:04 np0005538513.localdomain systemd[281904]: Reached target Shutdown.
Nov 28 09:50:04 np0005538513.localdomain systemd[281904]: Finished Exit the Session.
Nov 28 09:50:04 np0005538513.localdomain systemd[281904]: Reached target Exit the Session.
Nov 28 09:50:04 np0005538513.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Nov 28 09:50:04 np0005538513.localdomain systemd[1]: Stopped User Manager for UID 1003.
Nov 28 09:50:04 np0005538513.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Nov 28 09:50:04 np0005538513.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Nov 28 09:50:04 np0005538513.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Nov 28 09:50:04 np0005538513.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Nov 28 09:50:04 np0005538513.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Nov 28 09:50:04 np0005538513.localdomain systemd[1]: user-1003.slice: Consumed 1.772s CPU time.
Nov 28 09:50:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:50:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:50:06 np0005538513.localdomain podman[285486]: 2025-11-28 09:50:06.844094788 +0000 UTC m=+0.077848451 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 09:50:06 np0005538513.localdomain podman[285487]: 2025-11-28 09:50:06.921231098 +0000 UTC m=+0.151841434 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:50:06 np0005538513.localdomain podman[285487]: 2025-11-28 09:50:06.952552128 +0000 UTC m=+0.183162444 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 09:50:06 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:50:06 np0005538513.localdomain podman[285486]: 2025-11-28 09:50:06.971847116 +0000 UTC m=+0.205600719 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:50:06 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:50:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:07.485 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:08 np0005538513.localdomain sudo[285528]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:08 np0005538513.localdomain sudo[285528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:50:08 np0005538513.localdomain sudo[285528]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:08 np0005538513.localdomain podman[285546]: 2025-11-28 09:50:08.462033167 +0000 UTC m=+0.100078531 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 09:50:08 np0005538513.localdomain podman[285546]: 2025-11-28 09:50:08.472343907 +0000 UTC m=+0.110389291 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 28 09:50:08 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:50:09 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:09.277 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:10 np0005538513.localdomain podman[238687]: time="2025-11-28T09:50:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:50:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:50:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149601 "" "Go-http-client/1.1"
Nov 28 09:50:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:50:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17727 "" "Go-http-client/1.1"
Nov 28 09:50:10 np0005538513.localdomain sudo[285565]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:10 np0005538513.localdomain sudo[285565]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:10 np0005538513.localdomain sudo[285565]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:11 np0005538513.localdomain sudo[285583]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:11 np0005538513.localdomain sudo[285583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:11 np0005538513.localdomain sudo[285583]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:12.518 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:14.319 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:50:15 np0005538513.localdomain podman[285601]: 2025-11-28 09:50:15.858633758 +0000 UTC m=+0.087901314 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:50:15 np0005538513.localdomain podman[285601]: 2025-11-28 09:50:15.89549027 +0000 UTC m=+0.124757826 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:50:15 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:50:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:50:16 np0005538513.localdomain podman[285624]: 2025-11-28 09:50:16.850514793 +0000 UTC m=+0.093079774 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 09:50:16 np0005538513.localdomain podman[285624]: 2025-11-28 09:50:16.894407683 +0000 UTC m=+0.136972694 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 28 09:50:16 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:50:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:17.522 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:50:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:50:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:50:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:50:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:50:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:50:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:50:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:50:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:50:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:50:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:50:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:50:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:19.324 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:22.563 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:24.369 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:26.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:26.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 09:50:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:26.790 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 09:50:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:26.791 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:26.791 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 09:50:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:26.808 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:27.567 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:27.816 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:27.816 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:27.817 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:50:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:50:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:28.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:28 np0005538513.localdomain podman[285643]: 2025-11-28 09:50:28.851610374 +0000 UTC m=+0.084448257 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc.)
Nov 28 09:50:28 np0005538513.localdomain podman[285643]: 2025-11-28 09:50:28.86696261 +0000 UTC m=+0.099800493 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, vcs-type=git, name=ubi9-minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_id=edpm, vendor=Red Hat, Inc.)
Nov 28 09:50:28 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:50:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:29.373 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:29.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:29.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:30.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:31.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:31.807 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:50:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:31.807 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:50:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:31.807 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:50:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:31.808 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:50:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:31.808 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:50:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:32.303 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:50:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:32.376 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:50:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:32.376 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:50:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:32.595 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:32.608 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:50:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:32.609 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12259MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:50:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:32.610 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:50:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:32.610 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:50:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:32.730 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:50:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:32.731 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:50:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:32.731 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:50:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:32.780 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing inventories for resource provider 35fead26-0bad-4950-b646-987079d58a17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 09:50:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:32.805 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating ProviderTree inventory for provider 35fead26-0bad-4950-b646-987079d58a17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 09:50:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:32.805 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 09:50:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:32.869 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing aggregate associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 09:50:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:32.891 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing trait associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_FMA3,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_ABM,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AMD_SVM,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_CLMUL,HW_CPU_X86_BMI2,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 09:50:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:32.925 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:50:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:33.393 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:50:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:33.399 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:50:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:33.418 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:50:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:33.420 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:50:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:33.421 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.811s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:50:33 np0005538513.localdomain sshd[285707]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:50:33 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:50:33 np0005538513.localdomain sshd[285707]: Invalid user ubuntu from 161.35.116.38 port 43330
Nov 28 09:50:33 np0005538513.localdomain sshd[285707]: Received disconnect from 161.35.116.38 port 43330:11:  [preauth]
Nov 28 09:50:33 np0005538513.localdomain sshd[285707]: Disconnected from invalid user ubuntu 161.35.116.38 port 43330 [preauth]
Nov 28 09:50:33 np0005538513.localdomain systemd[1]: tmp-crun.Uzc9sY.mount: Deactivated successfully.
Nov 28 09:50:33 np0005538513.localdomain podman[285709]: 2025-11-28 09:50:33.846905091 +0000 UTC m=+0.085275153 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:50:33 np0005538513.localdomain podman[285709]: 2025-11-28 09:50:33.855321152 +0000 UTC m=+0.093691224 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:50:33 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:50:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:34.409 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:34.421 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:34.421 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:50:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:34.421 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:50:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:35.281 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:50:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:35.281 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:50:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:35.281 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:50:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:35.281 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:50:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:36.208 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:50:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:36.227 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:50:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:36.228 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:50:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:36.228 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:37 np0005538513.localdomain sudo[285733]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:37 np0005538513.localdomain sudo[285733]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:50:37 np0005538513.localdomain sudo[285733]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:50:37 np0005538513.localdomain systemd[1]: tmp-crun.UbH7NU.mount: Deactivated successfully.
Nov 28 09:50:37 np0005538513.localdomain systemd[1]: tmp-crun.SB8fHW.mount: Deactivated successfully.
Nov 28 09:50:37 np0005538513.localdomain podman[285751]: 2025-11-28 09:50:37.208321236 +0000 UTC m=+0.132707762 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:50:37 np0005538513.localdomain podman[285752]: 2025-11-28 09:50:37.178241275 +0000 UTC m=+0.102761755 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 09:50:37 np0005538513.localdomain podman[285752]: 2025-11-28 09:50:37.270498473 +0000 UTC m=+0.195018933 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 28 09:50:37 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:50:37 np0005538513.localdomain podman[285751]: 2025-11-28 09:50:37.312469483 +0000 UTC m=+0.236856059 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Nov 28 09:50:37 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:50:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:37.599 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:38 np0005538513.localdomain sudo[285792]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:38 np0005538513.localdomain sudo[285792]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:38 np0005538513.localdomain sudo[285792]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:50:38 np0005538513.localdomain podman[285810]: 2025-11-28 09:50:38.852196457 +0000 UTC m=+0.082063123 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 09:50:38 np0005538513.localdomain podman[285810]: 2025-11-28 09:50:38.864377535 +0000 UTC m=+0.094244211 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 09:50:38 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:50:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:39.411 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:39 np0005538513.localdomain sudo[285829]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:39 np0005538513.localdomain sudo[285829]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:39 np0005538513.localdomain sudo[285829]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:40 np0005538513.localdomain podman[238687]: time="2025-11-28T09:50:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:50:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:50:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149601 "" "Go-http-client/1.1"
Nov 28 09:50:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:50:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17724 "" "Go-http-client/1.1"
Nov 28 09:50:40 np0005538513.localdomain sudo[285847]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:50:41 np0005538513.localdomain sudo[285847]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:41 np0005538513.localdomain sudo[285847]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:41 np0005538513.localdomain sudo[285865]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:50:41 np0005538513.localdomain sudo[285865]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:41 np0005538513.localdomain podman[285925]: 
Nov 28 09:50:41 np0005538513.localdomain podman[285925]: 2025-11-28 09:50:41.655973839 +0000 UTC m=+0.075347826 container create 2ac47655fc969d5ce9a9e9bcafeb74ec590f05de71137ce1b5c4d076d0a0dec8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_knuth, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, name=rhceph, io.openshift.expose-services=, ceph=True, release=553, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:50:41 np0005538513.localdomain systemd[1]: Started libpod-conmon-2ac47655fc969d5ce9a9e9bcafeb74ec590f05de71137ce1b5c4d076d0a0dec8.scope.
Nov 28 09:50:41 np0005538513.localdomain podman[285925]: 2025-11-28 09:50:41.623326927 +0000 UTC m=+0.042700944 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:50:41 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:50:41 np0005538513.localdomain podman[285925]: 2025-11-28 09:50:41.738528796 +0000 UTC m=+0.157902783 container init 2ac47655fc969d5ce9a9e9bcafeb74ec590f05de71137ce1b5c4d076d0a0dec8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_knuth, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, name=rhceph, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 09:50:41 np0005538513.localdomain podman[285925]: 2025-11-28 09:50:41.747107842 +0000 UTC m=+0.166481839 container start 2ac47655fc969d5ce9a9e9bcafeb74ec590f05de71137ce1b5c4d076d0a0dec8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_knuth, distribution-scope=public, version=7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, vcs-type=git, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, RELEASE=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.tags=rhceph ceph)
Nov 28 09:50:41 np0005538513.localdomain podman[285925]: 2025-11-28 09:50:41.747303078 +0000 UTC m=+0.166677105 container attach 2ac47655fc969d5ce9a9e9bcafeb74ec590f05de71137ce1b5c4d076d0a0dec8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_knuth, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, name=rhceph, io.openshift.tags=rhceph ceph, RELEASE=main, distribution-scope=public, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container)
Nov 28 09:50:41 np0005538513.localdomain nervous_knuth[285939]: 167 167
Nov 28 09:50:41 np0005538513.localdomain systemd[1]: libpod-2ac47655fc969d5ce9a9e9bcafeb74ec590f05de71137ce1b5c4d076d0a0dec8.scope: Deactivated successfully.
Nov 28 09:50:41 np0005538513.localdomain podman[285925]: 2025-11-28 09:50:41.749392392 +0000 UTC m=+0.168766399 container died 2ac47655fc969d5ce9a9e9bcafeb74ec590f05de71137ce1b5c4d076d0a0dec8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_knuth, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.expose-services=, CEPH_POINT_RELEASE=, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, version=7)
Nov 28 09:50:41 np0005538513.localdomain podman[285944]: 2025-11-28 09:50:41.849671048 +0000 UTC m=+0.088097829 container remove 2ac47655fc969d5ce9a9e9bcafeb74ec590f05de71137ce1b5c4d076d0a0dec8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_knuth, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, CEPH_POINT_RELEASE=, vcs-type=git, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., version=7, io.openshift.expose-services=, GIT_CLEAN=True, RELEASE=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 09:50:41 np0005538513.localdomain systemd[1]: libpod-conmon-2ac47655fc969d5ce9a9e9bcafeb74ec590f05de71137ce1b5c4d076d0a0dec8.scope: Deactivated successfully.
Nov 28 09:50:41 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:50:42 np0005538513.localdomain systemd-sysv-generator[285989]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:50:42 np0005538513.localdomain systemd-rc-local-generator[285983]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:50:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:50:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:42 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-bcd07bdf35e1bb658fa32b5ad0055ef57cca0afdc31ec71ba07544af3a3513aa-merged.mount: Deactivated successfully.
Nov 28 09:50:42 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:50:42 np0005538513.localdomain systemd-rc-local-generator[286027]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:50:42 np0005538513.localdomain systemd-sysv-generator[286031]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:50:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:50:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:42 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:42.632 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:42 np0005538513.localdomain systemd[1]: Starting Ceph mgr.np0005538513.dsfdlx for 2c5417c9-00eb-57d5-a565-ddecbc7995c1...
Nov 28 09:50:43 np0005538513.localdomain podman[286087]: 
Nov 28 09:50:43 np0005538513.localdomain podman[286087]: 2025-11-28 09:50:43.067430111 +0000 UTC m=+0.064390676 container create 76f135805438fecc744b1ef2f588233be779b89d35556f2880623d2347301830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx, GIT_BRANCH=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat Ceph Storage 7, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, version=7, vcs-type=git, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_CLEAN=True)
Nov 28 09:50:43 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6be7c41c7335451eaf2119ac66116e5641f59d6dce63812a7e980786f51387cf/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 09:50:43 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6be7c41c7335451eaf2119ac66116e5641f59d6dce63812a7e980786f51387cf/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 09:50:43 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6be7c41c7335451eaf2119ac66116e5641f59d6dce63812a7e980786f51387cf/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 09:50:43 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6be7c41c7335451eaf2119ac66116e5641f59d6dce63812a7e980786f51387cf/merged/var/lib/ceph/mgr/ceph-np0005538513.dsfdlx supports timestamps until 2038 (0x7fffffff)
Nov 28 09:50:43 np0005538513.localdomain podman[286087]: 2025-11-28 09:50:43.133840378 +0000 UTC m=+0.130800943 container init 76f135805438fecc744b1ef2f588233be779b89d35556f2880623d2347301830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=553, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:50:43 np0005538513.localdomain podman[286087]: 2025-11-28 09:50:43.036942286 +0000 UTC m=+0.033902881 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:50:43 np0005538513.localdomain podman[286087]: 2025-11-28 09:50:43.151646529 +0000 UTC m=+0.148607084 container start 76f135805438fecc744b1ef2f588233be779b89d35556f2880623d2347301830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.buildah.version=1.33.12, distribution-scope=public, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-type=git, GIT_CLEAN=True)
Nov 28 09:50:43 np0005538513.localdomain bash[286087]: 76f135805438fecc744b1ef2f588233be779b89d35556f2880623d2347301830
Nov 28 09:50:43 np0005538513.localdomain systemd[1]: Started Ceph mgr.np0005538513.dsfdlx for 2c5417c9-00eb-57d5-a565-ddecbc7995c1.
Nov 28 09:50:43 np0005538513.localdomain ceph-mgr[286105]: set uid:gid to 167:167 (ceph:ceph)
Nov 28 09:50:43 np0005538513.localdomain ceph-mgr[286105]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2
Nov 28 09:50:43 np0005538513.localdomain ceph-mgr[286105]: pidfile_write: ignore empty --pid-file
Nov 28 09:50:43 np0005538513.localdomain sudo[285865]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:43 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'alerts'
Nov 28 09:50:43 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 28 09:50:43 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'balancer'
Nov 28 09:50:43 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:43.324+0000 7fc543976140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 28 09:50:43 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 28 09:50:43 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'cephadm'
Nov 28 09:50:43 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:43.393+0000 7fc543976140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 28 09:50:43 np0005538513.localdomain systemd[1]: tmp-crun.fC1J0g.mount: Deactivated successfully.
Nov 28 09:50:43 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'crash'
Nov 28 09:50:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 28 09:50:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'dashboard'
Nov 28 09:50:44 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:44.026+0000 7fc543976140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 28 09:50:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:44.445 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'devicehealth'
Nov 28 09:50:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 28 09:50:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'diskprediction_local'
Nov 28 09:50:44 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:44.653+0000 7fc543976140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 28 09:50:44 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Nov 28 09:50:44 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Nov 28 09:50:44 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]:   from numpy import show_config as show_numpy_config
Nov 28 09:50:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 28 09:50:44 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:44.801+0000 7fc543976140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 28 09:50:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'influx'
Nov 28 09:50:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 28 09:50:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'insights'
Nov 28 09:50:44 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:44.867+0000 7fc543976140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 28 09:50:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'iostat'
Nov 28 09:50:45 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 28 09:50:45 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'k8sevents'
Nov 28 09:50:45 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:45.002+0000 7fc543976140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 28 09:50:45 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'localpool'
Nov 28 09:50:45 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'mds_autoscaler'
Nov 28 09:50:45 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'mirroring'
Nov 28 09:50:45 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'nfs'
Nov 28 09:50:45 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 28 09:50:45 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'orchestrator'
Nov 28 09:50:45 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:45.796+0000 7fc543976140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 28 09:50:45 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 28 09:50:45 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'osd_perf_query'
Nov 28 09:50:45 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:45.945+0000 7fc543976140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 28 09:50:46 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 28 09:50:46 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'osd_support'
Nov 28 09:50:46 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:46.011+0000 7fc543976140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 28 09:50:46 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 28 09:50:46 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'pg_autoscaler'
Nov 28 09:50:46 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:46.068+0000 7fc543976140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 28 09:50:46 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 28 09:50:46 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:46.139+0000 7fc543976140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 28 09:50:46 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'progress'
Nov 28 09:50:46 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 28 09:50:46 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:46.197+0000 7fc543976140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 28 09:50:46 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'prometheus'
Nov 28 09:50:46 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 28 09:50:46 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'rbd_support'
Nov 28 09:50:46 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:46.490+0000 7fc543976140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 28 09:50:46 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 28 09:50:46 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'restful'
Nov 28 09:50:46 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:46.574+0000 7fc543976140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 28 09:50:46 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'rgw'
Nov 28 09:50:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:50:46 np0005538513.localdomain podman[286135]: 2025-11-28 09:50:46.84155068 +0000 UTC m=+0.077327546 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:50:46 np0005538513.localdomain podman[286135]: 2025-11-28 09:50:46.852384446 +0000 UTC m=+0.088161282 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:50:46 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:50:46 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 28 09:50:46 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'rook'
Nov 28 09:50:46 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:46.898+0000 7fc543976140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 28 09:50:47 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 28 09:50:47 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'selftest'
Nov 28 09:50:47 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:47.331+0000 7fc543976140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 28 09:50:47 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 28 09:50:47 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'snap_schedule'
Nov 28 09:50:47 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:47.390+0000 7fc543976140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 28 09:50:47 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'stats'
Nov 28 09:50:47 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'status'
Nov 28 09:50:47 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module status has missing NOTIFY_TYPES member
Nov 28 09:50:47 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'telegraf'
Nov 28 09:50:47 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:47.582+0000 7fc543976140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Nov 28 09:50:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:47.636 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:47 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 28 09:50:47 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'telemetry'
Nov 28 09:50:47 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:47.642+0000 7fc543976140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 28 09:50:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:50:47 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 28 09:50:47 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:47.779+0000 7fc543976140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 28 09:50:47 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'test_orchestrator'
Nov 28 09:50:47 np0005538513.localdomain podman[286158]: 2025-11-28 09:50:47.846866191 +0000 UTC m=+0.083436576 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 28 09:50:47 np0005538513.localdomain podman[286158]: 2025-11-28 09:50:47.862398513 +0000 UTC m=+0.098968898 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:50:47 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:50:47 np0005538513.localdomain sudo[286170]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:47 np0005538513.localdomain sudo[286170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:47 np0005538513.localdomain sudo[286170]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:47 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 28 09:50:47 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:47.931+0000 7fc543976140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 28 09:50:47 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'volumes'
Nov 28 09:50:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:50:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:50:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:50:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:50:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:50:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:50:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:50:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:50:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:50:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:50:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:50:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:50:48 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 28 09:50:48 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'zabbix'
Nov 28 09:50:48 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:48.131+0000 7fc543976140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 28 09:50:48 np0005538513.localdomain sudo[286195]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:50:48 np0005538513.localdomain sudo[286195]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:48 np0005538513.localdomain sudo[286195]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:48 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 28 09:50:48 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:48.192+0000 7fc543976140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 28 09:50:48 np0005538513.localdomain ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b91e0 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0
Nov 28 09:50:48 np0005538513.localdomain ceph-mgr[286105]: client.0 ms_handle_reset on v2:172.18.0.103:6800/705940825
Nov 28 09:50:48 np0005538513.localdomain sudo[286213]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:50:48 np0005538513.localdomain sudo[286213]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:49 np0005538513.localdomain podman[286303]: 2025-11-28 09:50:49.063700954 +0000 UTC m=+0.095959803 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, release=553, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, name=rhceph, com.redhat.component=rhceph-container, io.buildah.version=1.33.12)
Nov 28 09:50:49 np0005538513.localdomain podman[286303]: 2025-11-28 09:50:49.233601948 +0000 UTC m=+0.265860787 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_CLEAN=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, RELEASE=main, architecture=x86_64, GIT_BRANCH=main, vcs-type=git, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:50:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:49.448 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:49 np0005538513.localdomain sudo[286213]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:49 np0005538513.localdomain sudo[286405]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:50:49 np0005538513.localdomain sudo[286405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:49 np0005538513.localdomain sudo[286405]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:49 np0005538513.localdomain sudo[286423]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:50:49 np0005538513.localdomain sudo[286423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:50 np0005538513.localdomain sudo[286423]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:50:50.828 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:50:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:50:50.828 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:50:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:50:50.830 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:50:51 np0005538513.localdomain sudo[286473]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:51 np0005538513.localdomain sudo[286473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:51 np0005538513.localdomain sudo[286473]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:51 np0005538513.localdomain sudo[286491]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:51 np0005538513.localdomain sudo[286491]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:51 np0005538513.localdomain sudo[286491]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:52 np0005538513.localdomain sudo[286509]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:52 np0005538513.localdomain sudo[286509]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:52 np0005538513.localdomain sudo[286509]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:52.677 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:53 np0005538513.localdomain sudo[286527]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:50:53 np0005538513.localdomain sudo[286527]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:53 np0005538513.localdomain sudo[286527]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:53 np0005538513.localdomain sudo[286545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:50:53 np0005538513.localdomain sudo[286545]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:53 np0005538513.localdomain sudo[286545]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:53 np0005538513.localdomain sudo[286563]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:50:53 np0005538513.localdomain sudo[286563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:53 np0005538513.localdomain sudo[286563]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:53 np0005538513.localdomain sudo[286581]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:50:53 np0005538513.localdomain sudo[286581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:53 np0005538513.localdomain sudo[286581]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:53 np0005538513.localdomain sudo[286599]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:50:53 np0005538513.localdomain sudo[286599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:53 np0005538513.localdomain sudo[286599]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:53 np0005538513.localdomain sudo[286633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:50:53 np0005538513.localdomain sudo[286633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:53 np0005538513.localdomain sudo[286633]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:53 np0005538513.localdomain sudo[286651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:50:53 np0005538513.localdomain sudo[286651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:53 np0005538513.localdomain sudo[286651]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:53 np0005538513.localdomain sudo[286669]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:50:53 np0005538513.localdomain sudo[286669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:53 np0005538513.localdomain sudo[286669]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:53 np0005538513.localdomain sudo[286687]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:50:53 np0005538513.localdomain sudo[286687]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:53 np0005538513.localdomain sudo[286687]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:53 np0005538513.localdomain sudo[286705]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:50:53 np0005538513.localdomain sudo[286705]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:53 np0005538513.localdomain sudo[286705]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:54 np0005538513.localdomain sudo[286723]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:50:54 np0005538513.localdomain sudo[286723]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:54 np0005538513.localdomain sudo[286723]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:54 np0005538513.localdomain sudo[286741]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:50:54 np0005538513.localdomain sudo[286741]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:54 np0005538513.localdomain sudo[286741]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:54 np0005538513.localdomain sudo[286759]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:50:54 np0005538513.localdomain sudo[286759]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:54 np0005538513.localdomain sudo[286759]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:54 np0005538513.localdomain sudo[286793]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:50:54 np0005538513.localdomain sudo[286793]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:54 np0005538513.localdomain sudo[286793]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:54 np0005538513.localdomain sudo[286811]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:50:54 np0005538513.localdomain sudo[286811]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:54 np0005538513.localdomain sudo[286811]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:54 np0005538513.localdomain sudo[286829]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:50:54 np0005538513.localdomain sudo[286829]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:54.485 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:54 np0005538513.localdomain sudo[286829]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:54 np0005538513.localdomain sudo[286847]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:50:54 np0005538513.localdomain sudo[286847]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:54 np0005538513.localdomain sudo[286847]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:54 np0005538513.localdomain sudo[286865]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:50:54 np0005538513.localdomain sudo[286865]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:54 np0005538513.localdomain sudo[286865]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:54 np0005538513.localdomain sudo[286883]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:50:54 np0005538513.localdomain sudo[286883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:54 np0005538513.localdomain sudo[286883]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:54 np0005538513.localdomain sudo[286901]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:50:54 np0005538513.localdomain sudo[286901]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:54 np0005538513.localdomain sudo[286901]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:54 np0005538513.localdomain sudo[286919]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:50:54 np0005538513.localdomain sudo[286919]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:54 np0005538513.localdomain sudo[286919]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:55 np0005538513.localdomain sudo[286953]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:50:55 np0005538513.localdomain sudo[286953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:55 np0005538513.localdomain sudo[286953]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:55 np0005538513.localdomain sudo[286971]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:50:55 np0005538513.localdomain sudo[286971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:55 np0005538513.localdomain sudo[286971]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:55 np0005538513.localdomain sudo[286989]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Nov 28 09:50:55 np0005538513.localdomain sudo[286989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:55 np0005538513.localdomain sudo[286989]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:55 np0005538513.localdomain sudo[287007]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:50:55 np0005538513.localdomain sudo[287007]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:55 np0005538513.localdomain sudo[287007]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:55 np0005538513.localdomain sudo[287025]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:50:55 np0005538513.localdomain sudo[287025]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:55 np0005538513.localdomain sudo[287025]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:55 np0005538513.localdomain sudo[287043]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:50:55 np0005538513.localdomain sudo[287043]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:55 np0005538513.localdomain sudo[287043]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:55 np0005538513.localdomain sudo[287061]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:50:55 np0005538513.localdomain sudo[287061]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:55 np0005538513.localdomain sudo[287061]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:55 np0005538513.localdomain sudo[287079]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:50:55 np0005538513.localdomain sudo[287079]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:55 np0005538513.localdomain sudo[287079]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:55 np0005538513.localdomain sudo[287113]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:50:55 np0005538513.localdomain sudo[287113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:55 np0005538513.localdomain sudo[287113]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:55 np0005538513.localdomain sudo[287131]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:50:55 np0005538513.localdomain sudo[287131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:55 np0005538513.localdomain sudo[287131]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:55 np0005538513.localdomain sudo[287149]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:50:55 np0005538513.localdomain sudo[287149]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:55 np0005538513.localdomain sudo[287149]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:56 np0005538513.localdomain sudo[287167]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:56 np0005538513.localdomain sudo[287167]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:56 np0005538513.localdomain sudo[287167]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:57.677 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:59 np0005538513.localdomain sudo[287185]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:59 np0005538513.localdomain sudo[287185]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:50:59 np0005538513.localdomain sudo[287185]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:50:59.488 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:50:59 np0005538513.localdomain podman[287203]: 2025-11-28 09:50:59.547287611 +0000 UTC m=+0.100791783 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, version=9.6, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Nov 28 09:50:59 np0005538513.localdomain podman[287203]: 2025-11-28 09:50:59.592345627 +0000 UTC m=+0.145849789 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Nov 28 09:50:59 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:51:01 np0005538513.localdomain sshd[287222]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:51:02 np0005538513.localdomain sshd[287222]: Received disconnect from 80.94.93.233 port 29814:11:  [preauth]
Nov 28 09:51:02 np0005538513.localdomain sshd[287222]: Disconnected from authenticating user root 80.94.93.233 port 29814 [preauth]
Nov 28 09:51:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:02.711 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:51:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:04.511 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:51:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:51:04 np0005538513.localdomain podman[287224]: 2025-11-28 09:51:04.842359993 +0000 UTC m=+0.080454003 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:51:04 np0005538513.localdomain podman[287224]: 2025-11-28 09:51:04.854396036 +0000 UTC m=+0.092490026 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:51:04 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:51:05 np0005538513.localdomain ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b91e0 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0
Nov 28 09:51:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:07.717 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:51:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:51:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:51:07 np0005538513.localdomain systemd[1]: tmp-crun.CwGzte.mount: Deactivated successfully.
Nov 28 09:51:07 np0005538513.localdomain podman[287248]: 2025-11-28 09:51:07.864877669 +0000 UTC m=+0.097606284 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 28 09:51:07 np0005538513.localdomain podman[287248]: 2025-11-28 09:51:07.897328124 +0000 UTC m=+0.130056799 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:51:07 np0005538513.localdomain systemd[1]: tmp-crun.wqgQVX.mount: Deactivated successfully.
Nov 28 09:51:07 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:51:07 np0005538513.localdomain podman[287247]: 2025-11-28 09:51:07.915355163 +0000 UTC m=+0.152819975 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:51:07 np0005538513.localdomain podman[287247]: 2025-11-28 09:51:07.952434561 +0000 UTC m=+0.189899373 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 09:51:07 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:51:09 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:09.514 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:51:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:51:09 np0005538513.localdomain podman[287290]: 2025-11-28 09:51:09.852680085 +0000 UTC m=+0.090420182 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:51:09 np0005538513.localdomain podman[287290]: 2025-11-28 09:51:09.867421792 +0000 UTC m=+0.105161939 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 28 09:51:09 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:51:10 np0005538513.localdomain podman[238687]: time="2025-11-28T09:51:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:51:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:51:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 151667 "" "Go-http-client/1.1"
Nov 28 09:51:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:51:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18217 "" "Go-http-client/1.1"
Nov 28 09:51:11 np0005538513.localdomain sudo[287308]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:51:11 np0005538513.localdomain sudo[287308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:11 np0005538513.localdomain sudo[287308]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:11 np0005538513.localdomain sudo[287326]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:11 np0005538513.localdomain sudo[287326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:11 np0005538513.localdomain podman[287387]: 
Nov 28 09:51:11 np0005538513.localdomain podman[287387]: 2025-11-28 09:51:11.749063668 +0000 UTC m=+0.083889279 container create 2b7ebea883864fbecf1719db677961821c9d8f6c97a9d49eac32a21a081e4a98 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_shannon, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=553, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, ceph=True, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12)
Nov 28 09:51:11 np0005538513.localdomain systemd[1]: Started libpod-conmon-2b7ebea883864fbecf1719db677961821c9d8f6c97a9d49eac32a21a081e4a98.scope.
Nov 28 09:51:11 np0005538513.localdomain podman[287387]: 2025-11-28 09:51:11.71392014 +0000 UTC m=+0.048745811 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:51:11 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:51:11 np0005538513.localdomain podman[287387]: 2025-11-28 09:51:11.829998116 +0000 UTC m=+0.164823777 container init 2b7ebea883864fbecf1719db677961821c9d8f6c97a9d49eac32a21a081e4a98 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_shannon, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, GIT_CLEAN=True, GIT_BRANCH=main, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, vcs-type=git, ceph=True, version=7, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, architecture=x86_64)
Nov 28 09:51:11 np0005538513.localdomain systemd[1]: tmp-crun.l9r6y6.mount: Deactivated successfully.
Nov 28 09:51:11 np0005538513.localdomain podman[287387]: 2025-11-28 09:51:11.844051841 +0000 UTC m=+0.178877452 container start 2b7ebea883864fbecf1719db677961821c9d8f6c97a9d49eac32a21a081e4a98 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_shannon, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, version=7, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, release=553, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12)
Nov 28 09:51:11 np0005538513.localdomain podman[287387]: 2025-11-28 09:51:11.844404132 +0000 UTC m=+0.179229753 container attach 2b7ebea883864fbecf1719db677961821c9d8f6c97a9d49eac32a21a081e4a98 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_shannon, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, io.buildah.version=1.33.12, vcs-type=git, release=553, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, com.redhat.component=rhceph-container, architecture=x86_64, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, distribution-scope=public, name=rhceph)
Nov 28 09:51:11 np0005538513.localdomain infallible_shannon[287402]: 167 167
Nov 28 09:51:11 np0005538513.localdomain systemd[1]: libpod-2b7ebea883864fbecf1719db677961821c9d8f6c97a9d49eac32a21a081e4a98.scope: Deactivated successfully.
Nov 28 09:51:11 np0005538513.localdomain podman[287387]: 2025-11-28 09:51:11.848673574 +0000 UTC m=+0.183499215 container died 2b7ebea883864fbecf1719db677961821c9d8f6c97a9d49eac32a21a081e4a98 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_shannon, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, release=553, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.33.12, RELEASE=main, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, ceph=True, GIT_CLEAN=True)
Nov 28 09:51:11 np0005538513.localdomain podman[287407]: 2025-11-28 09:51:11.962438648 +0000 UTC m=+0.099262925 container remove 2b7ebea883864fbecf1719db677961821c9d8f6c97a9d49eac32a21a081e4a98 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_shannon, vcs-type=git, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, architecture=x86_64, CEPH_POINT_RELEASE=, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, GIT_BRANCH=main, ceph=True, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph)
Nov 28 09:51:11 np0005538513.localdomain systemd[1]: libpod-conmon-2b7ebea883864fbecf1719db677961821c9d8f6c97a9d49eac32a21a081e4a98.scope: Deactivated successfully.
Nov 28 09:51:12 np0005538513.localdomain podman[287426]: 
Nov 28 09:51:12 np0005538513.localdomain podman[287426]: 2025-11-28 09:51:12.076478141 +0000 UTC m=+0.082042742 container create d9b8dcc9bd3c0f16abb7eaa0d0cd5d168d0da0d0c609dbade0773bc8e8869934 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_knuth, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, RELEASE=main, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_CLEAN=True, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=)
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: Started libpod-conmon-d9b8dcc9bd3c0f16abb7eaa0d0cd5d168d0da0d0c609dbade0773bc8e8869934.scope.
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:51:12 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86b20cbfc849af5301c83db64e4cb36a417f447d9fd364d088495788c7efb629/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Nov 28 09:51:12 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86b20cbfc849af5301c83db64e4cb36a417f447d9fd364d088495788c7efb629/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Nov 28 09:51:12 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86b20cbfc849af5301c83db64e4cb36a417f447d9fd364d088495788c7efb629/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 09:51:12 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86b20cbfc849af5301c83db64e4cb36a417f447d9fd364d088495788c7efb629/merged/var/lib/ceph/mon/ceph-np0005538513 supports timestamps until 2038 (0x7fffffff)
Nov 28 09:51:12 np0005538513.localdomain podman[287426]: 2025-11-28 09:51:12.041102086 +0000 UTC m=+0.046666727 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:51:12 np0005538513.localdomain podman[287426]: 2025-11-28 09:51:12.140089412 +0000 UTC m=+0.145654003 container init d9b8dcc9bd3c0f16abb7eaa0d0cd5d168d0da0d0c609dbade0773bc8e8869934 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_knuth, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, architecture=x86_64)
Nov 28 09:51:12 np0005538513.localdomain podman[287426]: 2025-11-28 09:51:12.149548354 +0000 UTC m=+0.155112955 container start d9b8dcc9bd3c0f16abb7eaa0d0cd5d168d0da0d0c609dbade0773bc8e8869934 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_knuth, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, GIT_BRANCH=main, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, ceph=True)
Nov 28 09:51:12 np0005538513.localdomain podman[287426]: 2025-11-28 09:51:12.149867364 +0000 UTC m=+0.155431995 container attach d9b8dcc9bd3c0f16abb7eaa0d0cd5d168d0da0d0c609dbade0773bc8e8869934 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_knuth, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vcs-type=git, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.expose-services=)
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: libpod-d9b8dcc9bd3c0f16abb7eaa0d0cd5d168d0da0d0c609dbade0773bc8e8869934.scope: Deactivated successfully.
Nov 28 09:51:12 np0005538513.localdomain podman[287426]: 2025-11-28 09:51:12.253835114 +0000 UTC m=+0.259399755 container died d9b8dcc9bd3c0f16abb7eaa0d0cd5d168d0da0d0c609dbade0773bc8e8869934 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_knuth, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.buildah.version=1.33.12, ceph=True, io.openshift.tags=rhceph ceph, RELEASE=main, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_CLEAN=True)
Nov 28 09:51:12 np0005538513.localdomain ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b8f20 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0
Nov 28 09:51:12 np0005538513.localdomain podman[287467]: 2025-11-28 09:51:12.351095657 +0000 UTC m=+0.085006183 container remove d9b8dcc9bd3c0f16abb7eaa0d0cd5d168d0da0d0c609dbade0773bc8e8869934 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_knuth, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, vcs-type=git, CEPH_POINT_RELEASE=, release=553, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, name=rhceph, architecture=x86_64, com.redhat.component=rhceph-container, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_CLEAN=True, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, RELEASE=main, vendor=Red Hat, Inc., version=7)
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: libpod-conmon-d9b8dcc9bd3c0f16abb7eaa0d0cd5d168d0da0d0c609dbade0773bc8e8869934.scope: Deactivated successfully.
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:51:12 np0005538513.localdomain systemd-rc-local-generator[287504]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:51:12 np0005538513.localdomain systemd-sysv-generator[287509]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-06e86172f19db509740997f9d4331839abf2fe8a6e515afd8687873657279d09-merged.mount: Deactivated successfully.
Nov 28 09:51:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:12.746 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:51:12 np0005538513.localdomain systemd-sysv-generator[287553]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:51:12 np0005538513.localdomain systemd-rc-local-generator[287548]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:12 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:13 np0005538513.localdomain systemd[1]: Starting Ceph mon.np0005538513 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1...
Nov 28 09:51:13 np0005538513.localdomain podman[287610]: 
Nov 28 09:51:13 np0005538513.localdomain podman[287610]: 2025-11-28 09:51:13.52043683 +0000 UTC m=+0.085051386 container create b72bfc005269ade02879c161a498dd471b1542ee13d035dbcf0188cb36c61613 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538513, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, distribution-scope=public, com.redhat.component=rhceph-container, vcs-type=git, GIT_BRANCH=main, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, version=7, vendor=Red Hat, Inc., name=rhceph, ceph=True)
Nov 28 09:51:13 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa77a2e7ddc389a4da392f7d457efdd2cb70596fdbafdc2c079e625848050438/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 09:51:13 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa77a2e7ddc389a4da392f7d457efdd2cb70596fdbafdc2c079e625848050438/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 09:51:13 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa77a2e7ddc389a4da392f7d457efdd2cb70596fdbafdc2c079e625848050438/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 09:51:13 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa77a2e7ddc389a4da392f7d457efdd2cb70596fdbafdc2c079e625848050438/merged/var/lib/ceph/mon/ceph-np0005538513 supports timestamps until 2038 (0x7fffffff)
Nov 28 09:51:13 np0005538513.localdomain podman[287610]: 2025-11-28 09:51:13.581509181 +0000 UTC m=+0.146123737 container init b72bfc005269ade02879c161a498dd471b1542ee13d035dbcf0188cb36c61613 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538513, vcs-type=git, io.buildah.version=1.33.12, release=553, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, build-date=2025-09-24T08:57:55, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.expose-services=)
Nov 28 09:51:13 np0005538513.localdomain podman[287610]: 2025-11-28 09:51:13.487976744 +0000 UTC m=+0.052591340 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:51:13 np0005538513.localdomain podman[287610]: 2025-11-28 09:51:13.594101152 +0000 UTC m=+0.158715718 container start b72bfc005269ade02879c161a498dd471b1542ee13d035dbcf0188cb36c61613 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538513, io.buildah.version=1.33.12, release=553, distribution-scope=public, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 09:51:13 np0005538513.localdomain bash[287610]: b72bfc005269ade02879c161a498dd471b1542ee13d035dbcf0188cb36c61613
Nov 28 09:51:13 np0005538513.localdomain systemd[1]: Started Ceph mon.np0005538513 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1.
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: set uid:gid to 167:167 (ceph:ceph)
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: pidfile_write: ignore empty --pid-file
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: load: jerasure load: lrc 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: RocksDB version: 7.9.2
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: Git sha 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: Compile date 2025-09-23 00:00:00
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: DB SUMMARY
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: DB Session ID:  ND9860OIBZS6OJ35KIN7
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: CURRENT file:  CURRENT
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: IDENTITY file:  IDENTITY
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005538513/store.db dir, Total Num: 0, files: 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005538513/store.db: 000004.log size: 886 ; 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                         Options.error_if_exists: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                       Options.create_if_missing: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                         Options.paranoid_checks: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                                     Options.env: 0x556f56dc39e0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                                      Options.fs: PosixFileSystem
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                                Options.info_log: 0x556f59286d20
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                Options.max_file_opening_threads: 16
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                              Options.statistics: (nil)
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                               Options.use_fsync: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                       Options.max_log_file_size: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                         Options.allow_fallocate: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                        Options.use_direct_reads: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:          Options.create_missing_column_families: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                              Options.db_log_dir: 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                                 Options.wal_dir: 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                   Options.advise_random_on_open: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                    Options.write_buffer_manager: 0x556f59297540
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                            Options.rate_limiter: (nil)
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                  Options.unordered_write: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                               Options.row_cache: None
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                              Options.wal_filter: None
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:             Options.allow_ingest_behind: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:             Options.two_write_queues: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:             Options.manual_wal_flush: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:             Options.wal_compression: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:             Options.atomic_flush: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                 Options.log_readahead_size: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:             Options.allow_data_in_errors: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:             Options.db_host_id: __hostname__
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:             Options.max_background_jobs: 2
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:             Options.max_background_compactions: -1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:             Options.max_subcompactions: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:             Options.max_total_wal_size: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                          Options.max_open_files: -1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                          Options.bytes_per_sync: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:       Options.compaction_readahead_size: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                  Options.max_background_flushes: -1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: Compression algorithms supported:
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:         kZSTD supported: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:         kXpressCompression supported: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:         kBZip2Compression supported: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:         kLZ4Compression supported: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:         kZlibCompression supported: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:         kLZ4HCCompression supported: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:         kSnappyCompression supported: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005538513/store.db/MANIFEST-000005
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:           Options.merge_operator: 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:        Options.compaction_filter: None
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x556f59286980)
                                                             cache_index_and_filter_blocks: 1
                                                             cache_index_and_filter_blocks_with_high_priority: 0
                                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                                             pin_top_level_index_and_filter: 1
                                                             index_type: 0
                                                             data_block_index_type: 0
                                                             index_shortening: 1
                                                             data_block_hash_table_util_ratio: 0.750000
                                                             checksum: 4
                                                             no_block_cache: 0
                                                             block_cache: 0x556f59283350
                                                             block_cache_name: BinnedLRUCache
                                                             block_cache_options:
                                                               capacity : 536870912
                                                               num_shard_bits : 4
                                                               strict_capacity_limit : 0
                                                               high_pri_pool_ratio: 0.000
                                                             block_cache_compressed: (nil)
                                                             persistent_cache: (nil)
                                                             block_size: 4096
                                                             block_size_deviation: 10
                                                             block_restart_interval: 16
                                                             index_block_restart_interval: 1
                                                             metadata_block_size: 4096
                                                             partition_filters: 0
                                                             use_delta_encoding: 1
                                                             filter_policy: bloomfilter
                                                             whole_key_filtering: 1
                                                             verify_compression: 0
                                                             read_amp_bytes_per_bit: 0
                                                             format_version: 5
                                                             enable_index_compression: 1
                                                             block_align: 0
                                                             max_auto_readahead_size: 262144
                                                             prepopulate_block_cache: 0
                                                             initial_auto_readahead_size: 8192
                                                             num_file_reads_for_auto_readahead: 2
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:        Options.write_buffer_size: 33554432
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:  Options.max_write_buffer_number: 2
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:          Options.compression: NoCompression
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:             Options.num_levels: 7
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                   Options.table_properties_collectors: 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                           Options.bloom_locality: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                               Options.ttl: 2592000
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                       Options.enable_blob_files: false
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                           Options.min_blob_size: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 09:51:13 np0005538513.localdomain sudo[287326]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005538513/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 53876326-3f1b-4342-b386-ddefe9bbd825
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323473641079, "job": 1, "event": "recovery_started", "wal_files": [4]}
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323473643656, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 2012, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 898, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 776, "raw_average_value_size": 155, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323473, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "53876326-3f1b-4342-b386-ddefe9bbd825", "db_session_id": "ND9860OIBZS6OJ35KIN7", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323473643813, "job": 1, "event": "recovery_finished"}
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x556f592aae00
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: DB pointer 0x556f593a0000
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513 does not exist in monmap, will attempt to join an existing cluster
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      1/0    1.96 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                            Sum      1/0    1.96 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Flush(GB): cumulative 0.000, interval 0.000
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Interval compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x556f59283350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 2.8e-05 secs_since: 0
                                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: using public_addr v2:172.18.0.106:0/0 -> [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0]
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: starting mon.np0005538513 rank -1 at public addrs [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] at bind addrs [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005538513 fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@-1(???) e0 preinit fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@-1(synchronizing) e5 sync_obtain_latest_monmap
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@-1(synchronizing) e5 sync_obtain_latest_monmap obtained monmap e5
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@-1(synchronizing).mds e17 new map
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@-1(synchronizing).mds e17 print_map
                                                           e17
                                                           enable_multiple, ever_enabled_multiple: 1,1
                                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           legacy client fscid: 1
                                                            
                                                           Filesystem 'cephfs' (1)
                                                           fs_name        cephfs
                                                           epoch        15
                                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                                           created        2025-11-28T08:07:30.958224+0000
                                                           modified        2025-11-28T09:49:53.259185+0000
                                                           tableserver        0
                                                           root        0
                                                           session_timeout        60
                                                           session_autoclose        300
                                                           max_file_size        1099511627776
                                                           required_client_features        {}
                                                           last_failure        0
                                                           last_failure_osd_epoch        83
                                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           max_mds        1
                                                           in        0
                                                           up        {0=26449}
                                                           failed        
                                                           damaged        
                                                           stopped        
                                                           data_pools        [6]
                                                           metadata_pool        7
                                                           inline_data        disabled
                                                           balancer        
                                                           bal_rank_mask        -1
                                                           standby_count_wanted        1
                                                           qdb_cluster        leader: 26449 members: 26449
                                                           [mds.mds.np0005538514.umgtoy{0:26449} state up:active seq 12 addr [v2:172.18.0.107:6808/1969410151,v1:172.18.0.107:6809/1969410151] compat {c=[1],r=[1],i=[17ff]}]
                                                            
                                                            
                                                           Standby daemons:
                                                            
                                                           [mds.mds.np0005538513.yljthc{-1:16968} state up:standby seq 1 addr [v2:172.18.0.106:6808/2782735008,v1:172.18.0.106:6809/2782735008] compat {c=[1],r=[1],i=[17ff]}]
                                                           [mds.mds.np0005538515.anvatb{-1:26446} state up:standby seq 1 addr [v2:172.18.0.108:6808/2640180,v1:172.18.0.108:6809/2640180] compat {c=[1],r=[1],i=[17ff]}]
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@-1(synchronizing).osd e85 crush map has features 3314933000852226048, adjusting msgr requires
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@-1(synchronizing).osd e85 crush map has features 288514051259236352, adjusting msgr requires
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@-1(synchronizing).osd e85 crush map has features 288514051259236352, adjusting msgr requires
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@-1(synchronizing).osd e85 crush map has features 288514051259236352, adjusting msgr requires
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Added label mgr to host np0005538513.localdomain
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: pgmap v3816: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='client.17100 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538514.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Added label mgr to host np0005538514.localdomain
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='client.17106 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538515.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Added label mgr to host np0005538515.localdomain
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: pgmap v3817: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='client.17112 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Saving service mgr spec with placement label:mgr
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Deploying daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: pgmap v3818: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='client.17118 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mgr", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Deploying daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: pgmap v3819: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='client.17130 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538510.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Added label mon to host np0005538510.localdomain
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='client.17136 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538510.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Added label _admin to host np0005538510.localdomain
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Deploying daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: pgmap v3820: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='client.17148 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538511.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Added label mon to host np0005538511.localdomain
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: pgmap v3821: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='client.17160 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538511.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Added label _admin to host np0005538511.localdomain
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Standby manager daemon np0005538513.dsfdlx started
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: mgrmap e12: np0005538510.nzitwz(active, since 2h), standbys: np0005538512.zyhkxs, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mgr metadata", "who": "np0005538513.dsfdlx", "id": "np0005538513.dsfdlx"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='client.17166 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538512.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Added label mon to host np0005538512.localdomain
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: pgmap v3822: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Standby manager daemon np0005538514.djozup started
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='client.17172 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538512.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Added label _admin to host np0005538512.localdomain
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: mgrmap e13: np0005538510.nzitwz(active, since 2h), standbys: np0005538512.zyhkxs, np0005538514.djozup, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mgr metadata", "who": "np0005538514.djozup", "id": "np0005538514.djozup"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='client.17178 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538513.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Added label mon to host np0005538513.localdomain
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: pgmap v3823: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Standby manager daemon np0005538515.yfkzhl started
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='client.17184 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538513.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Added label _admin to host np0005538513.localdomain
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: mgrmap e14: np0005538510.nzitwz(active, since 2h), standbys: np0005538512.zyhkxs, np0005538514.djozup, np0005538515.yfkzhl, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mgr metadata", "who": "np0005538515.yfkzhl", "id": "np0005538515.yfkzhl"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: pgmap v3824: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='client.17190 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538514.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Added label mon to host np0005538514.localdomain
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='client.17196 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538514.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Added label _admin to host np0005538514.localdomain
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: pgmap v3825: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='client.17202 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538515.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Added label mon to host np0005538515.localdomain
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: pgmap v3826: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='client.17208 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538515.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Added label _admin to host np0005538515.localdomain
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='client.17214 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Saving service mon spec with placement label:mon
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: pgmap v3827: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='client.26635 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538513", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: pgmap v3828: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Deploying daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: pgmap v3829: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Deploying daemon mon.np0005538514 on np0005538514.localdomain
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: pgmap v3830: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538510"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538511"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: mon.np0005538510 calling monitor election
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: mon.np0005538512 calling monitor election
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: mon.np0005538511 calling monitor election
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: pgmap v3831: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: mon.np0005538515 calling monitor election
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: pgmap v3832: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: mon.np0005538510 is new leader, mons np0005538510,np0005538512,np0005538511,np0005538515 in quorum (ranks 0,1,2,3)
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: monmap epoch 4
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: last_changed 2025-11-28T09:51:05.886382+0000
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: min_mon_release 18 (reef)
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: election_strategy: 1
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538510
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538512
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538511
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: osdmap e85: 6 total, 6 up, 6 in
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: mgrmap e14: np0005538510.nzitwz(active, since 2h), standbys: np0005538512.zyhkxs, np0005538514.djozup, np0005538515.yfkzhl, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: overall HEALTH_OK
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: Deploying daemon mon.np0005538513 on np0005538513.localdomain
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='client.17228 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538513", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:51:13 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@-1(synchronizing).paxosservice(auth 1..34) refresh upgraded, format 0 -> 3
Nov 28 09:51:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:14.539 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:51:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:51:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:17.752 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:51:17 np0005538513.localdomain sudo[287674]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:51:17 np0005538513.localdomain sudo[287674]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:17 np0005538513.localdomain podman[287668]: 2025-11-28 09:51:17.830242143 +0000 UTC m=+0.073469257 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:51:17 np0005538513.localdomain sudo[287674]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:17 np0005538513.localdomain podman[287668]: 2025-11-28 09:51:17.835100324 +0000 UTC m=+0.078327368 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:51:17 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:51:17 np0005538513.localdomain ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b9600 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0
Nov 28 09:51:18 np0005538513.localdomain sudo[287708]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:51:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:51:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:51:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:51:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:51:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:51:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:51:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:51:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:51:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:51:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:51:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:51:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:51:18 np0005538513.localdomain sudo[287708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:51:18 np0005538513.localdomain sudo[287708]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:18 np0005538513.localdomain sudo[287732]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:51:18 np0005538513.localdomain sudo[287732]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:18 np0005538513.localdomain podman[287726]: 2025-11-28 09:51:18.202222916 +0000 UTC m=+0.092139805 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:51:18 np0005538513.localdomain podman[287726]: 2025-11-28 09:51:18.23947576 +0000 UTC m=+0.129392599 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125)
Nov 28 09:51:18 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:51:19 np0005538513.localdomain systemd[1]: tmp-crun.G2Ewit.mount: Deactivated successfully.
Nov 28 09:51:19 np0005538513.localdomain podman[287832]: 2025-11-28 09:51:19.080617355 +0000 UTC m=+0.105449727 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, vcs-type=git, vendor=Red Hat, Inc., release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.tags=rhceph ceph, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:51:19 np0005538513.localdomain podman[287832]: 2025-11-28 09:51:19.190711156 +0000 UTC m=+0.215543548 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-type=git, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_BRANCH=main, vendor=Red Hat, Inc., architecture=x86_64, name=rhceph, version=7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container)
Nov 28 09:51:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:19.542 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:51:19 np0005538513.localdomain sudo[287732]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:19 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@-1(probing) e6  my rank is now 5 (was -1)
Nov 28 09:51:19 np0005538513.localdomain ceph-mon[287629]: log_channel(cluster) log [INF] : mon.np0005538513 calling monitor election
Nov 28 09:51:19 np0005538513.localdomain ceph-mon[287629]: paxos.5).electionLogic(0) init, first boot, initializing epoch at 1 
Nov 28 09:51:19 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:22.792 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: pgmap v3833: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: mon.np0005538511 calling monitor election
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538510"} : dispatch
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538511"} : dispatch
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: mon.np0005538510 calling monitor election
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: mon.np0005538515 calling monitor election
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: mon.np0005538512 calling monitor election
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: pgmap v3834: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: mon.np0005538514 calling monitor election
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: pgmap v3835: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: mon.np0005538510 is new leader, mons np0005538510,np0005538512,np0005538511,np0005538515,np0005538514 in quorum (ranks 0,1,2,3,4)
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: monmap epoch 5
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: last_changed 2025-11-28T09:51:12.314668+0000
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: min_mon_release 18 (reef)
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: election_strategy: 1
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538510
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538512
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538511
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: 4: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005538514
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: osdmap e85: 6 total, 6 up, 6 in
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: mgrmap e14: np0005538510.nzitwz(active, since 2h), standbys: np0005538512.zyhkxs, np0005538514.djozup, np0005538515.yfkzhl, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: overall HEALTH_OK
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:22 np0005538513.localdomain ceph-mon[287629]: mgrc update_daemon_metadata mon.np0005538513 metadata {addrs=[v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005538513.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.6 (Plow),distro_version=9.6,hostname=np0005538513.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux}
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: from='client.26646 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538513", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: pgmap v3836: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: mon.np0005538511 calling monitor election
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: mon.np0005538515 calling monitor election
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538510"} : dispatch
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538511"} : dispatch
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: mon.np0005538510 calling monitor election
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: mon.np0005538512 calling monitor election
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: mon.np0005538514 calling monitor election
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: pgmap v3837: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513 calling monitor election
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: pgmap v3838: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: mon.np0005538510 is new leader, mons np0005538510,np0005538512,np0005538511,np0005538515,np0005538514,np0005538513 in quorum (ranks 0,1,2,3,4,5)
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: monmap epoch 6
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: last_changed 2025-11-28T09:51:17.896997+0000
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: min_mon_release 18 (reef)
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: election_strategy: 1
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538510
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538512
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538511
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: 4: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005538514
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: 5: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005538513
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: osdmap e85: 6 total, 6 up, 6 in
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: mgrmap e14: np0005538510.nzitwz(active, since 2h), standbys: np0005538512.zyhkxs, np0005538514.djozup, np0005538515.yfkzhl, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: overall HEALTH_OK
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:23 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:23 np0005538513.localdomain sudo[287951]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:51:23 np0005538513.localdomain sudo[287951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:23 np0005538513.localdomain sudo[287951]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:23 np0005538513.localdomain sudo[287969]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:51:23 np0005538513.localdomain sudo[287969]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:23 np0005538513.localdomain sudo[287969]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:23 np0005538513.localdomain sudo[287987]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:51:23 np0005538513.localdomain sudo[287987]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:23 np0005538513.localdomain sudo[287987]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:23 np0005538513.localdomain sudo[288005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:23 np0005538513.localdomain sudo[288005]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:23 np0005538513.localdomain sudo[288005]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:23 np0005538513.localdomain sudo[288023]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:51:23 np0005538513.localdomain sudo[288023]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:23 np0005538513.localdomain sudo[288023]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:23 np0005538513.localdomain sudo[288057]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:51:23 np0005538513.localdomain sudo[288057]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:23 np0005538513.localdomain sudo[288057]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:23 np0005538513.localdomain sudo[288075]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:51:23 np0005538513.localdomain sudo[288075]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:23 np0005538513.localdomain sudo[288075]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:23 np0005538513.localdomain sudo[288093]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:51:23 np0005538513.localdomain sudo[288093]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:23 np0005538513.localdomain sudo[288093]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:23 np0005538513.localdomain sudo[288111]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:51:23 np0005538513.localdomain sudo[288111]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:23 np0005538513.localdomain sudo[288111]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:24 np0005538513.localdomain sudo[288129]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:51:24 np0005538513.localdomain sudo[288129]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:24 np0005538513.localdomain sudo[288129]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:24 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:24 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:24 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:24 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:24 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:24 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:24 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:24 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config rm", "who": "osd/host:np0005538510", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:24 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:24 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:24 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:24 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:24 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:24 np0005538513.localdomain ceph-mon[287629]: Updating np0005538510.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:24 np0005538513.localdomain ceph-mon[287629]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:24 np0005538513.localdomain ceph-mon[287629]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:24 np0005538513.localdomain ceph-mon[287629]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:24 np0005538513.localdomain ceph-mon[287629]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:24 np0005538513.localdomain ceph-mon[287629]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:24 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:24 np0005538513.localdomain sudo[288147]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:51:24 np0005538513.localdomain sudo[288147]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:24 np0005538513.localdomain sudo[288147]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:24 np0005538513.localdomain sudo[288165]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:24 np0005538513.localdomain sudo[288165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:24 np0005538513.localdomain sudo[288165]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:24 np0005538513.localdomain sudo[288183]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:51:24 np0005538513.localdomain sudo[288183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:24 np0005538513.localdomain sudo[288183]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:24 np0005538513.localdomain sudo[288217]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:51:24 np0005538513.localdomain sudo[288217]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:24 np0005538513.localdomain sudo[288217]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:24 np0005538513.localdomain sudo[288235]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:51:24 np0005538513.localdomain sudo[288235]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:24 np0005538513.localdomain sudo[288235]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:24.580 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:51:24 np0005538513.localdomain sudo[288253]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:24 np0005538513.localdomain sudo[288253]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:24 np0005538513.localdomain sudo[288253]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:25 np0005538513.localdomain sudo[288271]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:51:25 np0005538513.localdomain sudo[288271]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:25 np0005538513.localdomain sudo[288271]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: Updating np0005538510.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: pgmap v3839: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: from='client.17256 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538513", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:51:25 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:26 np0005538513.localdomain ceph-mon[287629]: Reconfiguring mon.np0005538510 (monmap changed)...
Nov 28 09:51:26 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon mon.np0005538510 on np0005538510.localdomain
Nov 28 09:51:26 np0005538513.localdomain ceph-mon[287629]: from='client.34103 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538514", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:51:26 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:26 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:26 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538510.nzitwz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:26 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:51:26 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:27 np0005538513.localdomain ceph-mon[287629]: pgmap v3840: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:27 np0005538513.localdomain ceph-mon[287629]: Reconfiguring mgr.np0005538510.nzitwz (monmap changed)...
Nov 28 09:51:27 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon mgr.np0005538510.nzitwz on np0005538510.localdomain
Nov 28 09:51:27 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:27 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:27 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538510.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:51:27 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:27.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:51:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:27.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:51:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:27.795 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:51:28 np0005538513.localdomain ceph-mon[287629]: from='client.26677 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538515", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:51:28 np0005538513.localdomain ceph-mon[287629]: Reconfiguring crash.np0005538510 (monmap changed)...
Nov 28 09:51:28 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon crash.np0005538510 on np0005538510.localdomain
Nov 28 09:51:28 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:28 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:28 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:28 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538511.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:51:28 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:29 np0005538513.localdomain ceph-mon[287629]: pgmap v3841: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:29 np0005538513.localdomain ceph-mon[287629]: Reconfiguring crash.np0005538511 (monmap changed)...
Nov 28 09:51:29 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon crash.np0005538511 on np0005538511.localdomain
Nov 28 09:51:29 np0005538513.localdomain ceph-mon[287629]: from='client.? 172.18.0.103:0/1822508892' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 28 09:51:29 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:29 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:29 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:51:29 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:51:29 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:29.583 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:51:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:51:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:29.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:51:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:29.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:51:29 np0005538513.localdomain podman[288289]: 2025-11-28 09:51:29.85954706 +0000 UTC m=+0.089711960 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, config_id=edpm, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Nov 28 09:51:29 np0005538513.localdomain podman[288289]: 2025-11-28 09:51:29.903494022 +0000 UTC m=+0.133658932 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, release=1755695350, managed_by=edpm_ansible, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=)
Nov 28 09:51:29 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:51:30 np0005538513.localdomain ceph-mon[287629]: Reconfiguring mon.np0005538511 (monmap changed)...
Nov 28 09:51:30 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon mon.np0005538511 on np0005538511.localdomain
Nov 28 09:51:30 np0005538513.localdomain ceph-mon[287629]: pgmap v3842: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:30 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:30 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:30 np0005538513.localdomain ceph-mon[287629]: Reconfiguring mgr.np0005538511.fvuybw (monmap changed)...
Nov 28 09:51:30 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538511.fvuybw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:30 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:51:30 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:30 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon mgr.np0005538511.fvuybw on np0005538511.localdomain
Nov 28 09:51:30 np0005538513.localdomain ceph-mon[287629]: from='client.? 172.18.0.103:0/748718462' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Nov 28 09:51:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:30.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:51:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:30.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:51:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:31.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:51:31 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:31 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:31 np0005538513.localdomain ceph-mon[287629]: Reconfiguring mon.np0005538512 (monmap changed)...
Nov 28 09:51:31 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:51:31 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:51:31 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:31 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain
Nov 28 09:51:31 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:31 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:31 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:31 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:51:31 np0005538513.localdomain ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mgr fail"} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [INF] : from='client.? 172.18.0.103:0/3703486687' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon).osd e85 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon).osd e85 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon).osd e86 e86: 6 total, 6 up, 6 in
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538510"} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538510"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538511"} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538511"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538512"} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538513"} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005538513.yljthc"} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mds metadata", "who": "mds.np0005538513.yljthc"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005538515.anvatb"} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mds metadata", "who": "mds.np0005538515.anvatb"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005538514.umgtoy"} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mds metadata", "who": "mds.np0005538514.umgtoy"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon).mds e17 all = 0
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon).mds e17 all = 0
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon).mds e17 all = 0
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005538512.zyhkxs", "id": "np0005538512.zyhkxs"} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538512.zyhkxs", "id": "np0005538512.zyhkxs"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005538514.djozup", "id": "np0005538514.djozup"} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538514.djozup", "id": "np0005538514.djozup"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005538515.yfkzhl", "id": "np0005538515.yfkzhl"} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538515.yfkzhl", "id": "np0005538515.yfkzhl"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005538511.fvuybw", "id": "np0005538511.fvuybw"} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538511.fvuybw", "id": "np0005538511.fvuybw"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005538513.dsfdlx", "id": "np0005538513.dsfdlx"} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538513.dsfdlx", "id": "np0005538513.dsfdlx"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "osd metadata", "id": 3} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "osd metadata", "id": 4} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "osd metadata", "id": 5} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mds metadata"} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mds metadata"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon).mds e17 all = 1
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "osd metadata"} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mon metadata"} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain sshd[26471]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538513.localdomain sshd[26435]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538513.localdomain sshd[26340]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538513.localdomain sshd[26282]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538513.localdomain sshd[26299]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538513.localdomain systemd[1]: session-14.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538513.localdomain systemd[1]: session-18.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538513.localdomain systemd[1]: session-23.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538513.localdomain systemd[1]: session-25.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Session 25 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538513.localdomain systemd[1]: session-16.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Session 23 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Session 14 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Session 18 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Session 16 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538513.localdomain sshd[26397]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538513.localdomain systemd[1]: session-17.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538513.localdomain sshd[26321]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538513.localdomain systemd[1]: session-26.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538513.localdomain sshd[26490]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538513.localdomain systemd[1]: session-26.scope: Consumed 3min 26.962s CPU time.
Nov 28 09:51:32 np0005538513.localdomain sshd[26378]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538513.localdomain systemd[1]: session-21.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538513.localdomain sshd[26454]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538513.localdomain systemd[1]: session-24.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538513.localdomain sshd[26416]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Session 21 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538513.localdomain systemd[1]: session-20.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538513.localdomain systemd[1]: session-22.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Session 17 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Session 26 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Session 20 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Session 22 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Session 24 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538513.localdomain sshd[26359]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Removed session 14.
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Removed session 18.
Nov 28 09:51:32 np0005538513.localdomain systemd[1]: session-19.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Session 19 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Removed session 23.
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Removed session 25.
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Removed session 16.
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Removed session 17.
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Removed session 26.
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Removed session 21.
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Removed session 24.
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Removed session 20.
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Removed session 22.
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: Removed session 19.
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538512.zyhkxs/mirror_snapshot_schedule"} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538512.zyhkxs/mirror_snapshot_schedule"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538512.zyhkxs/trash_purge_schedule"} v 0)
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538512.zyhkxs/trash_purge_schedule"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain sshd[288311]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:51:32 np0005538513.localdomain sshd[288311]: Accepted publickey for ceph-admin from 192.168.122.105 port 48392 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 09:51:32 np0005538513.localdomain systemd-logind[764]: New session 64 of user ceph-admin.
Nov 28 09:51:32 np0005538513.localdomain systemd[1]: Started Session 64 of User ceph-admin.
Nov 28 09:51:32 np0005538513.localdomain sshd[288311]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 09:51:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:32.799 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:51:32 np0005538513.localdomain sudo[288315]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:51:32 np0005538513.localdomain sudo[288315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:32 np0005538513.localdomain sudo[288315]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='client.? 172.18.0.103:0/3703486687' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: Activating manager daemon np0005538512.zyhkxs
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538510"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538511"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mds metadata", "who": "mds.np0005538513.yljthc"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mds metadata", "who": "mds.np0005538515.anvatb"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mds metadata", "who": "mds.np0005538514.umgtoy"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538512.zyhkxs", "id": "np0005538512.zyhkxs"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538514.djozup", "id": "np0005538514.djozup"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: osdmap e86: 6 total, 6 up, 6 in
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538515.yfkzhl", "id": "np0005538515.yfkzhl"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538511.fvuybw", "id": "np0005538511.fvuybw"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: mgrmap e15: np0005538512.zyhkxs(active, starting, since 0.0649225s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538513.dsfdlx", "id": "np0005538513.dsfdlx"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mds metadata"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: Manager daemon np0005538512.zyhkxs is now available
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538512.zyhkxs/mirror_snapshot_schedule"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538512.zyhkxs/mirror_snapshot_schedule"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538512.zyhkxs/trash_purge_schedule"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538512.zyhkxs/trash_purge_schedule"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain ceph-mon[287629]: from='client.? 172.18.0.108:0/1096407890' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:51:32 np0005538513.localdomain sudo[288333]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:51:32 np0005538513.localdomain sudo[288333]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:33 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon).osd e86 _set_new_cache_sizes cache_size:1019691232 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:51:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:33.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:51:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:33.791 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:51:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:33.792 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:51:33 np0005538513.localdomain podman[288426]: 2025-11-28 09:51:33.809823405 +0000 UTC m=+0.108971777 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, vcs-type=git, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:51:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:33.817 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:51:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:33.817 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:51:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:33.818 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:51:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:33.818 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:51:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:33.819 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:51:33 np0005538513.localdomain podman[288426]: 2025-11-28 09:51:33.913535488 +0000 UTC m=+0.212683800 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., GIT_CLEAN=True, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.expose-services=, RELEASE=main, ceph=True, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, release=553, name=rhceph, version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:51:34 np0005538513.localdomain ceph-mon[287629]: mgrmap e16: np0005538512.zyhkxs(active, since 1.0929s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:34 np0005538513.localdomain ceph-mon[287629]: from='client.? 172.18.0.107:0/1691971170' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:51:34 np0005538513.localdomain ceph-mon[287629]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:34 np0005538513.localdomain ceph-mon[287629]: from='client.? 172.18.0.108:0/1013956769' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:51:34 np0005538513.localdomain ceph-mon[287629]: from='client.? 172.18.0.107:0/1839703657' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:51:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:34.278 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:51:34 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538511.localdomain.devices.0}] v 0)
Nov 28 09:51:34 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538511.localdomain}] v 0)
Nov 28 09:51:34 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538510.localdomain.devices.0}] v 0)
Nov 28 09:51:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:34.366 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:51:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:34.367 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:51:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:34.613 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:51:34 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain.devices.0}] v 0)
Nov 28 09:51:34 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:51:34 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:51:34 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538510.localdomain}] v 0)
Nov 28 09:51:34 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain}] v 0)
Nov 28 09:51:34 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:51:34 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:51:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:34.968 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:51:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:34.969 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11793MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:51:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:34.969 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:51:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:34.969 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:51:34 np0005538513.localdomain sudo[288333]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:34 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:51:35 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:51:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:35.054 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:51:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:35.054 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:51:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:35.054 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:51:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:35.091 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:51:35 np0005538513.localdomain sudo[288569]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:51:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:51:35 np0005538513.localdomain sudo[288569]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:35 np0005538513.localdomain sudo[288569]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:35 np0005538513.localdomain sudo[288589]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:51:35 np0005538513.localdomain sudo[288589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:35 np0005538513.localdomain podman[288587]: 2025-11-28 09:51:35.208060828 +0000 UTC m=+0.063227719 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:51:35 np0005538513.localdomain podman[288587]: 2025-11-28 09:51:35.221251007 +0000 UTC m=+0.076417898 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:51:35 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:51:35 np0005538513.localdomain ceph-mon[287629]: [28/Nov/2025:09:51:33] ENGINE Bus STARTING
Nov 28 09:51:35 np0005538513.localdomain ceph-mon[287629]: [28/Nov/2025:09:51:33] ENGINE Serving on https://172.18.0.105:7150
Nov 28 09:51:35 np0005538513.localdomain ceph-mon[287629]: [28/Nov/2025:09:51:33] ENGINE Client ('172.18.0.105', 40464) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 28 09:51:35 np0005538513.localdomain ceph-mon[287629]: [28/Nov/2025:09:51:33] ENGINE Serving on http://172.18.0.105:8765
Nov 28 09:51:35 np0005538513.localdomain ceph-mon[287629]: [28/Nov/2025:09:51:33] ENGINE Bus STARTED
Nov 28 09:51:35 np0005538513.localdomain ceph-mon[287629]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:35 np0005538513.localdomain ceph-mon[287629]: mgrmap e17: np0005538512.zyhkxs(active, since 2s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:35 np0005538513.localdomain ceph-mon[287629]: from='client.? 172.18.0.106:0/3152215171' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:51:35 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:51:35 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2421283346' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:51:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:35.591 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:51:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:35.601 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:51:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:35.623 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:51:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:35.627 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:51:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:35.628 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:51:35 np0005538513.localdomain sudo[288589]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:36 np0005538513.localdomain sudo[288680]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:51:36 np0005538513.localdomain sudo[288680]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:36 np0005538513.localdomain sudo[288680]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:36 np0005538513.localdomain sudo[288698]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 28 09:51:36 np0005538513.localdomain sudo[288698]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:36 np0005538513.localdomain sudo[288698]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:36 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain.devices.0}] v 0)
Nov 28 09:51:36 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:51:36 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:51:36 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:51:36 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538511.localdomain.devices.0}] v 0)
Nov 28 09:51:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:36.608 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:51:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:36.609 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:51:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:36.609 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:51:36 np0005538513.localdomain ceph-mon[287629]: from='client.? 172.18.0.106:0/2421283346' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:51:36 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain}] v 0)
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538511.localdomain}] v 0)
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} v 0)
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} v 0)
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 09:51:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:37.324 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:51:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:37.324 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:51:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:37.325 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:51:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:37.325 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538510.localdomain.devices.0}] v 0)
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538510.localdomain}] v 0)
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd/host:np0005538510", "name": "osd_memory_target"} v 0)
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538510", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain sudo[288734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:51:37 np0005538513.localdomain sudo[288734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:37 np0005538513.localdomain sudo[288734]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:37 np0005538513.localdomain sudo[288752]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:51:37 np0005538513.localdomain sudo[288752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:37 np0005538513.localdomain sudo[288752]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:37.691 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:51:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:37.710 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:51:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:37.710 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:51:37 np0005538513.localdomain sudo[288770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:51:37 np0005538513.localdomain sudo[288770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:37 np0005538513.localdomain sudo[288770]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: mgrmap e18: np0005538512.zyhkxs(active, since 4s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: Standby manager daemon np0005538510.nzitwz started
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538510", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538510", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:37.831 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:51:37 np0005538513.localdomain sudo[288788]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:37 np0005538513.localdomain sudo[288788]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:37 np0005538513.localdomain sudo[288788]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:37 np0005538513.localdomain sudo[288806]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:51:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:51:37 np0005538513.localdomain sudo[288806]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:37 np0005538513.localdomain sudo[288806]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:51:38 np0005538513.localdomain podman[288823]: 2025-11-28 09:51:38.048733282 +0000 UTC m=+0.101658260 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 28 09:51:38 np0005538513.localdomain sudo[288851]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:51:38 np0005538513.localdomain sudo[288851]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:38 np0005538513.localdomain podman[288823]: 2025-11-28 09:51:38.084318684 +0000 UTC m=+0.137243642 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 09:51:38 np0005538513.localdomain sudo[288851]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:38 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:51:38 np0005538513.localdomain systemd[1]: tmp-crun.E00jbi.mount: Deactivated successfully.
Nov 28 09:51:38 np0005538513.localdomain podman[288872]: 2025-11-28 09:51:38.164075436 +0000 UTC m=+0.106167060 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 09:51:38 np0005538513.localdomain sudo[288886]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:51:38 np0005538513.localdomain sudo[288886]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:38 np0005538513.localdomain sudo[288886]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:38 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005538510.nzitwz", "id": "np0005538510.nzitwz"} v 0)
Nov 28 09:51:38 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538510.nzitwz", "id": "np0005538510.nzitwz"} : dispatch
Nov 28 09:51:38 np0005538513.localdomain podman[288872]: 2025-11-28 09:51:38.241482133 +0000 UTC m=+0.183573767 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:51:38 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:51:38 np0005538513.localdomain sudo[288914]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:51:38 np0005538513.localdomain sudo[288914]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:38 np0005538513.localdomain sudo[288914]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:38 np0005538513.localdomain sudo[288934]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:51:38 np0005538513.localdomain sudo[288934]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:38 np0005538513.localdomain sudo[288934]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:38 np0005538513.localdomain sudo[288952]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:51:38 np0005538513.localdomain sudo[288952]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:38 np0005538513.localdomain sudo[288952]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:38 np0005538513.localdomain sudo[288970]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:51:38 np0005538513.localdomain sudo[288970]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:38 np0005538513.localdomain sudo[288970]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:38 np0005538513.localdomain sudo[288988]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:38 np0005538513.localdomain sudo[288988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:38 np0005538513.localdomain sudo[288988]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:38 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon).osd e86 _set_new_cache_sizes cache_size:1020047072 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:51:38 np0005538513.localdomain sudo[289006]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:51:38 np0005538513.localdomain sudo[289006]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:38 np0005538513.localdomain sudo[289006]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:38 np0005538513.localdomain sudo[289040]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:51:38 np0005538513.localdomain sudo[289040]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:38 np0005538513.localdomain sudo[289040]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:38 np0005538513.localdomain sudo[289058]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:51:38 np0005538513.localdomain sudo[289058]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:38 np0005538513.localdomain sudo[289058]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538513.localdomain sudo[289076]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:39 np0005538513.localdomain sudo[289076]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538513.localdomain sudo[289076]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538513.localdomain sudo[289094]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:51:39 np0005538513.localdomain sudo[289094]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538513.localdomain sudo[289094]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538513.localdomain sudo[289112]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:51:39 np0005538513.localdomain sudo[289112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538513.localdomain sudo[289112]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538513.localdomain ceph-mon[287629]: Updating np0005538510.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:39 np0005538513.localdomain ceph-mon[287629]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:39 np0005538513.localdomain ceph-mon[287629]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:39 np0005538513.localdomain ceph-mon[287629]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:39 np0005538513.localdomain ceph-mon[287629]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:39 np0005538513.localdomain ceph-mon[287629]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:39 np0005538513.localdomain ceph-mon[287629]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:39 np0005538513.localdomain ceph-mon[287629]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:39 np0005538513.localdomain ceph-mon[287629]: Updating np0005538510.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:39 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538510.nzitwz", "id": "np0005538510.nzitwz"} : dispatch
Nov 28 09:51:39 np0005538513.localdomain ceph-mon[287629]: mgrmap e19: np0005538512.zyhkxs(active, since 6s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:39 np0005538513.localdomain ceph-mon[287629]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:39 np0005538513.localdomain ceph-mon[287629]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:39 np0005538513.localdomain ceph-mon[287629]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:39 np0005538513.localdomain ceph-mon[287629]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:39 np0005538513.localdomain sudo[289130]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:51:39 np0005538513.localdomain sudo[289130]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538513.localdomain sudo[289130]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538513.localdomain sudo[289148]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:39 np0005538513.localdomain sudo[289148]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538513.localdomain sudo[289148]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538513.localdomain sudo[289166]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:51:39 np0005538513.localdomain sudo[289166]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538513.localdomain sudo[289166]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538513.localdomain sudo[289200]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:51:39 np0005538513.localdomain sudo[289200]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538513.localdomain sudo[289200]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:39.615 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:51:39 np0005538513.localdomain sudo[289218]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:51:39 np0005538513.localdomain sudo[289218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538513.localdomain sudo[289218]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538513.localdomain sudo[289236]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Nov 28 09:51:39 np0005538513.localdomain sudo[289236]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538513.localdomain sudo[289236]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538513.localdomain sudo[289254]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:51:39 np0005538513.localdomain sudo[289254]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538513.localdomain sudo[289254]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538513.localdomain sudo[289272]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:51:39 np0005538513.localdomain sudo[289272]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:51:39 np0005538513.localdomain sudo[289272]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538513.localdomain sudo[289296]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:51:39 np0005538513.localdomain sudo[289296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538513.localdomain sudo[289296]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:40 np0005538513.localdomain podman[289290]: 2025-11-28 09:51:40.02046591 +0000 UTC m=+0.098815402 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:51:40 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:51:40 np0005538513.localdomain podman[289290]: 2025-11-28 09:51:40.041238504 +0000 UTC m=+0.119588036 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 09:51:40 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:51:40 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:51:40 np0005538513.localdomain podman[238687]: time="2025-11-28T09:51:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:51:40 np0005538513.localdomain sudo[289326]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:40 np0005538513.localdomain sudo[289326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:40 np0005538513.localdomain sudo[289326]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:51:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 28 09:51:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:51:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18707 "" "Go-http-client/1.1"
Nov 28 09:51:40 np0005538513.localdomain sudo[289345]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:51:40 np0005538513.localdomain sudo[289345]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:40 np0005538513.localdomain sudo[289345]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:40 np0005538513.localdomain ceph-mon[287629]: Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:51:40 np0005538513.localdomain ceph-mon[287629]: Updating np0005538510.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:51:40 np0005538513.localdomain ceph-mon[287629]: Updating np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:51:40 np0005538513.localdomain ceph-mon[287629]: Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:51:40 np0005538513.localdomain ceph-mon[287629]: Updating np0005538511.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:51:40 np0005538513.localdomain ceph-mon[287629]: Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:51:40 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:40 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:40 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538510.localdomain.devices.0}] v 0)
Nov 28 09:51:40 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:51:40 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538510.localdomain}] v 0)
Nov 28 09:51:40 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain.devices.0}] v 0)
Nov 28 09:51:40 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:51:40 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain}] v 0)
Nov 28 09:51:40 np0005538513.localdomain sudo[289379]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:51:40 np0005538513.localdomain sudo[289379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:40 np0005538513.localdomain sudo[289379]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:40 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538511.localdomain.devices.0}] v 0)
Nov 28 09:51:40 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538511.localdomain}] v 0)
Nov 28 09:51:40 np0005538513.localdomain sudo[289397]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:51:40 np0005538513.localdomain sudo[289397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:40 np0005538513.localdomain sudo[289397]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:40 np0005538513.localdomain sudo[289415]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:51:40 np0005538513.localdomain sudo[289415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:40 np0005538513.localdomain sudo[289415]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:40 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:51:40 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:51:40 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 09:51:40 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 09:51:40 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:51:40 np0005538513.localdomain sudo[289433]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:51:40 np0005538513.localdomain sudo[289433]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:40 np0005538513.localdomain sudo[289433]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: Updating np0005538510.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain.devices.0}] v 0)
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain}] v 0)
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:51:41 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:42 np0005538513.localdomain ceph-mon[287629]: Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)...
Nov 28 09:51:42 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain
Nov 28 09:51:42 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:42 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:51:42 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:42 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:42 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:51:42 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 09:51:42 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain.devices.0}] v 0)
Nov 28 09:51:42 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain}] v 0)
Nov 28 09:51:42 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Nov 28 09:51:42 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:51:42 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:51:42 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:42 np0005538513.localdomain sudo[289452]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:51:42 np0005538513.localdomain sudo[289452]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:42.871 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:51:42 np0005538513.localdomain sudo[289452]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:42 np0005538513.localdomain sudo[289470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:42 np0005538513.localdomain sudo[289470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:43 np0005538513.localdomain ceph-mon[287629]: Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:51:43 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:51:43 np0005538513.localdomain ceph-mon[287629]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Nov 28 09:51:43 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:43 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:43 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:51:43 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:43 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:43 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:51:43 np0005538513.localdomain podman[289505]: 
Nov 28 09:51:43 np0005538513.localdomain podman[289505]: 2025-11-28 09:51:43.418428657 +0000 UTC m=+0.081970500 container create f636510474bea4a4ff41782216cfdc69914ddb48a547cb10f3267c329b054180 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_bassi, build-date=2025-09-24T08:57:55, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, name=rhceph, vcs-type=git, io.openshift.expose-services=, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, release=553)
Nov 28 09:51:43 np0005538513.localdomain systemd[1]: Started libpod-conmon-f636510474bea4a4ff41782216cfdc69914ddb48a547cb10f3267c329b054180.scope.
Nov 28 09:51:43 np0005538513.localdomain podman[289505]: 2025-11-28 09:51:43.385702773 +0000 UTC m=+0.049244646 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:51:43 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:51:43 np0005538513.localdomain podman[289505]: 2025-11-28 09:51:43.513334997 +0000 UTC m=+0.176876840 container init f636510474bea4a4ff41782216cfdc69914ddb48a547cb10f3267c329b054180 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_bassi, ceph=True, release=553, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, architecture=x86_64, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, distribution-scope=public, GIT_BRANCH=main)
Nov 28 09:51:43 np0005538513.localdomain podman[289505]: 2025-11-28 09:51:43.525566626 +0000 UTC m=+0.189108459 container start f636510474bea4a4ff41782216cfdc69914ddb48a547cb10f3267c329b054180 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_bassi, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, name=rhceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, io.openshift.expose-services=, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, distribution-scope=public, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12)
Nov 28 09:51:43 np0005538513.localdomain podman[289505]: 2025-11-28 09:51:43.525912167 +0000 UTC m=+0.189454000 container attach f636510474bea4a4ff41782216cfdc69914ddb48a547cb10f3267c329b054180 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_bassi, vcs-type=git, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, version=7, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, release=553, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:51:43 np0005538513.localdomain heuristic_bassi[289520]: 167 167
Nov 28 09:51:43 np0005538513.localdomain systemd[1]: libpod-f636510474bea4a4ff41782216cfdc69914ddb48a547cb10f3267c329b054180.scope: Deactivated successfully.
Nov 28 09:51:43 np0005538513.localdomain podman[289505]: 2025-11-28 09:51:43.531136658 +0000 UTC m=+0.194678541 container died f636510474bea4a4ff41782216cfdc69914ddb48a547cb10f3267c329b054180 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_bassi, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, version=7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_CLEAN=True)
Nov 28 09:51:43 np0005538513.localdomain podman[289525]: 2025-11-28 09:51:43.630207978 +0000 UTC m=+0.089597117 container remove f636510474bea4a4ff41782216cfdc69914ddb48a547cb10f3267c329b054180 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_bassi, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, CEPH_POINT_RELEASE=, RELEASE=main, architecture=x86_64, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, vendor=Red Hat, Inc., GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12)
Nov 28 09:51:43 np0005538513.localdomain systemd[1]: libpod-conmon-f636510474bea4a4ff41782216cfdc69914ddb48a547cb10f3267c329b054180.scope: Deactivated successfully.
Nov 28 09:51:43 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon).osd e86 _set_new_cache_sizes cache_size:1020054565 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:51:43 np0005538513.localdomain sudo[289470]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:43 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:51:43 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:51:43 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0)
Nov 28 09:51:43 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:51:43 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:51:43 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:43 np0005538513.localdomain sudo[289541]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:51:43 np0005538513.localdomain sudo[289541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:43 np0005538513.localdomain sudo[289541]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:43 np0005538513.localdomain sudo[289559]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:43 np0005538513.localdomain sudo[289559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:44 np0005538513.localdomain ceph-mon[287629]: Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:51:44 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:51:44 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:44 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:51:44 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:44 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:44 np0005538513.localdomain podman[289595]: 
Nov 28 09:51:44 np0005538513.localdomain podman[289595]: 2025-11-28 09:51:44.337987032 +0000 UTC m=+0.066896503 container create a8318569b00adb21a7ff9b5e8b773e0ddf9b66cd4acb70eba0c27a49a25f25c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_agnesi, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.openshift.tags=rhceph ceph, version=7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_CLEAN=True)
Nov 28 09:51:44 np0005538513.localdomain systemd[1]: Started libpod-conmon-a8318569b00adb21a7ff9b5e8b773e0ddf9b66cd4acb70eba0c27a49a25f25c5.scope.
Nov 28 09:51:44 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:51:44 np0005538513.localdomain podman[289595]: 2025-11-28 09:51:44.306548108 +0000 UTC m=+0.035457599 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:51:44 np0005538513.localdomain podman[289595]: 2025-11-28 09:51:44.411093026 +0000 UTC m=+0.140002497 container init a8318569b00adb21a7ff9b5e8b773e0ddf9b66cd4acb70eba0c27a49a25f25c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_agnesi, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, distribution-scope=public, CEPH_POINT_RELEASE=, architecture=x86_64, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, name=rhceph)
Nov 28 09:51:44 np0005538513.localdomain podman[289595]: 2025-11-28 09:51:44.420851139 +0000 UTC m=+0.149760600 container start a8318569b00adb21a7ff9b5e8b773e0ddf9b66cd4acb70eba0c27a49a25f25c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_agnesi, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, RELEASE=main, version=7, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, vendor=Red Hat, Inc., name=rhceph, ceph=True, distribution-scope=public, vcs-type=git, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container)
Nov 28 09:51:44 np0005538513.localdomain amazing_agnesi[289610]: 167 167
Nov 28 09:51:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-454c6499e0a1ac672d2907fbb2c5eab0ffa25180482709da65a54ecb80996177-merged.mount: Deactivated successfully.
Nov 28 09:51:44 np0005538513.localdomain systemd[1]: libpod-a8318569b00adb21a7ff9b5e8b773e0ddf9b66cd4acb70eba0c27a49a25f25c5.scope: Deactivated successfully.
Nov 28 09:51:44 np0005538513.localdomain podman[289595]: 2025-11-28 09:51:44.421162438 +0000 UTC m=+0.150071899 container attach a8318569b00adb21a7ff9b5e8b773e0ddf9b66cd4acb70eba0c27a49a25f25c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_agnesi, distribution-scope=public, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, version=7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:51:44 np0005538513.localdomain podman[289595]: 2025-11-28 09:51:44.4305835 +0000 UTC m=+0.159493011 container died a8318569b00adb21a7ff9b5e8b773e0ddf9b66cd4acb70eba0c27a49a25f25c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_agnesi, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, version=7, vcs-type=git, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:51:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c461735e4425da9797d34137b527a69d41c73ac10c88cb0b457449cd5b86b220-merged.mount: Deactivated successfully.
Nov 28 09:51:44 np0005538513.localdomain podman[289615]: 2025-11-28 09:51:44.531864358 +0000 UTC m=+0.093116856 container remove a8318569b00adb21a7ff9b5e8b773e0ddf9b66cd4acb70eba0c27a49a25f25c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_agnesi, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_CLEAN=True, distribution-scope=public, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, architecture=x86_64, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:51:44 np0005538513.localdomain systemd[1]: libpod-conmon-a8318569b00adb21a7ff9b5e8b773e0ddf9b66cd4acb70eba0c27a49a25f25c5.scope: Deactivated successfully.
Nov 28 09:51:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:44.644 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:51:44 np0005538513.localdomain sudo[289559]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:44 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:51:44 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:51:44 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0)
Nov 28 09:51:44 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:51:44 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:51:44 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:44 np0005538513.localdomain sudo[289639]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:51:44 np0005538513.localdomain sudo[289639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:44 np0005538513.localdomain sudo[289639]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:44 np0005538513.localdomain sudo[289657]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:44 np0005538513.localdomain sudo[289657]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:45 np0005538513.localdomain ceph-mon[287629]: Reconfiguring osd.2 (monmap changed)...
Nov 28 09:51:45 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:51:45 np0005538513.localdomain ceph-mon[287629]: from='client.17376 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:51:45 np0005538513.localdomain ceph-mon[287629]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 28 09:51:45 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:45 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:51:45 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:45 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:45 np0005538513.localdomain podman[289692]: 
Nov 28 09:51:45 np0005538513.localdomain podman[289692]: 2025-11-28 09:51:45.47620354 +0000 UTC m=+0.086118379 container create 16b827de04db2d9f312e6d444b1ed835c22fa087de6316e3f086b51f5b62b9b9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_snyder, GIT_CLEAN=True, vendor=Red Hat, Inc., RELEASE=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, name=rhceph, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, version=7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64)
Nov 28 09:51:45 np0005538513.localdomain systemd[1]: Started libpod-conmon-16b827de04db2d9f312e6d444b1ed835c22fa087de6316e3f086b51f5b62b9b9.scope.
Nov 28 09:51:45 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:51:45 np0005538513.localdomain podman[289692]: 2025-11-28 09:51:45.441183305 +0000 UTC m=+0.051098194 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:51:45 np0005538513.localdomain podman[289692]: 2025-11-28 09:51:45.54819692 +0000 UTC m=+0.158111759 container init 16b827de04db2d9f312e6d444b1ed835c22fa087de6316e3f086b51f5b62b9b9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_snyder, RELEASE=main, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, version=7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, name=rhceph, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:51:45 np0005538513.localdomain epic_snyder[289707]: 167 167
Nov 28 09:51:45 np0005538513.localdomain systemd[1]: libpod-16b827de04db2d9f312e6d444b1ed835c22fa087de6316e3f086b51f5b62b9b9.scope: Deactivated successfully.
Nov 28 09:51:45 np0005538513.localdomain podman[289692]: 2025-11-28 09:51:45.557333383 +0000 UTC m=+0.167248222 container start 16b827de04db2d9f312e6d444b1ed835c22fa087de6316e3f086b51f5b62b9b9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_snyder, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, version=7, RELEASE=main, com.redhat.component=rhceph-container, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph)
Nov 28 09:51:45 np0005538513.localdomain podman[289692]: 2025-11-28 09:51:45.562212244 +0000 UTC m=+0.172127083 container attach 16b827de04db2d9f312e6d444b1ed835c22fa087de6316e3f086b51f5b62b9b9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_snyder, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_BRANCH=main, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, RELEASE=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, release=553, io.buildah.version=1.33.12, name=rhceph, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:51:45 np0005538513.localdomain podman[289692]: 2025-11-28 09:51:45.565463525 +0000 UTC m=+0.175378394 container died 16b827de04db2d9f312e6d444b1ed835c22fa087de6316e3f086b51f5b62b9b9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_snyder, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_BRANCH=main, vendor=Red Hat, Inc., RELEASE=main, io.buildah.version=1.33.12, GIT_CLEAN=True, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, distribution-scope=public, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vcs-type=git)
Nov 28 09:51:45 np0005538513.localdomain podman[289713]: 2025-11-28 09:51:45.631991385 +0000 UTC m=+0.063856778 container remove 16b827de04db2d9f312e6d444b1ed835c22fa087de6316e3f086b51f5b62b9b9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_snyder, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, distribution-scope=public, io.openshift.expose-services=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, CEPH_POINT_RELEASE=, ceph=True, vendor=Red Hat, Inc., GIT_CLEAN=True)
Nov 28 09:51:45 np0005538513.localdomain systemd[1]: libpod-conmon-16b827de04db2d9f312e6d444b1ed835c22fa087de6316e3f086b51f5b62b9b9.scope: Deactivated successfully.
Nov 28 09:51:45 np0005538513.localdomain sudo[289657]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:45 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:51:45 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:51:45 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Nov 28 09:51:45 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:51:45 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:51:45 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:45 np0005538513.localdomain sudo[289736]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:51:45 np0005538513.localdomain sudo[289736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:45 np0005538513.localdomain sudo[289736]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:46 np0005538513.localdomain sudo[289754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:46 np0005538513.localdomain sudo[289754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:46 np0005538513.localdomain ceph-mon[287629]: Reconfiguring osd.5 (monmap changed)...
Nov 28 09:51:46 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:51:46 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:46 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:51:46 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:46 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:46 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:51:46 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-9ed6b433b11fa7279eda7c68f33a34cbfadf33be427c982552b151346a3d90d8-merged.mount: Deactivated successfully.
Nov 28 09:51:46 np0005538513.localdomain podman[289789]: 
Nov 28 09:51:46 np0005538513.localdomain podman[289789]: 2025-11-28 09:51:46.532195571 +0000 UTC m=+0.087212392 container create fd903b507ce9272c3fd7ac3854af4d713578195f6c4d59b6ecb022bc4bc7d495 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_brattain, ceph=True, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, release=553, io.buildah.version=1.33.12, RELEASE=main)
Nov 28 09:51:46 np0005538513.localdomain systemd[1]: Started libpod-conmon-fd903b507ce9272c3fd7ac3854af4d713578195f6c4d59b6ecb022bc4bc7d495.scope.
Nov 28 09:51:46 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:51:46 np0005538513.localdomain podman[289789]: 2025-11-28 09:51:46.499853639 +0000 UTC m=+0.054870500 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:51:46 np0005538513.localdomain podman[289789]: 2025-11-28 09:51:46.601291511 +0000 UTC m=+0.156308332 container init fd903b507ce9272c3fd7ac3854af4d713578195f6c4d59b6ecb022bc4bc7d495 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_brattain, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., release=553, architecture=x86_64, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7)
Nov 28 09:51:46 np0005538513.localdomain podman[289789]: 2025-11-28 09:51:46.613380396 +0000 UTC m=+0.168397217 container start fd903b507ce9272c3fd7ac3854af4d713578195f6c4d59b6ecb022bc4bc7d495 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_brattain, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, architecture=x86_64, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, GIT_CLEAN=True, vcs-type=git, ceph=True)
Nov 28 09:51:46 np0005538513.localdomain podman[289789]: 2025-11-28 09:51:46.613651804 +0000 UTC m=+0.168668665 container attach fd903b507ce9272c3fd7ac3854af4d713578195f6c4d59b6ecb022bc4bc7d495 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_brattain, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, version=7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, distribution-scope=public, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., release=553, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, RELEASE=main)
Nov 28 09:51:46 np0005538513.localdomain relaxed_brattain[289804]: 167 167
Nov 28 09:51:46 np0005538513.localdomain systemd[1]: libpod-fd903b507ce9272c3fd7ac3854af4d713578195f6c4d59b6ecb022bc4bc7d495.scope: Deactivated successfully.
Nov 28 09:51:46 np0005538513.localdomain podman[289789]: 2025-11-28 09:51:46.618343929 +0000 UTC m=+0.173360800 container died fd903b507ce9272c3fd7ac3854af4d713578195f6c4d59b6ecb022bc4bc7d495 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_brattain, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_BRANCH=main, version=7, distribution-scope=public, release=553, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:51:46 np0005538513.localdomain podman[289809]: 2025-11-28 09:51:46.72135629 +0000 UTC m=+0.092457975 container remove fd903b507ce9272c3fd7ac3854af4d713578195f6c4d59b6ecb022bc4bc7d495 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_brattain, com.redhat.component=rhceph-container, version=7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, release=553, architecture=x86_64, io.openshift.expose-services=, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, ceph=True, GIT_BRANCH=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.buildah.version=1.33.12)
Nov 28 09:51:46 np0005538513.localdomain systemd[1]: libpod-conmon-fd903b507ce9272c3fd7ac3854af4d713578195f6c4d59b6ecb022bc4bc7d495.scope: Deactivated successfully.
Nov 28 09:51:46 np0005538513.localdomain sudo[289754]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:46 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:51:46 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:51:46 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Nov 28 09:51:46 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:46 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 28 09:51:46 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:51:46 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:51:46 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:46 np0005538513.localdomain sudo[289827]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:51:46 np0005538513.localdomain sudo[289827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:46 np0005538513.localdomain sudo[289827]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:47 np0005538513.localdomain sudo[289845]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:47 np0005538513.localdomain sudo[289845]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:47 np0005538513.localdomain ceph-mon[287629]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:51:47 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:51:47 np0005538513.localdomain ceph-mon[287629]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:51:47 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:47 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:47 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:47 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:51:47 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:47 np0005538513.localdomain ceph-mon[287629]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:51:47 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:47 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:51:47 np0005538513.localdomain ceph-mon[287629]: from='client.26761 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538510", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:51:47 np0005538513.localdomain podman[289880]: 
Nov 28 09:51:47 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-9480669b0e5659c76a11a25687955aa4eeba5348392bec876060a31ecfba255c-merged.mount: Deactivated successfully.
Nov 28 09:51:47 np0005538513.localdomain podman[289880]: 2025-11-28 09:51:47.482835329 +0000 UTC m=+0.083176708 container create ff15fdf91c938fb824a7ad871b1630ff5bcfeb7ca99457e8ebb0865fc2fb4dbf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, name=rhceph, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, version=7, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, distribution-scope=public, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:51:47 np0005538513.localdomain systemd[1]: Started libpod-conmon-ff15fdf91c938fb824a7ad871b1630ff5bcfeb7ca99457e8ebb0865fc2fb4dbf.scope.
Nov 28 09:51:47 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:51:47 np0005538513.localdomain podman[289880]: 2025-11-28 09:51:47.450166946 +0000 UTC m=+0.050508355 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:51:47 np0005538513.localdomain podman[289880]: 2025-11-28 09:51:47.549605907 +0000 UTC m=+0.149947276 container init ff15fdf91c938fb824a7ad871b1630ff5bcfeb7ca99457e8ebb0865fc2fb4dbf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, com.redhat.component=rhceph-container, architecture=x86_64, version=7, RELEASE=main, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, release=553, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.openshift.expose-services=, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12)
Nov 28 09:51:47 np0005538513.localdomain podman[289880]: 2025-11-28 09:51:47.559952208 +0000 UTC m=+0.160293577 container start ff15fdf91c938fb824a7ad871b1630ff5bcfeb7ca99457e8ebb0865fc2fb4dbf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.buildah.version=1.33.12, GIT_CLEAN=True, ceph=True, RELEASE=main, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, build-date=2025-09-24T08:57:55, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:51:47 np0005538513.localdomain podman[289880]: 2025-11-28 09:51:47.560620778 +0000 UTC m=+0.160962187 container attach ff15fdf91c938fb824a7ad871b1630ff5bcfeb7ca99457e8ebb0865fc2fb4dbf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, version=7, release=553, vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, vendor=Red Hat, Inc., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 09:51:47 np0005538513.localdomain recursing_fermi[289896]: 167 167
Nov 28 09:51:47 np0005538513.localdomain systemd[1]: libpod-ff15fdf91c938fb824a7ad871b1630ff5bcfeb7ca99457e8ebb0865fc2fb4dbf.scope: Deactivated successfully.
Nov 28 09:51:47 np0005538513.localdomain podman[289880]: 2025-11-28 09:51:47.573118525 +0000 UTC m=+0.173459924 container died ff15fdf91c938fb824a7ad871b1630ff5bcfeb7ca99457e8ebb0865fc2fb4dbf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, build-date=2025-09-24T08:57:55, distribution-scope=public, version=7, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.buildah.version=1.33.12, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.openshift.expose-services=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7)
Nov 28 09:51:47 np0005538513.localdomain podman[289901]: 2025-11-28 09:51:47.675172377 +0000 UTC m=+0.092902390 container remove ff15fdf91c938fb824a7ad871b1630ff5bcfeb7ca99457e8ebb0865fc2fb4dbf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, release=553, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, vcs-type=git)
Nov 28 09:51:47 np0005538513.localdomain systemd[1]: libpod-conmon-ff15fdf91c938fb824a7ad871b1630ff5bcfeb7ca99457e8ebb0865fc2fb4dbf.scope: Deactivated successfully.
Nov 28 09:51:47 np0005538513.localdomain sudo[289845]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:47 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:51:47 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:51:47 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Nov 28 09:51:47 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:51:47 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Nov 28 09:51:47 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:51:47 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:51:47 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:47.871 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:51:47 np0005538513.localdomain sudo[289917]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:51:47 np0005538513.localdomain sudo[289917]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:51:47 np0005538513.localdomain sudo[289917]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:47 np0005538513.localdomain sudo[289941]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:47 np0005538513.localdomain podman[289935]: 2025-11-28 09:51:47.98778258 +0000 UTC m=+0.078448771 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:51:47 np0005538513.localdomain sudo[289941]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:48 np0005538513.localdomain podman[289935]: 2025-11-28 09:51:48.023530937 +0000 UTC m=+0.114197108 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:51:48 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:51:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:51:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:51:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:51:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:51:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:51:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:51:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:51:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:51:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:51:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:51:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:51:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:51:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:51:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-769641ae2eb320e288f9e1b0a52e8e321e86a6ef10a15e10b362bf052d6fb7a0-merged.mount: Deactivated successfully.
Nov 28 09:51:48 np0005538513.localdomain podman[289993]: 2025-11-28 09:51:48.541864434 +0000 UTC m=+0.099951068 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 09:51:48 np0005538513.localdomain podman[290001]: 
Nov 28 09:51:48 np0005538513.localdomain podman[290001]: 2025-11-28 09:51:48.558959893 +0000 UTC m=+0.091052441 container create a67931833af89a99c1cbea8b9b9cb55e88105cbbe802bce147fd6ffdb8424e35 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_tu, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, release=553, vcs-type=git, distribution-scope=public, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, version=7, CEPH_POINT_RELEASE=, ceph=True)
Nov 28 09:51:48 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "quorum_status"} v 0)
Nov 28 09:51:48 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "quorum_status"} : dispatch
Nov 28 09:51:48 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mon rm", "name": "np0005538510"} v 0)
Nov 28 09:51:48 np0005538513.localdomain ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon rm", "name": "np0005538510"} : dispatch
Nov 28 09:51:48 np0005538513.localdomain ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b9600 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0
Nov 28 09:51:48 np0005538513.localdomain systemd[1]: Started libpod-conmon-a67931833af89a99c1cbea8b9b9cb55e88105cbbe802bce147fd6ffdb8424e35.scope.
Nov 28 09:51:48 np0005538513.localdomain ceph-mgr[286105]: client.0 ms_handle_reset on v2:172.18.0.105:3300/0
Nov 28 09:51:48 np0005538513.localdomain ceph-mgr[286105]: client.0 ms_handle_reset on v2:172.18.0.105:3300/0
Nov 28 09:51:48 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@5(peon) e7  my rank is now 4 (was 5)
Nov 28 09:51:48 np0005538513.localdomain podman[289993]: 2025-11-28 09:51:48.610256992 +0000 UTC m=+0.168343626 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 09:51:48 np0005538513.localdomain ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b91e0 mon_map magic: 0 from mon.4 v2:172.18.0.106:3300/0
Nov 28 09:51:48 np0005538513.localdomain podman[290001]: 2025-11-28 09:51:48.522241356 +0000 UTC m=+0.054333944 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:51:48 np0005538513.localdomain ceph-mon[287629]: log_channel(cluster) log [INF] : mon.np0005538513 calling monitor election
Nov 28 09:51:48 np0005538513.localdomain ceph-mon[287629]: paxos.4).electionLogic(26) init, last seen epoch 26
Nov 28 09:51:48 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@4(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:48 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@4(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:48 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:51:48 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:51:48 np0005538513.localdomain podman[290001]: 2025-11-28 09:51:48.66764393 +0000 UTC m=+0.199736478 container init a67931833af89a99c1cbea8b9b9cb55e88105cbbe802bce147fd6ffdb8424e35 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_tu, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, distribution-scope=public, GIT_CLEAN=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, architecture=x86_64, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, ceph=True, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:51:48 np0005538513.localdomain podman[290001]: 2025-11-28 09:51:48.677571787 +0000 UTC m=+0.209664335 container start a67931833af89a99c1cbea8b9b9cb55e88105cbbe802bce147fd6ffdb8424e35 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_tu, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.component=rhceph-container, name=rhceph, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.buildah.version=1.33.12, version=7, GIT_CLEAN=True, vcs-type=git)
Nov 28 09:51:48 np0005538513.localdomain podman[290001]: 2025-11-28 09:51:48.677891878 +0000 UTC m=+0.209984476 container attach a67931833af89a99c1cbea8b9b9cb55e88105cbbe802bce147fd6ffdb8424e35 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_tu, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, io.buildah.version=1.33.12, distribution-scope=public, build-date=2025-09-24T08:57:55, architecture=x86_64, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, release=553, version=7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:51:48 np0005538513.localdomain condescending_tu[290030]: 167 167
Nov 28 09:51:48 np0005538513.localdomain systemd[1]: libpod-a67931833af89a99c1cbea8b9b9cb55e88105cbbe802bce147fd6ffdb8424e35.scope: Deactivated successfully.
Nov 28 09:51:48 np0005538513.localdomain podman[290001]: 2025-11-28 09:51:48.685183053 +0000 UTC m=+0.217275611 container died a67931833af89a99c1cbea8b9b9cb55e88105cbbe802bce147fd6ffdb8424e35 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_tu, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, RELEASE=main, release=553, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public)
Nov 28 09:51:48 np0005538513.localdomain podman[290035]: 2025-11-28 09:51:48.801465125 +0000 UTC m=+0.099324518 container remove a67931833af89a99c1cbea8b9b9cb55e88105cbbe802bce147fd6ffdb8424e35 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_tu, CEPH_POINT_RELEASE=, release=553, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, vcs-type=git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, distribution-scope=public, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, architecture=x86_64)
Nov 28 09:51:48 np0005538513.localdomain systemd[1]: libpod-conmon-a67931833af89a99c1cbea8b9b9cb55e88105cbbe802bce147fd6ffdb8424e35.scope: Deactivated successfully.
Nov 28 09:51:48 np0005538513.localdomain sudo[289941]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:49 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-fd152b1b9fa663a649df1b21b39bf9efd4cd17e83bf10fec9fa10105450469b2-merged.mount: Deactivated successfully.
Nov 28 09:51:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:49.649 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:51:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:51:50.828 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:51:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:51:50.829 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:51:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:51:50.830 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:51:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:52.903 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:51:53 np0005538513.localdomain ceph-mds[282744]: mds.beacon.mds.np0005538513.yljthc missed beacon ack from the monitors
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@4(peon) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@4(peon).osd e86 _set_new_cache_sizes cache_size:1020054728 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "quorum_status"} : dispatch
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon rm", "name": "np0005538510"} : dispatch
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: from='client.34185 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005538510"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: Remove daemons mon.np0005538510
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: Safe to remove mon.np0005538510: new quorum should be ['np0005538512', 'np0005538511', 'np0005538515', 'np0005538514', 'np0005538513'] (from ['np0005538512', 'np0005538511', 'np0005538515', 'np0005538514', 'np0005538513'])
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: Removing monitor np0005538510 from monmap...
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: Removing daemon mon.np0005538510 from np0005538510.localdomain -- ports []
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: mon.np0005538515 calling monitor election
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: mon.np0005538511 calling monitor election
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: mon.np0005538512 calling monitor election
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513 calling monitor election
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538511"} : dispatch
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: mon.np0005538512 is new leader, mons np0005538512,np0005538511,np0005538515,np0005538513 in quorum (ranks 0,1,2,4)
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: monmap epoch 7
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: last_changed 2025-11-28T09:51:48.586207+0000
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: min_mon_release 18 (reef)
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: election_strategy: 1
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538512
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538511
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005538514
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: 4: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005538513
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: osdmap e86: 6 total, 6 up, 6 in
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: mgrmap e19: np0005538512.zyhkxs(active, since 21s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: Health check failed: 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538513 (MON_DOWN)
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: Health detail: HEALTH_WARN 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538513
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: [WRN] MON_DOWN: 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538513
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]:     mon.np0005538514 (rank 3) addr [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] is down (out of quorum)
Nov 28 09:51:53 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:54.675 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:51:54 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:54 np0005538513.localdomain ceph-mon[287629]: Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:51:54 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:51:54 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:54 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:51:54 np0005538513.localdomain ceph-mon[287629]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:54 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:54 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:54 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 28 09:51:54 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:55 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@4(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:55 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@4(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:55 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@4(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:55 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@4(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:55 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@4(peon) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: mon.np0005538514 calling monitor election
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: from='client.34197 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538510.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: Removed label mon from host np0005538510.localdomain
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: Reconfiguring osd.3 (monmap changed)...
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: mon.np0005538511 calling monitor election
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: mon.np0005538515 calling monitor election
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: mon.np0005538512 calling monitor election
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: mon.np0005538512 is new leader, mons np0005538512,np0005538511,np0005538515,np0005538514,np0005538513 in quorum (ranks 0,1,2,3,4)
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: monmap epoch 7
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: last_changed 2025-11-28T09:51:48.586207+0000
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: min_mon_release 18 (reef)
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: election_strategy: 1
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538512
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538511
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005538514
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: 4: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005538513
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: osdmap e86: 6 total, 6 up, 6 in
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: mgrmap e19: np0005538512.zyhkxs(active, since 23s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: Health check cleared: MON_DOWN (was: 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538513)
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: Cluster is now healthy
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: overall HEALTH_OK
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:51:56 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:57 np0005538513.localdomain ceph-mon[287629]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:51:57 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:51:57 np0005538513.localdomain ceph-mon[287629]: from='client.34169 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538510.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:57 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:57 np0005538513.localdomain ceph-mon[287629]: Removed label mgr from host np0005538510.localdomain
Nov 28 09:51:57 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:57 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:57 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:57 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:51:57 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:57.903 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:51:58 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@4(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:51:59 np0005538513.localdomain ceph-mon[287629]: Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:51:59 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:51:59 np0005538513.localdomain ceph-mon[287629]: from='client.26785 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538510.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:59 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:59 np0005538513.localdomain ceph-mon[287629]: Removed label _admin from host np0005538510.localdomain
Nov 28 09:51:59 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:59 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:59 np0005538513.localdomain ceph-mon[287629]: Reconfiguring mon.np0005538514 (monmap changed)...
Nov 28 09:51:59 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:51:59 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:51:59 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:59 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain
Nov 28 09:51:59 np0005538513.localdomain ceph-mon[287629]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:59 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:59 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:59 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:51:59 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:51:59.678 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:00 np0005538513.localdomain ceph-mon[287629]: Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:52:00 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:52:00 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:00 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:00 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 28 09:52:00 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.673 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.675 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.710 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.711 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '214922ef-f20b-4c42-9623-3da0c7a2abca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:52:00.675427', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e2b6b7c4-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': 'fd77d4f54934f83fa4cbac869fe601f11b796cab24d5a7c3b0a9cecf576636de'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:52:00.675427', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e2b6cdfe-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': '9bb4eeda6708b1df838b793ffe1407ac3755aa6f04b415dfde54f8f9bf0fa311'}]}, 'timestamp': '2025-11-28 09:52:00.712471', '_unique_id': '13462806b4f04329a9bbb67daa989585'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.716 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.720 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4b4be75-98c2-444f-9e4c-e546e0077d4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:52:00.716333', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'e2b810b0-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.888238179, 'message_signature': '0a2928a63a5045fe8272ef405eb37906c7030e8c10005c5594fa63ac440922d3'}]}, 'timestamp': '2025-11-28 09:52:00.720748', '_unique_id': '37897a71ba97470ebfadc45cdda473f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.722 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.723 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.723 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.723 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '725a2f19-ff6d-4aaa-bbcd-99473b57317f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:52:00.723157', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e2b88144-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': '8523af116c815982d5e0746c252ace8d6c8d142d166726e39fe3e3f4a737ba2b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:52:00.723157', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e2b89120-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': 'e83868b96542a5d47d06dc8103b473a1fd2a87cfde6248ceea15b5ba0c364400'}]}, 'timestamp': '2025-11-28 09:52:00.723996', '_unique_id': '5531b26bc3664ff28293053614052cc1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.726 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.726 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b747b9e-7b19-4a85-82b1-48e55c69ce3b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:52:00.726205', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'e2b8f836-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.888238179, 'message_signature': 'a019dc53cda9572f036bfa636820e93626fef1a3fbcd8ff73def75c1eda745b3'}]}, 'timestamp': '2025-11-28 09:52:00.726664', '_unique_id': '191198ea9dc3403e9097644f87bb89e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.728 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.746 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 12760000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7e8abe6-7a6f-48ce-9d2a-5df44ba0ba5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12760000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:52:00.728747', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'e2bc2664-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.918732784, 'message_signature': 'a7d69dfbe71ffa4545c74fe434b25e411be7fb5b3bc78ccd3a61308bd6277488'}]}, 'timestamp': '2025-11-28 09:52:00.747515', '_unique_id': '125284382b7a45e9800ce448b534abaa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.749 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.749 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.750 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be635916-7a4f-4aef-a8c6-cf7de23c57d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:52:00.749710', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e2bc8e1a-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': '6ca2d7cf98ddba9afc1679d4fafc4debf10de10cc4f91d5d516996afc0f1b579'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:52:00.749710', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e2bc9f4a-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': '3c074aae2a8709a83fabb4654c20eef250cd91695bb7081bb2aa39b8ddc070fa'}]}, 'timestamp': '2025-11-28 09:52:00.750571', '_unique_id': 'b353bbef85f64753a596fabb6ee3b718'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.752 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 09:52:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.765 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.765 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2bddbae-8053-4884-86b0-142fb99727fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:52:00.752745', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e2bee9bc-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.924641177, 'message_signature': '184e1374a048393c37f41d02e82685d6817cbc2dead9ebc7e28844c4aab760f9'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:52:00.752745', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e2befa1a-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.924641177, 'message_signature': '6aaf87739b1fb3c92607276f69869575072dedd5288a74a412d5f5e374f0a886'}]}, 'timestamp': '2025-11-28 09:52:00.766003', '_unique_id': 'd8c1ae5223994dceb4a2e1dd70b23767'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.768 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.768 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.768 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c18eaa7b-6265-4d84-ab74-aaf0179af586', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:52:00.768589', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'e2bf6fb8-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.918732784, 'message_signature': '838a7799f1611d83e1c08d6c1486ed69562f8d09caa609bfb06372aa0f210f48'}]}, 'timestamp': '2025-11-28 09:52:00.769098', '_unique_id': '5f1c98a334314c3a98ef912b017dc251'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.771 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.771 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.771 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73b78d13-7923-4a5b-8905-82cf55dfe60c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:52:00.771174', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e2bfd480-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': 'fb003e18052d6fd5c72b9e3eeaf19686a391a92e3b583a234a72906190f0f463'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:52:00.771174', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e2bfe420-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': '4002d00b6555963384870e1de63e3880e278073b841ff968d80423d6eb254a5a'}]}, 'timestamp': '2025-11-28 09:52:00.771994', '_unique_id': '1cfadfdde2f04f85bb1d7fd9f39b2e5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.774 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.774 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9852a5ed-c029-4a75-a2a6-0a546b801e5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:52:00.774161', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'e2c04a6e-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.888238179, 'message_signature': 'f89e19b6bcbdefd7e85dccf837803a41a7278212908af74d69ddaa1291b82305'}]}, 'timestamp': '2025-11-28 09:52:00.774670', '_unique_id': '51b469ed3e464b7f842a5d490d037ed1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.776 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.776 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dbeac082-aed8-4899-bd39-0be963da7908', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:52:00.776940', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'e2c0b74c-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.888238179, 'message_signature': 'ab0d3499cd0b96c48dca4640cc93f01a0a31633422cba847f6aca9baad688387'}]}, 'timestamp': '2025-11-28 09:52:00.777426', '_unique_id': '50c33d85b3554e26b3158d4c3e20acd0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.779 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.779 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.779 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2cc4b7e-f7c5-4580-9b2c-433d56c974de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:52:00.779466', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e2c11822-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': 'fb9a708f91ece67813e0026a4aafaf193a4c1e69b6efa7777df09662c6cab9d0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:52:00.779466', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e2c12916-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': '1a6628759ccbd1312b06ef3e13b26b09fc16a91ffd206cc77ab9804251a111ed'}]}, 'timestamp': '2025-11-28 09:52:00.780313', '_unique_id': '9779c8dad98d4cad969f90df6bb1c02f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.782 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.782 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.782 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07dbbb58-c2f2-41b6-9525-5862ee3b1725', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:52:00.782526', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'e2c18fb4-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.888238179, 'message_signature': '7e0f74f6a84d269ee146070b4a15abb271e8bd67951d17563012f5627404bbb5'}]}, 'timestamp': '2025-11-28 09:52:00.782967', '_unique_id': 'c83970b3eff54a5686439b2f5ddcaaa9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.784 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.785 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd913e79-ae48-4aea-9044-e63de1b762dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:52:00.785033', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'e2c20d7c-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.888238179, 'message_signature': 'fbc7b428c875cf4bcfb95b6b1d2ffcdb2d0114ad845854a33b23fa40795f3e8a'}]}, 'timestamp': '2025-11-28 09:52:00.786421', '_unique_id': '2a3fb661ebfd49598707f5b3deca2ec1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.789 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.789 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc72ff31-18ad-4a84-a9ce-8c6df7c63818', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:52:00.789838', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'e2c2afde-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.888238179, 'message_signature': '560a97d85fb5576dc2a7544723dc01549dd47297b5c63f18d0c0fad33c31562a'}]}, 'timestamp': '2025-11-28 09:52:00.790358', '_unique_id': '47a396969b364b438892aac4f98830a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.792 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.792 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.792 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1738b2f-08e4-43b5-bcc5-5a143ad5d68d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:52:00.792716', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'e2c31e38-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.888238179, 'message_signature': 'c21ed207347dd29547c3d0ca48756bf4ed953ea56492113afa595a422f6ab424'}]}, 'timestamp': '2025-11-28 09:52:00.793206', '_unique_id': '932ef35aaa1f403e925549ba2efb45d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.795 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a541e8a-707a-4f5a-8bf0-1038e457a6a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:52:00.795299', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e2c3830a-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.924641177, 'message_signature': '369ae8958d0478269e4305ef5624dabc561c1388aabb2426da84a2e6174c7c8d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:52:00.795299', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e2c392d2-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.924641177, 'message_signature': '6fda23888d5c22c995657772342be087cec7712e9872fe4073b1d01b6377c248'}]}, 'timestamp': '2025-11-28 09:52:00.796164', '_unique_id': '9ee53aee7905415f87ac73990564e84d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.798 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.798 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d135c63-c491-43a3-84b2-5cac4cfaf8d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:52:00.798288', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'e2c3f79a-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.888238179, 'message_signature': '3d2c535159bc5f22fa74eab4e44cdab74c4b521ce82314c116e868ef845c0a97'}]}, 'timestamp': '2025-11-28 09:52:00.798735', '_unique_id': 'b30308ab24f74d20878de9149b064949'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.801 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.801 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.801 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5c4c128-3d0a-47ca-bc88-ef4a73e18f4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:52:00.801337', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e2c46ce8-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.924641177, 'message_signature': '4eb6dff05fc07fc583459ad0c0051cac14cac41b5962b8f8b672cc477d71513d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:52:00.801337', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e2c476d4-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.924641177, 'message_signature': 'b2fe282a7478fd002ca9821155cf7f86a13da985096ad53ad3f05c7d836bb5b9'}]}, 'timestamp': '2025-11-28 09:52:00.801887', '_unique_id': 'bebce2593dd14e20adf6f15e7f6f4679'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.803 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.803 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.803 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27f781ed-21e2-4e7f-afdb-8fa0925237c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:52:00.803206', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e2c4b496-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': 'a5d6cfe97413103f2e5720eb08c59e01c2fe50a413ceec4200c1bd1ca0fa4089'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:52:00.803206', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e2c4be0a-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': 'a559b5edb2c56c45096792907f37a78c026ee6f126a2d0c326539b26512397f5'}]}, 'timestamp': '2025-11-28 09:52:00.803709', '_unique_id': 'f0403d47cb53412aaf264e41ab09af0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45adb67e-b8e1-4391-9229-eed29f91c743', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:52:00.805056', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'e2c4fcf8-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.888238179, 'message_signature': '53521d9d05f0913a7c0893a46bb900db231c01e3c0e165222496ec6c96f404f5'}]}, 'timestamp': '2025-11-28 09:52:00.805339', '_unique_id': 'cd77c5da1cd74ca9b18c0c122a14bbb5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:52:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:52:00 np0005538513.localdomain podman[290053]: 2025-11-28 09:52:00.870610137 +0000 UTC m=+0.097344347 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-type=git, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter)
Nov 28 09:52:00 np0005538513.localdomain podman[290053]: 2025-11-28 09:52:00.885609961 +0000 UTC m=+0.112344191 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm)
Nov 28 09:52:00 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:52:01 np0005538513.localdomain ceph-mon[287629]: Reconfiguring osd.1 (monmap changed)...
Nov 28 09:52:01 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:52:01 np0005538513.localdomain ceph-mon[287629]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:01 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:01 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:01 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 28 09:52:01 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:02 np0005538513.localdomain ceph-mon[287629]: Reconfiguring osd.4 (monmap changed)...
Nov 28 09:52:02 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:52:02 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:02 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:02 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:52:02 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:02.944 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:03 np0005538513.localdomain ceph-mon[287629]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:52:03 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:52:03 np0005538513.localdomain ceph-mon[287629]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:03 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:03 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:03 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:52:03 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:52:03 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:03 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@4(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:52:04 np0005538513.localdomain ceph-mon[287629]: Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)...
Nov 28 09:52:04 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:52:04 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:04 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:04 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:52:04 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:52:04 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:04.713 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:05 np0005538513.localdomain ceph-mon[287629]: Reconfiguring mon.np0005538515 (monmap changed)...
Nov 28 09:52:05 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:52:05 np0005538513.localdomain ceph-mon[287629]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:05 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:05 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:52:05 np0005538513.localdomain systemd[1]: tmp-crun.iZFTbB.mount: Deactivated successfully.
Nov 28 09:52:05 np0005538513.localdomain podman[290076]: 2025-11-28 09:52:05.489601107 +0000 UTC m=+0.100464914 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:52:05 np0005538513.localdomain podman[290076]: 2025-11-28 09:52:05.526404777 +0000 UTC m=+0.137268554 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:52:05 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:52:06 np0005538513.localdomain sudo[290099]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:52:06 np0005538513.localdomain sudo[290099]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:06 np0005538513.localdomain sudo[290099]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:06 np0005538513.localdomain sudo[290117]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:52:06 np0005538513.localdomain sudo[290117]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:06 np0005538513.localdomain sudo[290117]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:06 np0005538513.localdomain sudo[290135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:06 np0005538513.localdomain sudo[290135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:06 np0005538513.localdomain sudo[290135]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:06 np0005538513.localdomain sudo[290153]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:06 np0005538513.localdomain sudo[290153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:06 np0005538513.localdomain sudo[290153]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:06 np0005538513.localdomain sudo[290171]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:06 np0005538513.localdomain sudo[290171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:06 np0005538513.localdomain sudo[290171]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:06 np0005538513.localdomain sudo[290205]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:06 np0005538513.localdomain sudo[290205]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:06 np0005538513.localdomain sudo[290205]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:06 np0005538513.localdomain sudo[290223]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:06 np0005538513.localdomain sudo[290223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:06 np0005538513.localdomain sudo[290223]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:06 np0005538513.localdomain sudo[290241]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:52:06 np0005538513.localdomain sudo[290241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:06 np0005538513.localdomain sudo[290241]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:07 np0005538513.localdomain sudo[290259]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:52:07 np0005538513.localdomain sudo[290259]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:07 np0005538513.localdomain sudo[290259]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:07 np0005538513.localdomain sudo[290277]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:52:07 np0005538513.localdomain sudo[290277]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:07 np0005538513.localdomain sudo[290277]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:07 np0005538513.localdomain sudo[290295]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:07 np0005538513.localdomain sudo[290295]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:07 np0005538513.localdomain sudo[290295]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:07 np0005538513.localdomain ceph-mon[287629]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:07 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:07 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:07 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:07 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:52:07 np0005538513.localdomain ceph-mon[287629]: Removing np0005538510.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:07 np0005538513.localdomain ceph-mon[287629]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:07 np0005538513.localdomain ceph-mon[287629]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:07 np0005538513.localdomain ceph-mon[287629]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:07 np0005538513.localdomain ceph-mon[287629]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:07 np0005538513.localdomain ceph-mon[287629]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:07 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:07 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:07 np0005538513.localdomain sudo[290313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:07 np0005538513.localdomain sudo[290313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:07 np0005538513.localdomain sudo[290313]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:07 np0005538513.localdomain sudo[290331]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:07 np0005538513.localdomain sudo[290331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:07 np0005538513.localdomain sudo[290331]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:07 np0005538513.localdomain sudo[290365]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:07 np0005538513.localdomain sudo[290365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:07 np0005538513.localdomain sudo[290365]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:07 np0005538513.localdomain sudo[290383]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:07 np0005538513.localdomain sudo[290383]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:07 np0005538513.localdomain sudo[290383]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:07 np0005538513.localdomain sudo[290401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:07 np0005538513.localdomain sudo[290401]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:07 np0005538513.localdomain sudo[290401]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:07.949 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:08 np0005538513.localdomain ceph-mon[287629]: Removing np0005538510.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:52:08 np0005538513.localdomain ceph-mon[287629]: Removing np0005538510.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:52:08 np0005538513.localdomain ceph-mon[287629]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:08 np0005538513.localdomain ceph-mon[287629]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:08 np0005538513.localdomain ceph-mon[287629]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:08 np0005538513.localdomain ceph-mon[287629]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:08 np0005538513.localdomain ceph-mon[287629]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:08 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:08 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:08 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:08 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:08 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:08 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:08 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:08 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:08 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:08 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:08 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:08 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@4(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:52:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:52:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:52:08 np0005538513.localdomain systemd[1]: tmp-crun.eouQjm.mount: Deactivated successfully.
Nov 28 09:52:08 np0005538513.localdomain podman[290420]: 2025-11-28 09:52:08.873353293 +0000 UTC m=+0.103419494 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:52:08 np0005538513.localdomain podman[290420]: 2025-11-28 09:52:08.904765956 +0000 UTC m=+0.134832187 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:52:08 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:52:08 np0005538513.localdomain systemd[1]: tmp-crun.GGu8yV.mount: Deactivated successfully.
Nov 28 09:52:08 np0005538513.localdomain podman[290419]: 2025-11-28 09:52:08.973285718 +0000 UTC m=+0.202333908 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 28 09:52:09 np0005538513.localdomain podman[290419]: 2025-11-28 09:52:09.04984776 +0000 UTC m=+0.278895970 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:52:09 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:52:09 np0005538513.localdomain ceph-mon[287629]: Removing daemon mgr.np0005538510.nzitwz from np0005538510.localdomain -- ports [9283, 8765]
Nov 28 09:52:09 np0005538513.localdomain ceph-mon[287629]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:09 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:09.716 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:10 np0005538513.localdomain podman[238687]: time="2025-11-28T09:52:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:52:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:52:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 28 09:52:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:52:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18709 "" "Go-http-client/1.1"
Nov 28 09:52:10 np0005538513.localdomain sudo[290461]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:52:10 np0005538513.localdomain sudo[290461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:52:10 np0005538513.localdomain sudo[290461]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:10 np0005538513.localdomain ceph-mon[287629]: from='client.26795 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005538510.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:52:10 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:10 np0005538513.localdomain ceph-mon[287629]: Added label _no_schedule to host np0005538510.localdomain
Nov 28 09:52:10 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:10 np0005538513.localdomain ceph-mon[287629]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005538510.localdomain
Nov 28 09:52:10 np0005538513.localdomain ceph-mon[287629]: Removing key for mgr.np0005538510.nzitwz
Nov 28 09:52:10 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth rm", "entity": "mgr.np0005538510.nzitwz"} : dispatch
Nov 28 09:52:10 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005538510.nzitwz"}]': finished
Nov 28 09:52:10 np0005538513.localdomain ceph-mon[287629]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:10 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:10 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:10 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:52:10 np0005538513.localdomain systemd[1]: tmp-crun.lQFQ6z.mount: Deactivated successfully.
Nov 28 09:52:10 np0005538513.localdomain podman[290478]: 2025-11-28 09:52:10.674203147 +0000 UTC m=+0.100722250 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 09:52:10 np0005538513.localdomain podman[290478]: 2025-11-28 09:52:10.687623183 +0000 UTC m=+0.114142326 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3)
Nov 28 09:52:10 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:52:12 np0005538513.localdomain ceph-mon[287629]: from='client.26805 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005538510.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:52:12 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:12 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:12 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:12 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:52:12 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:12 np0005538513.localdomain sudo[290498]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:52:12 np0005538513.localdomain sudo[290498]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:12 np0005538513.localdomain sudo[290498]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:12.984 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:13 np0005538513.localdomain sudo[290516]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:52:13 np0005538513.localdomain sudo[290516]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:13 np0005538513.localdomain sudo[290516]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:13 np0005538513.localdomain ceph-mon[287629]: Removing daemon crash.np0005538510 from np0005538510.localdomain -- ports []
Nov 28 09:52:13 np0005538513.localdomain ceph-mon[287629]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain"} : dispatch
Nov 28 09:52:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain"}]': finished
Nov 28 09:52:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth rm", "entity": "client.crash.np0005538510.localdomain"} : dispatch
Nov 28 09:52:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd='[{"prefix": "auth rm", "entity": "client.crash.np0005538510.localdomain"}]': finished
Nov 28 09:52:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:52:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:52:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:13 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:52:13 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@4(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: from='client.34193 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005538510.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: Removing key for client.crash.np0005538510.localdomain
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: Removed host np0005538510.localdomain
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: Reconfiguring crash.np0005538511 (monmap changed)...
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538511.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon crash.np0005538511 on np0005538511.localdomain
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: from='client.? 172.18.0.32:0/1979550132' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: from='client.? 172.18.0.32:0/1979550132' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:14.429212) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323534429305, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 12076, "num_deletes": 528, "total_data_size": 19062293, "memory_usage": 19777560, "flush_reason": "Manual Compaction"}
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323534520476, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 12283092, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 12081, "table_properties": {"data_size": 12227037, "index_size": 29773, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25221, "raw_key_size": 262952, "raw_average_key_size": 26, "raw_value_size": 12056539, "raw_average_value_size": 1196, "num_data_blocks": 1136, "num_entries": 10074, "num_filter_entries": 10074, "num_deletions": 527, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323473, "oldest_key_time": 1764323473, "file_creation_time": 1764323534, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "53876326-3f1b-4342-b386-ddefe9bbd825", "db_session_id": "ND9860OIBZS6OJ35KIN7", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 91337 microseconds, and 25448 cpu microseconds.
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:14.520551) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 12283092 bytes OK
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:14.520581) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:14.522276) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:14.522306) EVENT_LOG_v1 {"time_micros": 1764323534522299, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:14.522329) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 18983401, prev total WAL file size 18984150, number of live WAL files 2.
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:14.525713) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033353039' seq:72057594037927935, type:22 .. '6D6772737461740033373631' seq:0, type:0; will stop at (end)
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(11MB) 8(2012B)]
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323534525813, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 12285104, "oldest_snapshot_seqno": -1}
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 9550 keys, 12274999 bytes, temperature: kUnknown
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323534620119, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 12274999, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12220302, "index_size": 29700, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23941, "raw_key_size": 254014, "raw_average_key_size": 26, "raw_value_size": 12056597, "raw_average_value_size": 1262, "num_data_blocks": 1134, "num_entries": 9550, "num_filter_entries": 9550, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323473, "oldest_key_time": 0, "file_creation_time": 1764323534, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "53876326-3f1b-4342-b386-ddefe9bbd825", "db_session_id": "ND9860OIBZS6OJ35KIN7", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:14.620306) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 12274999 bytes
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:14.621709) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 130.2 rd, 130.1 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(11.7, 0.0 +0.0 blob) out(11.7 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 10079, records dropped: 529 output_compression: NoCompression
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:14.621724) EVENT_LOG_v1 {"time_micros": 1764323534621717, "job": 4, "event": "compaction_finished", "compaction_time_micros": 94365, "compaction_time_cpu_micros": 35922, "output_level": 6, "num_output_files": 1, "total_output_size": 12274999, "num_input_records": 10079, "num_output_records": 9550, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323534622605, "job": 4, "event": "table_file_deletion", "file_number": 14}
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323534622634, "job": 4, "event": "table_file_deletion", "file_number": 8}
Nov 28 09:52:14 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:14.525551) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:52:14 np0005538513.localdomain sshd[290534]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:52:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:14.755 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:14 np0005538513.localdomain sshd[290534]: Accepted publickey for tripleo-admin from 192.168.122.11 port 50356 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:52:14 np0005538513.localdomain systemd-logind[764]: New session 65 of user tripleo-admin.
Nov 28 09:52:14 np0005538513.localdomain systemd[1]: Created slice User Slice of UID 1003.
Nov 28 09:52:14 np0005538513.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Nov 28 09:52:14 np0005538513.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Nov 28 09:52:14 np0005538513.localdomain systemd[1]: Starting User Manager for UID 1003...
Nov 28 09:52:14 np0005538513.localdomain systemd[290538]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Nov 28 09:52:15 np0005538513.localdomain systemd[290538]: Queued start job for default target Main User Target.
Nov 28 09:52:15 np0005538513.localdomain systemd[290538]: Created slice User Application Slice.
Nov 28 09:52:15 np0005538513.localdomain systemd[290538]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 09:52:15 np0005538513.localdomain systemd[290538]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 09:52:15 np0005538513.localdomain systemd[290538]: Reached target Paths.
Nov 28 09:52:15 np0005538513.localdomain systemd[290538]: Reached target Timers.
Nov 28 09:52:15 np0005538513.localdomain systemd[290538]: Starting D-Bus User Message Bus Socket...
Nov 28 09:52:15 np0005538513.localdomain systemd[290538]: Starting Create User's Volatile Files and Directories...
Nov 28 09:52:15 np0005538513.localdomain systemd[290538]: Listening on D-Bus User Message Bus Socket.
Nov 28 09:52:15 np0005538513.localdomain systemd[290538]: Reached target Sockets.
Nov 28 09:52:15 np0005538513.localdomain systemd[290538]: Finished Create User's Volatile Files and Directories.
Nov 28 09:52:15 np0005538513.localdomain systemd[290538]: Reached target Basic System.
Nov 28 09:52:15 np0005538513.localdomain systemd[290538]: Reached target Main User Target.
Nov 28 09:52:15 np0005538513.localdomain systemd[290538]: Startup finished in 131ms.
Nov 28 09:52:15 np0005538513.localdomain systemd[1]: Started User Manager for UID 1003.
Nov 28 09:52:15 np0005538513.localdomain systemd[1]: Started Session 65 of User tripleo-admin.
Nov 28 09:52:15 np0005538513.localdomain sshd[290534]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.205312) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323535205391, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 292, "num_deletes": 251, "total_data_size": 91200, "memory_usage": 96920, "flush_reason": "Manual Compaction"}
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323535208605, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 59562, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12086, "largest_seqno": 12373, "table_properties": {"data_size": 57583, "index_size": 218, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 5380, "raw_average_key_size": 19, "raw_value_size": 53620, "raw_average_value_size": 195, "num_data_blocks": 8, "num_entries": 274, "num_filter_entries": 274, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323534, "oldest_key_time": 1764323534, "file_creation_time": 1764323535, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "53876326-3f1b-4342-b386-ddefe9bbd825", "db_session_id": "ND9860OIBZS6OJ35KIN7", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 3333 microseconds, and 1071 cpu microseconds.
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.208655) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 59562 bytes OK
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.208677) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.210271) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.210293) EVENT_LOG_v1 {"time_micros": 1764323535210286, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.210315) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 89037, prev total WAL file size 89037, number of live WAL files 2.
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.210812) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130303430' seq:72057594037927935, type:22 .. '7061786F73003130323932' seq:0, type:0; will stop at (end)
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(58KB)], [15(11MB)]
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323535210844, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 12334561, "oldest_snapshot_seqno": -1}
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 9307 keys, 11191532 bytes, temperature: kUnknown
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323535289199, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 11191532, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11140072, "index_size": 27103, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23301, "raw_key_size": 249390, "raw_average_key_size": 26, "raw_value_size": 10982149, "raw_average_value_size": 1179, "num_data_blocks": 1021, "num_entries": 9307, "num_filter_entries": 9307, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323473, "oldest_key_time": 0, "file_creation_time": 1764323535, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "53876326-3f1b-4342-b386-ddefe9bbd825", "db_session_id": "ND9860OIBZS6OJ35KIN7", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.289519) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 11191532 bytes
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.291912) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 157.2 rd, 142.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 11.7 +0.0 blob) out(10.7 +0.0 blob), read-write-amplify(395.0) write-amplify(187.9) OK, records in: 9824, records dropped: 517 output_compression: NoCompression
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.291943) EVENT_LOG_v1 {"time_micros": 1764323535291929, "job": 6, "event": "compaction_finished", "compaction_time_micros": 78479, "compaction_time_cpu_micros": 32028, "output_level": 6, "num_output_files": 1, "total_output_size": 11191532, "num_input_records": 9824, "num_output_records": 9307, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323535292113, "job": 6, "event": "table_file_deletion", "file_number": 17}
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323535293995, "job": 6, "event": "table_file_deletion", "file_number": 15}
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.210708) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.294110) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.294118) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.294131) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.294134) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.294140) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: Reconfiguring mon.np0005538511 (monmap changed)...
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon mon.np0005538511 on np0005538511.localdomain
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538511.fvuybw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:52:15 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:15 np0005538513.localdomain sudo[290679]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwesmpsutoiydcsoueufyckseljrtiyb ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764323535.158339-62239-279432179788947/AnsiballZ_lineinfile.py
Nov 28 09:52:15 np0005538513.localdomain sudo[290679]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 28 09:52:15 np0005538513.localdomain python3[290681]: ansible-ansible.builtin.lineinfile Invoked with dest=/etc/os-net-config/tripleo_config.yaml insertafter=172.18.0 line=    - ip_netmask: 172.18.0.103/24 backup=True path=/etc/os-net-config/tripleo_config.yaml state=present backrefs=False create=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:52:15 np0005538513.localdomain sudo[290679]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:16 np0005538513.localdomain ceph-mon[287629]: Reconfiguring mgr.np0005538511.fvuybw (monmap changed)...
Nov 28 09:52:16 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon mgr.np0005538511.fvuybw on np0005538511.localdomain
Nov 28 09:52:16 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:16 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:16 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:52:16 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:52:16 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:16 np0005538513.localdomain sudo[290825]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tuowprtfrtulfrivgdpqstrxxrpvubjr ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764323536.092501-62255-261415380333075/AnsiballZ_command.py
Nov 28 09:52:16 np0005538513.localdomain sudo[290825]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 28 09:52:16 np0005538513.localdomain python3[290827]: ansible-ansible.legacy.command Invoked with _raw_params=ip a add 172.18.0.103/24 dev vlan21 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:52:16 np0005538513.localdomain sudo[290825]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:17 np0005538513.localdomain ceph-mon[287629]: Reconfiguring mon.np0005538512 (monmap changed)...
Nov 28 09:52:17 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain
Nov 28 09:52:17 np0005538513.localdomain ceph-mon[287629]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:17 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:17 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:17 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:52:17 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:52:17 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:17 np0005538513.localdomain sudo[290970]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzffyqrjkoyahgmqfglgescbczkxfqts ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764323536.9286172-62266-162166000819924/AnsiballZ_command.py
Nov 28 09:52:17 np0005538513.localdomain sudo[290970]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 28 09:52:17 np0005538513.localdomain python3[290972]: ansible-ansible.legacy.command Invoked with _raw_params=ping -W1 -c 3 172.18.0.103 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:52:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:17.984 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:52:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:52:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:52:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:52:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:52:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:52:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:52:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:52:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:52:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:52:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:52:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:52:18 np0005538513.localdomain ceph-mon[287629]: Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)...
Nov 28 09:52:18 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain
Nov 28 09:52:18 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:18 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:18 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:18 np0005538513.localdomain ceph-mon[287629]: Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:52:18 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:52:18 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:18 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:52:18 np0005538513.localdomain ceph-mon[287629]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:18 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@4(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:52:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:52:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:52:18 np0005538513.localdomain sudo[290974]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:18 np0005538513.localdomain sudo[290974]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:18 np0005538513.localdomain sudo[290974]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:18 np0005538513.localdomain sudo[291012]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:18 np0005538513.localdomain sudo[291012]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:18 np0005538513.localdomain podman[290990]: 2025-11-28 09:52:18.869757298 +0000 UTC m=+0.097542303 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:52:18 np0005538513.localdomain podman[290990]: 2025-11-28 09:52:18.883527684 +0000 UTC m=+0.111312709 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:52:18 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:52:18 np0005538513.localdomain systemd[1]: tmp-crun.7Bs7yN.mount: Deactivated successfully.
Nov 28 09:52:18 np0005538513.localdomain podman[290991]: 2025-11-28 09:52:18.981296903 +0000 UTC m=+0.209184131 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 28 09:52:18 np0005538513.localdomain podman[290991]: 2025-11-28 09:52:18.992568432 +0000 UTC m=+0.220455670 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 09:52:19 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:52:19 np0005538513.localdomain podman[291070]: 
Nov 28 09:52:19 np0005538513.localdomain podman[291070]: 2025-11-28 09:52:19.399191058 +0000 UTC m=+0.085551922 container create 9e78265ce0d00d6e4e8c80ad3876710da092d71bf07fe53c044efe1c57db97a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_neumann, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, vcs-type=git, description=Red Hat Ceph Storage 7, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, version=7)
Nov 28 09:52:19 np0005538513.localdomain systemd[1]: Started libpod-conmon-9e78265ce0d00d6e4e8c80ad3876710da092d71bf07fe53c044efe1c57db97a3.scope.
Nov 28 09:52:19 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:52:19 np0005538513.localdomain podman[291070]: 2025-11-28 09:52:19.362229723 +0000 UTC m=+0.048590617 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:52:19 np0005538513.localdomain podman[291070]: 2025-11-28 09:52:19.479052902 +0000 UTC m=+0.165413776 container init 9e78265ce0d00d6e4e8c80ad3876710da092d71bf07fe53c044efe1c57db97a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_neumann, vcs-type=git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, GIT_CLEAN=True, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, release=553, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph)
Nov 28 09:52:19 np0005538513.localdomain podman[291070]: 2025-11-28 09:52:19.487877095 +0000 UTC m=+0.174237959 container start 9e78265ce0d00d6e4e8c80ad3876710da092d71bf07fe53c044efe1c57db97a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_neumann, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-type=git, GIT_BRANCH=main, release=553, GIT_CLEAN=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, distribution-scope=public, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:52:19 np0005538513.localdomain podman[291070]: 2025-11-28 09:52:19.488120443 +0000 UTC m=+0.174481337 container attach 9e78265ce0d00d6e4e8c80ad3876710da092d71bf07fe53c044efe1c57db97a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_neumann, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, name=rhceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, release=553)
Nov 28 09:52:19 np0005538513.localdomain hungry_neumann[291085]: 167 167
Nov 28 09:52:19 np0005538513.localdomain systemd[1]: libpod-9e78265ce0d00d6e4e8c80ad3876710da092d71bf07fe53c044efe1c57db97a3.scope: Deactivated successfully.
Nov 28 09:52:19 np0005538513.localdomain podman[291070]: 2025-11-28 09:52:19.495269734 +0000 UTC m=+0.181630628 container died 9e78265ce0d00d6e4e8c80ad3876710da092d71bf07fe53c044efe1c57db97a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_neumann, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, release=553, vendor=Red Hat, Inc., RELEASE=main, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, GIT_CLEAN=True, com.redhat.component=rhceph-container, architecture=x86_64, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:52:19 np0005538513.localdomain podman[291090]: 2025-11-28 09:52:19.591933248 +0000 UTC m=+0.089258836 container remove 9e78265ce0d00d6e4e8c80ad3876710da092d71bf07fe53c044efe1c57db97a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_neumann, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, name=rhceph, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, distribution-scope=public, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:52:19 np0005538513.localdomain systemd[1]: libpod-conmon-9e78265ce0d00d6e4e8c80ad3876710da092d71bf07fe53c044efe1c57db97a3.scope: Deactivated successfully.
Nov 28 09:52:19 np0005538513.localdomain sudo[290970]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:19 np0005538513.localdomain sudo[291012]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:19 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:19 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:19 np0005538513.localdomain ceph-mon[287629]: Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:52:19 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:52:19 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:19 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:52:19 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:19.758 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:19 np0005538513.localdomain sudo[291123]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:19 np0005538513.localdomain sudo[291123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:19 np0005538513.localdomain sudo[291123]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:19 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c7196ca7a0596408ad9b5a2c485d84bcc458f121f7d7d314aa3a45288d25201f-merged.mount: Deactivated successfully.
Nov 28 09:52:19 np0005538513.localdomain sudo[291142]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:19 np0005538513.localdomain sudo[291142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:20 np0005538513.localdomain podman[291177]: 
Nov 28 09:52:20 np0005538513.localdomain podman[291177]: 2025-11-28 09:52:20.338002729 +0000 UTC m=+0.079793193 container create be262855e2ac77df34f7c19bf4280a49624d8456b6640a98284654a5ca34597f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_turing, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True)
Nov 28 09:52:20 np0005538513.localdomain systemd[1]: Started libpod-conmon-be262855e2ac77df34f7c19bf4280a49624d8456b6640a98284654a5ca34597f.scope.
Nov 28 09:52:20 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:52:20 np0005538513.localdomain podman[291177]: 2025-11-28 09:52:20.304561113 +0000 UTC m=+0.046351617 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:52:20 np0005538513.localdomain podman[291177]: 2025-11-28 09:52:20.408901675 +0000 UTC m=+0.150692139 container init be262855e2ac77df34f7c19bf4280a49624d8456b6640a98284654a5ca34597f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_turing, architecture=x86_64, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, release=553, description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container)
Nov 28 09:52:20 np0005538513.localdomain podman[291177]: 2025-11-28 09:52:20.419123282 +0000 UTC m=+0.160913746 container start be262855e2ac77df34f7c19bf4280a49624d8456b6640a98284654a5ca34597f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_turing, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., ceph=True, name=rhceph, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, architecture=x86_64, version=7, io.openshift.expose-services=, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:52:20 np0005538513.localdomain podman[291177]: 2025-11-28 09:52:20.41938842 +0000 UTC m=+0.161178904 container attach be262855e2ac77df34f7c19bf4280a49624d8456b6640a98284654a5ca34597f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_turing, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, RELEASE=main, GIT_BRANCH=main, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_CLEAN=True, distribution-scope=public, CEPH_POINT_RELEASE=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, ceph=True, io.buildah.version=1.33.12)
Nov 28 09:52:20 np0005538513.localdomain dazzling_turing[291193]: 167 167
Nov 28 09:52:20 np0005538513.localdomain systemd[1]: libpod-be262855e2ac77df34f7c19bf4280a49624d8456b6640a98284654a5ca34597f.scope: Deactivated successfully.
Nov 28 09:52:20 np0005538513.localdomain podman[291177]: 2025-11-28 09:52:20.422695212 +0000 UTC m=+0.164485706 container died be262855e2ac77df34f7c19bf4280a49624d8456b6640a98284654a5ca34597f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_turing, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, build-date=2025-09-24T08:57:55, version=7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, name=rhceph, io.openshift.expose-services=, architecture=x86_64, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True)
Nov 28 09:52:20 np0005538513.localdomain podman[291198]: 2025-11-28 09:52:20.508799949 +0000 UTC m=+0.073932040 container remove be262855e2ac77df34f7c19bf4280a49624d8456b6640a98284654a5ca34597f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_turing, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, com.redhat.component=rhceph-container, release=553, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, vcs-type=git, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, ceph=True)
Nov 28 09:52:20 np0005538513.localdomain systemd[1]: libpod-conmon-be262855e2ac77df34f7c19bf4280a49624d8456b6640a98284654a5ca34597f.scope: Deactivated successfully.
Nov 28 09:52:20 np0005538513.localdomain sudo[291142]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:20 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:20 np0005538513.localdomain ceph-mon[287629]: Reconfiguring osd.2 (monmap changed)...
Nov 28 09:52:20 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:52:20 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:20 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:52:20 np0005538513.localdomain ceph-mon[287629]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:20 np0005538513.localdomain sudo[291221]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:20 np0005538513.localdomain sudo[291221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:20 np0005538513.localdomain sudo[291221]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:20 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-4b391dbd2ed654773b7277cd1822234aea0448970ba6b0d093433c6ac0de246b-merged.mount: Deactivated successfully.
Nov 28 09:52:20 np0005538513.localdomain sudo[291239]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:20 np0005538513.localdomain sudo[291239]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:21 np0005538513.localdomain podman[291273]: 
Nov 28 09:52:21 np0005538513.localdomain podman[291273]: 2025-11-28 09:52:21.381670668 +0000 UTC m=+0.079365230 container create a05a5965bcf46a342759dccea69443af2d35257cbece258f45b18ec395427b19 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_nightingale, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, RELEASE=main, version=7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, release=553, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, name=rhceph, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True)
Nov 28 09:52:21 np0005538513.localdomain systemd[1]: Started libpod-conmon-a05a5965bcf46a342759dccea69443af2d35257cbece258f45b18ec395427b19.scope.
Nov 28 09:52:21 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:52:21 np0005538513.localdomain podman[291273]: 2025-11-28 09:52:21.446806696 +0000 UTC m=+0.144501258 container init a05a5965bcf46a342759dccea69443af2d35257cbece258f45b18ec395427b19 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_nightingale, GIT_BRANCH=main, vcs-type=git, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_CLEAN=True, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container)
Nov 28 09:52:21 np0005538513.localdomain podman[291273]: 2025-11-28 09:52:21.348508161 +0000 UTC m=+0.046202783 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:52:21 np0005538513.localdomain podman[291273]: 2025-11-28 09:52:21.456146585 +0000 UTC m=+0.153841197 container start a05a5965bcf46a342759dccea69443af2d35257cbece258f45b18ec395427b19 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_nightingale, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, distribution-scope=public, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vendor=Red Hat, Inc., ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_CLEAN=True, RELEASE=main, version=7, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 09:52:21 np0005538513.localdomain podman[291273]: 2025-11-28 09:52:21.456426653 +0000 UTC m=+0.154121215 container attach a05a5965bcf46a342759dccea69443af2d35257cbece258f45b18ec395427b19 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_nightingale, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, io.openshift.expose-services=, release=553, architecture=x86_64, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:52:21 np0005538513.localdomain trusting_nightingale[291288]: 167 167
Nov 28 09:52:21 np0005538513.localdomain systemd[1]: libpod-a05a5965bcf46a342759dccea69443af2d35257cbece258f45b18ec395427b19.scope: Deactivated successfully.
Nov 28 09:52:21 np0005538513.localdomain podman[291273]: 2025-11-28 09:52:21.459282652 +0000 UTC m=+0.156977244 container died a05a5965bcf46a342759dccea69443af2d35257cbece258f45b18ec395427b19 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_nightingale, io.buildah.version=1.33.12, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, architecture=x86_64, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, release=553, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, io.openshift.tags=rhceph ceph, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, RELEASE=main, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc.)
Nov 28 09:52:21 np0005538513.localdomain podman[291293]: 2025-11-28 09:52:21.551937892 +0000 UTC m=+0.085015374 container remove a05a5965bcf46a342759dccea69443af2d35257cbece258f45b18ec395427b19 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_nightingale, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, GIT_BRANCH=main, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, version=7, architecture=x86_64, io.openshift.tags=rhceph ceph, distribution-scope=public, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=)
Nov 28 09:52:21 np0005538513.localdomain systemd[1]: libpod-conmon-a05a5965bcf46a342759dccea69443af2d35257cbece258f45b18ec395427b19.scope: Deactivated successfully.
Nov 28 09:52:21 np0005538513.localdomain sudo[291239]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:21 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:21 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:21 np0005538513.localdomain ceph-mon[287629]: Reconfiguring osd.5 (monmap changed)...
Nov 28 09:52:21 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:52:21 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:21 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:52:21 np0005538513.localdomain ceph-mon[287629]: from='client.34194 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:52:21 np0005538513.localdomain ceph-mon[287629]: Saving service mon spec with placement label:mon
Nov 28 09:52:21 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:21 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-14a912ffd230ebe86e4792c06d4f8070f8a1280237a16266ce23f83fba99b912-merged.mount: Deactivated successfully.
Nov 28 09:52:21 np0005538513.localdomain sudo[291318]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:21 np0005538513.localdomain sudo[291318]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:21 np0005538513.localdomain sudo[291318]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:21 np0005538513.localdomain sudo[291336]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:21 np0005538513.localdomain sudo[291336]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:22 np0005538513.localdomain podman[291370]: 
Nov 28 09:52:22 np0005538513.localdomain podman[291370]: 2025-11-28 09:52:22.381343895 +0000 UTC m=+0.072210208 container create fe5109be67baeb775791f3c9aaea73c2aa276d979252911350b2a55ddbf35669 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_galileo, RELEASE=main, name=rhceph, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, vcs-type=git, version=7, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, architecture=x86_64, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:52:22 np0005538513.localdomain systemd[1]: Started libpod-conmon-fe5109be67baeb775791f3c9aaea73c2aa276d979252911350b2a55ddbf35669.scope.
Nov 28 09:52:22 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:52:22 np0005538513.localdomain podman[291370]: 2025-11-28 09:52:22.45062081 +0000 UTC m=+0.141487113 container init fe5109be67baeb775791f3c9aaea73c2aa276d979252911350b2a55ddbf35669 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_galileo, release=553, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, version=7, com.redhat.component=rhceph-container, architecture=x86_64, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, name=rhceph, distribution-scope=public, ceph=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:52:22 np0005538513.localdomain podman[291370]: 2025-11-28 09:52:22.352306325 +0000 UTC m=+0.043172658 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:52:22 np0005538513.localdomain podman[291370]: 2025-11-28 09:52:22.458863046 +0000 UTC m=+0.149729349 container start fe5109be67baeb775791f3c9aaea73c2aa276d979252911350b2a55ddbf35669 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_galileo, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_BRANCH=main, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, RELEASE=main, CEPH_POINT_RELEASE=, release=553, name=rhceph, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, architecture=x86_64, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:52:22 np0005538513.localdomain podman[291370]: 2025-11-28 09:52:22.459120553 +0000 UTC m=+0.149986906 container attach fe5109be67baeb775791f3c9aaea73c2aa276d979252911350b2a55ddbf35669 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_galileo, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, architecture=x86_64, GIT_CLEAN=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, ceph=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.expose-services=, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph)
Nov 28 09:52:22 np0005538513.localdomain sweet_galileo[291385]: 167 167
Nov 28 09:52:22 np0005538513.localdomain systemd[1]: libpod-fe5109be67baeb775791f3c9aaea73c2aa276d979252911350b2a55ddbf35669.scope: Deactivated successfully.
Nov 28 09:52:22 np0005538513.localdomain podman[291370]: 2025-11-28 09:52:22.461820997 +0000 UTC m=+0.152687340 container died fe5109be67baeb775791f3c9aaea73c2aa276d979252911350b2a55ddbf35669 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_galileo, release=553, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, name=rhceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.buildah.version=1.33.12, vendor=Red Hat, Inc., distribution-scope=public, ceph=True, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=)
Nov 28 09:52:22 np0005538513.localdomain podman[291390]: 2025-11-28 09:52:22.559320648 +0000 UTC m=+0.085211651 container remove fe5109be67baeb775791f3c9aaea73c2aa276d979252911350b2a55ddbf35669 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_galileo, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-type=git, release=553, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:52:22 np0005538513.localdomain systemd[1]: libpod-conmon-fe5109be67baeb775791f3c9aaea73c2aa276d979252911350b2a55ddbf35669.scope: Deactivated successfully.
Nov 28 09:52:22 np0005538513.localdomain sudo[291336]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:22 np0005538513.localdomain sudo[291406]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:22 np0005538513.localdomain sudo[291406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:22 np0005538513.localdomain sudo[291406]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:22 np0005538513.localdomain ceph-mon[287629]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:52:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:52:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:22 np0005538513.localdomain ceph-mon[287629]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:52:22 np0005538513.localdomain ceph-mon[287629]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:22 np0005538513.localdomain ceph-mon[287629]: from='client.34199 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538513", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:52:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:52:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:52:22 np0005538513.localdomain ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:22 np0005538513.localdomain sudo[291424]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:22 np0005538513.localdomain sudo[291424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:22 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-76516cfe65d994939cb819dd7c2e351040b3c2cc9d88659c45a8e00945726c60-merged.mount: Deactivated successfully.
Nov 28 09:52:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:23.017 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:23 np0005538513.localdomain podman[291459]: 
Nov 28 09:52:23 np0005538513.localdomain podman[291459]: 2025-11-28 09:52:23.264877103 +0000 UTC m=+0.077629495 container create cc3b6900b442300513f7425b1eebcdd73e23e362a22a67fd0929583ba0539efe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_hofstadter, GIT_BRANCH=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, architecture=x86_64, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, ceph=True, distribution-scope=public)
Nov 28 09:52:23 np0005538513.localdomain systemd[1]: Started libpod-conmon-cc3b6900b442300513f7425b1eebcdd73e23e362a22a67fd0929583ba0539efe.scope.
Nov 28 09:52:23 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:52:23 np0005538513.localdomain podman[291459]: 2025-11-28 09:52:23.329157415 +0000 UTC m=+0.141909817 container init cc3b6900b442300513f7425b1eebcdd73e23e362a22a67fd0929583ba0539efe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_hofstadter, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_BRANCH=main, io.buildah.version=1.33.12, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 09:52:23 np0005538513.localdomain podman[291459]: 2025-11-28 09:52:23.233334647 +0000 UTC m=+0.046087069 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:52:23 np0005538513.localdomain podman[291459]: 2025-11-28 09:52:23.338273757 +0000 UTC m=+0.151026149 container start cc3b6900b442300513f7425b1eebcdd73e23e362a22a67fd0929583ba0539efe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_hofstadter, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True)
Nov 28 09:52:23 np0005538513.localdomain podman[291459]: 2025-11-28 09:52:23.338603767 +0000 UTC m=+0.151356219 container attach cc3b6900b442300513f7425b1eebcdd73e23e362a22a67fd0929583ba0539efe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_hofstadter, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., ceph=True, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, description=Red Hat Ceph Storage 7, RELEASE=main, name=rhceph, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:52:23 np0005538513.localdomain zen_hofstadter[291474]: 167 167
Nov 28 09:52:23 np0005538513.localdomain systemd[1]: libpod-cc3b6900b442300513f7425b1eebcdd73e23e362a22a67fd0929583ba0539efe.scope: Deactivated successfully.
Nov 28 09:52:23 np0005538513.localdomain podman[291459]: 2025-11-28 09:52:23.343700535 +0000 UTC m=+0.156452987 container died cc3b6900b442300513f7425b1eebcdd73e23e362a22a67fd0929583ba0539efe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_hofstadter, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, version=7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, distribution-scope=public)
Nov 28 09:52:23 np0005538513.localdomain podman[291479]: 2025-11-28 09:52:23.4361982 +0000 UTC m=+0.083740465 container remove cc3b6900b442300513f7425b1eebcdd73e23e362a22a67fd0929583ba0539efe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_hofstadter, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 09:52:23 np0005538513.localdomain systemd[1]: libpod-conmon-cc3b6900b442300513f7425b1eebcdd73e23e362a22a67fd0929583ba0539efe.scope: Deactivated successfully.
Nov 28 09:52:23 np0005538513.localdomain sudo[291424]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:23 np0005538513.localdomain ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b8f20 mon_map magic: 0 from mon.4 v2:172.18.0.106:3300/0
Nov 28 09:52:23 np0005538513.localdomain ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b91e0 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0
Nov 28 09:52:23 np0005538513.localdomain ceph-mon[287629]: mon.np0005538513@4(peon) e8  removed from monmap, suicide.
Nov 28 09:52:23 np0005538513.localdomain ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b9600 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0
Nov 28 09:52:23 np0005538513.localdomain sudo[291496]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:23 np0005538513.localdomain sudo[291495]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:23 np0005538513.localdomain sudo[291496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:23 np0005538513.localdomain sudo[291495]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:23 np0005538513.localdomain sudo[291496]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:23 np0005538513.localdomain sudo[291495]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:23 np0005538513.localdomain podman[291506]: 2025-11-28 09:52:23.666997169 +0000 UTC m=+0.063885439 container died b72bfc005269ade02879c161a498dd471b1542ee13d035dbcf0188cb36c61613 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538513, distribution-scope=public, description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, ceph=True, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-type=git, GIT_CLEAN=True)
Nov 28 09:52:23 np0005538513.localdomain podman[291506]: 2025-11-28 09:52:23.700679743 +0000 UTC m=+0.097567973 container remove b72bfc005269ade02879c161a498dd471b1542ee13d035dbcf0188cb36c61613 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538513, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, version=7, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, GIT_CLEAN=True, release=553, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_BRANCH=main, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc.)
Nov 28 09:52:23 np0005538513.localdomain sudo[291543]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 rm-daemon --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 --name mon.np0005538513 --force
Nov 28 09:52:23 np0005538513.localdomain sudo[291544]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 09:52:23 np0005538513.localdomain sudo[291543]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:23 np0005538513.localdomain sudo[291544]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:23 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-fa146ef048fc0f615b2043c07baf9f6ab786ee7824e42405da8d80c5752a9ca4-merged.mount: Deactivated successfully.
Nov 28 09:52:23 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-aa77a2e7ddc389a4da392f7d457efdd2cb70596fdbafdc2c079e625848050438-merged.mount: Deactivated successfully.
Nov 28 09:52:24 np0005538513.localdomain sudo[291544]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:24 np0005538513.localdomain sudo[291683]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:52:24 np0005538513.localdomain sudo[291683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:24 np0005538513.localdomain sudo[291683]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:24 np0005538513.localdomain sudo[291703]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:52:24 np0005538513.localdomain sudo[291703]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:24 np0005538513.localdomain sudo[291703]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:24 np0005538513.localdomain sudo[291727]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:24 np0005538513.localdomain sudo[291727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:24 np0005538513.localdomain sudo[291727]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:24 np0005538513.localdomain sudo[291752]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:24 np0005538513.localdomain sudo[291752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:24 np0005538513.localdomain sudo[291752]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:24 np0005538513.localdomain systemd[1]: ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1@mon.np0005538513.service: Deactivated successfully.
Nov 28 09:52:24 np0005538513.localdomain systemd[1]: Stopped Ceph mon.np0005538513 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1.
Nov 28 09:52:24 np0005538513.localdomain systemd[1]: ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1@mon.np0005538513.service: Consumed 4.361s CPU time.
Nov 28 09:52:24 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:52:24 np0005538513.localdomain sudo[291771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:24 np0005538513.localdomain systemd-rc-local-generator[291811]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:52:24 np0005538513.localdomain systemd-sysv-generator[291818]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:52:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:24.787 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:24 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:24 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:24 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:24 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:24 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:52:24 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:24 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:24 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:24 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:25 np0005538513.localdomain sudo[291771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:25 np0005538513.localdomain sudo[291771]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:25 np0005538513.localdomain sudo[291543]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:25 np0005538513.localdomain sudo[291843]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:25 np0005538513.localdomain sudo[291843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:25 np0005538513.localdomain sudo[291843]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:25 np0005538513.localdomain sudo[291861]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:25 np0005538513.localdomain sudo[291861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:25 np0005538513.localdomain sudo[291861]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:25 np0005538513.localdomain sudo[291879]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:52:25 np0005538513.localdomain sudo[291879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:25 np0005538513.localdomain sudo[291879]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:25 np0005538513.localdomain sudo[291897]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:52:25 np0005538513.localdomain sudo[291897]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:25 np0005538513.localdomain sudo[291897]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:25 np0005538513.localdomain sudo[291915]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:52:25 np0005538513.localdomain sudo[291915]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:25 np0005538513.localdomain sudo[291915]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:25 np0005538513.localdomain sudo[291933]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:25 np0005538513.localdomain sudo[291933]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:25 np0005538513.localdomain sudo[291933]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:25 np0005538513.localdomain sudo[291951]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:25 np0005538513.localdomain sudo[291951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:25 np0005538513.localdomain sudo[291951]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:25 np0005538513.localdomain sudo[291969]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:25 np0005538513.localdomain sudo[291969]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:25 np0005538513.localdomain sudo[291969]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:25 np0005538513.localdomain sudo[292003]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:25 np0005538513.localdomain sudo[292003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:25 np0005538513.localdomain sudo[292003]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:25 np0005538513.localdomain sudo[292021]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:25 np0005538513.localdomain sudo[292021]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:25 np0005538513.localdomain sudo[292021]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:25 np0005538513.localdomain sudo[292039]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:25 np0005538513.localdomain sudo[292039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:25 np0005538513.localdomain sudo[292039]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:26 np0005538513.localdomain sudo[292057]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:52:26 np0005538513.localdomain sudo[292057]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:26 np0005538513.localdomain sudo[292057]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:28.018 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:29.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:52:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:29.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:52:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:29.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:52:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:29.790 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:30 np0005538513.localdomain sudo[292075]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:30 np0005538513.localdomain sudo[292075]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:30 np0005538513.localdomain sudo[292075]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:30 np0005538513.localdomain sudo[292093]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:30 np0005538513.localdomain sudo[292093]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:30 np0005538513.localdomain podman[292127]: 
Nov 28 09:52:30 np0005538513.localdomain podman[292127]: 2025-11-28 09:52:30.805723523 +0000 UTC m=+0.080115613 container create 4b9448c347bbd22919dad67c0530ab2e43fd35625b446f389a8d2a69478e9caa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_tharp, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, ceph=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, name=rhceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, version=7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:52:30 np0005538513.localdomain systemd[1]: Started libpod-conmon-4b9448c347bbd22919dad67c0530ab2e43fd35625b446f389a8d2a69478e9caa.scope.
Nov 28 09:52:30 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:52:30 np0005538513.localdomain podman[292127]: 2025-11-28 09:52:30.771622687 +0000 UTC m=+0.046014817 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:52:30 np0005538513.localdomain podman[292127]: 2025-11-28 09:52:30.880500639 +0000 UTC m=+0.154892729 container init 4b9448c347bbd22919dad67c0530ab2e43fd35625b446f389a8d2a69478e9caa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_tharp, ceph=True, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, GIT_CLEAN=True, io.buildah.version=1.33.12, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, name=rhceph, version=7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 09:52:30 np0005538513.localdomain podman[292127]: 2025-11-28 09:52:30.892576034 +0000 UTC m=+0.166968124 container start 4b9448c347bbd22919dad67c0530ab2e43fd35625b446f389a8d2a69478e9caa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_tharp, GIT_CLEAN=True, release=553, distribution-scope=public, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, vcs-type=git, RELEASE=main, io.openshift.expose-services=, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12)
Nov 28 09:52:30 np0005538513.localdomain podman[292127]: 2025-11-28 09:52:30.892843262 +0000 UTC m=+0.167235402 container attach 4b9448c347bbd22919dad67c0530ab2e43fd35625b446f389a8d2a69478e9caa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_tharp, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, release=553, ceph=True, name=rhceph, distribution-scope=public, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, RELEASE=main, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:52:30 np0005538513.localdomain frosty_tharp[292142]: 167 167
Nov 28 09:52:30 np0005538513.localdomain systemd[1]: libpod-4b9448c347bbd22919dad67c0530ab2e43fd35625b446f389a8d2a69478e9caa.scope: Deactivated successfully.
Nov 28 09:52:30 np0005538513.localdomain podman[292127]: 2025-11-28 09:52:30.897099863 +0000 UTC m=+0.171492003 container died 4b9448c347bbd22919dad67c0530ab2e43fd35625b446f389a8d2a69478e9caa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_tharp, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, architecture=x86_64, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, version=7, io.buildah.version=1.33.12, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_BRANCH=main)
Nov 28 09:52:30 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:52:31 np0005538513.localdomain podman[292147]: 2025-11-28 09:52:31.013486589 +0000 UTC m=+0.108150261 container remove 4b9448c347bbd22919dad67c0530ab2e43fd35625b446f389a8d2a69478e9caa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_tharp, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, distribution-scope=public, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, name=rhceph, version=7, release=553, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-type=git, description=Red Hat Ceph Storage 7)
Nov 28 09:52:31 np0005538513.localdomain systemd[1]: libpod-conmon-4b9448c347bbd22919dad67c0530ab2e43fd35625b446f389a8d2a69478e9caa.scope: Deactivated successfully.
Nov 28 09:52:31 np0005538513.localdomain sudo[292093]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:31 np0005538513.localdomain podman[292161]: 2025-11-28 09:52:31.083438156 +0000 UTC m=+0.081665281 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.tags=minimal rhel9, version=9.6, container_name=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, distribution-scope=public)
Nov 28 09:52:31 np0005538513.localdomain podman[292161]: 2025-11-28 09:52:31.093200628 +0000 UTC m=+0.091427793 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41)
Nov 28 09:52:31 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:52:31 np0005538513.localdomain sudo[292182]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:31 np0005538513.localdomain sudo[292182]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:31 np0005538513.localdomain sudo[292182]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:31 np0005538513.localdomain sudo[292200]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:31 np0005538513.localdomain sudo[292200]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:31 np0005538513.localdomain podman[292234]: 
Nov 28 09:52:31 np0005538513.localdomain podman[292234]: 2025-11-28 09:52:31.715223996 +0000 UTC m=+0.077574573 container create 32c43026a18399bc4d268edd864df58f77b474ef2cd51fcb8ece24acaae71bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_brattain, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, version=7, architecture=x86_64, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, release=553, GIT_CLEAN=True, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-type=git, distribution-scope=public, ceph=True)
Nov 28 09:52:31 np0005538513.localdomain systemd[1]: Started libpod-conmon-32c43026a18399bc4d268edd864df58f77b474ef2cd51fcb8ece24acaae71bd6.scope.
Nov 28 09:52:31 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:52:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:31.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:52:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:31.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:52:31 np0005538513.localdomain podman[292234]: 2025-11-28 09:52:31.684718702 +0000 UTC m=+0.047069299 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:52:31 np0005538513.localdomain podman[292234]: 2025-11-28 09:52:31.787069672 +0000 UTC m=+0.149420259 container init 32c43026a18399bc4d268edd864df58f77b474ef2cd51fcb8ece24acaae71bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_brattain, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, distribution-scope=public, description=Red Hat Ceph Storage 7, RELEASE=main, version=7, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, vendor=Red Hat, Inc., release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64)
Nov 28 09:52:31 np0005538513.localdomain podman[292234]: 2025-11-28 09:52:31.798124634 +0000 UTC m=+0.160475211 container start 32c43026a18399bc4d268edd864df58f77b474ef2cd51fcb8ece24acaae71bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_brattain, io.openshift.expose-services=, GIT_CLEAN=True, io.buildah.version=1.33.12, GIT_BRANCH=main, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, name=rhceph, vendor=Red Hat, Inc., ceph=True, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=)
Nov 28 09:52:31 np0005538513.localdomain podman[292234]: 2025-11-28 09:52:31.798384412 +0000 UTC m=+0.160735029 container attach 32c43026a18399bc4d268edd864df58f77b474ef2cd51fcb8ece24acaae71bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_brattain, name=rhceph, vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, distribution-scope=public, io.openshift.expose-services=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.component=rhceph-container, RELEASE=main, io.openshift.tags=rhceph ceph)
Nov 28 09:52:31 np0005538513.localdomain stoic_brattain[292250]: 167 167
Nov 28 09:52:31 np0005538513.localdomain systemd[1]: libpod-32c43026a18399bc4d268edd864df58f77b474ef2cd51fcb8ece24acaae71bd6.scope: Deactivated successfully.
Nov 28 09:52:31 np0005538513.localdomain podman[292234]: 2025-11-28 09:52:31.802101027 +0000 UTC m=+0.164451644 container died 32c43026a18399bc4d268edd864df58f77b474ef2cd51fcb8ece24acaae71bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_brattain, distribution-scope=public, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.12, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., release=553, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:52:31 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d2472805b6e65faea0a922838297e0a878fcac27f317ddbce17813762cb01898-merged.mount: Deactivated successfully.
Nov 28 09:52:31 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-8ca598cba0f76d13ff7187c0eeb32ff3a2065c42911d66d61cb0ab9a61e3a769-merged.mount: Deactivated successfully.
Nov 28 09:52:31 np0005538513.localdomain podman[292255]: 2025-11-28 09:52:31.915559952 +0000 UTC m=+0.097761469 container remove 32c43026a18399bc4d268edd864df58f77b474ef2cd51fcb8ece24acaae71bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_brattain, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, distribution-scope=public, ceph=True, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_CLEAN=True, name=rhceph, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12)
Nov 28 09:52:31 np0005538513.localdomain systemd[1]: libpod-conmon-32c43026a18399bc4d268edd864df58f77b474ef2cd51fcb8ece24acaae71bd6.scope: Deactivated successfully.
Nov 28 09:52:32 np0005538513.localdomain sudo[292200]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:32 np0005538513.localdomain sudo[292279]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:32 np0005538513.localdomain sudo[292279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:32 np0005538513.localdomain sudo[292279]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:32 np0005538513.localdomain sudo[292297]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:32 np0005538513.localdomain sudo[292297]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:32.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:52:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:32.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:52:32 np0005538513.localdomain podman[292332]: 
Nov 28 09:52:32 np0005538513.localdomain podman[292332]: 2025-11-28 09:52:32.794161659 +0000 UTC m=+0.082732285 container create b02b84ecb3dcb870adce08c53d7583b5227dadc28aec69d789fbbb92dee46820 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_darwin, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, architecture=x86_64, com.redhat.component=rhceph-container, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vcs-type=git, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12)
Nov 28 09:52:32 np0005538513.localdomain systemd[1]: Started libpod-conmon-b02b84ecb3dcb870adce08c53d7583b5227dadc28aec69d789fbbb92dee46820.scope.
Nov 28 09:52:32 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:52:32 np0005538513.localdomain podman[292332]: 2025-11-28 09:52:32.857352465 +0000 UTC m=+0.145923081 container init b02b84ecb3dcb870adce08c53d7583b5227dadc28aec69d789fbbb92dee46820 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_darwin, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.openshift.tags=rhceph ceph, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, RELEASE=main, distribution-scope=public, GIT_CLEAN=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 09:52:32 np0005538513.localdomain podman[292332]: 2025-11-28 09:52:32.75936062 +0000 UTC m=+0.047931246 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:52:32 np0005538513.localdomain podman[292332]: 2025-11-28 09:52:32.870100981 +0000 UTC m=+0.158671607 container start b02b84ecb3dcb870adce08c53d7583b5227dadc28aec69d789fbbb92dee46820 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_darwin, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, ceph=True, release=553, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, version=7)
Nov 28 09:52:32 np0005538513.localdomain compassionate_darwin[292347]: 167 167
Nov 28 09:52:32 np0005538513.localdomain podman[292332]: 2025-11-28 09:52:32.87040683 +0000 UTC m=+0.158977486 container attach b02b84ecb3dcb870adce08c53d7583b5227dadc28aec69d789fbbb92dee46820 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_darwin, version=7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, ceph=True, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_CLEAN=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.buildah.version=1.33.12, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:52:32 np0005538513.localdomain systemd[1]: libpod-b02b84ecb3dcb870adce08c53d7583b5227dadc28aec69d789fbbb92dee46820.scope: Deactivated successfully.
Nov 28 09:52:32 np0005538513.localdomain podman[292332]: 2025-11-28 09:52:32.876286642 +0000 UTC m=+0.164857258 container died b02b84ecb3dcb870adce08c53d7583b5227dadc28aec69d789fbbb92dee46820 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_darwin, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.component=rhceph-container, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, vcs-type=git, ceph=True)
Nov 28 09:52:32 np0005538513.localdomain podman[292352]: 2025-11-28 09:52:32.972255265 +0000 UTC m=+0.086439409 container remove b02b84ecb3dcb870adce08c53d7583b5227dadc28aec69d789fbbb92dee46820 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_darwin, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, name=rhceph, vendor=Red Hat, Inc., ceph=True, io.buildah.version=1.33.12, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph)
Nov 28 09:52:32 np0005538513.localdomain systemd[1]: libpod-conmon-b02b84ecb3dcb870adce08c53d7583b5227dadc28aec69d789fbbb92dee46820.scope: Deactivated successfully.
Nov 28 09:52:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:33.058 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:33 np0005538513.localdomain sudo[292297]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:33 np0005538513.localdomain sudo[292375]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:33 np0005538513.localdomain sudo[292375]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:33 np0005538513.localdomain sudo[292375]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:33 np0005538513.localdomain sudo[292393]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:33 np0005538513.localdomain sudo[292393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:33.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:52:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:33.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:52:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:33.797 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:52:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:33.797 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:52:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:33.798 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:52:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:33.798 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:52:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:33.799 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:52:33 np0005538513.localdomain podman[292427]: 
Nov 28 09:52:33 np0005538513.localdomain podman[292427]: 2025-11-28 09:52:33.836122014 +0000 UTC m=+0.080546485 container create f7ff2578c7a2e12d93fa3f943fc29fff90c5a9e48efd6b2a9dceff2f980753f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_mccarthy, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, release=553, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, architecture=x86_64, vcs-type=git, distribution-scope=public, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, CEPH_POINT_RELEASE=)
Nov 28 09:52:33 np0005538513.localdomain systemd[1]: tmp-crun.FbRJ4q.mount: Deactivated successfully.
Nov 28 09:52:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-881907754b830c474470259834474c34dad40024dd2547f3ed3cbcf3fffcda2f-merged.mount: Deactivated successfully.
Nov 28 09:52:33 np0005538513.localdomain systemd[1]: Started libpod-conmon-f7ff2578c7a2e12d93fa3f943fc29fff90c5a9e48efd6b2a9dceff2f980753f8.scope.
Nov 28 09:52:33 np0005538513.localdomain podman[292427]: 2025-11-28 09:52:33.804121543 +0000 UTC m=+0.048546044 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:52:33 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:52:33 np0005538513.localdomain podman[292427]: 2025-11-28 09:52:33.929333662 +0000 UTC m=+0.173758153 container init f7ff2578c7a2e12d93fa3f943fc29fff90c5a9e48efd6b2a9dceff2f980753f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_mccarthy, CEPH_POINT_RELEASE=, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=553, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12)
Nov 28 09:52:33 np0005538513.localdomain podman[292427]: 2025-11-28 09:52:33.941825958 +0000 UTC m=+0.186250429 container start f7ff2578c7a2e12d93fa3f943fc29fff90c5a9e48efd6b2a9dceff2f980753f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_mccarthy, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, CEPH_POINT_RELEASE=, GIT_CLEAN=True, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, com.redhat.component=rhceph-container)
Nov 28 09:52:33 np0005538513.localdomain podman[292427]: 2025-11-28 09:52:33.942169059 +0000 UTC m=+0.186593530 container attach f7ff2578c7a2e12d93fa3f943fc29fff90c5a9e48efd6b2a9dceff2f980753f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_mccarthy, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, build-date=2025-09-24T08:57:55, name=rhceph, GIT_BRANCH=main, version=7, vcs-type=git, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:52:33 np0005538513.localdomain angry_mccarthy[292443]: 167 167
Nov 28 09:52:33 np0005538513.localdomain systemd[1]: libpod-f7ff2578c7a2e12d93fa3f943fc29fff90c5a9e48efd6b2a9dceff2f980753f8.scope: Deactivated successfully.
Nov 28 09:52:33 np0005538513.localdomain podman[292427]: 2025-11-28 09:52:33.944968376 +0000 UTC m=+0.189392837 container died f7ff2578c7a2e12d93fa3f943fc29fff90c5a9e48efd6b2a9dceff2f980753f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_mccarthy, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, version=7, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, release=553, GIT_CLEAN=True, RELEASE=main, architecture=x86_64, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:52:34 np0005538513.localdomain podman[292460]: 2025-11-28 09:52:34.029483385 +0000 UTC m=+0.074690265 container remove f7ff2578c7a2e12d93fa3f943fc29fff90c5a9e48efd6b2a9dceff2f980753f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_mccarthy, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_BRANCH=main, release=553, GIT_CLEAN=True, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., ceph=True)
Nov 28 09:52:34 np0005538513.localdomain systemd[1]: libpod-conmon-f7ff2578c7a2e12d93fa3f943fc29fff90c5a9e48efd6b2a9dceff2f980753f8.scope: Deactivated successfully.
Nov 28 09:52:34 np0005538513.localdomain sudo[292393]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:34 np0005538513.localdomain sudo[292485]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:34 np0005538513.localdomain sudo[292485]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:34 np0005538513.localdomain sudo[292485]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:34 np0005538513.localdomain sudo[292503]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:34 np0005538513.localdomain sudo[292503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:34.325 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.526s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:52:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:34.391 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:52:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:34.392 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:52:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:34.703 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:52:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:34.706 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11838MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:52:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:34.707 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:52:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:34.707 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:52:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:34.798 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:52:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:34.799 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:52:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:34.800 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:52:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:34.823 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:34 np0005538513.localdomain systemd[1]: tmp-crun.SiymzB.mount: Deactivated successfully.
Nov 28 09:52:34 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-820f235033de985fc02ee247ea603f4d95f34deb878d756e05cb2bed4a5ee24e-merged.mount: Deactivated successfully.
Nov 28 09:52:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:34.855 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:52:34 np0005538513.localdomain podman[292540]: 
Nov 28 09:52:34 np0005538513.localdomain podman[292540]: 2025-11-28 09:52:34.875576214 +0000 UTC m=+0.101336711 container create 1538f2835789f0d34897b6c2ae399d129cd9d6c1f9fac1b057f086961d956fe1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_greider, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, distribution-scope=public, ceph=True, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, vcs-type=git)
Nov 28 09:52:34 np0005538513.localdomain systemd[1]: Started libpod-conmon-1538f2835789f0d34897b6c2ae399d129cd9d6c1f9fac1b057f086961d956fe1.scope.
Nov 28 09:52:34 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:52:34 np0005538513.localdomain podman[292540]: 2025-11-28 09:52:34.841573399 +0000 UTC m=+0.067333936 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:52:34 np0005538513.localdomain podman[292540]: 2025-11-28 09:52:34.943552699 +0000 UTC m=+0.169313156 container init 1538f2835789f0d34897b6c2ae399d129cd9d6c1f9fac1b057f086961d956fe1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_greider, CEPH_POINT_RELEASE=, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, version=7, RELEASE=main, com.redhat.component=rhceph-container, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:52:34 np0005538513.localdomain podman[292540]: 2025-11-28 09:52:34.951941109 +0000 UTC m=+0.177701566 container start 1538f2835789f0d34897b6c2ae399d129cd9d6c1f9fac1b057f086961d956fe1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_greider, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.k8s.description=Red Hat Ceph Storage 7, version=7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, vcs-type=git, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, distribution-scope=public, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_CLEAN=True)
Nov 28 09:52:34 np0005538513.localdomain podman[292540]: 2025-11-28 09:52:34.952221397 +0000 UTC m=+0.177981854 container attach 1538f2835789f0d34897b6c2ae399d129cd9d6c1f9fac1b057f086961d956fe1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_greider, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, name=rhceph, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, architecture=x86_64, build-date=2025-09-24T08:57:55, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7)
Nov 28 09:52:34 np0005538513.localdomain reverent_greider[292553]: 167 167
Nov 28 09:52:34 np0005538513.localdomain systemd[1]: libpod-1538f2835789f0d34897b6c2ae399d129cd9d6c1f9fac1b057f086961d956fe1.scope: Deactivated successfully.
Nov 28 09:52:34 np0005538513.localdomain podman[292540]: 2025-11-28 09:52:34.958575874 +0000 UTC m=+0.184336421 container died 1538f2835789f0d34897b6c2ae399d129cd9d6c1f9fac1b057f086961d956fe1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_greider, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, GIT_BRANCH=main, version=7, name=rhceph)
Nov 28 09:52:35 np0005538513.localdomain podman[292559]: 2025-11-28 09:52:35.06882209 +0000 UTC m=+0.094039775 container remove 1538f2835789f0d34897b6c2ae399d129cd9d6c1f9fac1b057f086961d956fe1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_greider, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, ceph=True, release=553, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=)
Nov 28 09:52:35 np0005538513.localdomain systemd[1]: libpod-conmon-1538f2835789f0d34897b6c2ae399d129cd9d6c1f9fac1b057f086961d956fe1.scope: Deactivated successfully.
Nov 28 09:52:35 np0005538513.localdomain sudo[292503]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:35.298 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:52:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:35.306 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:52:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:35.323 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:52:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:35.326 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:52:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:35.326 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:52:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:52:35 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cb887550d0d9e4663a530abb81fb3d5212ae9ad3989098cf7d46352fcd6c38a2-merged.mount: Deactivated successfully.
Nov 28 09:52:35 np0005538513.localdomain podman[292594]: 2025-11-28 09:52:35.858841372 +0000 UTC m=+0.087728250 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:52:35 np0005538513.localdomain podman[292594]: 2025-11-28 09:52:35.867382186 +0000 UTC m=+0.096269114 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:52:35 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:52:36 np0005538513.localdomain sudo[292616]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:36 np0005538513.localdomain sudo[292616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:36 np0005538513.localdomain sudo[292616]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:36 np0005538513.localdomain sudo[292634]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:36 np0005538513.localdomain sudo[292634]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:37 np0005538513.localdomain podman[292693]: 
Nov 28 09:52:37 np0005538513.localdomain podman[292693]: 2025-11-28 09:52:37.538823021 +0000 UTC m=+0.079598806 container create 6223c4783206a29407ce0a4ba2c9124c2f63fab8e562c2a29fd5fd0bcab372fd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_blackburn, version=7, io.openshift.tags=rhceph ceph, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_BRANCH=main, GIT_CLEAN=True, com.redhat.component=rhceph-container, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12)
Nov 28 09:52:37 np0005538513.localdomain systemd[1]: Started libpod-conmon-6223c4783206a29407ce0a4ba2c9124c2f63fab8e562c2a29fd5fd0bcab372fd.scope.
Nov 28 09:52:37 np0005538513.localdomain podman[292693]: 2025-11-28 09:52:37.503976102 +0000 UTC m=+0.044751917 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:52:37 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:52:37 np0005538513.localdomain podman[292693]: 2025-11-28 09:52:37.61917897 +0000 UTC m=+0.159954755 container init 6223c4783206a29407ce0a4ba2c9124c2f63fab8e562c2a29fd5fd0bcab372fd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_blackburn, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, release=553, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, build-date=2025-09-24T08:57:55, name=rhceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, architecture=x86_64, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:52:37 np0005538513.localdomain podman[292693]: 2025-11-28 09:52:37.629144469 +0000 UTC m=+0.169920254 container start 6223c4783206a29407ce0a4ba2c9124c2f63fab8e562c2a29fd5fd0bcab372fd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_blackburn, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, version=7, release=553, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, RELEASE=main, ceph=True, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=)
Nov 28 09:52:37 np0005538513.localdomain podman[292693]: 2025-11-28 09:52:37.629436049 +0000 UTC m=+0.170211884 container attach 6223c4783206a29407ce0a4ba2c9124c2f63fab8e562c2a29fd5fd0bcab372fd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_blackburn, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, name=rhceph, release=553, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, ceph=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container)
Nov 28 09:52:37 np0005538513.localdomain keen_blackburn[292708]: 167 167
Nov 28 09:52:37 np0005538513.localdomain systemd[1]: libpod-6223c4783206a29407ce0a4ba2c9124c2f63fab8e562c2a29fd5fd0bcab372fd.scope: Deactivated successfully.
Nov 28 09:52:37 np0005538513.localdomain podman[292693]: 2025-11-28 09:52:37.633504084 +0000 UTC m=+0.174279919 container died 6223c4783206a29407ce0a4ba2c9124c2f63fab8e562c2a29fd5fd0bcab372fd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_blackburn, architecture=x86_64, vcs-type=git, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:52:37 np0005538513.localdomain podman[292713]: 2025-11-28 09:52:37.73151354 +0000 UTC m=+0.086262463 container remove 6223c4783206a29407ce0a4ba2c9124c2f63fab8e562c2a29fd5fd0bcab372fd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_blackburn, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, com.redhat.component=rhceph-container, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:52:37 np0005538513.localdomain systemd[1]: libpod-conmon-6223c4783206a29407ce0a4ba2c9124c2f63fab8e562c2a29fd5fd0bcab372fd.scope: Deactivated successfully.
Nov 28 09:52:37 np0005538513.localdomain podman[292730]: 
Nov 28 09:52:37 np0005538513.localdomain podman[292730]: 2025-11-28 09:52:37.842127296 +0000 UTC m=+0.070244467 container create bb869c3a0c394b4ec34158681c35ecd579415ddceadc5c323599a2a55cbc1f7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_satoshi, ceph=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, distribution-scope=public, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, version=7, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:52:37 np0005538513.localdomain systemd[1]: Started libpod-conmon-bb869c3a0c394b4ec34158681c35ecd579415ddceadc5c323599a2a55cbc1f7c.scope.
Nov 28 09:52:37 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:52:37 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90dab33e2ca513faa2eb61fc496ab4abd6ccd8904e7ed39f236081f510491e11/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Nov 28 09:52:37 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90dab33e2ca513faa2eb61fc496ab4abd6ccd8904e7ed39f236081f510491e11/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Nov 28 09:52:37 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90dab33e2ca513faa2eb61fc496ab4abd6ccd8904e7ed39f236081f510491e11/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 09:52:37 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90dab33e2ca513faa2eb61fc496ab4abd6ccd8904e7ed39f236081f510491e11/merged/var/lib/ceph/mon/ceph-np0005538513 supports timestamps until 2038 (0x7fffffff)
Nov 28 09:52:37 np0005538513.localdomain podman[292730]: 2025-11-28 09:52:37.894654263 +0000 UTC m=+0.122771474 container init bb869c3a0c394b4ec34158681c35ecd579415ddceadc5c323599a2a55cbc1f7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_satoshi, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, version=7, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.component=rhceph-container, name=rhceph, architecture=x86_64, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:52:37 np0005538513.localdomain podman[292730]: 2025-11-28 09:52:37.901101804 +0000 UTC m=+0.129219005 container start bb869c3a0c394b4ec34158681c35ecd579415ddceadc5c323599a2a55cbc1f7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_satoshi, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., ceph=True, name=rhceph, architecture=x86_64, io.buildah.version=1.33.12, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 09:52:37 np0005538513.localdomain podman[292730]: 2025-11-28 09:52:37.90130569 +0000 UTC m=+0.129422891 container attach bb869c3a0c394b4ec34158681c35ecd579415ddceadc5c323599a2a55cbc1f7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_satoshi, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, ceph=True, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, distribution-scope=public, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:52:37 np0005538513.localdomain podman[292730]: 2025-11-28 09:52:37.822050824 +0000 UTC m=+0.050168025 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: libpod-bb869c3a0c394b4ec34158681c35ecd579415ddceadc5c323599a2a55cbc1f7c.scope: Deactivated successfully.
Nov 28 09:52:38 np0005538513.localdomain podman[292730]: 2025-11-28 09:52:38.002608378 +0000 UTC m=+0.230725619 container died bb869c3a0c394b4ec34158681c35ecd579415ddceadc5c323599a2a55cbc1f7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_satoshi, architecture=x86_64, release=553, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, vcs-type=git, description=Red Hat Ceph Storage 7)
Nov 28 09:52:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:38.057 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:38 np0005538513.localdomain podman[292771]: 2025-11-28 09:52:38.083253526 +0000 UTC m=+0.072285270 container remove bb869c3a0c394b4ec34158681c35ecd579415ddceadc5c323599a2a55cbc1f7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_satoshi, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, ceph=True, io.openshift.expose-services=, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55)
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: libpod-conmon-bb869c3a0c394b4ec34158681c35ecd579415ddceadc5c323599a2a55cbc1f7c.scope: Deactivated successfully.
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:52:38 np0005538513.localdomain systemd-sysv-generator[292817]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:52:38 np0005538513.localdomain systemd-rc-local-generator[292811]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:52:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:38.327 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:52:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:38.328 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:52:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:38.328 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:38.402 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:52:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:38.402 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:52:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:38.402 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:52:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:38.403 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-36ebc93b29125860e1b39929ac9957a9d2872b5ba394384215c9eabc3709c0ef-merged.mount: Deactivated successfully.
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: Reloading.
Nov 28 09:52:38 np0005538513.localdomain systemd-sysv-generator[292857]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:52:38 np0005538513.localdomain systemd-rc-local-generator[292849]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:52:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:38.784 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:52:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:38.798 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:52:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:38.799 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:52:38 np0005538513.localdomain systemd[1]: Starting Ceph mon.np0005538513 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1...
Nov 28 09:52:39 np0005538513.localdomain podman[292915]: 
Nov 28 09:52:39 np0005538513.localdomain podman[292915]: 2025-11-28 09:52:39.203394354 +0000 UTC m=+0.076246653 container create ca134fa658f8ef463fc51bf5d195486ba0b637d105c74e04079135961095ec50 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538513, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, vcs-type=git, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True)
Nov 28 09:52:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:52:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:52:39 np0005538513.localdomain systemd[1]: tmp-crun.4AzS7j.mount: Deactivated successfully.
Nov 28 09:52:39 np0005538513.localdomain podman[292915]: 2025-11-28 09:52:39.17262642 +0000 UTC m=+0.045478809 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:52:39 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20ed6216cd9662b2a673d46f9bd63d7634576c3919ee4f8211cda62883165257/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 09:52:39 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20ed6216cd9662b2a673d46f9bd63d7634576c3919ee4f8211cda62883165257/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 09:52:39 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20ed6216cd9662b2a673d46f9bd63d7634576c3919ee4f8211cda62883165257/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 09:52:39 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20ed6216cd9662b2a673d46f9bd63d7634576c3919ee4f8211cda62883165257/merged/var/lib/ceph/mon/ceph-np0005538513 supports timestamps until 2038 (0x7fffffff)
Nov 28 09:52:39 np0005538513.localdomain podman[292915]: 2025-11-28 09:52:39.290569845 +0000 UTC m=+0.163422254 container init ca134fa658f8ef463fc51bf5d195486ba0b637d105c74e04079135961095ec50 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538513, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., release=553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, name=rhceph, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, version=7, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph)
Nov 28 09:52:39 np0005538513.localdomain podman[292929]: 2025-11-28 09:52:39.334378311 +0000 UTC m=+0.090140533 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: set uid:gid to 167:167 (ceph:ceph)
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pidfile_write: ignore empty --pid-file
Nov 28 09:52:39 np0005538513.localdomain podman[292929]: 2025-11-28 09:52:39.340303055 +0000 UTC m=+0.096065287 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: load: jerasure load: lrc 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: RocksDB version: 7.9.2
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: Git sha 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: Compile date 2025-09-23 00:00:00
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: DB SUMMARY
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: DB Session ID:  MM4LCQC4OTZXQR5A0TS6
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: CURRENT file:  CURRENT
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: IDENTITY file:  IDENTITY
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005538513/store.db dir, Total Num: 0, files: 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005538513/store.db: 000004.log size: 886 ; 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                         Options.error_if_exists: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                       Options.create_if_missing: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                         Options.paranoid_checks: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                                     Options.env: 0x55b5b91039e0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                                      Options.fs: PosixFileSystem
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                                Options.info_log: 0x55b5bb67cd20
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                Options.max_file_opening_threads: 16
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                              Options.statistics: (nil)
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                               Options.use_fsync: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                       Options.max_log_file_size: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                         Options.allow_fallocate: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                        Options.use_direct_reads: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:          Options.create_missing_column_families: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                              Options.db_log_dir: 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                                 Options.wal_dir: 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                   Options.advise_random_on_open: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                    Options.write_buffer_manager: 0x55b5bb68d540
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                            Options.rate_limiter: (nil)
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                  Options.unordered_write: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                               Options.row_cache: None
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                              Options.wal_filter: None
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:             Options.allow_ingest_behind: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:             Options.two_write_queues: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:             Options.manual_wal_flush: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:             Options.wal_compression: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:             Options.atomic_flush: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                 Options.log_readahead_size: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:             Options.allow_data_in_errors: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:             Options.db_host_id: __hostname__
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:             Options.max_background_jobs: 2
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:             Options.max_background_compactions: -1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:             Options.max_subcompactions: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:             Options.max_total_wal_size: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                          Options.max_open_files: -1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                          Options.bytes_per_sync: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:       Options.compaction_readahead_size: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                  Options.max_background_flushes: -1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: Compression algorithms supported:
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:         kZSTD supported: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:         kXpressCompression supported: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:         kBZip2Compression supported: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:         kLZ4Compression supported: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:         kZlibCompression supported: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:         kLZ4HCCompression supported: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:         kSnappyCompression supported: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005538513/store.db/MANIFEST-000005
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:           Options.merge_operator: 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:        Options.compaction_filter: None
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b5bb67c980)
                                                             cache_index_and_filter_blocks: 1
                                                             cache_index_and_filter_blocks_with_high_priority: 0
                                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                                             pin_top_level_index_and_filter: 1
                                                             index_type: 0
                                                             data_block_index_type: 0
                                                             index_shortening: 1
                                                             data_block_hash_table_util_ratio: 0.750000
                                                             checksum: 4
                                                             no_block_cache: 0
                                                             block_cache: 0x55b5bb679350
                                                             block_cache_name: BinnedLRUCache
                                                             block_cache_options:
                                                               capacity : 536870912
                                                               num_shard_bits : 4
                                                               strict_capacity_limit : 0
                                                               high_pri_pool_ratio: 0.000
                                                             block_cache_compressed: (nil)
                                                             persistent_cache: (nil)
                                                             block_size: 4096
                                                             block_size_deviation: 10
                                                             block_restart_interval: 16
                                                             index_block_restart_interval: 1
                                                             metadata_block_size: 4096
                                                             partition_filters: 0
                                                             use_delta_encoding: 1
                                                             filter_policy: bloomfilter
                                                             whole_key_filtering: 1
                                                             verify_compression: 0
                                                             read_amp_bytes_per_bit: 0
                                                             format_version: 5
                                                             enable_index_compression: 1
                                                             block_align: 0
                                                             max_auto_readahead_size: 262144
                                                             prepopulate_block_cache: 0
                                                             initial_auto_readahead_size: 8192
                                                             num_file_reads_for_auto_readahead: 2
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:        Options.write_buffer_size: 33554432
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:  Options.max_write_buffer_number: 2
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:          Options.compression: NoCompression
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:             Options.num_levels: 7
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                   Options.table_properties_collectors: 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                           Options.bloom_locality: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                               Options.ttl: 2592000
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                       Options.enable_blob_files: false
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                           Options.min_blob_size: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005538513/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 49d3ae8b-2ff6-4713-88ed-5986b1f8221e
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323559348949, "job": 1, "event": "recovery_started", "wal_files": [4]}
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323559353286, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 2012, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 898, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 776, "raw_average_value_size": 155, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323559353426, "job": 1, "event": "recovery_finished"}
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Nov 28 09:52:39 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55b5bb6a0e00
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: DB pointer 0x55b5bb796000
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513 does not exist in monmap, will attempt to join an existing cluster
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      1/0    1.96 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                            Sum      1/0    1.96 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Flush(GB): cumulative 0.000, interval 0.000
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.00 GB write, 0.10 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Interval compaction: 0.00 GB write, 0.10 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x55b5bb679350#2 capacity: 512.00 MB usage: 1.30 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 3.4e-05 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(1,1.08 KB,0.000205636%) FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: using public_addr v2:172.18.0.103:0/0 -> [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: starting mon.np0005538513 rank -1 at public addrs [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] at bind addrs [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005538513 fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@-1(???) e0 preinit fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@-1(synchronizing) e8 sync_obtain_latest_monmap
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@-1(synchronizing) e8 sync_obtain_latest_monmap obtained monmap e8
Nov 28 09:52:39 np0005538513.localdomain podman[292927]: 2025-11-28 09:52:39.393873084 +0000 UTC m=+0.150744210 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 28 09:52:39 np0005538513.localdomain podman[292915]: 2025-11-28 09:52:39.402049817 +0000 UTC m=+0.274902156 container start ca134fa658f8ef463fc51bf5d195486ba0b637d105c74e04079135961095ec50 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538513, com.redhat.component=rhceph-container, version=7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, name=rhceph, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.buildah.version=1.33.12, release=553)
Nov 28 09:52:39 np0005538513.localdomain bash[292915]: ca134fa658f8ef463fc51bf5d195486ba0b637d105c74e04079135961095ec50
Nov 28 09:52:39 np0005538513.localdomain systemd[1]: Started Ceph mon.np0005538513 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1.
Nov 28 09:52:39 np0005538513.localdomain podman[292927]: 2025-11-28 09:52:39.458661402 +0000 UTC m=+0.215532528 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 09:52:39 np0005538513.localdomain sudo[292634]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:39 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@-1(synchronizing).mds e17 new map
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@-1(synchronizing).mds e17 print_map
                                                           e17
                                                           enable_multiple, ever_enabled_multiple: 1,1
                                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           legacy client fscid: 1
                                                            
                                                           Filesystem 'cephfs' (1)
                                                           fs_name        cephfs
                                                           epoch        15
                                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                                           created        2025-11-28T08:07:30.958224+0000
                                                           modified        2025-11-28T09:49:53.259185+0000
                                                           tableserver        0
                                                           root        0
                                                           session_timeout        60
                                                           session_autoclose        300
                                                           max_file_size        1099511627776
                                                           required_client_features        {}
                                                           last_failure        0
                                                           last_failure_osd_epoch        83
                                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           max_mds        1
                                                           in        0
                                                           up        {0=26449}
                                                           failed        
                                                           damaged        
                                                           stopped        
                                                           data_pools        [6]
                                                           metadata_pool        7
                                                           inline_data        disabled
                                                           balancer        
                                                           bal_rank_mask        -1
                                                           standby_count_wanted        1
                                                           qdb_cluster        leader: 26449 members: 26449
                                                           [mds.mds.np0005538514.umgtoy{0:26449} state up:active seq 12 addr [v2:172.18.0.107:6808/1969410151,v1:172.18.0.107:6809/1969410151] compat {c=[1],r=[1],i=[17ff]}]
                                                            
                                                            
                                                           Standby daemons:
                                                            
                                                           [mds.mds.np0005538513.yljthc{-1:16968} state up:standby seq 1 addr [v2:172.18.0.106:6808/2782735008,v1:172.18.0.106:6809/2782735008] compat {c=[1],r=[1],i=[17ff]}]
                                                           [mds.mds.np0005538515.anvatb{-1:26446} state up:standby seq 1 addr [v2:172.18.0.108:6808/2640180,v1:172.18.0.108:6809/2640180] compat {c=[1],r=[1],i=[17ff]}]
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@-1(synchronizing).osd e86 crush map has features 3314933000852226048, adjusting msgr requires
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@-1(synchronizing).osd e86 crush map has features 288514051259236352, adjusting msgr requires
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@-1(synchronizing).osd e86 crush map has features 288514051259236352, adjusting msgr requires
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@-1(synchronizing).osd e86 crush map has features 288514051259236352, adjusting msgr requires
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538514 calling monitor election
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='client.34197 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538510.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Removed label mon from host np0005538510.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.3 (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538511 calling monitor election
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538515 calling monitor election
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538512 calling monitor election
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538512 is new leader, mons np0005538512,np0005538511,np0005538515,np0005538514,np0005538513 in quorum (ranks 0,1,2,3,4)
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: monmap epoch 7
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: last_changed 2025-11-28T09:51:48.586207+0000
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: min_mon_release 18 (reef)
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: election_strategy: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538512
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538511
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005538514
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: 4: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005538513
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: osdmap e86: 6 total, 6 up, 6 in
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: mgrmap e19: np0005538512.zyhkxs(active, since 23s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Health check cleared: MON_DOWN (was: 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538513)
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Cluster is now healthy
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: overall HEALTH_OK
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='client.34169 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538510.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Removed label mgr from host np0005538510.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='client.26785 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538510.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Removed label _admin from host np0005538510.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mon.np0005538514 (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.1 (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.4 (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mon.np0005538515 (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Removing np0005538510.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Removing np0005538510.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Removing np0005538510.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Removing daemon mgr.np0005538510.nzitwz from np0005538510.localdomain -- ports [9283, 8765]
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='client.26795 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005538510.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Added label _no_schedule to host np0005538510.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005538510.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Removing key for mgr.np0005538510.nzitwz
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth rm", "entity": "mgr.np0005538510.nzitwz"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005538510.nzitwz"}]': finished
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='client.26805 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005538510.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Removing daemon crash.np0005538510 from np0005538510.localdomain -- ports []
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain"}]': finished
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth rm", "entity": "client.crash.np0005538510.localdomain"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd='[{"prefix": "auth rm", "entity": "client.crash.np0005538510.localdomain"}]': finished
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='client.34193 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005538510.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Removing key for client.crash.np0005538510.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Removed host np0005538510.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538511 (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538511.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538511 on np0005538511.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1979550132' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1979550132' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mon.np0005538511 (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mon.np0005538511 on np0005538511.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538511.fvuybw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538511.fvuybw (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538511.fvuybw on np0005538511.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mon.np0005538512 (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.2 (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.5 (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='client.34194 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Saving service mon spec with placement label:mon
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='client.34199 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538513", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='client.26581 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005538513"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Remove daemons mon.np0005538513
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Safe to remove mon.np0005538513: new quorum should be ['np0005538512', 'np0005538511', 'np0005538515', 'np0005538514'] (from ['np0005538512', 'np0005538511', 'np0005538515', 'np0005538514'])
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Removing monitor np0005538513 from monmap...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Removing daemon mon.np0005538513 from np0005538513.localdomain -- ports []
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538511"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538511 calling monitor election
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538515 calling monitor election
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538514 calling monitor election
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538512 calling monitor election
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538512 is new leader, mons np0005538512,np0005538511,np0005538515,np0005538514 in quorum (ranks 0,1,2,3)
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: monmap epoch 8
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: last_changed 2025-11-28T09:52:23.566128+0000
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: min_mon_release 18 (reef)
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: election_strategy: 1
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538512
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538511
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005538514
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: osdmap e86: 6 total, 6 up, 6 in
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: mgrmap e19: np0005538512.zyhkxs(active, since 51s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: overall HEALTH_OK
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538511.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538511 (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538511 on np0005538511.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538511.fvuybw (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538511.fvuybw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538511.fvuybw on np0005538511.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.2 (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.5 (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1522556299' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/3135871719' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2185645533' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1971507369' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2894340973' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2401285035' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.0 (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.0 on np0005538514.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='client.34266 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005538513.localdomain:172.18.0.103", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Deploying daemon mon.np0005538513 on np0005538513.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.3 (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@-1(synchronizing).paxosservice(auth 1..37) refresh upgraded, format 0 -> 3
Nov 28 09:52:39 np0005538513.localdomain ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b8f20 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0
Nov 28 09:52:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:39.825 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:40 np0005538513.localdomain podman[238687]: time="2025-11-28T09:52:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:52:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:52:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 28 09:52:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:52:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18696 "" "Go-http-client/1.1"
Nov 28 09:52:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:52:40 np0005538513.localdomain systemd[1]: tmp-crun.eahCiq.mount: Deactivated successfully.
Nov 28 09:52:40 np0005538513.localdomain podman[293015]: 2025-11-28 09:52:40.859577658 +0000 UTC m=+0.097061448 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 09:52:40 np0005538513.localdomain podman[293015]: 2025-11-28 09:52:40.871415894 +0000 UTC m=+0.108899684 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Nov 28 09:52:40 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:52:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@-1(probing) e9  my rank is now 4 (was -1)
Nov 28 09:52:41 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 calling monitor election
Nov 28 09:52:41 np0005538513.localdomain ceph-mon[292954]: paxos.4).electionLogic(0) init, first boot, initializing epoch at 1 
Nov 28 09:52:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@4(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:52:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:43.093 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:44.870 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 calling monitor election
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: paxos.4).electionLogic(0) init, first boot, initializing epoch at 1 
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@4(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@4(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@4(peon) e9 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@4(peon) e9 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538511 calling monitor election
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538515 calling monitor election
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538514 calling monitor election
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538511"} : dispatch
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538512 calling monitor election
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538512 is new leader, mons np0005538512,np0005538511,np0005538515,np0005538514 in quorum (ranks 0,1,2,3)
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: monmap epoch 9
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: last_changed 2025-11-28T09:52:39.794263+0000
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: min_mon_release 18 (reef)
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: election_strategy: 1
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538512
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538511
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005538514
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: 4: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: osdmap e86: 6 total, 6 up, 6 in
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: mgrmap e19: np0005538512.zyhkxs(active, since 72s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: Health check failed: 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538514 (MON_DOWN)
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: Health detail: HEALTH_WARN 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538514
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: [WRN] MON_DOWN: 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538514
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]:     mon.np0005538513 (rank 4) addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] is down (out of quorum)
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@4(peon) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:52:46 np0005538513.localdomain ceph-mon[292954]: mgrc update_daemon_metadata mon.np0005538513 metadata {addrs=[v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005538513.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.6 (Plow),distro_version=9.6,hostname=np0005538513.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux}
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513 calling monitor election
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513 calling monitor election
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538511 calling monitor election
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538515 calling monitor election
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538514 calling monitor election
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538512 calling monitor election
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538512 is new leader, mons np0005538512,np0005538511,np0005538515,np0005538514,np0005538513 in quorum (ranks 0,1,2,3,4)
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: monmap epoch 9
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: last_changed 2025-11-28T09:52:39.794263+0000
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: min_mon_release 18 (reef)
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: election_strategy: 1
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538512
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538511
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005538514
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: 4: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: osdmap e86: 6 total, 6 up, 6 in
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: mgrmap e19: np0005538512.zyhkxs(active, since 74s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: Health check cleared: MON_DOWN (was: 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538514)
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: Cluster is now healthy
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: overall HEALTH_OK
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.4 (monmap changed)...
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:52:47 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:52:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:52:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:52:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:52:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:52:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:52:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:52:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:52:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:52:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:52:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:52:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:52:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:52:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:48.092 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:48 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:48 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:48 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:52:48 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:52:48 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:48 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:52:48 np0005538513.localdomain ceph-mon[292954]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:52:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:52:49 np0005538513.localdomain podman[293035]: 2025-11-28 09:52:49.854197348 +0000 UTC m=+0.086272982 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3)
Nov 28 09:52:49 np0005538513.localdomain podman[293035]: 2025-11-28 09:52:49.866054746 +0000 UTC m=+0.098130370 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:52:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:49.872 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:49 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:52:49 np0005538513.localdomain systemd[1]: tmp-crun.KIUrbA.mount: Deactivated successfully.
Nov 28 09:52:49 np0005538513.localdomain podman[293034]: 2025-11-28 09:52:49.963775523 +0000 UTC m=+0.198868071 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:52:49 np0005538513.localdomain podman[293034]: 2025-11-28 09:52:49.973967589 +0000 UTC m=+0.209060137 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:52:49 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:52:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:49 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)...
Nov 28 09:52:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:52:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:52:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:49 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:52:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:50 np0005538513.localdomain sudo[293063]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:50 np0005538513.localdomain sudo[293063]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:50 np0005538513.localdomain sudo[293063]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:50 np0005538513.localdomain sudo[293093]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:52:50 np0005538513.localdomain sudo[293093]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:52:50.830 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:52:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:52:50.831 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:52:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:52:50.833 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:52:50 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@4(peon) e9 handle_command mon_command({"prefix": "status", "format": "json"} v 0)
Nov 28 09:52:50 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/438518273' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 28 09:52:51 np0005538513.localdomain ceph-mon[292954]: pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:51 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.200:0/438518273' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 28 09:52:51 np0005538513.localdomain podman[293181]: 2025-11-28 09:52:51.035313366 +0000 UTC m=+0.105369266 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, ceph=True, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-type=git, release=553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, architecture=x86_64, io.buildah.version=1.33.12, version=7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=)
Nov 28 09:52:51 np0005538513.localdomain podman[293181]: 2025-11-28 09:52:51.137963776 +0000 UTC m=+0.208019636 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, version=7, vendor=Red Hat, Inc., release=553, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_CLEAN=True, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.openshift.expose-services=, ceph=True)
Nov 28 09:52:51 np0005538513.localdomain sudo[293093]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:51 np0005538513.localdomain sudo[293302]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:51 np0005538513.localdomain sudo[293302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:51 np0005538513.localdomain sudo[293302]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:51 np0005538513.localdomain sudo[293320]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:52:51 np0005538513.localdomain sudo[293320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:52 np0005538513.localdomain sudo[293320]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:52 np0005538513.localdomain sudo[293370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:52:52 np0005538513.localdomain sudo[293370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:52 np0005538513.localdomain sudo[293370]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538513.localdomain ceph-mon[292954]: pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:52:52 np0005538513.localdomain sudo[293388]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:52:52 np0005538513.localdomain sudo[293388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:52 np0005538513.localdomain sudo[293388]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:52 np0005538513.localdomain sudo[293406]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:52 np0005538513.localdomain sudo[293406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:52 np0005538513.localdomain sudo[293406]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:53 np0005538513.localdomain sudo[293424]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:53 np0005538513.localdomain sudo[293424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:53 np0005538513.localdomain sudo[293424]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:53 np0005538513.localdomain sudo[293442]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:53 np0005538513.localdomain sudo[293442]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:53 np0005538513.localdomain sudo[293442]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:53.136 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:53 np0005538513.localdomain sudo[293476]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:53 np0005538513.localdomain sudo[293476]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:53 np0005538513.localdomain sudo[293476]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:53 np0005538513.localdomain sudo[293494]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:53 np0005538513.localdomain sudo[293494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:53 np0005538513.localdomain sudo[293494]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:53 np0005538513.localdomain sudo[293512]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:52:53 np0005538513.localdomain sudo[293512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:53 np0005538513.localdomain sudo[293512]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:53 np0005538513.localdomain sudo[293530]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:52:53 np0005538513.localdomain sudo[293530]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:53 np0005538513.localdomain sudo[293530]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:53 np0005538513.localdomain sudo[293548]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:52:53 np0005538513.localdomain sudo[293548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:53 np0005538513.localdomain sudo[293548]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:53 np0005538513.localdomain sudo[293566]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:53 np0005538513.localdomain sudo[293566]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:53 np0005538513.localdomain sudo[293566]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:53 np0005538513.localdomain sudo[293584]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:53 np0005538513.localdomain sudo[293584]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:53 np0005538513.localdomain sudo[293584]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:53 np0005538513.localdomain ceph-mon[292954]: from='client.26865 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:52:53 np0005538513.localdomain ceph-mon[292954]: Reconfig service osd.default_drive_group
Nov 28 09:52:53 np0005538513.localdomain ceph-mon[292954]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:53 np0005538513.localdomain ceph-mon[292954]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:53 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:53 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:53 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:53 np0005538513.localdomain sudo[293602]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:53 np0005538513.localdomain sudo[293602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:53 np0005538513.localdomain sudo[293602]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:54 np0005538513.localdomain sudo[293636]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:54 np0005538513.localdomain sudo[293636]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:54 np0005538513.localdomain sudo[293636]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:54 np0005538513.localdomain sudo[293654]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:54 np0005538513.localdomain sudo[293654]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:54 np0005538513.localdomain sudo[293654]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@4(peon).osd e86 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@4(peon).osd e86 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@4(peon).osd e87 e87: 6 total, 6 up, 6 in
Nov 28 09:52:54 np0005538513.localdomain sshd[288311]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:52:54 np0005538513.localdomain systemd-logind[764]: Session 64 logged out. Waiting for processes to exit.
Nov 28 09:52:54 np0005538513.localdomain sudo[293672]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:54 np0005538513.localdomain sudo[293672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:54 np0005538513.localdomain sudo[293672]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:54 np0005538513.localdomain systemd[1]: session-64.scope: Deactivated successfully.
Nov 28 09:52:54 np0005538513.localdomain systemd[1]: session-64.scope: Consumed 29.579s CPU time.
Nov 28 09:52:54 np0005538513.localdomain systemd-logind[764]: Removed session 64.
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@4(peon).osd e87 _set_new_cache_sizes cache_size:1019519450 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:52:54 np0005538513.localdomain sshd[293690]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:52:54 np0005538513.localdomain sshd[293690]: Accepted publickey for ceph-admin from 192.168.122.107 port 51506 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 09:52:54 np0005538513.localdomain systemd-logind[764]: New session 67 of user ceph-admin.
Nov 28 09:52:54 np0005538513.localdomain systemd[1]: Started Session 67 of User ceph-admin.
Nov 28 09:52:54 np0005538513.localdomain sshd[293690]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 09:52:54 np0005538513.localdomain sudo[293694]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:54 np0005538513.localdomain sudo[293694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:54 np0005538513.localdomain sudo[293694]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:54 np0005538513.localdomain sudo[293712]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:52:54 np0005538513.localdomain sudo[293712]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:54.901 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.200:0/1216930330' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: Activating manager daemon np0005538514.djozup
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: osdmap e87: 6 total, 6 up, 6 in
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.200:0/1216930330' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: mgrmap e20: np0005538514.djozup(active, starting, since 0.0638751s), standbys: np0005538515.yfkzhl, np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mon metadata", "id": "np0005538511"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mds metadata", "who": "mds.np0005538513.yljthc"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mds metadata", "who": "mds.np0005538515.anvatb"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mds metadata", "who": "mds.np0005538514.umgtoy"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr metadata", "who": "np0005538514.djozup", "id": "np0005538514.djozup"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr metadata", "who": "np0005538515.yfkzhl", "id": "np0005538515.yfkzhl"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr metadata", "who": "np0005538510.nzitwz", "id": "np0005538510.nzitwz"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr metadata", "who": "np0005538511.fvuybw", "id": "np0005538511.fvuybw"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr metadata", "who": "np0005538513.dsfdlx", "id": "np0005538513.dsfdlx"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mds metadata"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mon metadata"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: Manager daemon np0005538514.djozup is now available
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: removing stray HostCache host record np0005538510.localdomain.devices.0
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain.devices.0"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain.devices.0"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain.devices.0"}]': finished
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain.devices.0"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain.devices.0"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain.devices.0"}]': finished
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/mirror_snapshot_schedule"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/mirror_snapshot_schedule"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/trash_purge_schedule"} : dispatch
Nov 28 09:52:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/trash_purge_schedule"} : dispatch
Nov 28 09:52:55 np0005538513.localdomain podman[293799]: 2025-11-28 09:52:55.71800585 +0000 UTC m=+0.077776641 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, distribution-scope=public, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, release=553, RELEASE=main, ceph=True, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git)
Nov 28 09:52:55 np0005538513.localdomain podman[293799]: 2025-11-28 09:52:55.828120671 +0000 UTC m=+0.187891482 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, RELEASE=main, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, name=rhceph, distribution-scope=public, CEPH_POINT_RELEASE=)
Nov 28 09:52:56 np0005538513.localdomain ceph-mon[292954]: mgrmap e21: np0005538514.djozup(active, since 1.14261s), standbys: np0005538515.yfkzhl, np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:52:56 np0005538513.localdomain ceph-mon[292954]: pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:56 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:56 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:56 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:56 np0005538513.localdomain sudo[293712]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:56 np0005538513.localdomain sudo[293918]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:56 np0005538513.localdomain sudo[293918]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:56 np0005538513.localdomain sudo[293918]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:56 np0005538513.localdomain sudo[293936]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:52:56 np0005538513.localdomain sudo[293936]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:57 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:52:55] ENGINE Bus STARTING
Nov 28 09:52:57 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:52:56] ENGINE Serving on http://172.18.0.107:8765
Nov 28 09:52:57 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:52:56] ENGINE Serving on https://172.18.0.107:7150
Nov 28 09:52:57 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:52:56] ENGINE Bus STARTED
Nov 28 09:52:57 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:52:56] ENGINE Client ('172.18.0.107', 40776) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 28 09:52:57 np0005538513.localdomain ceph-mon[292954]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:57 np0005538513.localdomain sudo[293936]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:57 np0005538513.localdomain sudo[293986]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:57 np0005538513.localdomain sudo[293986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:57 np0005538513.localdomain sudo[293986]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:57 np0005538513.localdomain sudo[294004]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 28 09:52:57 np0005538513.localdomain sudo[294004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:57 np0005538513.localdomain sudo[294004]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:58 np0005538513.localdomain sudo[294040]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:52:58 np0005538513.localdomain sudo[294040]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:58 np0005538513.localdomain sudo[294040]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:58.141 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:58 np0005538513.localdomain sudo[294058]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:52:58 np0005538513.localdomain sudo[294058]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:58 np0005538513.localdomain sudo[294058]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:58 np0005538513.localdomain sudo[294076]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:58 np0005538513.localdomain sudo[294076]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:58 np0005538513.localdomain sudo[294076]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: mgrmap e22: np0005538514.djozup(active, since 3s), standbys: np0005538515.yfkzhl, np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:52:58 np0005538513.localdomain sudo[294094]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:58 np0005538513.localdomain sudo[294094]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:58 np0005538513.localdomain sudo[294094]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:58 np0005538513.localdomain sudo[294112]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:58 np0005538513.localdomain sudo[294112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:58 np0005538513.localdomain sudo[294112]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:58 np0005538513.localdomain sudo[294146]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:58 np0005538513.localdomain sudo[294146]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:58 np0005538513.localdomain sudo[294146]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:58 np0005538513.localdomain sudo[294164]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:58 np0005538513.localdomain sudo[294164]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:58 np0005538513.localdomain sudo[294164]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:58 np0005538513.localdomain sudo[294182]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:52:58 np0005538513.localdomain sudo[294182]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:58 np0005538513.localdomain sudo[294182]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:58 np0005538513.localdomain sudo[294200]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:52:58 np0005538513.localdomain sudo[294200]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:58 np0005538513.localdomain sudo[294200]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:58 np0005538513.localdomain sudo[294218]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:52:58 np0005538513.localdomain sudo[294218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:58 np0005538513.localdomain sudo[294218]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:59 np0005538513.localdomain sudo[294236]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:59 np0005538513.localdomain sudo[294236]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:59 np0005538513.localdomain sudo[294236]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:59 np0005538513.localdomain sudo[294254]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:59 np0005538513.localdomain sudo[294254]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:59 np0005538513.localdomain sudo[294254]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:59 np0005538513.localdomain sudo[294272]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:59 np0005538513.localdomain sudo[294272]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:59 np0005538513.localdomain sudo[294272]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:59 np0005538513.localdomain ceph-mon[292954]: Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 09:52:59 np0005538513.localdomain ceph-mon[292954]: Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:52:59 np0005538513.localdomain ceph-mon[292954]: Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 09:52:59 np0005538513.localdomain ceph-mon[292954]: Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:52:59 np0005538513.localdomain ceph-mon[292954]: Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 09:52:59 np0005538513.localdomain ceph-mon[292954]: Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:52:59 np0005538513.localdomain ceph-mon[292954]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:59 np0005538513.localdomain ceph-mon[292954]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:59 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:59 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:59 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:59 np0005538513.localdomain ceph-mon[292954]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:59 np0005538513.localdomain ceph-mon[292954]: mgrmap e23: np0005538514.djozup(active, since 4s), standbys: np0005538515.yfkzhl, np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:52:59 np0005538513.localdomain ceph-mon[292954]: Standby manager daemon np0005538512.zyhkxs started
Nov 28 09:52:59 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@4(peon).osd e87 _set_new_cache_sizes cache_size:1020040748 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:52:59 np0005538513.localdomain sudo[294306]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:59 np0005538513.localdomain sudo[294306]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:59 np0005538513.localdomain sudo[294306]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:59 np0005538513.localdomain sudo[294324]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:59 np0005538513.localdomain sudo[294324]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:59 np0005538513.localdomain sudo[294324]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:59 np0005538513.localdomain sudo[294342]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:59 np0005538513.localdomain sudo[294342]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:59 np0005538513.localdomain sudo[294342]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:59 np0005538513.localdomain sudo[294360]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:52:59 np0005538513.localdomain sudo[294360]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:59 np0005538513.localdomain sudo[294360]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:59 np0005538513.localdomain sudo[294378]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:52:59 np0005538513.localdomain sudo[294378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:59 np0005538513.localdomain sudo[294378]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:59 np0005538513.localdomain sudo[294396]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:52:59 np0005538513.localdomain sudo[294396]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:59 np0005538513.localdomain sudo[294396]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:59 np0005538513.localdomain sudo[294414]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:52:59.904 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:52:59 np0005538513.localdomain sudo[294414]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:59 np0005538513.localdomain sudo[294414]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:59 np0005538513.localdomain sudo[294432]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:52:59 np0005538513.localdomain sudo[294432]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:59 np0005538513.localdomain sudo[294432]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:00 np0005538513.localdomain sudo[294466]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:53:00 np0005538513.localdomain sudo[294466]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:00 np0005538513.localdomain sudo[294466]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:00 np0005538513.localdomain sudo[294484]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:53:00 np0005538513.localdomain sudo[294484]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:00 np0005538513.localdomain sudo[294484]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:00 np0005538513.localdomain sudo[294502]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:00 np0005538513.localdomain sudo[294502]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:00 np0005538513.localdomain sudo[294502]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:00 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:00 np0005538513.localdomain ceph-mon[292954]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:00 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:00 np0005538513.localdomain ceph-mon[292954]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:00 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:00 np0005538513.localdomain ceph-mon[292954]: Updating np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:00 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:00 np0005538513.localdomain ceph-mon[292954]: Updating np0005538511.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:00 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:00 np0005538513.localdomain ceph-mon[292954]: mgrmap e24: np0005538514.djozup(active, since 6s), standbys: np0005538515.yfkzhl, np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx, np0005538512.zyhkxs
Nov 28 09:53:00 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr metadata", "who": "np0005538512.zyhkxs", "id": "np0005538512.zyhkxs"} : dispatch
Nov 28 09:53:00 np0005538513.localdomain sudo[294520]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:53:00 np0005538513.localdomain sudo[294520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:00 np0005538513.localdomain sudo[294520]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:00 np0005538513.localdomain sudo[294538]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:53:00 np0005538513.localdomain sudo[294538]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:00 np0005538513.localdomain sudo[294538]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:00 np0005538513.localdomain sudo[294556]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:53:00 np0005538513.localdomain sudo[294556]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:00 np0005538513.localdomain sudo[294556]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:00 np0005538513.localdomain sudo[294574]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:00 np0005538513.localdomain sudo[294574]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:00 np0005538513.localdomain sudo[294574]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:00 np0005538513.localdomain sudo[294592]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:53:00 np0005538513.localdomain sudo[294592]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:00 np0005538513.localdomain sudo[294592]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:00 np0005538513.localdomain sudo[294626]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:53:00 np0005538513.localdomain sudo[294626]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:00 np0005538513.localdomain sudo[294626]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:00 np0005538513.localdomain sudo[294644]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:53:00 np0005538513.localdomain sudo[294644]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:00 np0005538513.localdomain sudo[294644]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:01 np0005538513.localdomain sudo[294662]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:01 np0005538513.localdomain sudo[294662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:01 np0005538513.localdomain sudo[294662]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:01 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:01 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:01 np0005538513.localdomain ceph-mon[292954]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:01 np0005538513.localdomain ceph-mon[292954]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:01 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:01 np0005538513.localdomain ceph-mon[292954]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:01 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:53:01 np0005538513.localdomain ceph-mon[292954]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Nov 28 09:53:01 np0005538513.localdomain ceph-mon[292954]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Nov 28 09:53:01 np0005538513.localdomain sudo[294680]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:53:01 np0005538513.localdomain sudo[294680]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:01 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:53:01 np0005538513.localdomain sudo[294680]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:01 np0005538513.localdomain podman[294698]: 2025-11-28 09:53:01.574330248 +0000 UTC m=+0.093222378 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible)
Nov 28 09:53:01 np0005538513.localdomain podman[294698]: 2025-11-28 09:53:01.620418226 +0000 UTC m=+0.139310386 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, release=1755695350)
Nov 28 09:53:01 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:53:02 np0005538513.localdomain ceph-mon[292954]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 35 KiB/s rd, 0 B/s wr, 19 op/s
Nov 28 09:53:02 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538511.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:02 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538511.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:02 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:03 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:03.143 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:53:03 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538511 (monmap changed)...
Nov 28 09:53:03 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538511 on np0005538511.localdomain
Nov 28 09:53:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538511.fvuybw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538511.fvuybw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:53:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:53:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:04 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@4(peon).osd e87 _set_new_cache_sizes cache_size:1020054355 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:53:04 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538511.fvuybw (monmap changed)...
Nov 28 09:53:04 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538511.fvuybw on np0005538511.localdomain
Nov 28 09:53:04 np0005538513.localdomain ceph-mon[292954]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 0 B/s wr, 14 op/s
Nov 28 09:53:04 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)...
Nov 28 09:53:04 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain
Nov 28 09:53:04 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:04 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:04 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:04 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:04 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:04 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:04.928 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:53:05 np0005538513.localdomain sudo[294720]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:05 np0005538513.localdomain sudo[294720]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:05 np0005538513.localdomain sudo[294720]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:05 np0005538513.localdomain sudo[294738]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:05 np0005538513.localdomain sudo[294738]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:05 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:53:05 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:53:05 np0005538513.localdomain ceph-mon[292954]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 0 B/s wr, 11 op/s
Nov 28 09:53:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:05 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:53:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:05 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:53:05 np0005538513.localdomain podman[294773]: 
Nov 28 09:53:05 np0005538513.localdomain podman[294773]: 2025-11-28 09:53:05.860239403 +0000 UTC m=+0.095351775 container create db931d5ba21eeabd950812d709b1c23c6bd4df8f66693bea362abecf1aaf3285 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_chaplygin, GIT_CLEAN=True, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, architecture=x86_64, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, CEPH_POINT_RELEASE=, vcs-type=git, release=553, io.buildah.version=1.33.12)
Nov 28 09:53:05 np0005538513.localdomain systemd[1]: Started libpod-conmon-db931d5ba21eeabd950812d709b1c23c6bd4df8f66693bea362abecf1aaf3285.scope.
Nov 28 09:53:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:53:05 np0005538513.localdomain podman[294773]: 2025-11-28 09:53:05.818128648 +0000 UTC m=+0.053240990 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:53:05 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:53:05 np0005538513.localdomain podman[294773]: 2025-11-28 09:53:05.945370689 +0000 UTC m=+0.180483011 container init db931d5ba21eeabd950812d709b1c23c6bd4df8f66693bea362abecf1aaf3285 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_chaplygin, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, ceph=True, distribution-scope=public, vcs-type=git, release=553)
Nov 28 09:53:05 np0005538513.localdomain podman[294773]: 2025-11-28 09:53:05.957130164 +0000 UTC m=+0.192242476 container start db931d5ba21eeabd950812d709b1c23c6bd4df8f66693bea362abecf1aaf3285 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_chaplygin, vcs-type=git, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, GIT_CLEAN=True, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, distribution-scope=public, release=553, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.openshift.tags=rhceph ceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 09:53:05 np0005538513.localdomain podman[294773]: 2025-11-28 09:53:05.957394412 +0000 UTC m=+0.192506724 container attach db931d5ba21eeabd950812d709b1c23c6bd4df8f66693bea362abecf1aaf3285 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_chaplygin, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, name=rhceph, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_BRANCH=main, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc.)
Nov 28 09:53:05 np0005538513.localdomain naughty_chaplygin[294788]: 167 167
Nov 28 09:53:05 np0005538513.localdomain systemd[1]: libpod-db931d5ba21eeabd950812d709b1c23c6bd4df8f66693bea362abecf1aaf3285.scope: Deactivated successfully.
Nov 28 09:53:05 np0005538513.localdomain podman[294773]: 2025-11-28 09:53:05.964827362 +0000 UTC m=+0.199939674 container died db931d5ba21eeabd950812d709b1c23c6bd4df8f66693bea362abecf1aaf3285 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_chaplygin, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, version=7, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.openshift.expose-services=, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, release=553, vendor=Red Hat, Inc., architecture=x86_64, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:53:06 np0005538513.localdomain podman[294789]: 2025-11-28 09:53:06.034146429 +0000 UTC m=+0.109850153 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:53:06 np0005538513.localdomain podman[294804]: 2025-11-28 09:53:06.082455146 +0000 UTC m=+0.106364266 container remove db931d5ba21eeabd950812d709b1c23c6bd4df8f66693bea362abecf1aaf3285 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_chaplygin, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, RELEASE=main, CEPH_POINT_RELEASE=, architecture=x86_64, release=553, ceph=True, version=7, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:53:06 np0005538513.localdomain systemd[1]: libpod-conmon-db931d5ba21eeabd950812d709b1c23c6bd4df8f66693bea362abecf1aaf3285.scope: Deactivated successfully.
Nov 28 09:53:06 np0005538513.localdomain podman[294789]: 2025-11-28 09:53:06.124497188 +0000 UTC m=+0.200200942 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:53:06 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:53:06 np0005538513.localdomain sudo[294738]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:06 np0005538513.localdomain sudo[294832]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:06 np0005538513.localdomain sudo[294832]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:06 np0005538513.localdomain sudo[294832]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:06 np0005538513.localdomain sudo[294850]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:06 np0005538513.localdomain sudo[294850]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:06 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-68ef4e00ce326bb2ea1f7906d2df4a8e9fc1c97e404c1c9736f5f7a0bf556f1b-merged.mount: Deactivated successfully.
Nov 28 09:53:06 np0005538513.localdomain podman[294884]: 
Nov 28 09:53:06 np0005538513.localdomain podman[294884]: 2025-11-28 09:53:06.927534984 +0000 UTC m=+0.089061260 container create 48a2fd060f1d09c2c03ef659a8d747f2a55c74f1cec2a6eacd0ba1ffec9424cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_clarke, vcs-type=git, io.buildah.version=1.33.12, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, release=553, com.redhat.component=rhceph-container, version=7, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True)
Nov 28 09:53:06 np0005538513.localdomain systemd[1]: Started libpod-conmon-48a2fd060f1d09c2c03ef659a8d747f2a55c74f1cec2a6eacd0ba1ffec9424cd.scope.
Nov 28 09:53:06 np0005538513.localdomain podman[294884]: 2025-11-28 09:53:06.892460087 +0000 UTC m=+0.053986383 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:53:07 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:53:07 np0005538513.localdomain podman[294884]: 2025-11-28 09:53:07.027462149 +0000 UTC m=+0.188988425 container init 48a2fd060f1d09c2c03ef659a8d747f2a55c74f1cec2a6eacd0ba1ffec9424cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_clarke, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, CEPH_POINT_RELEASE=, vcs-type=git, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, GIT_CLEAN=True, release=553, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:53:07 np0005538513.localdomain podman[294884]: 2025-11-28 09:53:07.038174941 +0000 UTC m=+0.199701217 container start 48a2fd060f1d09c2c03ef659a8d747f2a55c74f1cec2a6eacd0ba1ffec9424cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_clarke, name=rhceph, GIT_BRANCH=main, vendor=Red Hat, Inc., distribution-scope=public, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, release=553, ceph=True, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12)
Nov 28 09:53:07 np0005538513.localdomain podman[294884]: 2025-11-28 09:53:07.038546833 +0000 UTC m=+0.200073169 container attach 48a2fd060f1d09c2c03ef659a8d747f2a55c74f1cec2a6eacd0ba1ffec9424cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_clarke, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, com.redhat.component=rhceph-container, version=7, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, GIT_CLEAN=True, release=553, io.openshift.expose-services=, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, distribution-scope=public, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:53:07 np0005538513.localdomain blissful_clarke[294899]: 167 167
Nov 28 09:53:07 np0005538513.localdomain systemd[1]: libpod-48a2fd060f1d09c2c03ef659a8d747f2a55c74f1cec2a6eacd0ba1ffec9424cd.scope: Deactivated successfully.
Nov 28 09:53:07 np0005538513.localdomain podman[294884]: 2025-11-28 09:53:07.044172877 +0000 UTC m=+0.205699173 container died 48a2fd060f1d09c2c03ef659a8d747f2a55c74f1cec2a6eacd0ba1ffec9424cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_clarke, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, release=553, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_BRANCH=main)
Nov 28 09:53:07 np0005538513.localdomain podman[294904]: 2025-11-28 09:53:07.153946527 +0000 UTC m=+0.091245798 container remove 48a2fd060f1d09c2c03ef659a8d747f2a55c74f1cec2a6eacd0ba1ffec9424cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_clarke, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, architecture=x86_64, version=7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, name=rhceph, com.redhat.component=rhceph-container, io.openshift.expose-services=, RELEASE=main, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:53:07 np0005538513.localdomain systemd[1]: libpod-conmon-48a2fd060f1d09c2c03ef659a8d747f2a55c74f1cec2a6eacd0ba1ffec9424cd.scope: Deactivated successfully.
Nov 28 09:53:07 np0005538513.localdomain ceph-mon[292954]: from='client.44134 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:53:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:07 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.2 (monmap changed)...
Nov 28 09:53:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:53:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:07 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:53:07 np0005538513.localdomain sudo[294850]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:07 np0005538513.localdomain sudo[294928]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:07 np0005538513.localdomain sudo[294928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:07 np0005538513.localdomain sudo[294928]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:07 np0005538513.localdomain sudo[294946]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:07 np0005538513.localdomain sudo[294946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:07 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5034f1fed43bbc6712c686731f1f23a696002f82c4d98fba781293102e0388ff-merged.mount: Deactivated successfully.
Nov 28 09:53:08 np0005538513.localdomain podman[294981]: 
Nov 28 09:53:08 np0005538513.localdomain podman[294981]: 2025-11-28 09:53:08.093972726 +0000 UTC m=+0.091467694 container create f6be8b90b0e8a48745edaf2e8ac9d5fae6084777353c680abb82bf261065f4cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_edison, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, release=553, vcs-type=git, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, RELEASE=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, distribution-scope=public, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True)
Nov 28 09:53:08 np0005538513.localdomain systemd[1]: Started libpod-conmon-f6be8b90b0e8a48745edaf2e8ac9d5fae6084777353c680abb82bf261065f4cb.scope.
Nov 28 09:53:08 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:53:08 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:08.150 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:53:08 np0005538513.localdomain podman[294981]: 2025-11-28 09:53:08.053302056 +0000 UTC m=+0.050797094 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:53:08 np0005538513.localdomain podman[294981]: 2025-11-28 09:53:08.157005789 +0000 UTC m=+0.154500787 container init f6be8b90b0e8a48745edaf2e8ac9d5fae6084777353c680abb82bf261065f4cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_edison, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, GIT_CLEAN=True, ceph=True, io.openshift.expose-services=, GIT_BRANCH=main, RELEASE=main, name=rhceph, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, io.openshift.tags=rhceph ceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 09:53:08 np0005538513.localdomain podman[294981]: 2025-11-28 09:53:08.167131772 +0000 UTC m=+0.164626800 container start f6be8b90b0e8a48745edaf2e8ac9d5fae6084777353c680abb82bf261065f4cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_edison, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, ceph=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, name=rhceph, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, release=553, version=7, io.buildah.version=1.33.12)
Nov 28 09:53:08 np0005538513.localdomain podman[294981]: 2025-11-28 09:53:08.167496474 +0000 UTC m=+0.164991512 container attach f6be8b90b0e8a48745edaf2e8ac9d5fae6084777353c680abb82bf261065f4cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_edison, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, release=553, vcs-type=git, version=7, distribution-scope=public, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, GIT_CLEAN=True, architecture=x86_64, CEPH_POINT_RELEASE=, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7)
Nov 28 09:53:08 np0005538513.localdomain confident_edison[294996]: 167 167
Nov 28 09:53:08 np0005538513.localdomain systemd[1]: libpod-f6be8b90b0e8a48745edaf2e8ac9d5fae6084777353c680abb82bf261065f4cb.scope: Deactivated successfully.
Nov 28 09:53:08 np0005538513.localdomain podman[294981]: 2025-11-28 09:53:08.171420245 +0000 UTC m=+0.168915233 container died f6be8b90b0e8a48745edaf2e8ac9d5fae6084777353c680abb82bf261065f4cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_edison, io.buildah.version=1.33.12, version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, name=rhceph, vendor=Red Hat, Inc., ceph=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, com.redhat.component=rhceph-container, distribution-scope=public, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=)
Nov 28 09:53:08 np0005538513.localdomain podman[295001]: 2025-11-28 09:53:08.269120892 +0000 UTC m=+0.087589604 container remove f6be8b90b0e8a48745edaf2e8ac9d5fae6084777353c680abb82bf261065f4cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_edison, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, RELEASE=main, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, vcs-type=git, vendor=Red Hat, Inc.)
Nov 28 09:53:08 np0005538513.localdomain systemd[1]: libpod-conmon-f6be8b90b0e8a48745edaf2e8ac9d5fae6084777353c680abb82bf261065f4cb.scope: Deactivated successfully.
Nov 28 09:53:08 np0005538513.localdomain ceph-mon[292954]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 28 09:53:08 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:08 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:08 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:08 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:08 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.5 (monmap changed)...
Nov 28 09:53:08 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:53:08 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:08 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:53:08 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:08 np0005538513.localdomain sudo[294946]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:08 np0005538513.localdomain sudo[295025]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:08 np0005538513.localdomain sudo[295025]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:08 np0005538513.localdomain sudo[295025]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:08 np0005538513.localdomain sudo[295043]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:08 np0005538513.localdomain sudo[295043]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:08 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d4182eb10206f7f6ec0d226e6b68a747a63dd49045a7946fb9aeffeaa9affe85-merged.mount: Deactivated successfully.
Nov 28 09:53:09 np0005538513.localdomain podman[295077]: 
Nov 28 09:53:09 np0005538513.localdomain podman[295077]: 2025-11-28 09:53:09.24915852 +0000 UTC m=+0.084162338 container create c31ccd1e5f47c4e67f6cf8225cf3cbb75821f4d3f06909a920088350b66a73af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_clarke, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, distribution-scope=public, vcs-type=git, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, version=7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:53:09 np0005538513.localdomain systemd[1]: Started libpod-conmon-c31ccd1e5f47c4e67f6cf8225cf3cbb75821f4d3f06909a920088350b66a73af.scope.
Nov 28 09:53:09 np0005538513.localdomain podman[295077]: 2025-11-28 09:53:09.21334576 +0000 UTC m=+0.048349628 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:53:09 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:53:09 np0005538513.localdomain podman[295077]: 2025-11-28 09:53:09.328884499 +0000 UTC m=+0.163888327 container init c31ccd1e5f47c4e67f6cf8225cf3cbb75821f4d3f06909a920088350b66a73af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_clarke, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, name=rhceph, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, ceph=True, com.redhat.component=rhceph-container)
Nov 28 09:53:09 np0005538513.localdomain podman[295077]: 2025-11-28 09:53:09.340060075 +0000 UTC m=+0.175063943 container start c31ccd1e5f47c4e67f6cf8225cf3cbb75821f4d3f06909a920088350b66a73af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_clarke, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, build-date=2025-09-24T08:57:55, release=553, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, vcs-type=git, name=rhceph)
Nov 28 09:53:09 np0005538513.localdomain podman[295077]: 2025-11-28 09:53:09.341186051 +0000 UTC m=+0.176189929 container attach c31ccd1e5f47c4e67f6cf8225cf3cbb75821f4d3f06909a920088350b66a73af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_clarke, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, distribution-scope=public, io.openshift.tags=rhceph ceph, RELEASE=main, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:53:09 np0005538513.localdomain distracted_clarke[295092]: 167 167
Nov 28 09:53:09 np0005538513.localdomain systemd[1]: libpod-c31ccd1e5f47c4e67f6cf8225cf3cbb75821f4d3f06909a920088350b66a73af.scope: Deactivated successfully.
Nov 28 09:53:09 np0005538513.localdomain podman[295077]: 2025-11-28 09:53:09.344589216 +0000 UTC m=+0.179593124 container died c31ccd1e5f47c4e67f6cf8225cf3cbb75821f4d3f06909a920088350b66a73af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_clarke, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, version=7, architecture=x86_64, GIT_CLEAN=True, io.buildah.version=1.33.12, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph)
Nov 28 09:53:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@4(peon).osd e87 _set_new_cache_sizes cache_size:1020054721 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:53:09 np0005538513.localdomain ceph-mon[292954]: from='client.44144 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:53:09 np0005538513.localdomain ceph-mon[292954]: Saving service mon spec with placement label:mon
Nov 28 09:53:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:53:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:53:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:53:09 np0005538513.localdomain podman[295097]: 2025-11-28 09:53:09.446691648 +0000 UTC m=+0.093759034 container remove c31ccd1e5f47c4e67f6cf8225cf3cbb75821f4d3f06909a920088350b66a73af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_clarke, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.component=rhceph-container, ceph=True, release=553, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, version=7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:53:09 np0005538513.localdomain systemd[1]: libpod-conmon-c31ccd1e5f47c4e67f6cf8225cf3cbb75821f4d3f06909a920088350b66a73af.scope: Deactivated successfully.
Nov 28 09:53:09 np0005538513.localdomain sudo[295043]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:09 np0005538513.localdomain podman[295112]: 2025-11-28 09:53:09.563640311 +0000 UTC m=+0.103874898 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 09:53:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:53:09 np0005538513.localdomain podman[295112]: 2025-11-28 09:53:09.598505241 +0000 UTC m=+0.138739828 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:53:09 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:53:09 np0005538513.localdomain sudo[295138]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:09 np0005538513.localdomain sudo[295138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:09 np0005538513.localdomain sudo[295138]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:09 np0005538513.localdomain podman[295132]: 2025-11-28 09:53:09.696522967 +0000 UTC m=+0.095720356 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:53:09 np0005538513.localdomain podman[295132]: 2025-11-28 09:53:09.770799419 +0000 UTC m=+0.169996798 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 09:53:09 np0005538513.localdomain sudo[295169]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:09 np0005538513.localdomain sudo[295169]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:09 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:53:09 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5ea726546635a277b6fee427c926bf381607320f22e8d5a1b8aa56e1cdc6a33d-merged.mount: Deactivated successfully.
Nov 28 09:53:09 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:09.929 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:53:10 np0005538513.localdomain podman[238687]: time="2025-11-28T09:53:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:53:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:53:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 28 09:53:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:53:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18716 "" "Go-http-client/1.1"
Nov 28 09:53:10 np0005538513.localdomain podman[295211]: 
Nov 28 09:53:10 np0005538513.localdomain podman[295211]: 2025-11-28 09:53:10.274417869 +0000 UTC m=+0.093618891 container create fc9e2d5abd2267ba769fc12257a666518a2d5020f2685105e8ed725192fbc1f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_ride, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.component=rhceph-container, architecture=x86_64, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, release=553, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, distribution-scope=public, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12)
Nov 28 09:53:10 np0005538513.localdomain systemd[1]: Started libpod-conmon-fc9e2d5abd2267ba769fc12257a666518a2d5020f2685105e8ed725192fbc1f6.scope.
Nov 28 09:53:10 np0005538513.localdomain podman[295211]: 2025-11-28 09:53:10.235906646 +0000 UTC m=+0.055107698 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:53:10 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:53:10 np0005538513.localdomain podman[295211]: 2025-11-28 09:53:10.362003742 +0000 UTC m=+0.181204764 container init fc9e2d5abd2267ba769fc12257a666518a2d5020f2685105e8ed725192fbc1f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_ride, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, name=rhceph, RELEASE=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 09:53:10 np0005538513.localdomain podman[295211]: 2025-11-28 09:53:10.371256298 +0000 UTC m=+0.190457350 container start fc9e2d5abd2267ba769fc12257a666518a2d5020f2685105e8ed725192fbc1f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_ride, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vendor=Red Hat, Inc., GIT_BRANCH=main, name=rhceph, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_CLEAN=True, version=7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git)
Nov 28 09:53:10 np0005538513.localdomain podman[295211]: 2025-11-28 09:53:10.371484175 +0000 UTC m=+0.190685197 container attach fc9e2d5abd2267ba769fc12257a666518a2d5020f2685105e8ed725192fbc1f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_ride, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, version=7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, name=rhceph, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, release=553, io.buildah.version=1.33.12, ceph=True, CEPH_POINT_RELEASE=)
Nov 28 09:53:10 np0005538513.localdomain sweet_ride[295228]: 167 167
Nov 28 09:53:10 np0005538513.localdomain systemd[1]: libpod-fc9e2d5abd2267ba769fc12257a666518a2d5020f2685105e8ed725192fbc1f6.scope: Deactivated successfully.
Nov 28 09:53:10 np0005538513.localdomain podman[295211]: 2025-11-28 09:53:10.378242105 +0000 UTC m=+0.197443157 container died fc9e2d5abd2267ba769fc12257a666518a2d5020f2685105e8ed725192fbc1f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_ride, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, RELEASE=main, ceph=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, release=553, vcs-type=git, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True)
Nov 28 09:53:10 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:53:10 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:53:10 np0005538513.localdomain ceph-mon[292954]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 28 09:53:10 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:10 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:10 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:10 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:10 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:53:10 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:10 np0005538513.localdomain podman[295233]: 2025-11-28 09:53:10.482715621 +0000 UTC m=+0.092620820 container remove fc9e2d5abd2267ba769fc12257a666518a2d5020f2685105e8ed725192fbc1f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_ride, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, architecture=x86_64, name=rhceph, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, RELEASE=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, ceph=True, com.redhat.component=rhceph-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:53:10 np0005538513.localdomain systemd[1]: libpod-conmon-fc9e2d5abd2267ba769fc12257a666518a2d5020f2685105e8ed725192fbc1f6.scope: Deactivated successfully.
Nov 28 09:53:10 np0005538513.localdomain sudo[295169]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:10 np0005538513.localdomain sudo[295249]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:10 np0005538513.localdomain sudo[295249]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:10 np0005538513.localdomain sudo[295249]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:10 np0005538513.localdomain sudo[295267]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:10 np0005538513.localdomain sudo[295267]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:10 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3d0caea60d6719bb0da89d6e5067c0e3539c804930c77fcb1727715237e8de84-merged.mount: Deactivated successfully.
Nov 28 09:53:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:53:11 np0005538513.localdomain podman[295300]: 2025-11-28 09:53:11.324047552 +0000 UTC m=+0.105841259 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute)
Nov 28 09:53:11 np0005538513.localdomain podman[295308]: 
Nov 28 09:53:11 np0005538513.localdomain podman[295308]: 2025-11-28 09:53:11.339296105 +0000 UTC m=+0.095036085 container create 256b114422cb5e0745162aee5d5dbaccc54894c8791b369eb38d47243d47a3d0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_lewin, build-date=2025-09-24T08:57:55, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, version=7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.buildah.version=1.33.12)
Nov 28 09:53:11 np0005538513.localdomain podman[295300]: 2025-11-28 09:53:11.364064603 +0000 UTC m=+0.145858290 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:53:11 np0005538513.localdomain systemd[1]: Started libpod-conmon-256b114422cb5e0745162aee5d5dbaccc54894c8791b369eb38d47243d47a3d0.scope.
Nov 28 09:53:11 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:53:11 np0005538513.localdomain podman[295308]: 2025-11-28 09:53:11.29879709 +0000 UTC m=+0.054537120 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:53:11 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:53:11 np0005538513.localdomain podman[295308]: 2025-11-28 09:53:11.451637135 +0000 UTC m=+0.207377115 container init 256b114422cb5e0745162aee5d5dbaccc54894c8791b369eb38d47243d47a3d0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_lewin, GIT_CLEAN=True, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, release=553, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, distribution-scope=public, version=7, com.redhat.component=rhceph-container, ceph=True, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, architecture=x86_64)
Nov 28 09:53:11 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:53:11 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:53:11 np0005538513.localdomain ceph-mon[292954]: from='client.26905 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538513", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:53:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:53:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:53:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:11 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.200:0/430380774' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 28 09:53:11 np0005538513.localdomain systemd[1]: libpod-256b114422cb5e0745162aee5d5dbaccc54894c8791b369eb38d47243d47a3d0.scope: Deactivated successfully.
Nov 28 09:53:11 np0005538513.localdomain hopeful_lewin[295337]: 167 167
Nov 28 09:53:11 np0005538513.localdomain podman[295308]: 2025-11-28 09:53:11.483316156 +0000 UTC m=+0.239056146 container start 256b114422cb5e0745162aee5d5dbaccc54894c8791b369eb38d47243d47a3d0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_lewin, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, version=7)
Nov 28 09:53:11 np0005538513.localdomain podman[295308]: 2025-11-28 09:53:11.483963286 +0000 UTC m=+0.239703286 container attach 256b114422cb5e0745162aee5d5dbaccc54894c8791b369eb38d47243d47a3d0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_lewin, distribution-scope=public, vcs-type=git, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, release=553, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.openshift.expose-services=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:53:11 np0005538513.localdomain podman[295308]: 2025-11-28 09:53:11.490454888 +0000 UTC m=+0.246194898 container died 256b114422cb5e0745162aee5d5dbaccc54894c8791b369eb38d47243d47a3d0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_lewin, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, release=553, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, name=rhceph, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git)
Nov 28 09:53:11 np0005538513.localdomain podman[295342]: 2025-11-28 09:53:11.583610443 +0000 UTC m=+0.098356148 container remove 256b114422cb5e0745162aee5d5dbaccc54894c8791b369eb38d47243d47a3d0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_lewin, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, RELEASE=main, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, architecture=x86_64, io.openshift.expose-services=, GIT_BRANCH=main, ceph=True, version=7, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, GIT_CLEAN=True, vcs-type=git, io.buildah.version=1.33.12)
Nov 28 09:53:11 np0005538513.localdomain systemd[1]: libpod-conmon-256b114422cb5e0745162aee5d5dbaccc54894c8791b369eb38d47243d47a3d0.scope: Deactivated successfully.
Nov 28 09:53:11 np0005538513.localdomain sudo[295267]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:11 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f4cfdb30d543e6f8d1b36776da3435febeefc1b8f2a8f2002a4ea3d127bd1268-merged.mount: Deactivated successfully.
Nov 28 09:53:12 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mon.np0005538513 (monmap changed)...
Nov 28 09:53:12 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain
Nov 28 09:53:12 np0005538513.localdomain ceph-mon[292954]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 28 09:53:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:13 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:13.173 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3388432170' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3388432170' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:13.483383) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323593483550, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 11496, "num_deletes": 257, "total_data_size": 19523211, "memory_usage": 20414488, "flush_reason": "Manual Compaction"}
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323593580596, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 14818754, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 11501, "table_properties": {"data_size": 14760009, "index_size": 31695, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25989, "raw_key_size": 274264, "raw_average_key_size": 26, "raw_value_size": 14583379, "raw_average_value_size": 1404, "num_data_blocks": 1216, "num_entries": 10383, "num_filter_entries": 10383, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 1764323559, "file_creation_time": 1764323593, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 97302 microseconds, and 31725 cpu microseconds.
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:13.580693) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 14818754 bytes OK
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:13.580736) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:13.582572) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:13.582597) EVENT_LOG_v1 {"time_micros": 1764323593582590, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:13.582626) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 19446154, prev total WAL file size 19446154, number of live WAL files 2.
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:13.586633) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130323931' seq:72057594037927935, type:22 .. '7061786F73003130353433' seq:0, type:0; will stop at (end)
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(14MB) 8(2012B)]
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323593586787, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 14820766, "oldest_snapshot_seqno": -1}
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 10132 keys, 14815360 bytes, temperature: kUnknown
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323593701307, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 14815360, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14757320, "index_size": 31635, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25349, "raw_key_size": 269421, "raw_average_key_size": 26, "raw_value_size": 14584034, "raw_average_value_size": 1439, "num_data_blocks": 1215, "num_entries": 10132, "num_filter_entries": 10132, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764323593, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:13.701957) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 14815360 bytes
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:13.703976) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 129.0 rd, 129.0 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(14.1, 0.0 +0.0 blob) out(14.1 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 10388, records dropped: 256 output_compression: NoCompression
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:13.704010) EVENT_LOG_v1 {"time_micros": 1764323593703994, "job": 4, "event": "compaction_finished", "compaction_time_micros": 114858, "compaction_time_cpu_micros": 42264, "output_level": 6, "num_output_files": 1, "total_output_size": 14815360, "num_input_records": 10388, "num_output_records": 10132, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323593707689, "job": 4, "event": "table_file_deletion", "file_number": 14}
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323593707968, "job": 4, "event": "table_file_deletion", "file_number": 8}
Nov 28 09:53:13 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:13.586432) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@4(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:53:14 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.0 (monmap changed)...
Nov 28 09:53:14 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.0 on np0005538514.localdomain
Nov 28 09:53:14 np0005538513.localdomain ceph-mon[292954]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:14 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.3 (monmap changed)...
Nov 28 09:53:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 28 09:53:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:14 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:53:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:14.966 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:53:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:15 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:53:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:53:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:53:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:15 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:53:15 np0005538513.localdomain ceph-mon[292954]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:53:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:16 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:53:16 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:53:16 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.200:0/3919583814' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Nov 28 09:53:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:53:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:53:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:17 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mon.np0005538514 (monmap changed)...
Nov 28 09:53:17 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain
Nov 28 09:53:17 np0005538513.localdomain ceph-mon[292954]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:17 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:17 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:17 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 28 09:53:17 np0005538513.localdomain ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:17 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@4(peon).osd e88 e88: 6 total, 6 up, 6 in
Nov 28 09:53:18 np0005538513.localdomain sshd[293690]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:53:18 np0005538513.localdomain systemd[1]: session-67.scope: Deactivated successfully.
Nov 28 09:53:18 np0005538513.localdomain systemd[1]: session-67.scope: Consumed 11.737s CPU time.
Nov 28 09:53:18 np0005538513.localdomain systemd-logind[764]: Session 67 logged out. Waiting for processes to exit.
Nov 28 09:53:18 np0005538513.localdomain systemd-logind[764]: Removed session 67.
Nov 28 09:53:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:53:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:53:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:53:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:53:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:53:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:53:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:53:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:53:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:53:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:53:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:53:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:53:18 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:18.175 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:53:18 np0005538513.localdomain sshd[295359]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:53:18 np0005538513.localdomain sshd[295359]: Accepted publickey for ceph-admin from 192.168.122.108 port 46390 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 09:53:18 np0005538513.localdomain systemd-logind[764]: New session 68 of user ceph-admin.
Nov 28 09:53:18 np0005538513.localdomain systemd[1]: Started Session 68 of User ceph-admin.
Nov 28 09:53:18 np0005538513.localdomain sshd[295359]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 09:53:18 np0005538513.localdomain sudo[295363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:18 np0005538513.localdomain sudo[295363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:18 np0005538513.localdomain sudo[295363]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:18 np0005538513.localdomain sudo[295381]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:53:18 np0005538513.localdomain sudo[295381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.200:0/1038640921' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: Activating manager daemon np0005538515.yfkzhl
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: osdmap e88: 6 total, 6 up, 6 in
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.200:0/1038640921' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: mgrmap e25: np0005538515.yfkzhl(active, starting, since 0.0424963s), standbys: np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx, np0005538512.zyhkxs
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538511"} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mds metadata", "who": "mds.np0005538513.yljthc"} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mds metadata", "who": "mds.np0005538515.anvatb"} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mds metadata", "who": "mds.np0005538514.umgtoy"} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr metadata", "who": "np0005538515.yfkzhl", "id": "np0005538515.yfkzhl"} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr metadata", "who": "np0005538510.nzitwz", "id": "np0005538510.nzitwz"} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr metadata", "who": "np0005538511.fvuybw", "id": "np0005538511.fvuybw"} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr metadata", "who": "np0005538513.dsfdlx", "id": "np0005538513.dsfdlx"} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr metadata", "who": "np0005538512.zyhkxs", "id": "np0005538512.zyhkxs"} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mds metadata"} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata"} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata"} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: Manager daemon np0005538515.yfkzhl is now available
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/mirror_snapshot_schedule"} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/mirror_snapshot_schedule"} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/trash_purge_schedule"} : dispatch
Nov 28 09:53:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/trash_purge_schedule"} : dispatch
Nov 28 09:53:19 np0005538513.localdomain sshd[290554]: Received disconnect from 192.168.122.11 port 50356:11: disconnected by user
Nov 28 09:53:19 np0005538513.localdomain sshd[290554]: Disconnected from user tripleo-admin 192.168.122.11 port 50356
Nov 28 09:53:19 np0005538513.localdomain sshd[290534]: pam_unix(sshd:session): session closed for user tripleo-admin
Nov 28 09:53:19 np0005538513.localdomain systemd[1]: session-65.scope: Deactivated successfully.
Nov 28 09:53:19 np0005538513.localdomain systemd[1]: session-65.scope: Consumed 1.776s CPU time.
Nov 28 09:53:19 np0005538513.localdomain systemd-logind[764]: Session 65 logged out. Waiting for processes to exit.
Nov 28 09:53:19 np0005538513.localdomain systemd-logind[764]: Removed session 65.
Nov 28 09:53:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@4(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:53:19 np0005538513.localdomain podman[295472]: 2025-11-28 09:53:19.474549777 +0000 UTC m=+0.097408259 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, vendor=Red Hat, Inc., RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55)
Nov 28 09:53:19 np0005538513.localdomain podman[295472]: 2025-11-28 09:53:19.609374493 +0000 UTC m=+0.232232955 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.buildah.version=1.33.12, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, name=rhceph, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.expose-services=, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public)
Nov 28 09:53:19 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:53:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:19.969 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:53:20 np0005538513.localdomain ceph-mon[292954]: mgrmap e26: np0005538515.yfkzhl(active, since 1.07766s), standbys: np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx, np0005538512.zyhkxs
Nov 28 09:53:20 np0005538513.localdomain ceph-mon[292954]: pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:20 np0005538513.localdomain ceph-mon[292954]: mgrmap e27: np0005538515.yfkzhl(active, since 1.58689s), standbys: np0005538511.fvuybw, np0005538513.dsfdlx, np0005538512.zyhkxs
Nov 28 09:53:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:53:20 np0005538513.localdomain podman[295548]: 2025-11-28 09:53:20.027608628 +0000 UTC m=+0.117193542 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 28 09:53:20 np0005538513.localdomain podman[295548]: 2025-11-28 09:53:20.067106451 +0000 UTC m=+0.156691385 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 28 09:53:20 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:53:20 np0005538513.localdomain podman[295576]: 2025-11-28 09:53:20.119880816 +0000 UTC m=+0.091126113 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:53:20 np0005538513.localdomain podman[295576]: 2025-11-28 09:53:20.161464305 +0000 UTC m=+0.132709582 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:53:20 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:53:20 np0005538513.localdomain sudo[295381]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:20 np0005538513.localdomain sudo[295637]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:20 np0005538513.localdomain sudo[295637]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:20 np0005538513.localdomain sudo[295637]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:20 np0005538513.localdomain sudo[295655]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:53:20 np0005538513.localdomain sudo[295655]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:21 np0005538513.localdomain sudo[295655]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:21 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:53:19] ENGINE Bus STARTING
Nov 28 09:53:21 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:53:19] ENGINE Serving on https://172.18.0.108:7150
Nov 28 09:53:21 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:53:19] ENGINE Client ('172.18.0.108', 34804) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 28 09:53:21 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:53:19] ENGINE Serving on http://172.18.0.108:8765
Nov 28 09:53:21 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:53:19] ENGINE Bus STARTED
Nov 28 09:53:21 np0005538513.localdomain ceph-mon[292954]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Nov 28 09:53:21 np0005538513.localdomain ceph-mon[292954]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Nov 28 09:53:21 np0005538513.localdomain ceph-mon[292954]: Cluster is now healthy
Nov 28 09:53:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:21 np0005538513.localdomain ceph-mon[292954]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:21 np0005538513.localdomain sudo[295705]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:21 np0005538513.localdomain sudo[295705]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:21 np0005538513.localdomain sudo[295705]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:21 np0005538513.localdomain sudo[295723]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 28 09:53:21 np0005538513.localdomain sudo[295723]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:21 np0005538513.localdomain sudo[295723]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:21 np0005538513.localdomain sudo[295760]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:53:21 np0005538513.localdomain sudo[295760]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:21 np0005538513.localdomain sudo[295760]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:21 np0005538513.localdomain sudo[295778]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:53:21 np0005538513.localdomain sudo[295778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:21 np0005538513.localdomain sudo[295778]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538513.localdomain sudo[295796]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:53:22 np0005538513.localdomain sudo[295796]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538513.localdomain sudo[295796]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538513.localdomain sudo[295814]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:22 np0005538513.localdomain sudo[295814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538513.localdomain sudo[295814]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538513.localdomain sudo[295832]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:53:22 np0005538513.localdomain sudo[295832]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538513.localdomain sudo[295832]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538513.localdomain sudo[295866]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:53:22 np0005538513.localdomain sudo[295866]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538513.localdomain sudo[295866]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: mgrmap e28: np0005538515.yfkzhl(active, since 3s), standbys: np0005538511.fvuybw, np0005538513.dsfdlx, np0005538512.zyhkxs
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:53:22 np0005538513.localdomain sudo[295884]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:53:22 np0005538513.localdomain sudo[295884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538513.localdomain sudo[295884]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538513.localdomain sudo[295902]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:53:22 np0005538513.localdomain sudo[295902]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538513.localdomain sudo[295902]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538513.localdomain sudo[295920]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:53:22 np0005538513.localdomain sudo[295920]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538513.localdomain sudo[295920]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538513.localdomain sudo[295938]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:53:22 np0005538513.localdomain sudo[295938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538513.localdomain sudo[295938]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538513.localdomain sudo[295956]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:53:22 np0005538513.localdomain sudo[295956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538513.localdomain sudo[295956]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538513.localdomain sudo[295974]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:22 np0005538513.localdomain sudo[295974]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538513.localdomain sudo[295974]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538513.localdomain sudo[295992]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:53:22 np0005538513.localdomain sudo[295992]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538513.localdomain sudo[295992]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538513.localdomain sudo[296026]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:53:22 np0005538513.localdomain sudo[296026]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538513.localdomain sudo[296026]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538513.localdomain sudo[296044]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:53:22 np0005538513.localdomain sudo[296044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538513.localdomain sudo[296044]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:23 np0005538513.localdomain sudo[296062]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:23 np0005538513.localdomain sudo[296062]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538513.localdomain sudo[296062]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:23 np0005538513.localdomain sudo[296080]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:53:23 np0005538513.localdomain sudo[296080]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538513.localdomain sudo[296080]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:23 np0005538513.localdomain sudo[296098]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:53:23 np0005538513.localdomain sudo[296098]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538513.localdomain sudo[296098]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:23.209 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:53:23 np0005538513.localdomain sudo[296116]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:53:23 np0005538513.localdomain sudo[296116]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538513.localdomain sudo[296116]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:23 np0005538513.localdomain sudo[296134]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:23 np0005538513.localdomain sudo[296134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538513.localdomain sudo[296134]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:23 np0005538513.localdomain ceph-mon[292954]: Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 09:53:23 np0005538513.localdomain ceph-mon[292954]: Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:53:23 np0005538513.localdomain ceph-mon[292954]: Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 09:53:23 np0005538513.localdomain ceph-mon[292954]: Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:53:23 np0005538513.localdomain ceph-mon[292954]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:23 np0005538513.localdomain ceph-mon[292954]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:23 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:23 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:23 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:23 np0005538513.localdomain ceph-mon[292954]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:23 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:23 np0005538513.localdomain ceph-mon[292954]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:23 np0005538513.localdomain ceph-mon[292954]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:23 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:23 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:23 np0005538513.localdomain ceph-mon[292954]: Standby manager daemon np0005538514.djozup started
Nov 28 09:53:23 np0005538513.localdomain sudo[296152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:53:23 np0005538513.localdomain sudo[296152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538513.localdomain sudo[296152]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:23 np0005538513.localdomain sudo[296186]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:53:23 np0005538513.localdomain sudo[296186]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538513.localdomain sudo[296186]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:23 np0005538513.localdomain sudo[296204]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:53:23 np0005538513.localdomain sudo[296204]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538513.localdomain sudo[296204]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:23 np0005538513.localdomain sudo[296222]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:23 np0005538513.localdomain sudo[296222]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538513.localdomain sudo[296222]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:23 np0005538513.localdomain sudo[296240]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:53:23 np0005538513.localdomain sudo[296240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538513.localdomain sudo[296240]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:23 np0005538513.localdomain sudo[296258]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:53:23 np0005538513.localdomain sudo[296258]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538513.localdomain sudo[296258]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:23 np0005538513.localdomain sudo[296276]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:53:23 np0005538513.localdomain sudo[296276]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538513.localdomain sudo[296276]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:24 np0005538513.localdomain sudo[296294]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:24 np0005538513.localdomain sudo[296294]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:24 np0005538513.localdomain sudo[296294]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:24 np0005538513.localdomain sudo[296312]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:53:24 np0005538513.localdomain sudo[296312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:24 np0005538513.localdomain sudo[296312]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:24 np0005538513.localdomain sudo[296346]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:53:24 np0005538513.localdomain sudo[296346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:24 np0005538513.localdomain sudo[296346]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:24 np0005538513.localdomain sudo[296364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:53:24 np0005538513.localdomain sudo[296364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:24 np0005538513.localdomain sudo[296364]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@4(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: Updating np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: Updating np0005538511.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: mgrmap e29: np0005538515.yfkzhl(active, since 5s), standbys: np0005538511.fvuybw, np0005538513.dsfdlx, np0005538514.djozup, np0005538512.zyhkxs
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr metadata", "who": "np0005538514.djozup", "id": "np0005538514.djozup"} : dispatch
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:24 np0005538513.localdomain sudo[296382]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:24 np0005538513.localdomain sudo[296382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:24 np0005538513.localdomain sudo[296382]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.522738) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323604522868, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 973, "num_deletes": 276, "total_data_size": 5014026, "memory_usage": 5158592, "flush_reason": "Manual Compaction"}
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323604548296, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 3201627, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 11506, "largest_seqno": 12474, "table_properties": {"data_size": 3196696, "index_size": 2334, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11990, "raw_average_key_size": 20, "raw_value_size": 3186115, "raw_average_value_size": 5372, "num_data_blocks": 96, "num_entries": 593, "num_filter_entries": 593, "num_deletions": 275, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323593, "oldest_key_time": 1764323593, "file_creation_time": 1764323604, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 25626 microseconds, and 8212 cpu microseconds.
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.548367) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 3201627 bytes OK
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.548404) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.550912) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.550944) EVENT_LOG_v1 {"time_micros": 1764323604550935, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.550972) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 5008595, prev total WAL file size 5008595, number of live WAL files 2.
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.552551) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031303231' seq:72057594037927935, type:22 .. '6B760031323936' seq:0, type:0; will stop at (end)
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(3126KB)], [15(14MB)]
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323604552639, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 18016987, "oldest_snapshot_seqno": -1}
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 10142 keys, 16854943 bytes, temperature: kUnknown
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323604669911, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 16854943, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16797320, "index_size": 31154, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25413, "raw_key_size": 271820, "raw_average_key_size": 26, "raw_value_size": 16624127, "raw_average_value_size": 1639, "num_data_blocks": 1176, "num_entries": 10142, "num_filter_entries": 10142, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764323604, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.670482) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 16854943 bytes
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.673425) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 153.2 rd, 143.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.1, 14.1 +0.0 blob) out(16.1 +0.0 blob), read-write-amplify(10.9) write-amplify(5.3) OK, records in: 10725, records dropped: 583 output_compression: NoCompression
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.673464) EVENT_LOG_v1 {"time_micros": 1764323604673448, "job": 6, "event": "compaction_finished", "compaction_time_micros": 117591, "compaction_time_cpu_micros": 46771, "output_level": 6, "num_output_files": 1, "total_output_size": 16854943, "num_input_records": 10725, "num_output_records": 10142, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323604674326, "job": 6, "event": "table_file_deletion", "file_number": 17}
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323604677170, "job": 6, "event": "table_file_deletion", "file_number": 15}
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.552416) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.677303) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.677312) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.677316) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.677319) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:24 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.677323) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:24 np0005538513.localdomain sudo[296400]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:53:24 np0005538513.localdomain sudo[296400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:24 np0005538513.localdomain sudo[296400]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:25.002 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:53:25 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:25 np0005538513.localdomain ceph-mon[292954]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:25 np0005538513.localdomain ceph-mon[292954]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:25 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:25 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:25 np0005538513.localdomain ceph-mon[292954]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:53:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:53:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:53:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:26 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mon.np0005538511 (monmap changed)...
Nov 28 09:53:26 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mon.np0005538511 on np0005538511.localdomain
Nov 28 09:53:26 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:26 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:26 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:53:26 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:53:26 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:27 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mon.np0005538512 (monmap changed)...
Nov 28 09:53:27 np0005538513.localdomain ceph-mon[292954]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Nov 28 09:53:27 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain
Nov 28 09:53:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 28 09:53:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:28.210 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:53:28 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:53:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 28 09:53:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@4(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:53:29 np0005538513.localdomain systemd[1]: Stopping User Manager for UID 1003...
Nov 28 09:53:29 np0005538513.localdomain systemd[290538]: Activating special unit Exit the Session...
Nov 28 09:53:29 np0005538513.localdomain systemd[290538]: Stopped target Main User Target.
Nov 28 09:53:29 np0005538513.localdomain systemd[290538]: Stopped target Basic System.
Nov 28 09:53:29 np0005538513.localdomain systemd[290538]: Stopped target Paths.
Nov 28 09:53:29 np0005538513.localdomain systemd[290538]: Stopped target Sockets.
Nov 28 09:53:29 np0005538513.localdomain systemd[290538]: Stopped target Timers.
Nov 28 09:53:29 np0005538513.localdomain systemd[290538]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 28 09:53:29 np0005538513.localdomain systemd[290538]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 09:53:29 np0005538513.localdomain systemd[290538]: Closed D-Bus User Message Bus Socket.
Nov 28 09:53:29 np0005538513.localdomain systemd[290538]: Stopped Create User's Volatile Files and Directories.
Nov 28 09:53:29 np0005538513.localdomain systemd[290538]: Removed slice User Application Slice.
Nov 28 09:53:29 np0005538513.localdomain systemd[290538]: Reached target Shutdown.
Nov 28 09:53:29 np0005538513.localdomain systemd[290538]: Finished Exit the Session.
Nov 28 09:53:29 np0005538513.localdomain systemd[290538]: Reached target Exit the Session.
Nov 28 09:53:29 np0005538513.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Nov 28 09:53:29 np0005538513.localdomain systemd[1]: Stopped User Manager for UID 1003.
Nov 28 09:53:29 np0005538513.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Nov 28 09:53:29 np0005538513.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Nov 28 09:53:29 np0005538513.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Nov 28 09:53:29 np0005538513.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Nov 28 09:53:29 np0005538513.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Nov 28 09:53:29 np0005538513.localdomain systemd[1]: user-1003.slice: Consumed 2.408s CPU time.
Nov 28 09:53:29 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:53:29 np0005538513.localdomain ceph-mon[292954]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Nov 28 09:53:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:29 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mon.np0005538515 (monmap changed)...
Nov 28 09:53:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:53:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:53:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:29 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:53:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:30.004 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:53:30 np0005538513.localdomain sudo[296419]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:53:30 np0005538513.localdomain sudo[296419]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:30 np0005538513.localdomain sudo[296419]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:30.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:53:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:30.770 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:53:31 np0005538513.localdomain ceph-mon[292954]: from='client.26686 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:53:31 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:31 np0005538513.localdomain ceph-mon[292954]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 28 09:53:31 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:31 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:31 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:53:31 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:31 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:53:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:53:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:31.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:53:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:31.769 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:53:31 np0005538513.localdomain systemd[1]: tmp-crun.TyCxMF.mount: Deactivated successfully.
Nov 28 09:53:31 np0005538513.localdomain podman[296437]: 2025-11-28 09:53:31.860039345 +0000 UTC m=+0.095045075 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, config_id=edpm, vcs-type=git, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.buildah.version=1.33.7, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 09:53:31 np0005538513.localdomain podman[296437]: 2025-11-28 09:53:31.876533126 +0000 UTC m=+0.111538816 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, distribution-scope=public, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 28 09:53:31 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:53:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:32.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:53:33 np0005538513.localdomain ceph-mon[292954]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:53:33 np0005538513.localdomain ceph-mon[292954]: from='client.44214 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538511", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:53:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:33.247 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:53:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:33.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:53:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:33.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:53:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:33.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:53:33 np0005538513.localdomain ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b9600 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0
Nov 28 09:53:33 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@4(peon) e10  my rank is now 3 (was 4)
Nov 28 09:53:33 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 calling monitor election
Nov 28 09:53:33 np0005538513.localdomain ceph-mon[292954]: paxos.3).electionLogic(38) init, last seen epoch 38
Nov 28 09:53:33 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@3(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:53:33 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@3(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:53:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@3(electing) e10 handle_auth_request failed to assign global_id
Nov 28 09:53:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@3(electing) e10 handle_auth_request failed to assign global_id
Nov 28 09:53:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@3(electing) e10 handle_auth_request failed to assign global_id
Nov 28 09:53:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@3(electing) e10 handle_auth_request failed to assign global_id
Nov 28 09:53:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:35.038 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@3(electing) e10 handle_auth_request failed to assign global_id
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@3(electing) e10 handle_auth_request failed to assign global_id
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@3(electing) e10 handle_auth_request failed to assign global_id
Nov 28 09:53:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:35.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:53:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:35.793 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:53:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:35.793 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:53:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:35.794 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:53:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:35.794 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:53:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:35.795 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@3(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@3(peon) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/479351308' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: Remove daemons mon.np0005538511
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: Safe to remove mon.np0005538511: new quorum should be ['np0005538512', 'np0005538515', 'np0005538514', 'np0005538513'] (from ['np0005538512', 'np0005538515', 'np0005538514', 'np0005538513'])
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: Removing monitor np0005538511 from monmap...
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: Removing daemon mon.np0005538511 from np0005538511.localdomain -- ports []
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513 calling monitor election
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538515 calling monitor election
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538514 calling monitor election
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538512 calling monitor election
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538512 is new leader, mons np0005538512,np0005538515,np0005538514,np0005538513 in quorum (ranks 0,1,2,3)
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: monmap epoch 10
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: last_changed 2025-11-28T09:53:33.884066+0000
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: min_mon_release 18 (reef)
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: election_strategy: 1
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538512
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: 2: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005538514
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: 3: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: osdmap e88: 6 total, 6 up, 6 in
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: mgrmap e29: np0005538515.yfkzhl(active, since 17s), standbys: np0005538511.fvuybw, np0005538513.dsfdlx, np0005538514.djozup, np0005538512.zyhkxs
Nov 28 09:53:35 np0005538513.localdomain ceph-mon[292954]: overall HEALTH_OK
Nov 28 09:53:36 np0005538513.localdomain sudo[296477]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:53:36 np0005538513.localdomain sudo[296477]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:36 np0005538513.localdomain sudo[296477]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:36 np0005538513.localdomain sudo[296495]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:53:36 np0005538513.localdomain sudo[296495]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:36 np0005538513.localdomain sudo[296495]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:36 np0005538513.localdomain sudo[296513]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:53:36 np0005538513.localdomain sudo[296513]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:53:36 np0005538513.localdomain sudo[296513]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:36 np0005538513.localdomain sudo[296537]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:36 np0005538513.localdomain sudo[296537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:36 np0005538513.localdomain sudo[296537]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:36 np0005538513.localdomain podman[296531]: 2025-11-28 09:53:36.279385309 +0000 UTC m=+0.094519441 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:53:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:36.307 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:53:36 np0005538513.localdomain podman[296531]: 2025-11-28 09:53:36.316565203 +0000 UTC m=+0.131699325 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:53:36 np0005538513.localdomain sudo[296564]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:53:36 np0005538513.localdomain sudo[296564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:36 np0005538513.localdomain sudo[296564]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:36 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:53:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:36.384 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:53:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:36.385 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:53:36 np0005538513.localdomain sudo[296609]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:53:36 np0005538513.localdomain sudo[296609]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:36 np0005538513.localdomain sudo[296609]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:36 np0005538513.localdomain sudo[296627]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:53:36 np0005538513.localdomain sudo[296627]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:36 np0005538513.localdomain sudo[296627]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:36 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@3(peon) e10 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:53:36 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2640763287' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:53:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:36.634 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:53:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:36.636 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11810MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:53:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:36.636 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:53:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:36.636 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:53:36 np0005538513.localdomain sudo[296645]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:53:36 np0005538513.localdomain sudo[296645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:36 np0005538513.localdomain sudo[296645]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:36.746 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:53:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:36.746 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:53:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:36.747 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:53:36 np0005538513.localdomain sudo[296663]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:53:36 np0005538513.localdomain sudo[296663]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:36 np0005538513.localdomain sudo[296663]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:36.818 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:53:36 np0005538513.localdomain sudo[296681]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:53:36 np0005538513.localdomain sudo[296681]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:36 np0005538513.localdomain sudo[296681]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:36 np0005538513.localdomain sudo[296700]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:53:36 np0005538513.localdomain sudo[296700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:36 np0005538513.localdomain sudo[296700]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:37 np0005538513.localdomain sudo[296720]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:37 np0005538513.localdomain sudo[296720]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:37 np0005538513.localdomain sudo[296720]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:37 np0005538513.localdomain ceph-mon[292954]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:37 np0005538513.localdomain ceph-mon[292954]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:37 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:37 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:37 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:37 np0005538513.localdomain ceph-mon[292954]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:53:37 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2870239069' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:53:37 np0005538513.localdomain ceph-mon[292954]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:37 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2640763287' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:53:37 np0005538513.localdomain sudo[296755]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:53:37 np0005538513.localdomain sudo[296755]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:37 np0005538513.localdomain sudo[296755]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:37 np0005538513.localdomain sudo[296789]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:53:37 np0005538513.localdomain sudo[296789]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:37 np0005538513.localdomain sudo[296789]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@3(peon) e10 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:53:37 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/953514327' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:53:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:37.268 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:53:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:37.276 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:53:37 np0005538513.localdomain sudo[296807]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:53:37 np0005538513.localdomain sudo[296807]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:37 np0005538513.localdomain sudo[296807]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:37.291 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:53:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:37.294 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:53:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:37.295 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:53:37 np0005538513.localdomain sudo[296827]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:37 np0005538513.localdomain sudo[296827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:37 np0005538513.localdomain sudo[296827]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:37 np0005538513.localdomain sudo[296845]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:53:37 np0005538513.localdomain sudo[296845]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:37 np0005538513.localdomain sudo[296845]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:38.250 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: from='client.26988 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538511.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/953514327' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: Removed label mon from host np0005538511.localdomain
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/3573920266' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/3363422915' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538511.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538511.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:38.291 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:53:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:38.318 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:53:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:38.318 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:53:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:38.318 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:53:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:39.018 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:53:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:39.018 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:53:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:39.019 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:53:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:39.019 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:53:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:39.417 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:53:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:39.443 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:53:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:39.443 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:53:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@3(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:53:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538511 (monmap changed)...
Nov 28 09:53:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538511 on np0005538511.localdomain
Nov 28 09:53:39 np0005538513.localdomain ceph-mon[292954]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538511.fvuybw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538511.fvuybw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:53:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:53:39 np0005538513.localdomain systemd[1]: tmp-crun.cMPeBd.mount: Deactivated successfully.
Nov 28 09:53:39 np0005538513.localdomain podman[296863]: 2025-11-28 09:53:39.851287086 +0000 UTC m=+0.085202584 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 28 09:53:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:53:39 np0005538513.localdomain podman[296863]: 2025-11-28 09:53:39.893544496 +0000 UTC m=+0.127459954 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:53:39 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:53:39 np0005538513.localdomain podman[296881]: 2025-11-28 09:53:39.954791172 +0000 UTC m=+0.082500741 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Nov 28 09:53:39 np0005538513.localdomain podman[296881]: 2025-11-28 09:53:39.997861777 +0000 UTC m=+0.125571356 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true)
Nov 28 09:53:40 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:53:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:40.040 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:53:40 np0005538513.localdomain podman[238687]: time="2025-11-28T09:53:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:53:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:53:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 28 09:53:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:53:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18720 "" "Go-http-client/1.1"
Nov 28 09:53:40 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538511.fvuybw (monmap changed)...
Nov 28 09:53:40 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538511.fvuybw on np0005538511.localdomain
Nov 28 09:53:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:53:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:53:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:41 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mon.np0005538512 (monmap changed)...
Nov 28 09:53:41 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain
Nov 28 09:53:41 np0005538513.localdomain ceph-mon[292954]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:53:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:41 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:53:41 np0005538513.localdomain podman[296906]: 2025-11-28 09:53:41.81727773 +0000 UTC m=+0.061221555 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:53:41 np0005538513.localdomain podman[296906]: 2025-11-28 09:53:41.828632109 +0000 UTC m=+0.072575924 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 28 09:53:41 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:53:42 np0005538513.localdomain sudo[296925]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:42 np0005538513.localdomain sudo[296925]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:42 np0005538513.localdomain sudo[296925]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:42 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)...
Nov 28 09:53:42 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain
Nov 28 09:53:42 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:53:42 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:53:42 np0005538513.localdomain ceph-mon[292954]: from='client.27004 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538511.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:53:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:42 np0005538513.localdomain ceph-mon[292954]: Removed label mgr from host np0005538511.localdomain
Nov 28 09:53:42 np0005538513.localdomain ceph-mon[292954]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:42 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:53:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:42 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:53:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:42 np0005538513.localdomain sudo[296943]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:42 np0005538513.localdomain sudo[296943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:43 np0005538513.localdomain podman[296978]: 
Nov 28 09:53:43 np0005538513.localdomain podman[296978]: 2025-11-28 09:53:43.090064578 +0000 UTC m=+0.067184659 container create d7bf7b5eec17bbbd3a16388fb0a3c80f9f241b2a2c2b0db19c401852c231658c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_curie, vendor=Red Hat, Inc., ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, architecture=x86_64, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_BRANCH=main, version=7, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True)
Nov 28 09:53:43 np0005538513.localdomain systemd[1]: Started libpod-conmon-d7bf7b5eec17bbbd3a16388fb0a3c80f9f241b2a2c2b0db19c401852c231658c.scope.
Nov 28 09:53:43 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:53:43 np0005538513.localdomain podman[296978]: 2025-11-28 09:53:43.064827031 +0000 UTC m=+0.041947122 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:53:43 np0005538513.localdomain podman[296978]: 2025-11-28 09:53:43.178485299 +0000 UTC m=+0.155605380 container init d7bf7b5eec17bbbd3a16388fb0a3c80f9f241b2a2c2b0db19c401852c231658c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_curie, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, architecture=x86_64, name=rhceph, version=7, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.openshift.expose-services=)
Nov 28 09:53:43 np0005538513.localdomain podman[296978]: 2025-11-28 09:53:43.190920692 +0000 UTC m=+0.168040773 container start d7bf7b5eec17bbbd3a16388fb0a3c80f9f241b2a2c2b0db19c401852c231658c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_curie, build-date=2025-09-24T08:57:55, release=553, com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.tags=rhceph ceph, RELEASE=main, ceph=True)
Nov 28 09:53:43 np0005538513.localdomain podman[296978]: 2025-11-28 09:53:43.191197651 +0000 UTC m=+0.168317772 container attach d7bf7b5eec17bbbd3a16388fb0a3c80f9f241b2a2c2b0db19c401852c231658c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_curie, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=553, io.openshift.tags=rhceph ceph, version=7, name=rhceph, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vendor=Red Hat, Inc., distribution-scope=public)
Nov 28 09:53:43 np0005538513.localdomain condescending_curie[296993]: 167 167
Nov 28 09:53:43 np0005538513.localdomain systemd[1]: libpod-d7bf7b5eec17bbbd3a16388fb0a3c80f9f241b2a2c2b0db19c401852c231658c.scope: Deactivated successfully.
Nov 28 09:53:43 np0005538513.localdomain podman[296978]: 2025-11-28 09:53:43.195497202 +0000 UTC m=+0.172617313 container died d7bf7b5eec17bbbd3a16388fb0a3c80f9f241b2a2c2b0db19c401852c231658c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_curie, ceph=True, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, build-date=2025-09-24T08:57:55)
Nov 28 09:53:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:43.285 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:53:43 np0005538513.localdomain podman[296998]: 2025-11-28 09:53:43.327457464 +0000 UTC m=+0.120873111 container remove d7bf7b5eec17bbbd3a16388fb0a3c80f9f241b2a2c2b0db19c401852c231658c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_curie, build-date=2025-09-24T08:57:55, version=7, vcs-type=git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, name=rhceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, io.openshift.expose-services=)
Nov 28 09:53:43 np0005538513.localdomain systemd[1]: libpod-conmon-d7bf7b5eec17bbbd3a16388fb0a3c80f9f241b2a2c2b0db19c401852c231658c.scope: Deactivated successfully.
Nov 28 09:53:43 np0005538513.localdomain sudo[296943]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:43 np0005538513.localdomain sudo[297015]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:43 np0005538513.localdomain sudo[297015]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:43 np0005538513.localdomain sudo[297015]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:43 np0005538513.localdomain sudo[297033]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:43 np0005538513.localdomain sudo[297033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:43 np0005538513.localdomain ceph-mon[292954]: from='client.44263 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538511.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:53:43 np0005538513.localdomain ceph-mon[292954]: Removed label _admin from host np0005538511.localdomain
Nov 28 09:53:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:43 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.2 (monmap changed)...
Nov 28 09:53:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:53:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:43 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:53:44 np0005538513.localdomain podman[297067]: 
Nov 28 09:53:44 np0005538513.localdomain podman[297067]: 2025-11-28 09:53:44.038078268 +0000 UTC m=+0.079610341 container create ad1222987a0e7ff30aab98dfa8bd0440036d29f869983b2a6410a64ccd05aa3a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_kare, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_CLEAN=True, name=rhceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, RELEASE=main, version=7, distribution-scope=public, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:53:44 np0005538513.localdomain systemd[1]: Started libpod-conmon-ad1222987a0e7ff30aab98dfa8bd0440036d29f869983b2a6410a64ccd05aa3a.scope.
Nov 28 09:53:44 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:53:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-563384d2cbbb14240e8678f8ea08988afbfcedf1cbab013f9542294f82232c37-merged.mount: Deactivated successfully.
Nov 28 09:53:44 np0005538513.localdomain podman[297067]: 2025-11-28 09:53:44.097543458 +0000 UTC m=+0.139075531 container init ad1222987a0e7ff30aab98dfa8bd0440036d29f869983b2a6410a64ccd05aa3a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_kare, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, CEPH_POINT_RELEASE=, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, version=7, GIT_BRANCH=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, release=553, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.buildah.version=1.33.12, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64)
Nov 28 09:53:44 np0005538513.localdomain podman[297067]: 2025-11-28 09:53:44.006043142 +0000 UTC m=+0.047575255 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:53:44 np0005538513.localdomain podman[297067]: 2025-11-28 09:53:44.111533829 +0000 UTC m=+0.153065902 container start ad1222987a0e7ff30aab98dfa8bd0440036d29f869983b2a6410a64ccd05aa3a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_kare, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, com.redhat.component=rhceph-container, distribution-scope=public, CEPH_POINT_RELEASE=, ceph=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph)
Nov 28 09:53:44 np0005538513.localdomain podman[297067]: 2025-11-28 09:53:44.111804207 +0000 UTC m=+0.153336330 container attach ad1222987a0e7ff30aab98dfa8bd0440036d29f869983b2a6410a64ccd05aa3a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_kare, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.buildah.version=1.33.12, GIT_BRANCH=main, vcs-type=git, RELEASE=main)
Nov 28 09:53:44 np0005538513.localdomain affectionate_kare[297082]: 167 167
Nov 28 09:53:44 np0005538513.localdomain systemd[1]: libpod-ad1222987a0e7ff30aab98dfa8bd0440036d29f869983b2a6410a64ccd05aa3a.scope: Deactivated successfully.
Nov 28 09:53:44 np0005538513.localdomain podman[297067]: 2025-11-28 09:53:44.114627654 +0000 UTC m=+0.156159767 container died ad1222987a0e7ff30aab98dfa8bd0440036d29f869983b2a6410a64ccd05aa3a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_kare, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, release=553, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., RELEASE=main, vcs-type=git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, distribution-scope=public, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:53:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-2fcb21b62f658603a8af6e6a4cdac3e228de3cab9d5d300153769dc45ec13c68-merged.mount: Deactivated successfully.
Nov 28 09:53:44 np0005538513.localdomain podman[297087]: 2025-11-28 09:53:44.207287017 +0000 UTC m=+0.082425679 container remove ad1222987a0e7ff30aab98dfa8bd0440036d29f869983b2a6410a64ccd05aa3a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_kare, RELEASE=main, io.openshift.tags=rhceph ceph, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, version=7, distribution-scope=public, io.buildah.version=1.33.12, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, vcs-type=git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, ceph=True, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:53:44 np0005538513.localdomain systemd[1]: libpod-conmon-ad1222987a0e7ff30aab98dfa8bd0440036d29f869983b2a6410a64ccd05aa3a.scope: Deactivated successfully.
Nov 28 09:53:44 np0005538513.localdomain sudo[297033]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@3(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:53:44 np0005538513.localdomain sudo[297111]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:44 np0005538513.localdomain sudo[297111]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:44 np0005538513.localdomain sudo[297111]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.522258) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323624522345, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 1080, "num_deletes": 258, "total_data_size": 1715108, "memory_usage": 1736928, "flush_reason": "Manual Compaction"}
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323624531662, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 995869, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12479, "largest_seqno": 13554, "table_properties": {"data_size": 990623, "index_size": 2589, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 13555, "raw_average_key_size": 21, "raw_value_size": 979270, "raw_average_value_size": 1559, "num_data_blocks": 108, "num_entries": 628, "num_filter_entries": 628, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323604, "oldest_key_time": 1764323604, "file_creation_time": 1764323624, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 9444 microseconds, and 3944 cpu microseconds.
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.531710) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 995869 bytes OK
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.531736) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.534509) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.534531) EVENT_LOG_v1 {"time_micros": 1764323624534525, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.534553) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 1709330, prev total WAL file size 1709654, number of live WAL files 2.
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.535174) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033353138' seq:72057594037927935, type:22 .. '6C6F676D0033373731' seq:0, type:0; will stop at (end)
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(972KB)], [18(16MB)]
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323624535219, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 17850812, "oldest_snapshot_seqno": -1}
Nov 28 09:53:44 np0005538513.localdomain sudo[297129]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:44 np0005538513.localdomain sudo[297129]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 10221 keys, 17704717 bytes, temperature: kUnknown
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323624643761, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 17704717, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17644988, "index_size": 33068, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25605, "raw_key_size": 275182, "raw_average_key_size": 26, "raw_value_size": 17468822, "raw_average_value_size": 1709, "num_data_blocks": 1256, "num_entries": 10221, "num_filter_entries": 10221, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764323624, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.644132) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 17704717 bytes
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.658237) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 164.3 rd, 163.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 16.1 +0.0 blob) out(16.9 +0.0 blob), read-write-amplify(35.7) write-amplify(17.8) OK, records in: 10770, records dropped: 549 output_compression: NoCompression
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.658295) EVENT_LOG_v1 {"time_micros": 1764323624658273, "job": 8, "event": "compaction_finished", "compaction_time_micros": 108638, "compaction_time_cpu_micros": 43984, "output_level": 6, "num_output_files": 1, "total_output_size": 17704717, "num_input_records": 10770, "num_output_records": 10221, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323624658716, "job": 8, "event": "table_file_deletion", "file_number": 20}
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323624661522, "job": 8, "event": "table_file_deletion", "file_number": 18}
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.535100) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.661570) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.661582) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.661586) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.661589) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:44 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.661592) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:45 np0005538513.localdomain podman[297163]: 
Nov 28 09:53:45 np0005538513.localdomain podman[297163]: 2025-11-28 09:53:45.039955607 +0000 UTC m=+0.074262837 container create 37f0706866e6fe29256bcb02ec20b78740a4542cf23de399abd5d0def94bc632 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_herschel, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, RELEASE=main, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, distribution-scope=public, ceph=True, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12)
Nov 28 09:53:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:45.082 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:53:45 np0005538513.localdomain systemd[1]: Started libpod-conmon-37f0706866e6fe29256bcb02ec20b78740a4542cf23de399abd5d0def94bc632.scope.
Nov 28 09:53:45 np0005538513.localdomain podman[297163]: 2025-11-28 09:53:45.009447838 +0000 UTC m=+0.043755128 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:53:45 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:53:45 np0005538513.localdomain podman[297163]: 2025-11-28 09:53:45.125588262 +0000 UTC m=+0.159895492 container init 37f0706866e6fe29256bcb02ec20b78740a4542cf23de399abd5d0def94bc632 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_herschel, name=rhceph, architecture=x86_64, release=553, distribution-scope=public, io.openshift.tags=rhceph ceph, vcs-type=git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True)
Nov 28 09:53:45 np0005538513.localdomain reverent_herschel[297178]: 167 167
Nov 28 09:53:45 np0005538513.localdomain podman[297163]: 2025-11-28 09:53:45.135752676 +0000 UTC m=+0.170059886 container start 37f0706866e6fe29256bcb02ec20b78740a4542cf23de399abd5d0def94bc632 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_herschel, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, name=rhceph, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, version=7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public)
Nov 28 09:53:45 np0005538513.localdomain systemd[1]: libpod-37f0706866e6fe29256bcb02ec20b78740a4542cf23de399abd5d0def94bc632.scope: Deactivated successfully.
Nov 28 09:53:45 np0005538513.localdomain podman[297163]: 2025-11-28 09:53:45.136474678 +0000 UTC m=+0.170781948 container attach 37f0706866e6fe29256bcb02ec20b78740a4542cf23de399abd5d0def94bc632 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_herschel, vendor=Red Hat, Inc., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-type=git, RELEASE=main, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, release=553, architecture=x86_64)
Nov 28 09:53:45 np0005538513.localdomain podman[297163]: 2025-11-28 09:53:45.13915773 +0000 UTC m=+0.173464960 container died 37f0706866e6fe29256bcb02ec20b78740a4542cf23de399abd5d0def94bc632 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_herschel, distribution-scope=public, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=)
Nov 28 09:53:45 np0005538513.localdomain podman[297184]: 2025-11-28 09:53:45.232614078 +0000 UTC m=+0.082814431 container remove 37f0706866e6fe29256bcb02ec20b78740a4542cf23de399abd5d0def94bc632 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_herschel, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.openshift.expose-services=, release=553, description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, com.redhat.component=rhceph-container, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, vcs-type=git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:53:45 np0005538513.localdomain systemd[1]: libpod-conmon-37f0706866e6fe29256bcb02ec20b78740a4542cf23de399abd5d0def94bc632.scope: Deactivated successfully.
Nov 28 09:53:45 np0005538513.localdomain sudo[297129]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:45 np0005538513.localdomain ceph-mon[292954]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:45 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.5 (monmap changed)...
Nov 28 09:53:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:53:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:45 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:53:45 np0005538513.localdomain sudo[297208]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:45 np0005538513.localdomain sudo[297208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:45 np0005538513.localdomain sudo[297208]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:45 np0005538513.localdomain sudo[297226]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:45 np0005538513.localdomain sudo[297226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:46 np0005538513.localdomain podman[297262]: 
Nov 28 09:53:46 np0005538513.localdomain podman[297262]: 2025-11-28 09:53:46.075389268 +0000 UTC m=+0.080487528 container create fc0c45a9abc5bb0b60cfa321d2bc380caf33a24a1a713d6aba3b68529e475572 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_kowalevski, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64)
Nov 28 09:53:46 np0005538513.localdomain systemd[1]: Started libpod-conmon-fc0c45a9abc5bb0b60cfa321d2bc380caf33a24a1a713d6aba3b68529e475572.scope.
Nov 28 09:53:46 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-0e9d2d9919f59f56c334681e7bbcd44f2d9e5beb67b9e273697c599e26f341df-merged.mount: Deactivated successfully.
Nov 28 09:53:46 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:53:46 np0005538513.localdomain podman[297262]: 2025-11-28 09:53:46.044285011 +0000 UTC m=+0.049383301 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:53:46 np0005538513.localdomain podman[297262]: 2025-11-28 09:53:46.147013743 +0000 UTC m=+0.152115883 container init fc0c45a9abc5bb0b60cfa321d2bc380caf33a24a1a713d6aba3b68529e475572 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_kowalevski, CEPH_POINT_RELEASE=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, version=7, release=553, vcs-type=git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., com.redhat.component=rhceph-container)
Nov 28 09:53:46 np0005538513.localdomain podman[297262]: 2025-11-28 09:53:46.156102483 +0000 UTC m=+0.161200733 container start fc0c45a9abc5bb0b60cfa321d2bc380caf33a24a1a713d6aba3b68529e475572 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_kowalevski, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, RELEASE=main, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=rhceph-container, vcs-type=git, io.buildah.version=1.33.12, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:53:46 np0005538513.localdomain nice_kowalevski[297277]: 167 167
Nov 28 09:53:46 np0005538513.localdomain podman[297262]: 2025-11-28 09:53:46.156317789 +0000 UTC m=+0.161416049 container attach fc0c45a9abc5bb0b60cfa321d2bc380caf33a24a1a713d6aba3b68529e475572 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_kowalevski, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_BRANCH=main, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, GIT_CLEAN=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12)
Nov 28 09:53:46 np0005538513.localdomain systemd[1]: libpod-fc0c45a9abc5bb0b60cfa321d2bc380caf33a24a1a713d6aba3b68529e475572.scope: Deactivated successfully.
Nov 28 09:53:46 np0005538513.localdomain podman[297262]: 2025-11-28 09:53:46.162660645 +0000 UTC m=+0.167758945 container died fc0c45a9abc5bb0b60cfa321d2bc380caf33a24a1a713d6aba3b68529e475572 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_kowalevski, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_BRANCH=main, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, architecture=x86_64, ceph=True, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:53:46 np0005538513.localdomain podman[297282]: 2025-11-28 09:53:46.255974397 +0000 UTC m=+0.086506354 container remove fc0c45a9abc5bb0b60cfa321d2bc380caf33a24a1a713d6aba3b68529e475572 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_kowalevski, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, distribution-scope=public, name=rhceph, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, GIT_BRANCH=main, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.openshift.expose-services=, version=7, RELEASE=main)
Nov 28 09:53:46 np0005538513.localdomain systemd[1]: libpod-conmon-fc0c45a9abc5bb0b60cfa321d2bc380caf33a24a1a713d6aba3b68529e475572.scope: Deactivated successfully.
Nov 28 09:53:46 np0005538513.localdomain sudo[297226]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:46 np0005538513.localdomain sudo[297298]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:46 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:53:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:53:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:53:46 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:53:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:53:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:46 np0005538513.localdomain sudo[297298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:46 np0005538513.localdomain sudo[297298]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:46 np0005538513.localdomain sudo[297316]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:46 np0005538513.localdomain sudo[297316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:46 np0005538513.localdomain podman[297351]: 
Nov 28 09:53:46 np0005538513.localdomain podman[297351]: 2025-11-28 09:53:46.990772505 +0000 UTC m=+0.078939281 container create 9a731305e1dff5ee0b9003a006d9a9e4ac90ce2fa8c40566b46cc7fac4ce4b53 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_shirley, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.component=rhceph-container, release=553, GIT_BRANCH=main, vcs-type=git, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, GIT_CLEAN=True, name=rhceph, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.expose-services=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:53:47 np0005538513.localdomain systemd[1]: Started libpod-conmon-9a731305e1dff5ee0b9003a006d9a9e4ac90ce2fa8c40566b46cc7fac4ce4b53.scope.
Nov 28 09:53:47 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:53:47 np0005538513.localdomain podman[297351]: 2025-11-28 09:53:47.051853925 +0000 UTC m=+0.140020691 container init 9a731305e1dff5ee0b9003a006d9a9e4ac90ce2fa8c40566b46cc7fac4ce4b53 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_shirley, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-type=git, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, RELEASE=main, distribution-scope=public, name=rhceph)
Nov 28 09:53:47 np0005538513.localdomain podman[297351]: 2025-11-28 09:53:46.957325005 +0000 UTC m=+0.045491801 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:53:47 np0005538513.localdomain podman[297351]: 2025-11-28 09:53:47.061420569 +0000 UTC m=+0.149587325 container start 9a731305e1dff5ee0b9003a006d9a9e4ac90ce2fa8c40566b46cc7fac4ce4b53 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_shirley, architecture=x86_64, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.buildah.version=1.33.12, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, RELEASE=main, distribution-scope=public, release=553, version=7, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, vendor=Red Hat, Inc., ceph=True)
Nov 28 09:53:47 np0005538513.localdomain podman[297351]: 2025-11-28 09:53:47.061744949 +0000 UTC m=+0.149911745 container attach 9a731305e1dff5ee0b9003a006d9a9e4ac90ce2fa8c40566b46cc7fac4ce4b53 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_shirley, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, distribution-scope=public, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, name=rhceph, io.openshift.expose-services=, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:53:47 np0005538513.localdomain modest_shirley[297366]: 167 167
Nov 28 09:53:47 np0005538513.localdomain systemd[1]: libpod-9a731305e1dff5ee0b9003a006d9a9e4ac90ce2fa8c40566b46cc7fac4ce4b53.scope: Deactivated successfully.
Nov 28 09:53:47 np0005538513.localdomain podman[297351]: 2025-11-28 09:53:47.063461902 +0000 UTC m=+0.151628678 container died 9a731305e1dff5ee0b9003a006d9a9e4ac90ce2fa8c40566b46cc7fac4ce4b53 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_shirley, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, release=553, description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, io.buildah.version=1.33.12, version=7, GIT_CLEAN=True, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc.)
Nov 28 09:53:47 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-ec8e9de951f0194103ce90d35f4b8d9b04aab629902595c7e7cb42fab324f707-merged.mount: Deactivated successfully.
Nov 28 09:53:47 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-4fa418497f1a463fa17622249fe7f5c576572657dcbbf7a30f8bb09bc5099ce8-merged.mount: Deactivated successfully.
Nov 28 09:53:47 np0005538513.localdomain podman[297371]: 2025-11-28 09:53:47.163057198 +0000 UTC m=+0.086949797 container remove 9a731305e1dff5ee0b9003a006d9a9e4ac90ce2fa8c40566b46cc7fac4ce4b53 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_shirley, name=rhceph, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, version=7, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=553, io.openshift.expose-services=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, ceph=True, vendor=Red Hat, Inc., io.buildah.version=1.33.12)
Nov 28 09:53:47 np0005538513.localdomain systemd[1]: libpod-conmon-9a731305e1dff5ee0b9003a006d9a9e4ac90ce2fa8c40566b46cc7fac4ce4b53.scope: Deactivated successfully.
Nov 28 09:53:47 np0005538513.localdomain sudo[297316]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:47 np0005538513.localdomain sudo[297390]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:47 np0005538513.localdomain sudo[297390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:47 np0005538513.localdomain sudo[297390]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:47 np0005538513.localdomain sudo[297408]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:47 np0005538513.localdomain sudo[297408]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:47 np0005538513.localdomain ceph-mon[292954]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:47 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:53:47 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:53:47 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:47 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:47 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:53:47 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:53:47 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:47 np0005538513.localdomain podman[297444]: 
Nov 28 09:53:47 np0005538513.localdomain podman[297444]: 2025-11-28 09:53:47.874694433 +0000 UTC m=+0.071397809 container create c6045f97ec9240bded586f509be21bc2a7d5a70d29537b6d7b5069e71c5fa813 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_euclid, distribution-scope=public, io.openshift.tags=rhceph ceph, name=rhceph, io.buildah.version=1.33.12, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, ceph=True, com.redhat.component=rhceph-container, release=553, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, RELEASE=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55)
Nov 28 09:53:47 np0005538513.localdomain systemd[1]: Started libpod-conmon-c6045f97ec9240bded586f509be21bc2a7d5a70d29537b6d7b5069e71c5fa813.scope.
Nov 28 09:53:47 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:53:47 np0005538513.localdomain podman[297444]: 2025-11-28 09:53:47.94249297 +0000 UTC m=+0.139196306 container init c6045f97ec9240bded586f509be21bc2a7d5a70d29537b6d7b5069e71c5fa813 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_euclid, GIT_BRANCH=main, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, distribution-scope=public, GIT_CLEAN=True, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.buildah.version=1.33.12, io.openshift.expose-services=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64)
Nov 28 09:53:47 np0005538513.localdomain podman[297444]: 2025-11-28 09:53:47.848867588 +0000 UTC m=+0.045570924 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:53:47 np0005538513.localdomain podman[297444]: 2025-11-28 09:53:47.951961011 +0000 UTC m=+0.148664337 container start c6045f97ec9240bded586f509be21bc2a7d5a70d29537b6d7b5069e71c5fa813 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_euclid, name=rhceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., RELEASE=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, version=7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph)
Nov 28 09:53:47 np0005538513.localdomain podman[297444]: 2025-11-28 09:53:47.952315282 +0000 UTC m=+0.149018648 container attach c6045f97ec9240bded586f509be21bc2a7d5a70d29537b6d7b5069e71c5fa813 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_euclid, distribution-scope=public, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, GIT_CLEAN=True, version=7, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_BRANCH=main, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, com.redhat.component=rhceph-container, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:53:47 np0005538513.localdomain gifted_euclid[297459]: 167 167
Nov 28 09:53:47 np0005538513.localdomain systemd[1]: libpod-c6045f97ec9240bded586f509be21bc2a7d5a70d29537b6d7b5069e71c5fa813.scope: Deactivated successfully.
Nov 28 09:53:47 np0005538513.localdomain podman[297444]: 2025-11-28 09:53:47.954212311 +0000 UTC m=+0.150915707 container died c6045f97ec9240bded586f509be21bc2a7d5a70d29537b6d7b5069e71c5fa813 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_euclid, vendor=Red Hat, Inc., io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, distribution-scope=public, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, ceph=True, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64)
Nov 28 09:53:48 np0005538513.localdomain podman[297464]: 2025-11-28 09:53:48.054194838 +0000 UTC m=+0.087212765 container remove c6045f97ec9240bded586f509be21bc2a7d5a70d29537b6d7b5069e71c5fa813 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_euclid, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, distribution-scope=public, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.12)
Nov 28 09:53:48 np0005538513.localdomain systemd[1]: libpod-conmon-c6045f97ec9240bded586f509be21bc2a7d5a70d29537b6d7b5069e71c5fa813.scope: Deactivated successfully.
Nov 28 09:53:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:53:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:53:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:53:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:53:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:53:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:53:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:53:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:53:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:53:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:53:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:53:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:53:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-5b6cb8489e37fd70ce4a79a2d82b39e56b5bd34a7ba4abca4f9811f53eeb6b5b-merged.mount: Deactivated successfully.
Nov 28 09:53:48 np0005538513.localdomain sudo[297408]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:48.285 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:53:48 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mon.np0005538513 (monmap changed)...
Nov 28 09:53:48 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain
Nov 28 09:53:48 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:48 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:48 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:48 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:48 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@3(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:53:49 np0005538513.localdomain ceph-mon[292954]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:49 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:53:49 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:53:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 28 09:53:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:50.084 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:53:50 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.0 (monmap changed)...
Nov 28 09:53:50 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.0 on np0005538514.localdomain
Nov 28 09:53:50 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:50 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:50 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 28 09:53:50 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:53:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:53:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:53:50.832 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:53:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:53:50.833 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:53:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:53:50.834 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:53:50 np0005538513.localdomain podman[297482]: 2025-11-28 09:53:50.86928395 +0000 UTC m=+0.095552802 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, config_id=multipathd, io.buildah.version=1.41.3)
Nov 28 09:53:50 np0005538513.localdomain podman[297482]: 2025-11-28 09:53:50.916285346 +0000 UTC m=+0.142554248 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, config_id=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:53:50 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:53:50 np0005538513.localdomain podman[297481]: 2025-11-28 09:53:50.926114139 +0000 UTC m=+0.154614150 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:53:51 np0005538513.localdomain podman[297481]: 2025-11-28 09:53:51.010550928 +0000 UTC m=+0.239050979 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:53:51 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:53:51 np0005538513.localdomain ceph-mon[292954]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:51 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.3 (monmap changed)...
Nov 28 09:53:51 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:53:51 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:51 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:51 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:53:51 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:53:51 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:52 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:53:52 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:53:52 np0005538513.localdomain ceph-mon[292954]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:52 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:53:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:53:52 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:53:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:53.313 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:53:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:53 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mon.np0005538514 (monmap changed)...
Nov 28 09:53:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:53:53 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain
Nov 28 09:53:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:53:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@3(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:53:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:55.114 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:53:55 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:53:55 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:53:55 np0005538513.localdomain ceph-mon[292954]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:55 np0005538513.localdomain ceph-mon[292954]: from='client.34406 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005538511.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:53:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:55 np0005538513.localdomain ceph-mon[292954]: Added label _no_schedule to host np0005538511.localdomain
Nov 28 09:53:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:55 np0005538513.localdomain ceph-mon[292954]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005538511.localdomain
Nov 28 09:53:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 28 09:53:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:56 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.1 (monmap changed)...
Nov 28 09:53:56 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:53:56 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:56 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:56 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 28 09:53:56 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:57 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.4 (monmap changed)...
Nov 28 09:53:57 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:53:57 np0005538513.localdomain ceph-mon[292954]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:57 np0005538513.localdomain ceph-mon[292954]: from='client.44275 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005538511.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:53:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:53:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:53:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538511.localdomain"} : dispatch
Nov 28 09:53:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538511.localdomain"} : dispatch
Nov 28 09:53:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538511.localdomain"}]': finished
Nov 28 09:53:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:53:58.314 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:53:58 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:53:58 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:53:58 np0005538513.localdomain ceph-mon[292954]: from='client.27016 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005538511.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:53:58 np0005538513.localdomain ceph-mon[292954]: Removed host np0005538511.localdomain
Nov 28 09:53:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:58 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)...
Nov 28 09:53:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:53:58 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:53:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:58 np0005538513.localdomain ceph-mon[292954]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:59 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@3(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:53:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:59 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mon.np0005538515 (monmap changed)...
Nov 28 09:53:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:53:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:53:59 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:53:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:53:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:53:59 np0005538513.localdomain sudo[297523]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:53:59 np0005538513.localdomain sudo[297523]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:59 np0005538513.localdomain sudo[297523]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:00.118 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:00 np0005538513.localdomain ceph-mon[292954]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.672 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.673 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.678 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6def1e59-3a34-44ec-aac5-1856d93a8c0d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:54:00.673833', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '2a385e7c-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.845732187, 'message_signature': '5fa516a8c81d06b4c7f003ec72fea157e0f15a0995c73b6d76e812effc189ccd'}]}, 'timestamp': '2025-11-28 09:54:00.679900', '_unique_id': 'a81431ef2933408e9a2248a14a2a0bed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.683 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.700 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6da40f3-0776-404b-aee9-65f158de2b27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:54:00.683359', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '2a3b8db8-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.871748518, 'message_signature': 'f9426b9bc63574f8a02a2c01f48e9a4c2f27e3c7e42314ad2dbf2014186b333d'}]}, 'timestamp': '2025-11-28 09:54:00.700750', '_unique_id': '04a4d8c030d648868a261cffc210ad71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.703 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.713 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.714 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b33d980-2096-413e-8fff-0d6d09b806b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:54:00.703503', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a3d9e5a-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.87540269, 'message_signature': '24bada68cc646eea5c0dab2f7ea611608ec4ffe47a45eca90712c4f2a0439b02'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:54:00.703503', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a3db048-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.87540269, 'message_signature': '1f7312a36a9209a7a0a55b4e3454a3e3db76593f291076da4d37d43345b48fab'}]}, 'timestamp': '2025-11-28 09:54:00.714686', '_unique_id': 'e8ad6f44d99e4316ad06aeafd42bcd2e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.716 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.716 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 13370000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d56f9cb-f81e-4d96-baeb-8a20fe1e09e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13370000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:54:00.716861', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '2a3e1812-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.871748518, 'message_signature': 'ea397230ae68ef1201b92a4e4f5570c2262fd52c698c1b7946afcde1b092e5dc'}]}, 'timestamp': '2025-11-28 09:54:00.717327', '_unique_id': '76ee69ea9afa4ebeb51d6c56b2b7e87b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.719 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.719 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7fbf9e25-8e16-4c6e-80a0-da2125801ac9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:54:00.719405', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '2a3e7a64-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.845732187, 'message_signature': '77a88447ec7a172a2e7e935d908b966270600c0e87e485f3de6ea62f65597e4f'}]}, 'timestamp': '2025-11-28 09:54:00.719858', '_unique_id': '9ef658763f3746119aad286a2c137f52'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.721 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.749 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.750 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2b1be69-9afb-4cf5-94cd-b4b98d9431c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:54:00.721903', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a431f9c-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': 'c85ad98e9758bf3cd9c2a87d079c7048cb2be27b2925a6364243f86bf80e4537'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:54:00.721903', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a43305e-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': '42965516291f61da68e78a22c1057456f616e40437d42de1f16e0bd1a5c59591'}]}, 'timestamp': '2025-11-28 09:54:00.750732', '_unique_id': 'cc35234855d74c3e877c847a7ec0da0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.752 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.753 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd08350dc-b26f-4454-bf61-7ed1241e92ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:54:00.753092', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '2a439e72-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.845732187, 'message_signature': '4d693f28718bb6a9f916e8b3f20c213816549db7fb7ff6ab92fe8aa4453db120'}]}, 'timestamp': '2025-11-28 09:54:00.753551', '_unique_id': '0c7778efa88a4558964c4e2ddd5d0271'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.755 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.755 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.756 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b5d3d5a-6477-45c9-81a4-8bf5b589b5ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:54:00.755616', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a440092-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': 'c01503078a0d33d96273cd993101aeb71ca63e90f5a37ef8799852316cd2314a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:54:00.755616', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a44119a-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': '6d487218e602f2e31a066247aa3ee7eccaa270cd1115c965134ca1d6d3ce94ef'}]}, 'timestamp': '2025-11-28 09:54:00.756465', '_unique_id': '04798778d83e4d4c84f9525a7661225d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.758 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.758 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '539562b4-0c9f-432f-b2bf-09a1690f846b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:54:00.758555', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '2a447482-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.845732187, 'message_signature': 'b99883d6e1c45dfef80e25552b700e108e02773cb360fc48dad672d59f22fa08'}]}, 'timestamp': '2025-11-28 09:54:00.759054', '_unique_id': 'd14ccbd0bc694ad6b72413db6d20a4f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.760 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.761 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.761 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8f648ed-6fe3-41bc-a57a-0274d6887094', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:54:00.761078', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a44d602-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': '7d77ef53e4129a9fded07c0f485f86e516f6248908b686e3c0dd6fad720dfea7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:54:00.761078', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a44e5b6-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': 'b7090e0f5f5450f6c21fcbdf75461b1e9e18b6f90a5d4c8d585451b2f8be595f'}]}, 'timestamp': '2025-11-28 09:54:00.761894', '_unique_id': '7ed528d6103847fa9285714ad044ef35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.763 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.764 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.764 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76f621c2-7c13-4bce-a7b4-a0c8995e074a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:54:00.764141', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '2a454e02-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.845732187, 'message_signature': '940d261ed1d8350d02636e9564151dc2ffb5d6477eaa01d4ca8a668cdee9e51f'}]}, 'timestamp': '2025-11-28 09:54:00.764597', '_unique_id': '6bf611ae31f34232b9433dc0d08c4c46'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.766 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.766 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.767 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7f1737a-dd16-450f-b207-567a25aca1cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:54:00.766600', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a45ae92-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': '0e659e44b1cc470b06bf58d0f5e3425c75ddc97e6146804d7a6714946aba9d23'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:54:00.766600', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a45bfd6-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': 'd9e8c21766815e45a6ef9ac287d1a7e6e00003fd3c0de0614a4fa6ac236473de'}]}, 'timestamp': '2025-11-28 09:54:00.767480', '_unique_id': 'ef156c5bad714e5d9f759af4c06bbf07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.769 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.769 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44263ae9-fcb6-4e46-b0e7-b1bdd72265c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:54:00.769530', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '2a46202a-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.845732187, 'message_signature': 'aa4ed5f673b90b7bbb1c746fdef619af0c11d12f0ced422590734e1743f804f9'}]}, 'timestamp': '2025-11-28 09:54:00.769976', '_unique_id': '635f90760c924b3a86f65384f1ccb5b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.771 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.772 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d4a3b29-24d0-47e6-adb6-82dd20ec9dac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:54:00.772048', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '2a4682ea-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.845732187, 'message_signature': 'a8cdf5a827b8a827eea00c9c761b8d9472faadad0b04ec58426f9b61896af75e'}]}, 'timestamp': '2025-11-28 09:54:00.772504', '_unique_id': '7c38f6b18a0e4af1907ad537d9373d84'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.774 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.774 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.774 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '630715b6-4e45-434c-b674-224868582e47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:54:00.774650', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '2a46e992-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.845732187, 'message_signature': 'd7c4ad51bf9f96291c6313bdfc033fbcbbf1192a1c3dc43843f72a22d1373c98'}]}, 'timestamp': '2025-11-28 09:54:00.775159', '_unique_id': '2bd130fd9b6e4e10a0b4ba2e651c8514'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.777 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.777 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.777 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '983b3d14-770a-454e-8832-adca62eab4ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:54:00.777299', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a474f9a-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': '896b82cfd4fff681169b400d37c5c4c212e97897f7a2bfd270ea0fff5d74fded'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:54:00.777299', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a475f62-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': '7c68b4699c7024591a098138d70f02bfadaffd8166b91bc0f268ac4e199ae012'}]}, 'timestamp': '2025-11-28 09:54:00.778148', '_unique_id': 'ca7912c191d042b6b539e548a16e6ee6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.780 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.780 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.780 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f0743cb-3ff5-4c97-9efe-0423b60c33ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:54:00.780212', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a47c146-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': '70c15efe6ae0d8fb7d5aee7465278f2d3f9d7a35ead7730bc311aa0b4d921f8b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:54:00.780212', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a47d0d2-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': 'bca0cb3be00b108a4a37b92d84090e0c614f3dc4cf4c4f2b72c594c975f6720f'}]}, 'timestamp': '2025-11-28 09:54:00.781054', '_unique_id': 'e24ca6d9dc7c40f58f2125d997fed7df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.782 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.783 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd11d099a-586d-4d5b-a368-f615fc6856d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:54:00.783115', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '2a483306-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.845732187, 'message_signature': '86aecc758aa28ad709561a3b1fcf38d993ad88f5dc677fe8e9544f33a9d3bcf1'}]}, 'timestamp': '2025-11-28 09:54:00.783562', '_unique_id': 'dfce17da8df94e67b0462253256cd888'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.785 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.786 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5870e7ec-11c2-4da9-aafc-3d5c474275d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:54:00.786137', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '2a48a8fe-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.845732187, 'message_signature': '32d152f812d55a75a85642ae4f249d40ecc99cbbb118a71278001238c6159dbe'}]}, 'timestamp': '2025-11-28 09:54:00.786586', '_unique_id': 'e4198e37130c4afb8d38daba16160ef6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.788 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.788 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.789 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c68ae9f0-f2ca-42b4-8b2c-8182abdd8aab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:54:00.788617', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a49097a-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.87540269, 'message_signature': '6b0d412fd1953bedc1ccf21a4b6976db5856f5d651ce68d5669c08a583896ac3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:54:00.788617', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a491d52-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.87540269, 'message_signature': '410c8314c289eccb7c98b836f66fb3aef50d9e659f435d00cde5505bf59f17e1'}]}, 'timestamp': '2025-11-28 09:54:00.789540', '_unique_id': 'c7c7687057d84e1ea7e1ca8023634e0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.791 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.791 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.791 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b3ac194-cad1-4273-bcb2-82edde00aa56', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:54:00.791643', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a497d74-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.87540269, 'message_signature': '858e46028ce69efaa65a44cea5406d1824b2917678deedbb7c28fd102f181c46'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:54:00.791643', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a4987f6-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.87540269, 'message_signature': '6c298a1438cbd043551ea0b84cf8a9a5e2ed7e46915162c0f2313ea659967fa4'}]}, 'timestamp': '2025-11-28 09:54:00.792188', '_unique_id': '9893a01b73264b1a8a359c82959a9870'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:54:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:54:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:54:02 np0005538513.localdomain podman[297541]: 2025-11-28 09:54:02.848342299 +0000 UTC m=+0.086601396 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 09:54:02 np0005538513.localdomain podman[297541]: 2025-11-28 09:54:02.864569869 +0000 UTC m=+0.102828966 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., config_id=edpm, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.expose-services=, name=ubi9-minimal)
Nov 28 09:54:02 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:54:03 np0005538513.localdomain ceph-mon[292954]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:03 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:03.343 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:04 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@3(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:54:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:05.120 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:05 np0005538513.localdomain ceph-mon[292954]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:06 np0005538513.localdomain sudo[297561]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:54:06 np0005538513.localdomain sudo[297561]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:06 np0005538513.localdomain sudo[297561]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:54:06 np0005538513.localdomain podman[297579]: 2025-11-28 09:54:06.841627875 +0000 UTC m=+0.078951801 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:54:06 np0005538513.localdomain podman[297579]: 2025-11-28 09:54:06.880590765 +0000 UTC m=+0.117914691 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:54:06 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:54:07 np0005538513.localdomain ceph-mon[292954]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:07 np0005538513.localdomain ceph-mon[292954]: from='client.27028 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:54:07 np0005538513.localdomain ceph-mon[292954]: Saving service mon spec with placement label:mon
Nov 28 09:54:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:54:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:54:08 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:08.347 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:09 np0005538513.localdomain ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b91e0 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0
Nov 28 09:54:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@3(peon) e11  my rank is now 2 (was 3)
Nov 28 09:54:09 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 calling monitor election
Nov 28 09:54:09 np0005538513.localdomain ceph-mon[292954]: paxos.2).electionLogic(40) init, last seen epoch 40
Nov 28 09:54:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(electing) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:54:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(electing) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:54:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:54:09 np0005538513.localdomain sudo[297602]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:54:09 np0005538513.localdomain sudo[297602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:09 np0005538513.localdomain sudo[297602]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:09 np0005538513.localdomain sudo[297620]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:54:09 np0005538513.localdomain sudo[297620]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:09 np0005538513.localdomain sudo[297620]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:09 np0005538513.localdomain sudo[297638]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:54:09 np0005538513.localdomain sudo[297638]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:09 np0005538513.localdomain sudo[297638]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:09 np0005538513.localdomain sudo[297656]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:54:09 np0005538513.localdomain sudo[297656]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:09 np0005538513.localdomain sudo[297656]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:09 np0005538513.localdomain sudo[297674]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:54:09 np0005538513.localdomain sudo[297674]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:09 np0005538513.localdomain sudo[297674]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:54:09 np0005538513.localdomain sudo[297708]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:54:09 np0005538513.localdomain sudo[297708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:09 np0005538513.localdomain sudo[297708]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:09 np0005538513.localdomain sudo[297726]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:54:09 np0005538513.localdomain sudo[297726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:09 np0005538513.localdomain sudo[297726]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:09 np0005538513.localdomain sudo[297744]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:54:09 np0005538513.localdomain sudo[297744]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:09 np0005538513.localdomain sudo[297744]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:09 np0005538513.localdomain sudo[297762]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:54:09 np0005538513.localdomain sudo[297762]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:09 np0005538513.localdomain sudo[297762]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:09 np0005538513.localdomain sudo[297780]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:54:09 np0005538513.localdomain sudo[297780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:09 np0005538513.localdomain sudo[297780]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:09 np0005538513.localdomain sudo[297798]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:54:09 np0005538513.localdomain sudo[297798]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:54:09 np0005538513.localdomain sudo[297798]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:10 np0005538513.localdomain sudo[297822]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:54:10 np0005538513.localdomain sudo[297822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:54:10 np0005538513.localdomain sudo[297822]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:10 np0005538513.localdomain podman[297815]: 2025-11-28 09:54:10.036990893 +0000 UTC m=+0.081769528 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: from='client.44295 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005538514"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: Remove daemons mon.np0005538514
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: Safe to remove mon.np0005538514: new quorum should be ['np0005538512', 'np0005538515', 'np0005538513'] (from ['np0005538512', 'np0005538515', 'np0005538513'])
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: Removing monitor np0005538514 from monmap...
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: Removing daemon mon.np0005538514 from np0005538514.localdomain -- ports []
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513 calling monitor election
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538512 calling monitor election
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538515 calling monitor election
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538512 is new leader, mons np0005538512,np0005538515,np0005538513 in quorum (ranks 0,1,2)
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: monmap epoch 11
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: last_changed 2025-11-28T09:54:09.028617+0000
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: min_mon_release 18 (reef)
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: election_strategy: 1
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538512
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: 2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: osdmap e88: 6 total, 6 up, 6 in
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: mgrmap e29: np0005538515.yfkzhl(active, since 51s), standbys: np0005538511.fvuybw, np0005538513.dsfdlx, np0005538514.djozup, np0005538512.zyhkxs
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: overall HEALTH_OK
Nov 28 09:54:10 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:10 np0005538513.localdomain podman[297815]: 2025-11-28 09:54:10.069274516 +0000 UTC m=+0.114053191 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 28 09:54:10 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:54:10 np0005538513.localdomain podman[238687]: time="2025-11-28T09:54:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:54:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:54:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 28 09:54:10 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:10.123 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:10 np0005538513.localdomain sudo[297856]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:54:10 np0005538513.localdomain sudo[297856]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:10 np0005538513.localdomain sudo[297856]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:10 np0005538513.localdomain systemd[1]: tmp-crun.1HZGQi.mount: Deactivated successfully.
Nov 28 09:54:10 np0005538513.localdomain podman[297847]: 2025-11-28 09:54:10.190299672 +0000 UTC m=+0.146666516 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 28 09:54:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:54:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18727 "" "Go-http-client/1.1"
Nov 28 09:54:10 np0005538513.localdomain podman[297847]: 2025-11-28 09:54:10.25260906 +0000 UTC m=+0.208975924 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:54:10 np0005538513.localdomain sudo[297904]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:54:10 np0005538513.localdomain sudo[297904]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:10 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:54:10 np0005538513.localdomain sudo[297904]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:10 np0005538513.localdomain sudo[297928]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:54:10 np0005538513.localdomain sudo[297928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:10 np0005538513.localdomain sudo[297928]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:10 np0005538513.localdomain sudo[297946]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:54:10 np0005538513.localdomain sudo[297946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:10 np0005538513.localdomain sudo[297946]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:11 np0005538513.localdomain sudo[297964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:54:11 np0005538513.localdomain sudo[297964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:11 np0005538513.localdomain sudo[297964]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:11 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:54:11 np0005538513.localdomain ceph-mon[292954]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:54:11 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:54:11 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:54:11 np0005538513.localdomain ceph-mon[292954]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:54:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:54:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:54:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:54:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:12 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)...
Nov 28 09:54:12 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain
Nov 28 09:54:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:54:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:54:12 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:54:12 np0005538513.localdomain podman[297982]: 2025-11-28 09:54:12.85325537 +0000 UTC m=+0.087166194 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=edpm)
Nov 28 09:54:12 np0005538513.localdomain podman[297982]: 2025-11-28 09:54:12.86756351 +0000 UTC m=+0.101474294 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 28 09:54:12 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:54:13 np0005538513.localdomain sudo[298001]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:54:13 np0005538513.localdomain sudo[298001]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:13 np0005538513.localdomain sudo[298001]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:13 np0005538513.localdomain sudo[298019]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:54:13 np0005538513.localdomain sudo[298019]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:13 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:13.382 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:13 np0005538513.localdomain ceph-mon[292954]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:13 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:54:13 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:54:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:54:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:54:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/921452423' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:54:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/921452423' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:54:13 np0005538513.localdomain podman[298053]: 
Nov 28 09:54:13 np0005538513.localdomain podman[298053]: 2025-11-28 09:54:13.767164501 +0000 UTC m=+0.079167078 container create e98336c95c269e3ca85a3eb3b2453778f1911229d0af5ab851c9dd48e21d7a73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_williams, build-date=2025-09-24T08:57:55, architecture=x86_64, distribution-scope=public, RELEASE=main, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vendor=Red Hat, Inc., name=rhceph, GIT_BRANCH=main, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 09:54:13 np0005538513.localdomain systemd[1]: Started libpod-conmon-e98336c95c269e3ca85a3eb3b2453778f1911229d0af5ab851c9dd48e21d7a73.scope.
Nov 28 09:54:13 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:54:13 np0005538513.localdomain podman[298053]: 2025-11-28 09:54:13.733775523 +0000 UTC m=+0.045778130 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:54:13 np0005538513.localdomain podman[298053]: 2025-11-28 09:54:13.836512286 +0000 UTC m=+0.148514863 container init e98336c95c269e3ca85a3eb3b2453778f1911229d0af5ab851c9dd48e21d7a73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_williams, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.openshift.tags=rhceph ceph, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_CLEAN=True)
Nov 28 09:54:13 np0005538513.localdomain podman[298053]: 2025-11-28 09:54:13.846429931 +0000 UTC m=+0.158432518 container start e98336c95c269e3ca85a3eb3b2453778f1911229d0af5ab851c9dd48e21d7a73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_williams, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, ceph=True, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.openshift.tags=rhceph ceph, distribution-scope=public, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph)
Nov 28 09:54:13 np0005538513.localdomain podman[298053]: 2025-11-28 09:54:13.846745551 +0000 UTC m=+0.158748128 container attach e98336c95c269e3ca85a3eb3b2453778f1911229d0af5ab851c9dd48e21d7a73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_williams, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, architecture=x86_64, name=rhceph)
Nov 28 09:54:13 np0005538513.localdomain competent_williams[298068]: 167 167
Nov 28 09:54:13 np0005538513.localdomain systemd[1]: libpod-e98336c95c269e3ca85a3eb3b2453778f1911229d0af5ab851c9dd48e21d7a73.scope: Deactivated successfully.
Nov 28 09:54:13 np0005538513.localdomain podman[298053]: 2025-11-28 09:54:13.853205119 +0000 UTC m=+0.165207706 container died e98336c95c269e3ca85a3eb3b2453778f1911229d0af5ab851c9dd48e21d7a73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_williams, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, ceph=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12)
Nov 28 09:54:13 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-237c3de389c3fb5b1f0547dcdf14e999c29be0c7e8234a28c5b45ddf7b885af5-merged.mount: Deactivated successfully.
Nov 28 09:54:13 np0005538513.localdomain podman[298074]: 2025-11-28 09:54:13.951781244 +0000 UTC m=+0.088419343 container remove e98336c95c269e3ca85a3eb3b2453778f1911229d0af5ab851c9dd48e21d7a73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_williams, distribution-scope=public, name=rhceph, version=7, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_BRANCH=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, release=553, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc.)
Nov 28 09:54:13 np0005538513.localdomain systemd[1]: libpod-conmon-e98336c95c269e3ca85a3eb3b2453778f1911229d0af5ab851c9dd48e21d7a73.scope: Deactivated successfully.
Nov 28 09:54:14 np0005538513.localdomain sudo[298019]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:14 np0005538513.localdomain sudo[298090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:54:14 np0005538513.localdomain sudo[298090]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:14 np0005538513.localdomain sudo[298090]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:14 np0005538513.localdomain sudo[298108]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:54:14 np0005538513.localdomain sudo[298108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.564720) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323654564771, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1326, "num_deletes": 252, "total_data_size": 2258697, "memory_usage": 2295448, "flush_reason": "Manual Compaction"}
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323654575769, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 1301650, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13559, "largest_seqno": 14880, "table_properties": {"data_size": 1295703, "index_size": 3097, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 15968, "raw_average_key_size": 22, "raw_value_size": 1282660, "raw_average_value_size": 1798, "num_data_blocks": 132, "num_entries": 713, "num_filter_entries": 713, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323624, "oldest_key_time": 1764323624, "file_creation_time": 1764323654, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 11094 microseconds, and 4712 cpu microseconds.
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.575813) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 1301650 bytes OK
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.575837) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.577977) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.577998) EVENT_LOG_v1 {"time_micros": 1764323654577992, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.578039) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 2251827, prev total WAL file size 2251827, number of live WAL files 2.
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.578719) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130353432' seq:72057594037927935, type:22 .. '7061786F73003130373934' seq:0, type:0; will stop at (end)
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(1271KB)], [21(16MB)]
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323654578787, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 19006367, "oldest_snapshot_seqno": -1}
Nov 28 09:54:14 np0005538513.localdomain podman[298144]: 
Nov 28 09:54:14 np0005538513.localdomain podman[298144]: 2025-11-28 09:54:14.702831051 +0000 UTC m=+0.103210707 container create 5332132bd1ed18ad7f6e2689e33a8421760cd5e7f00f47306f87083f5dc1be4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_antonelli, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.tags=rhceph ceph, ceph=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, release=553, name=rhceph, GIT_CLEAN=True, GIT_BRANCH=main, io.buildah.version=1.33.12)
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 10398 keys, 15806830 bytes, temperature: kUnknown
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323654703561, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 15806830, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15747383, "index_size": 32338, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26053, "raw_key_size": 280230, "raw_average_key_size": 26, "raw_value_size": 15569547, "raw_average_value_size": 1497, "num_data_blocks": 1224, "num_entries": 10398, "num_filter_entries": 10398, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764323654, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.703906) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 15806830 bytes
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.705872) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 152.2 rd, 126.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 16.9 +0.0 blob) out(15.1 +0.0 blob), read-write-amplify(26.7) write-amplify(12.1) OK, records in: 10934, records dropped: 536 output_compression: NoCompression
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.705916) EVENT_LOG_v1 {"time_micros": 1764323654705902, "job": 10, "event": "compaction_finished", "compaction_time_micros": 124888, "compaction_time_cpu_micros": 41269, "output_level": 6, "num_output_files": 1, "total_output_size": 15806830, "num_input_records": 10934, "num_output_records": 10398, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323654706291, "job": 10, "event": "table_file_deletion", "file_number": 23}
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323654708808, "job": 10, "event": "table_file_deletion", "file_number": 21}
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.578658) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.708977) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.708988) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.708992) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.708996) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:54:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.708999) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:54:14 np0005538513.localdomain systemd[1]: Started libpod-conmon-5332132bd1ed18ad7f6e2689e33a8421760cd5e7f00f47306f87083f5dc1be4c.scope.
Nov 28 09:54:14 np0005538513.localdomain podman[298144]: 2025-11-28 09:54:14.646826348 +0000 UTC m=+0.047206034 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:54:14 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:54:14 np0005538513.localdomain podman[298144]: 2025-11-28 09:54:14.762755337 +0000 UTC m=+0.163134993 container init 5332132bd1ed18ad7f6e2689e33a8421760cd5e7f00f47306f87083f5dc1be4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_antonelli, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-09-24T08:57:55, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:54:14 np0005538513.localdomain podman[298144]: 2025-11-28 09:54:14.773277891 +0000 UTC m=+0.173657537 container start 5332132bd1ed18ad7f6e2689e33a8421760cd5e7f00f47306f87083f5dc1be4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_antonelli, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, ceph=True, RELEASE=main, build-date=2025-09-24T08:57:55, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, version=7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7)
Nov 28 09:54:14 np0005538513.localdomain podman[298144]: 2025-11-28 09:54:14.773538309 +0000 UTC m=+0.173917995 container attach 5332132bd1ed18ad7f6e2689e33a8421760cd5e7f00f47306f87083f5dc1be4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_antonelli, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, name=rhceph, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vendor=Red Hat, Inc., architecture=x86_64, GIT_BRANCH=main, ceph=True, io.openshift.expose-services=)
Nov 28 09:54:14 np0005538513.localdomain strange_antonelli[298159]: 167 167
Nov 28 09:54:14 np0005538513.localdomain systemd[1]: libpod-5332132bd1ed18ad7f6e2689e33a8421760cd5e7f00f47306f87083f5dc1be4c.scope: Deactivated successfully.
Nov 28 09:54:14 np0005538513.localdomain podman[298144]: 2025-11-28 09:54:14.776255842 +0000 UTC m=+0.176635548 container died 5332132bd1ed18ad7f6e2689e33a8421760cd5e7f00f47306f87083f5dc1be4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_antonelli, name=rhceph, com.redhat.component=rhceph-container, version=7, GIT_BRANCH=main, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, release=553, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 09:54:14 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c4acb01f3f7a20eabb74f1ed2296f5b8bb3690823138adaa5f56b9ba4d6a34d0-merged.mount: Deactivated successfully.
Nov 28 09:54:14 np0005538513.localdomain podman[298164]: 2025-11-28 09:54:14.875139276 +0000 UTC m=+0.089461875 container remove 5332132bd1ed18ad7f6e2689e33a8421760cd5e7f00f47306f87083f5dc1be4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_antonelli, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, ceph=True, com.redhat.component=rhceph-container, RELEASE=main, distribution-scope=public, version=7, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.openshift.expose-services=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:54:14 np0005538513.localdomain systemd[1]: libpod-conmon-5332132bd1ed18ad7f6e2689e33a8421760cd5e7f00f47306f87083f5dc1be4c.scope: Deactivated successfully.
Nov 28 09:54:15 np0005538513.localdomain sudo[298108]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:15.157 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:15 np0005538513.localdomain sudo[298187]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:54:15 np0005538513.localdomain sudo[298187]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:15 np0005538513.localdomain sudo[298187]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:15 np0005538513.localdomain sudo[298205]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:54:15 np0005538513.localdomain sudo[298205]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:15 np0005538513.localdomain ceph-mon[292954]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:15 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.2 (monmap changed)...
Nov 28 09:54:15 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:54:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:54:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:15 np0005538513.localdomain podman[298241]: 
Nov 28 09:54:15 np0005538513.localdomain podman[298241]: 2025-11-28 09:54:15.724628574 +0000 UTC m=+0.077570069 container create 60809c4684c4edfacaec147b27e19bf26f866af98ded58f05f104f591cdf3713 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_liskov, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, release=553, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.buildah.version=1.33.12, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=)
Nov 28 09:54:15 np0005538513.localdomain systemd[1]: Started libpod-conmon-60809c4684c4edfacaec147b27e19bf26f866af98ded58f05f104f591cdf3713.scope.
Nov 28 09:54:15 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:54:15 np0005538513.localdomain podman[298241]: 2025-11-28 09:54:15.692384792 +0000 UTC m=+0.045326317 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:54:15 np0005538513.localdomain podman[298241]: 2025-11-28 09:54:15.793419272 +0000 UTC m=+0.146360767 container init 60809c4684c4edfacaec147b27e19bf26f866af98ded58f05f104f591cdf3713 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_liskov, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, version=7, RELEASE=main, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, ceph=True, vendor=Red Hat, Inc.)
Nov 28 09:54:15 np0005538513.localdomain podman[298241]: 2025-11-28 09:54:15.802939295 +0000 UTC m=+0.155880790 container start 60809c4684c4edfacaec147b27e19bf26f866af98ded58f05f104f591cdf3713 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_liskov, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, ceph=True, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:54:15 np0005538513.localdomain podman[298241]: 2025-11-28 09:54:15.803341137 +0000 UTC m=+0.156282632 container attach 60809c4684c4edfacaec147b27e19bf26f866af98ded58f05f104f591cdf3713 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_liskov, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, distribution-scope=public, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, com.redhat.component=rhceph-container, release=553, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_BRANCH=main)
Nov 28 09:54:15 np0005538513.localdomain happy_liskov[298256]: 167 167
Nov 28 09:54:15 np0005538513.localdomain systemd[1]: libpod-60809c4684c4edfacaec147b27e19bf26f866af98ded58f05f104f591cdf3713.scope: Deactivated successfully.
Nov 28 09:54:15 np0005538513.localdomain podman[298241]: 2025-11-28 09:54:15.805870565 +0000 UTC m=+0.158812100 container died 60809c4684c4edfacaec147b27e19bf26f866af98ded58f05f104f591cdf3713 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_liskov, name=rhceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, vcs-type=git, release=553, io.buildah.version=1.33.12, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., com.redhat.component=rhceph-container)
Nov 28 09:54:15 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-50ce6c0051a6e4efdccd421a571d7e5ee3041d4ffb448cc5e6171193994b5c51-merged.mount: Deactivated successfully.
Nov 28 09:54:15 np0005538513.localdomain podman[298261]: 2025-11-28 09:54:15.903515631 +0000 UTC m=+0.089097814 container remove 60809c4684c4edfacaec147b27e19bf26f866af98ded58f05f104f591cdf3713 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_liskov, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.openshift.expose-services=, release=553, version=7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_CLEAN=True)
Nov 28 09:54:15 np0005538513.localdomain systemd[1]: libpod-conmon-60809c4684c4edfacaec147b27e19bf26f866af98ded58f05f104f591cdf3713.scope: Deactivated successfully.
Nov 28 09:54:16 np0005538513.localdomain sudo[298205]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:16 np0005538513.localdomain sudo[298284]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:54:16 np0005538513.localdomain sudo[298284]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:16 np0005538513.localdomain sudo[298284]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:16 np0005538513.localdomain sudo[298302]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:54:16 np0005538513.localdomain sudo[298302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:16 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.5 (monmap changed)...
Nov 28 09:54:16 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:54:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:54:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:54:16 np0005538513.localdomain podman[298336]: 
Nov 28 09:54:16 np0005538513.localdomain podman[298336]: 2025-11-28 09:54:16.763513062 +0000 UTC m=+0.078518698 container create 254fe9ab19a6479d3eb43385db51b384d27e20e999398e02ececcf25fe885b9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_jang, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, ceph=True, vcs-type=git, release=553, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.buildah.version=1.33.12, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, version=7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=)
Nov 28 09:54:16 np0005538513.localdomain systemd[1]: Started libpod-conmon-254fe9ab19a6479d3eb43385db51b384d27e20e999398e02ececcf25fe885b9d.scope.
Nov 28 09:54:16 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:54:16 np0005538513.localdomain podman[298336]: 2025-11-28 09:54:16.822311532 +0000 UTC m=+0.137317188 container init 254fe9ab19a6479d3eb43385db51b384d27e20e999398e02ececcf25fe885b9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_jang, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, architecture=x86_64, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, GIT_CLEAN=True, ceph=True, vcs-type=git, description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main)
Nov 28 09:54:16 np0005538513.localdomain podman[298336]: 2025-11-28 09:54:16.732122456 +0000 UTC m=+0.047128162 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:54:16 np0005538513.localdomain podman[298336]: 2025-11-28 09:54:16.831901637 +0000 UTC m=+0.146907283 container start 254fe9ab19a6479d3eb43385db51b384d27e20e999398e02ececcf25fe885b9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_jang, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, architecture=x86_64, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=553, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vendor=Red Hat, Inc., distribution-scope=public, version=7, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=)
Nov 28 09:54:16 np0005538513.localdomain podman[298336]: 2025-11-28 09:54:16.832272419 +0000 UTC m=+0.147278065 container attach 254fe9ab19a6479d3eb43385db51b384d27e20e999398e02ececcf25fe885b9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_jang, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., ceph=True, vcs-type=git, io.buildah.version=1.33.12)
Nov 28 09:54:16 np0005538513.localdomain mystifying_jang[298351]: 167 167
Nov 28 09:54:16 np0005538513.localdomain systemd[1]: libpod-254fe9ab19a6479d3eb43385db51b384d27e20e999398e02ececcf25fe885b9d.scope: Deactivated successfully.
Nov 28 09:54:16 np0005538513.localdomain podman[298336]: 2025-11-28 09:54:16.834644761 +0000 UTC m=+0.149650437 container died 254fe9ab19a6479d3eb43385db51b384d27e20e999398e02ececcf25fe885b9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_jang, architecture=x86_64, com.redhat.component=rhceph-container, version=7, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-type=git, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:54:16 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-9f58a47facb36c21493d1508eb5bb6f7be5ec7366f6ff3fc1623af885ed7ae8e-merged.mount: Deactivated successfully.
Nov 28 09:54:16 np0005538513.localdomain podman[298356]: 2025-11-28 09:54:16.9353128 +0000 UTC m=+0.088132433 container remove 254fe9ab19a6479d3eb43385db51b384d27e20e999398e02ececcf25fe885b9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_jang, RELEASE=main, ceph=True, name=rhceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, architecture=x86_64, version=7, vcs-type=git, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container)
Nov 28 09:54:16 np0005538513.localdomain systemd[1]: libpod-conmon-254fe9ab19a6479d3eb43385db51b384d27e20e999398e02ececcf25fe885b9d.scope: Deactivated successfully.
Nov 28 09:54:16 np0005538513.localdomain sudo[298302]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:17 np0005538513.localdomain sudo[298372]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:54:17 np0005538513.localdomain sudo[298372]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:17 np0005538513.localdomain sudo[298372]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:17 np0005538513.localdomain sudo[298390]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:54:17 np0005538513.localdomain sudo[298390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:17 np0005538513.localdomain ceph-mon[292954]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:17 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:54:17 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:54:17 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:17 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:17 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:54:17 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:54:17 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:54:17 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:54:17 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:17 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:54:17 np0005538513.localdomain podman[298426]: 
Nov 28 09:54:17 np0005538513.localdomain podman[298426]: 2025-11-28 09:54:17.642367084 +0000 UTC m=+0.079895640 container create 5effd84eb50c0071ea643089d46e138e2b607a432b2cb3e80cada94fd70b2162 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mcclintock, architecture=x86_64, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, distribution-scope=public, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:54:17 np0005538513.localdomain systemd[1]: Started libpod-conmon-5effd84eb50c0071ea643089d46e138e2b607a432b2cb3e80cada94fd70b2162.scope.
Nov 28 09:54:17 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:54:17 np0005538513.localdomain podman[298426]: 2025-11-28 09:54:17.699615016 +0000 UTC m=+0.137143572 container init 5effd84eb50c0071ea643089d46e138e2b607a432b2cb3e80cada94fd70b2162 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mcclintock, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, name=rhceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:54:17 np0005538513.localdomain podman[298426]: 2025-11-28 09:54:17.708096147 +0000 UTC m=+0.145624703 container start 5effd84eb50c0071ea643089d46e138e2b607a432b2cb3e80cada94fd70b2162 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mcclintock, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, release=553, GIT_CLEAN=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, distribution-scope=public, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-type=git, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:54:17 np0005538513.localdomain podman[298426]: 2025-11-28 09:54:17.708378256 +0000 UTC m=+0.145906842 container attach 5effd84eb50c0071ea643089d46e138e2b607a432b2cb3e80cada94fd70b2162 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mcclintock, description=Red Hat Ceph Storage 7, version=7, build-date=2025-09-24T08:57:55, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., distribution-scope=public, GIT_CLEAN=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True)
Nov 28 09:54:17 np0005538513.localdomain practical_mcclintock[298441]: 167 167
Nov 28 09:54:17 np0005538513.localdomain systemd[1]: libpod-5effd84eb50c0071ea643089d46e138e2b607a432b2cb3e80cada94fd70b2162.scope: Deactivated successfully.
Nov 28 09:54:17 np0005538513.localdomain podman[298426]: 2025-11-28 09:54:17.710704498 +0000 UTC m=+0.148233084 container died 5effd84eb50c0071ea643089d46e138e2b607a432b2cb3e80cada94fd70b2162 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mcclintock, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, architecture=x86_64, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, ceph=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:54:17 np0005538513.localdomain podman[298426]: 2025-11-28 09:54:17.612582607 +0000 UTC m=+0.050111223 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:54:17 np0005538513.localdomain podman[298446]: 2025-11-28 09:54:17.800398498 +0000 UTC m=+0.081923942 container remove 5effd84eb50c0071ea643089d46e138e2b607a432b2cb3e80cada94fd70b2162 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mcclintock, architecture=x86_64, io.openshift.tags=rhceph ceph, vcs-type=git, release=553, version=7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:54:17 np0005538513.localdomain systemd[1]: libpod-conmon-5effd84eb50c0071ea643089d46e138e2b607a432b2cb3e80cada94fd70b2162.scope: Deactivated successfully.
Nov 28 09:54:17 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-b0a1bf3430418ba81dfed06a8dab4ddfa964857e39d5d37e3d8d14731d168ac0-merged.mount: Deactivated successfully.
Nov 28 09:54:17 np0005538513.localdomain sudo[298390]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:54:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:54:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:54:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:54:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:54:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:54:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:54:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:54:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:54:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:54:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:54:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:54:18 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:18.388 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:18 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:54:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:54:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:54:18 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:54:18 np0005538513.localdomain ceph-mon[292954]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 28 09:54:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:54:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:20.160 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:20 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.0 (monmap changed)...
Nov 28 09:54:20 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.0 on np0005538514.localdomain
Nov 28 09:54:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 28 09:54:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:21 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.3 (monmap changed)...
Nov 28 09:54:21 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:54:21 np0005538513.localdomain ceph-mon[292954]: pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:54:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:54:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:54:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:54:21 np0005538513.localdomain podman[298463]: 2025-11-28 09:54:21.860247992 +0000 UTC m=+0.093591622 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:54:21 np0005538513.localdomain podman[298464]: 2025-11-28 09:54:21.907967041 +0000 UTC m=+0.139377991 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 28 09:54:21 np0005538513.localdomain podman[298464]: 2025-11-28 09:54:21.917623408 +0000 UTC m=+0.149034288 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 09:54:21 np0005538513.localdomain podman[298463]: 2025-11-28 09:54:21.924986794 +0000 UTC m=+0.158330444 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:54:21 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:54:21 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:54:22 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:54:22 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:54:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:22 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:54:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:54:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:54:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:54:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:22 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:54:22 np0005538513.localdomain ceph-mon[292954]: pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:22 np0005538513.localdomain ceph-mon[292954]: from='client.44312 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005538514.localdomain:172.18.0.104", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:54:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:54:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:22 np0005538513.localdomain ceph-mon[292954]: Deploying daemon mon.np0005538514 on np0005538514.localdomain
Nov 28 09:54:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:23.430 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:23 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:54:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:54:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:54:23 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:54:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 28 09:54:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:54:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Nov 28 09:54:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Nov 28 09:54:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:25.193 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:25 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Nov 28 09:54:25 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.1 (monmap changed)...
Nov 28 09:54:25 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:54:25 np0005538513.localdomain ceph-mon[292954]: pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 28 09:54:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:26 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.4 (monmap changed)...
Nov 28 09:54:26 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:54:26 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:26 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:26 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:54:26 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:26 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:54:26 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Nov 28 09:54:27 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:54:27 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:54:27 np0005538513.localdomain ceph-mon[292954]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:54:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:54:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:54:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:27 np0005538513.localdomain sudo[298506]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:54:27 np0005538513.localdomain sudo[298506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:27 np0005538513.localdomain sudo[298506]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:27 np0005538513.localdomain sudo[298524]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:54:27 np0005538513.localdomain sudo[298524]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:28 np0005538513.localdomain sudo[298524]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:28.431 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:28 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)...
Nov 28 09:54:28 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:54:28 np0005538513.localdomain ceph-mon[292954]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Nov 28 09:54:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:54:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Nov 28 09:54:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:30.196 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:30 np0005538513.localdomain sudo[298575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:54:30 np0005538513.localdomain sudo[298575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:30 np0005538513.localdomain sudo[298575]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:30 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:30 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:30 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:30 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:30 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:54:30 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:30 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:54:31 np0005538513.localdomain ceph-mon[292954]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:31 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Nov 28 09:54:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:31.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:54:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:31.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:54:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:32.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:54:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:32.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:54:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:32.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:54:33 np0005538513.localdomain ceph-mon[292954]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:33 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:33.470 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:33 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Nov 28 09:54:33 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:54:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:33.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:54:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:33.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:54:33 np0005538513.localdomain podman[298593]: 2025-11-28 09:54:33.92053112 +0000 UTC m=+0.150960717 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.openshift.expose-services=, vcs-type=git, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, version=9.6)
Nov 28 09:54:33 np0005538513.localdomain podman[298593]: 2025-11-28 09:54:33.933657844 +0000 UTC m=+0.164087401 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_id=edpm, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 09:54:33 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:54:34 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:34 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:54:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:34.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:54:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:35.230 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:35 np0005538513.localdomain ceph-mon[292954]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:35 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:35 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2905538660' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:54:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Nov 28 09:54:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Nov 28 09:54:36 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon) e11 handle_command mon_command({"prefix": "status", "format": "json"} v 0)
Nov 28 09:54:36 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/11160684' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 28 09:54:36 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/4208687966' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:54:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:36 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/738597505' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:54:36 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.200:0/11160684' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 28 09:54:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:36.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:54:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:36.886 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:54:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:36.886 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:54:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:36.887 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:54:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:36.887 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:54:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:36.887 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:54:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:54:37 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4278207600' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:54:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:37.347 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:54:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:37.421 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:54:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:37.422 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:54:37 np0005538513.localdomain ceph-mon[292954]: pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:37 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:37 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1285419659' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:54:37 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/4278207600' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:54:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Nov 28 09:54:37 np0005538513.localdomain ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b8f20 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0
Nov 28 09:54:37 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 calling monitor election
Nov 28 09:54:37 np0005538513.localdomain ceph-mon[292954]: paxos.2).electionLogic(42) init, last seen epoch 42
Nov 28 09:54:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(electing) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:54:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:37.641 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:54:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:37.643 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11785MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:54:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:37.643 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:54:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:37.644 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:54:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:37.734 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:54:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:37.735 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:54:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:37.736 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:54:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:54:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:37.801 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:54:37 np0005538513.localdomain podman[298635]: 2025-11-28 09:54:37.849008174 +0000 UTC m=+0.081268843 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:54:37 np0005538513.localdomain podman[298635]: 2025-11-28 09:54:37.886878799 +0000 UTC m=+0.119139518 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:54:37 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:54:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(electing) e12 handle_auth_request failed to assign global_id
Nov 28 09:54:38 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(electing) e12 handle_auth_request failed to assign global_id
Nov 28 09:54:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:38.503 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:38 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(electing) e12 handle_auth_request failed to assign global_id
Nov 28 09:54:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(electing) e12 handle_auth_request failed to assign global_id
Nov 28 09:54:40 np0005538513.localdomain podman[238687]: time="2025-11-28T09:54:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:54:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:54:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 28 09:54:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:54:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18728 "" "Go-http-client/1.1"
Nov 28 09:54:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:40.235 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:54:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:54:40 np0005538513.localdomain podman[298667]: 2025-11-28 09:54:40.853281927 +0000 UTC m=+0.088326589 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 28 09:54:40 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(electing) e12 handle_auth_request failed to assign global_id
Nov 28 09:54:40 np0005538513.localdomain podman[298668]: 2025-11-28 09:54:40.941786592 +0000 UTC m=+0.174159612 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Nov 28 09:54:40 np0005538513.localdomain podman[298667]: 2025-11-28 09:54:40.946379313 +0000 UTC m=+0.181423945 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 09:54:40 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:54:40 np0005538513.localdomain podman[298668]: 2025-11-28 09:54:40.976611753 +0000 UTC m=+0.208984813 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Nov 28 09:54:40 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:54:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(electing) e12 handle_auth_request failed to assign global_id
Nov 28 09:54:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(electing) e12 handle_auth_request failed to assign global_id
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(electing) e12 handle_auth_request failed to assign global_id
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: paxos.2).electionLogic(43) init, last seen epoch 43, mid-election, bumping
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(electing) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(electing) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513 calling monitor election
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538515 calling monitor election
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538512 calling monitor election
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538514 calling monitor election
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538512 is new leader, mons np0005538512,np0005538515,np0005538513,np0005538514 in quorum (ranks 0,1,2,3)
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: monmap epoch 12
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: last_changed 2025-11-28T09:54:37.617923+0000
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: min_mon_release 18 (reef)
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: election_strategy: 1
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538512
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: 2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: 3: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538514
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: osdmap e88: 6 total, 6 up, 6 in
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: mgrmap e29: np0005538515.yfkzhl(active, since 84s), standbys: np0005538511.fvuybw, np0005538513.dsfdlx, np0005538514.djozup, np0005538512.zyhkxs
Nov 28 09:54:42 np0005538513.localdomain ceph-mon[292954]: overall HEALTH_OK
Nov 28 09:54:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:43.526 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:54:43 np0005538513.localdomain podman[298712]: 2025-11-28 09:54:43.847446321 +0000 UTC m=+0.081948833 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125)
Nov 28 09:54:43 np0005538513.localdomain podman[298712]: 2025-11-28 09:54:43.859876563 +0000 UTC m=+0.094379025 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 09:54:43 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:54:43 np0005538513.localdomain ceph-mon[292954]: from='client.44342 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:54:43 np0005538513.localdomain ceph-mon[292954]: Reconfig service osd.default_drive_group
Nov 28 09:54:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon) e12 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:54:44 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3652519406' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:54:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:44.330 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 6.530s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:54:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:44.336 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:54:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:44.378 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:54:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:44.383 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:54:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:44.383 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 6.739s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:54:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:54:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon).osd e89 e89: 6 total, 6 up, 6 in
Nov 28 09:54:44 np0005538513.localdomain sshd[295359]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:54:44 np0005538513.localdomain systemd[1]: session-68.scope: Deactivated successfully.
Nov 28 09:54:44 np0005538513.localdomain systemd[1]: session-68.scope: Consumed 18.260s CPU time.
Nov 28 09:54:44 np0005538513.localdomain systemd-logind[764]: Session 68 logged out. Waiting for processes to exit.
Nov 28 09:54:44 np0005538513.localdomain systemd-logind[764]: Removed session 68.
Nov 28 09:54:44 np0005538513.localdomain ceph-mon[292954]: pgmap v46: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:44 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3652519406' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:54:44 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.200:0/1122335753' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 28 09:54:44 np0005538513.localdomain ceph-mon[292954]: Activating manager daemon np0005538511.fvuybw
Nov 28 09:54:44 np0005538513.localdomain ceph-mon[292954]: osdmap e89: 6 total, 6 up, 6 in
Nov 28 09:54:44 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.200:0/1122335753' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Nov 28 09:54:44 np0005538513.localdomain ceph-mon[292954]: mgrmap e30: np0005538511.fvuybw(active, starting, since 0.0525983s), standbys: np0005538513.dsfdlx, np0005538514.djozup, np0005538512.zyhkxs
Nov 28 09:54:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:45.259 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:47.385 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:54:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:47.385 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:54:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:47.386 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:54:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:47.494 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:54:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:47.494 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:54:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:47.495 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:54:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:47.495 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:54:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:47.833 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:54:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:47.860 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:54:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:47.861 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:54:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:54:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:54:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:54:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:54:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:54:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:54:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:54:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:54:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:54:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:54:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:54:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:54:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:48.529 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:54:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:50.262 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:50 np0005538513.localdomain ceph-mon[292954]: Standby manager daemon np0005538515.yfkzhl started
Nov 28 09:54:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:54:50.833 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:54:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:54:50.834 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:54:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:54:50.835 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:54:51 np0005538513.localdomain ceph-mon[292954]: mgrmap e31: np0005538511.fvuybw(active, starting, since 5s), standbys: np0005538513.dsfdlx, np0005538514.djozup, np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:54:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:54:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:54:52 np0005538513.localdomain systemd[1]: tmp-crun.9ZEreo.mount: Deactivated successfully.
Nov 28 09:54:52 np0005538513.localdomain podman[298742]: 2025-11-28 09:54:52.851731403 +0000 UTC m=+0.087502304 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 09:54:52 np0005538513.localdomain podman[298742]: 2025-11-28 09:54:52.857509601 +0000 UTC m=+0.093280452 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 09:54:52 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:54:52 np0005538513.localdomain podman[298741]: 2025-11-28 09:54:52.828074735 +0000 UTC m=+0.069576053 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:54:52 np0005538513.localdomain podman[298741]: 2025-11-28 09:54:52.907628273 +0000 UTC m=+0.149129621 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:54:52 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:54:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:53.574 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:54:54 np0005538513.localdomain systemd[1]: Stopping User Manager for UID 1002...
Nov 28 09:54:54 np0005538513.localdomain systemd[26286]: Activating special unit Exit the Session...
Nov 28 09:54:54 np0005538513.localdomain systemd[26286]: Removed slice User Background Tasks Slice.
Nov 28 09:54:54 np0005538513.localdomain systemd[26286]: Stopped target Main User Target.
Nov 28 09:54:54 np0005538513.localdomain systemd[26286]: Stopped target Basic System.
Nov 28 09:54:54 np0005538513.localdomain systemd[26286]: Stopped target Paths.
Nov 28 09:54:54 np0005538513.localdomain systemd[26286]: Stopped target Sockets.
Nov 28 09:54:54 np0005538513.localdomain systemd[26286]: Stopped target Timers.
Nov 28 09:54:54 np0005538513.localdomain systemd[26286]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 28 09:54:54 np0005538513.localdomain systemd[26286]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 09:54:55 np0005538513.localdomain systemd[26286]: Closed D-Bus User Message Bus Socket.
Nov 28 09:54:55 np0005538513.localdomain systemd[26286]: Stopped Create User's Volatile Files and Directories.
Nov 28 09:54:55 np0005538513.localdomain systemd[26286]: Removed slice User Application Slice.
Nov 28 09:54:55 np0005538513.localdomain systemd[26286]: Reached target Shutdown.
Nov 28 09:54:55 np0005538513.localdomain systemd[26286]: Finished Exit the Session.
Nov 28 09:54:55 np0005538513.localdomain systemd[26286]: Reached target Exit the Session.
Nov 28 09:54:55 np0005538513.localdomain systemd[1]: user@1002.service: Deactivated successfully.
Nov 28 09:54:55 np0005538513.localdomain systemd[1]: Stopped User Manager for UID 1002.
Nov 28 09:54:55 np0005538513.localdomain systemd[1]: user@1002.service: Consumed 14.236s CPU time, read 0B from disk, written 7.0K to disk.
Nov 28 09:54:55 np0005538513.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1002...
Nov 28 09:54:55 np0005538513.localdomain systemd[1]: run-user-1002.mount: Deactivated successfully.
Nov 28 09:54:55 np0005538513.localdomain systemd[1]: user-runtime-dir@1002.service: Deactivated successfully.
Nov 28 09:54:55 np0005538513.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1002.
Nov 28 09:54:55 np0005538513.localdomain systemd[1]: Removed slice User Slice of UID 1002.
Nov 28 09:54:55 np0005538513.localdomain systemd[1]: user-1002.slice: Consumed 4min 44.203s CPU time.
Nov 28 09:54:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:55.287 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:54:58.577 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:54:59 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:00.290 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:03 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:03.616 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:04 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:55:04 np0005538513.localdomain podman[298784]: 2025-11-28 09:55:04.852434667 +0000 UTC m=+0.086331108 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Nov 28 09:55:04 np0005538513.localdomain podman[298784]: 2025-11-28 09:55:04.866325165 +0000 UTC m=+0.100221586 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 09:55:04 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:55:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:05.329 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:08 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:08.618 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:55:08 np0005538513.localdomain podman[298804]: 2025-11-28 09:55:08.848718058 +0000 UTC m=+0.086908327 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:55:08 np0005538513.localdomain podman[298804]: 2025-11-28 09:55:08.862388687 +0000 UTC m=+0.100578956 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:55:08 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:55:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:10 np0005538513.localdomain podman[238687]: time="2025-11-28T09:55:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:55:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:55:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 28 09:55:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:55:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18727 "" "Go-http-client/1.1"
Nov 28 09:55:10 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:10.332 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:55:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:55:11 np0005538513.localdomain podman[298828]: 2025-11-28 09:55:11.853879448 +0000 UTC m=+0.087228006 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:55:11 np0005538513.localdomain podman[298828]: 2025-11-28 09:55:11.862296287 +0000 UTC m=+0.095644835 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 09:55:11 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:55:11 np0005538513.localdomain podman[298827]: 2025-11-28 09:55:11.951327738 +0000 UTC m=+0.188318577 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 09:55:11 np0005538513.localdomain podman[298827]: 2025-11-28 09:55:11.990416531 +0000 UTC m=+0.227407350 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 09:55:12 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:55:13 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon) e12 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 09:55:13 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4217127523' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:55:13 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon) e12 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 09:55:13 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4217127523' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:55:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/4217127523' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:55:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/4217127523' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:55:13 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:13.661 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:14 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:55:14 np0005538513.localdomain podman[298868]: 2025-11-28 09:55:14.852652954 +0000 UTC m=+0.088525636 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Nov 28 09:55:14 np0005538513.localdomain podman[298868]: 2025-11-28 09:55:14.865466308 +0000 UTC m=+0.101338990 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 09:55:14 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:55:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:15.372 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:55:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:55:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:55:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:55:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:55:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:55:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:55:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:55:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:55:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:55:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:55:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:55:18 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:18.666 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon).osd e90 e90: 6 total, 6 up, 6 in
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: mgr handle_mgr_map Activating!
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: mgr handle_mgr_map I am now activating
Nov 28 09:55:19 np0005538513.localdomain ceph-mon[292954]: Activating manager daemon np0005538513.dsfdlx
Nov 28 09:55:19 np0005538513.localdomain ceph-mon[292954]: Manager daemon np0005538511.fvuybw is unresponsive, replacing it with standby daemon np0005538513.dsfdlx
Nov 28 09:55:19 np0005538513.localdomain ceph-mon[292954]: osdmap e90: 6 total, 6 up, 6 in
Nov 28 09:55:19 np0005538513.localdomain ceph-mon[292954]: mgrmap e32: np0005538513.dsfdlx(active, starting, since 0.0452346s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:55:19 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:55:19 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:55:19 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:55:19 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:55:19 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mds metadata", "who": "mds.np0005538513.yljthc"} : dispatch
Nov 28 09:55:19 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mds metadata", "who": "mds.np0005538515.anvatb"} : dispatch
Nov 28 09:55:19 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mds metadata", "who": "mds.np0005538514.umgtoy"} : dispatch
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: mgr load Constructed class from module: balancer
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [cephadm DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [balancer INFO root] Starting
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [balancer INFO root] Optimize plan auto_2025-11-28_09:55:19
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [balancer INFO root] Some PGs (1.000000) are unknown; try again later
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [cephadm WARNING root] removing stray HostCache host record np0005538511.localdomain.devices.0
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [WRN] : removing stray HostCache host record np0005538511.localdomain.devices.0
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: mgr load Constructed class from module: cephadm
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: mgr load Constructed class from module: crash
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: mgr load Constructed class from module: devicehealth
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: mgr load Constructed class from module: iostat
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [devicehealth INFO root] Starting
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: mgr load Constructed class from module: nfs
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: mgr load Constructed class from module: orchestrator
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: mgr load Constructed class from module: pg_autoscaler
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: mgr load Constructed class from module: progress
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] Loading...
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] Loaded [<progress.module.GhostEvent object at 0x7fc4c575ee20>, <progress.module.GhostEvent object at 0x7fc4c575ee80>, <progress.module.GhostEvent object at 0x7fc4c575eeb0>, <progress.module.GhostEvent object at 0x7fc4c575eee0>, <progress.module.GhostEvent object at 0x7fc4c575ef10>, <progress.module.GhostEvent object at 0x7fc4c575ef40>, <progress.module.GhostEvent object at 0x7fc4c575ef70>, <progress.module.GhostEvent object at 0x7fc4c575efa0>, <progress.module.GhostEvent object at 0x7fc4c575efd0>, <progress.module.GhostEvent object at 0x7fc4c576a040>, <progress.module.GhostEvent object at 0x7fc4c576a070>, <progress.module.GhostEvent object at 0x7fc4c576a0a0>, <progress.module.GhostEvent object at 0x7fc4c576a0d0>, <progress.module.GhostEvent object at 0x7fc4c576a100>, <progress.module.GhostEvent object at 0x7fc4c576a130>, <progress.module.GhostEvent object at 0x7fc4c576a160>, <progress.module.GhostEvent object at 0x7fc4c576a190>, <progress.module.GhostEvent object at 0x7fc4c576a1c0>, <progress.module.GhostEvent object at 0x7fc4c576a1f0>, <progress.module.GhostEvent object at 0x7fc4c576a220>, <progress.module.GhostEvent object at 0x7fc4c576a250>, <progress.module.GhostEvent object at 0x7fc4c576a280>, <progress.module.GhostEvent object at 0x7fc4c576a2b0>, <progress.module.GhostEvent object at 0x7fc4c576a2e0>, <progress.module.GhostEvent object at 0x7fc4c576a310>, <progress.module.GhostEvent object at 0x7fc4c576a340>, <progress.module.GhostEvent object at 0x7fc4c576a370>, <progress.module.GhostEvent object at 0x7fc4c576a3a0>, <progress.module.GhostEvent object at 0x7fc4c576a3d0>, <progress.module.GhostEvent object at 0x7fc4c576a400>, <progress.module.GhostEvent object at 0x7fc4c576a430>, <progress.module.GhostEvent object at 0x7fc4c576a460>, <progress.module.GhostEvent object at 0x7fc4c576a490>, <progress.module.GhostEvent object at 0x7fc4c576a4c0>, <progress.module.GhostEvent object at 0x7fc4c576a4f0>, <progress.module.GhostEvent object at 0x7fc4c576a520>, <progress.module.GhostEvent object at 0x7fc4c576a550>, <progress.module.GhostEvent object at 0x7fc4c576a580>, <progress.module.GhostEvent object at 0x7fc4c576a5b0>, <progress.module.GhostEvent object at 0x7fc4c576a5e0>, <progress.module.GhostEvent object at 0x7fc4c576a610>, <progress.module.GhostEvent object at 0x7fc4c576a640>, <progress.module.GhostEvent object at 0x7fc4c576a670>, <progress.module.GhostEvent object at 0x7fc4c576a6a0>, <progress.module.GhostEvent object at 0x7fc4c576a6d0>, <progress.module.GhostEvent object at 0x7fc4c576a700>, <progress.module.GhostEvent object at 0x7fc4c576a730>, <progress.module.GhostEvent object at 0x7fc4c576a760>, <progress.module.GhostEvent object at 0x7fc4c576a790>, <progress.module.GhostEvent object at 0x7fc4c576a7c0>] historic events
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] Loaded OSDMap, ready.
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] recovery thread starting
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] starting setup
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: mgr load Constructed class from module: rbd_support
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: mgr load Constructed class from module: restful
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: mgr load Constructed class from module: status
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [restful INFO root] server_addr: :: server_port: 8003
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: mgr load Constructed class from module: telemetry
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [restful WARNING root] server not running: no certificate configured
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: mgr load Constructed class from module: volumes
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:55:19 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:55:19.862+0000 7fc4b36d9640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:55:19 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:55:19.862+0000 7fc4b36d9640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:55:19 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:55:19.862+0000 7fc4b36d9640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:55:19 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:55:19.862+0000 7fc4b36d9640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:55:19 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:55:19.862+0000 7fc4b36d9640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:55:19 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:55:19.871+0000 7fc4b66df640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:55:19 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:55:19.871+0000 7fc4b66df640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:55:19 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:55:19.871+0000 7fc4b66df640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:55:19 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:55:19.871+0000 7fc4b66df640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:55:19 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:55:19.871+0000 7fc4b66df640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] PerfHandler: starting
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] load_task_task: vms, start_after=
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] load_task_task: volumes, start_after=
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] load_task_task: images, start_after=
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] load_task_task: backups, start_after=
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] TaskHandler: starting
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting
Nov 28 09:55:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] setup complete
Nov 28 09:55:19 np0005538513.localdomain sshd[299027]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:55:20 np0005538513.localdomain sshd[299027]: Accepted publickey for ceph-admin from 192.168.122.106 port 47862 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 09:55:20 np0005538513.localdomain systemd[1]: Created slice User Slice of UID 1002.
Nov 28 09:55:20 np0005538513.localdomain systemd[1]: Starting User Runtime Directory /run/user/1002...
Nov 28 09:55:20 np0005538513.localdomain systemd-logind[764]: New session 69 of user ceph-admin.
Nov 28 09:55:20 np0005538513.localdomain systemd[1]: Finished User Runtime Directory /run/user/1002.
Nov 28 09:55:20 np0005538513.localdomain systemd[1]: Starting User Manager for UID 1002...
Nov 28 09:55:20 np0005538513.localdomain systemd[299031]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 09:55:20 np0005538513.localdomain systemd[299031]: Queued start job for default target Main User Target.
Nov 28 09:55:20 np0005538513.localdomain systemd[299031]: Created slice User Application Slice.
Nov 28 09:55:20 np0005538513.localdomain systemd[299031]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 09:55:20 np0005538513.localdomain systemd[299031]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 09:55:20 np0005538513.localdomain systemd[299031]: Reached target Paths.
Nov 28 09:55:20 np0005538513.localdomain systemd[299031]: Reached target Timers.
Nov 28 09:55:20 np0005538513.localdomain systemd[299031]: Starting D-Bus User Message Bus Socket...
Nov 28 09:55:20 np0005538513.localdomain systemd[299031]: Starting Create User's Volatile Files and Directories...
Nov 28 09:55:20 np0005538513.localdomain systemd[299031]: Listening on D-Bus User Message Bus Socket.
Nov 28 09:55:20 np0005538513.localdomain systemd[299031]: Reached target Sockets.
Nov 28 09:55:20 np0005538513.localdomain systemd[299031]: Finished Create User's Volatile Files and Directories.
Nov 28 09:55:20 np0005538513.localdomain systemd[299031]: Reached target Basic System.
Nov 28 09:55:20 np0005538513.localdomain systemd[299031]: Reached target Main User Target.
Nov 28 09:55:20 np0005538513.localdomain systemd[299031]: Startup finished in 159ms.
Nov 28 09:55:20 np0005538513.localdomain systemd[1]: Started User Manager for UID 1002.
Nov 28 09:55:20 np0005538513.localdomain systemd[1]: Started Session 69 of User ceph-admin.
Nov 28 09:55:20 np0005538513.localdomain sshd[299027]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 09:55:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:20.375 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:20 np0005538513.localdomain sudo[299046]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:55:20 np0005538513.localdomain sudo[299046]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:20 np0005538513.localdomain sudo[299046]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:20 np0005538513.localdomain sudo[299064]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:55:20 np0005538513.localdomain sudo[299064]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr metadata", "who": "np0005538513.dsfdlx", "id": "np0005538513.dsfdlx"} : dispatch
Nov 28 09:55:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr metadata", "who": "np0005538514.djozup", "id": "np0005538514.djozup"} : dispatch
Nov 28 09:55:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr metadata", "who": "np0005538515.yfkzhl", "id": "np0005538515.yfkzhl"} : dispatch
Nov 28 09:55:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr metadata", "who": "np0005538512.zyhkxs", "id": "np0005538512.zyhkxs"} : dispatch
Nov 28 09:55:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Nov 28 09:55:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Nov 28 09:55:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Nov 28 09:55:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Nov 28 09:55:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Nov 28 09:55:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Nov 28 09:55:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mds metadata"} : dispatch
Nov 28 09:55:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd metadata"} : dispatch
Nov 28 09:55:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata"} : dispatch
Nov 28 09:55:20 np0005538513.localdomain ceph-mon[292954]: Manager daemon np0005538513.dsfdlx is now available
Nov 28 09:55:20 np0005538513.localdomain ceph-mon[292954]: removing stray HostCache host record np0005538511.localdomain.devices.0
Nov 28 09:55:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538511.localdomain.devices.0"} : dispatch
Nov 28 09:55:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538511.localdomain.devices.0"}]': finished
Nov 28 09:55:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538511.localdomain.devices.0"} : dispatch
Nov 28 09:55:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538511.localdomain.devices.0"}]': finished
Nov 28 09:55:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538513.dsfdlx/mirror_snapshot_schedule"} : dispatch
Nov 28 09:55:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538513.dsfdlx/trash_purge_schedule"} : dispatch
Nov 28 09:55:20 np0005538513.localdomain ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44366 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:55:20 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:21 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cherrypy.error] [28/Nov/2025:09:55:21] ENGINE Bus STARTING
Nov 28 09:55:21 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : [28/Nov/2025:09:55:21] ENGINE Bus STARTING
Nov 28 09:55:21 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cherrypy.error] [28/Nov/2025:09:55:21] ENGINE Serving on http://172.18.0.106:8765
Nov 28 09:55:21 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : [28/Nov/2025:09:55:21] ENGINE Serving on http://172.18.0.106:8765
Nov 28 09:55:21 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cherrypy.error] [28/Nov/2025:09:55:21] ENGINE Serving on https://172.18.0.106:7150
Nov 28 09:55:21 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : [28/Nov/2025:09:55:21] ENGINE Serving on https://172.18.0.106:7150
Nov 28 09:55:21 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cherrypy.error] [28/Nov/2025:09:55:21] ENGINE Bus STARTED
Nov 28 09:55:21 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : [28/Nov/2025:09:55:21] ENGINE Bus STARTED
Nov 28 09:55:21 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cherrypy.error] [28/Nov/2025:09:55:21] ENGINE Client ('172.18.0.106', 60952) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 28 09:55:21 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : [28/Nov/2025:09:55:21] ENGINE Client ('172.18.0.106', 60952) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 28 09:55:21 np0005538513.localdomain podman[299175]: 2025-11-28 09:55:21.388947969 +0000 UTC m=+0.085193773 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-type=git, architecture=x86_64, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, RELEASE=main, io.openshift.expose-services=, release=553, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:55:21 np0005538513.localdomain podman[299175]: 2025-11-28 09:55:21.519537059 +0000 UTC m=+0.215782923 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, architecture=x86_64, ceph=True, CEPH_POINT_RELEASE=, vcs-type=git, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., name=rhceph, description=Red Hat Ceph Storage 7)
Nov 28 09:55:21 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:21 np0005538513.localdomain ceph-mon[292954]: mgrmap e33: np0005538513.dsfdlx(active, since 1.13065s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:55:21 np0005538513.localdomain ceph-mon[292954]: from='client.44366 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:55:21 np0005538513.localdomain ceph-mon[292954]: pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:21 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:55:21] ENGINE Bus STARTING
Nov 28 09:55:21 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:55:21] ENGINE Serving on http://172.18.0.106:8765
Nov 28 09:55:21 np0005538513.localdomain ceph-mgr[286105]: [devicehealth INFO root] Check health
Nov 28 09:55:22 np0005538513.localdomain sudo[299064]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:22 np0005538513.localdomain sudo[299303]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:55:22 np0005538513.localdomain sudo[299303]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:22 np0005538513.localdomain sudo[299303]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:22 np0005538513.localdomain sudo[299321]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:55:22 np0005538513.localdomain sudo[299321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:22 np0005538513.localdomain sudo[299321]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:22 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:55:21] ENGINE Serving on https://172.18.0.106:7150
Nov 28 09:55:22 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:55:21] ENGINE Bus STARTED
Nov 28 09:55:22 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:55:21] ENGINE Client ('172.18.0.106', 60952) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 28 09:55:22 np0005538513.localdomain ceph-mon[292954]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:23 np0005538513.localdomain sudo[299370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:55:23 np0005538513.localdomain sudo[299370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:55:23 np0005538513.localdomain sudo[299370]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:55:23 np0005538513.localdomain sudo[299395]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 28 09:55:23 np0005538513.localdomain sudo[299395]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:23 np0005538513.localdomain podman[299389]: 2025-11-28 09:55:23.173581992 +0000 UTC m=+0.070841702 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd)
Nov 28 09:55:23 np0005538513.localdomain podman[299389]: 2025-11-28 09:55:23.189328397 +0000 UTC m=+0.086588047 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 09:55:23 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:55:23 np0005538513.localdomain systemd[1]: tmp-crun.TUDxKu.mount: Deactivated successfully.
Nov 28 09:55:23 np0005538513.localdomain podman[299388]: 2025-11-28 09:55:23.288292663 +0000 UTC m=+0.182101327 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:55:23 np0005538513.localdomain podman[299388]: 2025-11-28 09:55:23.297910088 +0000 UTC m=+0.191718752 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:55:23 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:55:23 np0005538513.localdomain sudo[299395]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44426 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO root] Saving service mon spec with placement label:mon
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO root] Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:23.704 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO root] Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO root] Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:23 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:23 np0005538513.localdomain sudo[299466]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:55:23 np0005538513.localdomain sudo[299466]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:23 np0005538513.localdomain sudo[299466]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:23 np0005538513.localdomain sudo[299484]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:55:23 np0005538513.localdomain sudo[299484]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:23 np0005538513.localdomain sudo[299484]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:24 np0005538513.localdomain sudo[299502]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:55:24 np0005538513.localdomain sudo[299502]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:24 np0005538513.localdomain sudo[299502]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:24 np0005538513.localdomain sudo[299520]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:24 np0005538513.localdomain sudo[299520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:24 np0005538513.localdomain sudo[299520]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:24 np0005538513.localdomain sudo[299538]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:55:24 np0005538513.localdomain sudo[299538]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:24 np0005538513.localdomain sudo[299538]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:24 np0005538513.localdomain sudo[299572]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} : dispatch
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: mgrmap e34: np0005538513.dsfdlx(active, since 3s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: from='client.44426 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: Saving service mon spec with placement label:mon
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:55:24 np0005538513.localdomain sudo[299572]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:24 np0005538513.localdomain sudo[299572]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:24 np0005538513.localdomain sudo[299590]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:55:24 np0005538513.localdomain sudo[299590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:24 np0005538513.localdomain sudo[299590]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:24 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:24 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:24 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:24 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:24 np0005538513.localdomain sudo[299608]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:55:24 np0005538513.localdomain sudo[299608]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:24 np0005538513.localdomain sudo[299608]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:24 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:24 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:24 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:24 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:24 np0005538513.localdomain sudo[299626]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:55:24 np0005538513.localdomain sudo[299626]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:24 np0005538513.localdomain sudo[299626]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:24 np0005538513.localdomain sudo[299644]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:55:24 np0005538513.localdomain sudo[299644]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:24 np0005538513.localdomain sudo[299644]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:24 np0005538513.localdomain sudo[299662]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:55:24 np0005538513.localdomain sudo[299662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:24 np0005538513.localdomain sudo[299662]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:24 np0005538513.localdomain sudo[299680]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:24 np0005538513.localdomain sudo[299680]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:24 np0005538513.localdomain sudo[299680]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:24 np0005538513.localdomain sudo[299698]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:55:24 np0005538513.localdomain sudo[299698]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:24 np0005538513.localdomain sudo[299698]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:25 np0005538513.localdomain sudo[299732]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:55:25 np0005538513.localdomain sudo[299732]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:25 np0005538513.localdomain sudo[299732]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:25 np0005538513.localdomain ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44434 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538514", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:55:25 np0005538513.localdomain sudo[299750]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:55:25 np0005538513.localdomain sudo[299750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:25 np0005538513.localdomain sudo[299750]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:25 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538513.localdomain sudo[299768]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:25 np0005538513.localdomain sudo[299768]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:25 np0005538513.localdomain sudo[299768]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:25 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538513.localdomain sudo[299786]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:55:25 np0005538513.localdomain sudo[299786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:25 np0005538513.localdomain sudo[299786]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:25 np0005538513.localdomain sudo[299804]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:55:25 np0005538513.localdomain sudo[299804]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:25.425 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:25 np0005538513.localdomain sudo[299804]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:25 np0005538513.localdomain sudo[299822]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:55:25 np0005538513.localdomain sudo[299822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:25 np0005538513.localdomain sudo[299822]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:25 np0005538513.localdomain sudo[299840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:25 np0005538513.localdomain sudo[299840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:25 np0005538513.localdomain sudo[299840]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:25 np0005538513.localdomain ceph-mon[292954]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:25 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:25 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:25 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:25 np0005538513.localdomain ceph-mon[292954]: from='client.44434 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538514", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:55:25 np0005538513.localdomain ceph-mon[292954]: Updating np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:25 np0005538513.localdomain sudo[299858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:55:25 np0005538513.localdomain sudo[299858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:25 np0005538513.localdomain sudo[299858]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:25 np0005538513.localdomain sudo[299892]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:55:25 np0005538513.localdomain sudo[299892]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:25 np0005538513.localdomain sudo[299892]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:25 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538513.localdomain sudo[299910]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:55:25 np0005538513.localdomain sudo[299910]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:25 np0005538513.localdomain sudo[299910]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:25 np0005538513.localdomain sudo[299928]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538513.localdomain sudo[299928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:25 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538513.localdomain sudo[299928]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:25 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:55:26 np0005538513.localdomain sudo[299946]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:55:26 np0005538513.localdomain sudo[299946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:26 np0005538513.localdomain sudo[299946]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:26 np0005538513.localdomain sudo[299964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:55:26 np0005538513.localdomain sudo[299964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:26 np0005538513.localdomain sudo[299964]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:26 np0005538513.localdomain sudo[299982]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:55:26 np0005538513.localdomain sudo[299982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:26 np0005538513.localdomain sudo[299982]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:26 np0005538513.localdomain sudo[300000]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:26 np0005538513.localdomain sudo[300000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:26 np0005538513.localdomain sudo[300000]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:26 np0005538513.localdomain sudo[300018]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:55:26 np0005538513.localdomain sudo[300018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:26 np0005538513.localdomain sudo[300018]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:26 np0005538513.localdomain sudo[300052]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:55:26 np0005538513.localdomain sudo[300052]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:26 np0005538513.localdomain sudo[300052]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:26 np0005538513.localdomain sudo[300070]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:55:26 np0005538513.localdomain sudo[300070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:26 np0005538513.localdomain sudo[300070]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:26 np0005538513.localdomain sudo[300088]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:55:26 np0005538513.localdomain sudo[300088]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:26 np0005538513.localdomain ceph-mon[292954]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:26 np0005538513.localdomain ceph-mon[292954]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:55:26 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:55:26 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:55:26 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:55:26 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:26 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:26 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.200:0/363124451' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 28 09:55:26 np0005538513.localdomain sudo[300088]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:26 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] update: starting ev e15a826f-99af-4834-8906-4631a44eebd9 (Updating node-proxy deployment (+4 -> 4))
Nov 28 09:55:26 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] complete: finished ev e15a826f-99af-4834-8906-4631a44eebd9 (Updating node-proxy deployment (+4 -> 4))
Nov 28 09:55:26 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] Completed event e15a826f-99af-4834-8906-4631a44eebd9 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Nov 28 09:55:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:26.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:26.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 09:55:26 np0005538513.localdomain sudo[300106]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:55:26 np0005538513.localdomain sudo[300106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:26 np0005538513.localdomain sudo[300106]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:27.030 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 09:55:27 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005538512 (monmap changed)...
Nov 28 09:55:27 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005538512 (monmap changed)...
Nov 28 09:55:27 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain
Nov 28 09:55:27 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain
Nov 28 09:55:27 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Nov 28 09:55:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:55:27 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mon.np0005538512 (monmap changed)...
Nov 28 09:55:27 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain
Nov 28 09:55:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:55:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:55:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:28 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)...
Nov 28 09:55:28 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)...
Nov 28 09:55:28 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain
Nov 28 09:55:28 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain
Nov 28 09:55:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:28.711 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:28 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:55:28 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:55:28 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:55:28 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:55:29 np0005538513.localdomain ceph-mon[292954]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Nov 28 09:55:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:29 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)...
Nov 28 09:55:29 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain
Nov 28 09:55:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:55:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:55:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:55:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:29 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 13 op/s
Nov 28 09:55:29 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] Writing back 50 completed events
Nov 28 09:55:29 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:55:29 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:55:29 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:55:29 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:55:29 np0005538513.localdomain sudo[300124]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:55:29 np0005538513.localdomain sudo[300124]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:29 np0005538513.localdomain sudo[300124]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:30 np0005538513.localdomain sudo[300142]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:30 np0005538513.localdomain sudo[300142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:30 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:55:30 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:55:30 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:30 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:30 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:30 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:55:30 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:30.427 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:30 np0005538513.localdomain podman[300176]: 
Nov 28 09:55:30 np0005538513.localdomain podman[300176]: 2025-11-28 09:55:30.535054625 +0000 UTC m=+0.087072001 container create dff4c09b97b3f88e9f9bd96051d5f66acd596454139e56c8c5c13cbc614a57a2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_khorana, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.buildah.version=1.33.12, version=7, name=rhceph, build-date=2025-09-24T08:57:55, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, release=553, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., architecture=x86_64)
Nov 28 09:55:30 np0005538513.localdomain systemd[1]: Started libpod-conmon-dff4c09b97b3f88e9f9bd96051d5f66acd596454139e56c8c5c13cbc614a57a2.scope.
Nov 28 09:55:30 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:55:30 np0005538513.localdomain podman[300176]: 2025-11-28 09:55:30.498366236 +0000 UTC m=+0.050383662 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:55:30 np0005538513.localdomain podman[300176]: 2025-11-28 09:55:30.610379454 +0000 UTC m=+0.162396840 container init dff4c09b97b3f88e9f9bd96051d5f66acd596454139e56c8c5c13cbc614a57a2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_khorana, version=7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, release=553, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, RELEASE=main, name=rhceph, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:55:30 np0005538513.localdomain podman[300176]: 2025-11-28 09:55:30.621726863 +0000 UTC m=+0.173744289 container start dff4c09b97b3f88e9f9bd96051d5f66acd596454139e56c8c5c13cbc614a57a2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_khorana, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, version=7, architecture=x86_64, release=553, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public)
Nov 28 09:55:30 np0005538513.localdomain podman[300176]: 2025-11-28 09:55:30.622039792 +0000 UTC m=+0.174057178 container attach dff4c09b97b3f88e9f9bd96051d5f66acd596454139e56c8c5c13cbc614a57a2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_khorana, build-date=2025-09-24T08:57:55, architecture=x86_64, ceph=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, vcs-type=git, GIT_CLEAN=True, io.openshift.expose-services=, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., RELEASE=main, CEPH_POINT_RELEASE=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:55:30 np0005538513.localdomain systemd[1]: libpod-dff4c09b97b3f88e9f9bd96051d5f66acd596454139e56c8c5c13cbc614a57a2.scope: Deactivated successfully.
Nov 28 09:55:30 np0005538513.localdomain sweet_khorana[300192]: 167 167
Nov 28 09:55:30 np0005538513.localdomain podman[300176]: 2025-11-28 09:55:30.629695758 +0000 UTC m=+0.181713174 container died dff4c09b97b3f88e9f9bd96051d5f66acd596454139e56c8c5c13cbc614a57a2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_khorana, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, release=553, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, vendor=Red Hat, Inc., ceph=True, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, version=7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:55:30 np0005538513.localdomain podman[300197]: 2025-11-28 09:55:30.736382232 +0000 UTC m=+0.091885559 container remove dff4c09b97b3f88e9f9bd96051d5f66acd596454139e56c8c5c13cbc614a57a2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_khorana, io.openshift.expose-services=, ceph=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-type=git, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_CLEAN=True, version=7, architecture=x86_64, name=rhceph, GIT_BRANCH=main)
Nov 28 09:55:30 np0005538513.localdomain systemd[1]: libpod-conmon-dff4c09b97b3f88e9f9bd96051d5f66acd596454139e56c8c5c13cbc614a57a2.scope: Deactivated successfully.
Nov 28 09:55:30 np0005538513.localdomain sudo[300142]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:30 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)...
Nov 28 09:55:30 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)...
Nov 28 09:55:30 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:55:30 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:55:30 np0005538513.localdomain sudo[300213]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:55:30 np0005538513.localdomain sudo[300213]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:30 np0005538513.localdomain sudo[300213]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:30 np0005538513.localdomain sudo[300231]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:30 np0005538513.localdomain sudo[300231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:31 np0005538513.localdomain ceph-mon[292954]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 13 op/s
Nov 28 09:55:31 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:55:31 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:55:31 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:31 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:31 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:55:31 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:31 np0005538513.localdomain podman[300267]: 
Nov 28 09:55:31 np0005538513.localdomain podman[300267]: 2025-11-28 09:55:31.438758441 +0000 UTC m=+0.059455891 container create 73d0522ac7c60a132398fcc3eea12929cfb11b1710fdaa221047c7f55ddc3926 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_joliot, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, version=7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, RELEASE=main, GIT_CLEAN=True, name=rhceph)
Nov 28 09:55:31 np0005538513.localdomain systemd[1]: Started libpod-conmon-73d0522ac7c60a132398fcc3eea12929cfb11b1710fdaa221047c7f55ddc3926.scope.
Nov 28 09:55:31 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:55:31 np0005538513.localdomain podman[300267]: 2025-11-28 09:55:31.493100125 +0000 UTC m=+0.113797585 container init 73d0522ac7c60a132398fcc3eea12929cfb11b1710fdaa221047c7f55ddc3926 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_joliot, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, RELEASE=main, com.redhat.component=rhceph-container, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, name=rhceph, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, ceph=True, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 09:55:31 np0005538513.localdomain podman[300267]: 2025-11-28 09:55:31.502006829 +0000 UTC m=+0.122704249 container start 73d0522ac7c60a132398fcc3eea12929cfb11b1710fdaa221047c7f55ddc3926 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_joliot, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, RELEASE=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, com.redhat.component=rhceph-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, release=553, GIT_CLEAN=True)
Nov 28 09:55:31 np0005538513.localdomain podman[300267]: 2025-11-28 09:55:31.502158613 +0000 UTC m=+0.122856103 container attach 73d0522ac7c60a132398fcc3eea12929cfb11b1710fdaa221047c7f55ddc3926 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_joliot, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., release=553, GIT_BRANCH=main, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph)
Nov 28 09:55:31 np0005538513.localdomain ecstatic_joliot[300283]: 167 167
Nov 28 09:55:31 np0005538513.localdomain systemd[1]: libpod-73d0522ac7c60a132398fcc3eea12929cfb11b1710fdaa221047c7f55ddc3926.scope: Deactivated successfully.
Nov 28 09:55:31 np0005538513.localdomain podman[300267]: 2025-11-28 09:55:31.406177049 +0000 UTC m=+0.026874569 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:55:31 np0005538513.localdomain podman[300267]: 2025-11-28 09:55:31.505740313 +0000 UTC m=+0.126437813 container died 73d0522ac7c60a132398fcc3eea12929cfb11b1710fdaa221047c7f55ddc3926 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_joliot, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, vcs-type=git, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main, io.openshift.tags=rhceph ceph, distribution-scope=public, CEPH_POINT_RELEASE=, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, release=553)
Nov 28 09:55:31 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f27fc19861619d301d8d9da9bc4725260477003f261db5aa6e45b0c10a4a570c-merged.mount: Deactivated successfully.
Nov 28 09:55:31 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-0a5416d70a866570e4d156259e882aa2c63c88ec04bd8570ee301b56df1e63a5-merged.mount: Deactivated successfully.
Nov 28 09:55:31 np0005538513.localdomain podman[300288]: 2025-11-28 09:55:31.59822978 +0000 UTC m=+0.084920245 container remove 73d0522ac7c60a132398fcc3eea12929cfb11b1710fdaa221047c7f55ddc3926 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_joliot, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, release=553, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.buildah.version=1.33.12)
Nov 28 09:55:31 np0005538513.localdomain systemd[1]: libpod-conmon-73d0522ac7c60a132398fcc3eea12929cfb11b1710fdaa221047c7f55ddc3926.scope: Deactivated successfully.
Nov 28 09:55:31 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 28 09:55:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon) e12 handle_command mon_command({"prefix": "mgr stat", "format": "json"} v 0)
Nov 28 09:55:31 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/577138193' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Nov 28 09:55:31 np0005538513.localdomain sudo[300231]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:31 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)...
Nov 28 09:55:31 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)...
Nov 28 09:55:31 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:55:31 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:55:31 np0005538513.localdomain sudo[300311]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:55:31 np0005538513.localdomain sudo[300311]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:31 np0005538513.localdomain sudo[300311]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:32 np0005538513.localdomain sudo[300329]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:32 np0005538513.localdomain sudo[300329]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:32 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.2 (monmap changed)...
Nov 28 09:55:32 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:55:32 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.200:0/577138193' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Nov 28 09:55:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:55:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:32 np0005538513.localdomain podman[300364]: 
Nov 28 09:55:32 np0005538513.localdomain podman[300364]: 2025-11-28 09:55:32.464706782 +0000 UTC m=+0.080467298 container create 8c6151cfc4a6ffddfdbc4bbabe0d183732cba6d349032189520a8b8fcb77fa96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_hamilton, description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, RELEASE=main, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., vcs-type=git, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55)
Nov 28 09:55:32 np0005538513.localdomain systemd[1]: Started libpod-conmon-8c6151cfc4a6ffddfdbc4bbabe0d183732cba6d349032189520a8b8fcb77fa96.scope.
Nov 28 09:55:32 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:55:32 np0005538513.localdomain podman[300364]: 2025-11-28 09:55:32.529558548 +0000 UTC m=+0.145319064 container init 8c6151cfc4a6ffddfdbc4bbabe0d183732cba6d349032189520a8b8fcb77fa96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_hamilton, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., RELEASE=main, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, name=rhceph, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, description=Red Hat Ceph Storage 7)
Nov 28 09:55:32 np0005538513.localdomain podman[300364]: 2025-11-28 09:55:32.432364806 +0000 UTC m=+0.048125362 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:55:32 np0005538513.localdomain interesting_hamilton[300379]: 167 167
Nov 28 09:55:32 np0005538513.localdomain systemd[1]: libpod-8c6151cfc4a6ffddfdbc4bbabe0d183732cba6d349032189520a8b8fcb77fa96.scope: Deactivated successfully.
Nov 28 09:55:32 np0005538513.localdomain podman[300364]: 2025-11-28 09:55:32.541772473 +0000 UTC m=+0.157532989 container start 8c6151cfc4a6ffddfdbc4bbabe0d183732cba6d349032189520a8b8fcb77fa96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_hamilton, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, ceph=True, GIT_CLEAN=True, com.redhat.component=rhceph-container, distribution-scope=public, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, RELEASE=main, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=)
Nov 28 09:55:32 np0005538513.localdomain podman[300364]: 2025-11-28 09:55:32.542229777 +0000 UTC m=+0.157990293 container attach 8c6151cfc4a6ffddfdbc4bbabe0d183732cba6d349032189520a8b8fcb77fa96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_hamilton, version=7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., vcs-type=git, release=553, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True)
Nov 28 09:55:32 np0005538513.localdomain podman[300364]: 2025-11-28 09:55:32.544199508 +0000 UTC m=+0.159960094 container died 8c6151cfc4a6ffddfdbc4bbabe0d183732cba6d349032189520a8b8fcb77fa96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_hamilton, ceph=True, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vcs-type=git, GIT_CLEAN=True, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_BRANCH=main, release=553, build-date=2025-09-24T08:57:55, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, RELEASE=main, name=rhceph)
Nov 28 09:55:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-1c0fdc40bd1ebef23511b5cfa1cf80437a0ec2a4dcb9baa3d53189d5181a247d-merged.mount: Deactivated successfully.
Nov 28 09:55:32 np0005538513.localdomain podman[300384]: 2025-11-28 09:55:32.649427908 +0000 UTC m=+0.097703449 container remove 8c6151cfc4a6ffddfdbc4bbabe0d183732cba6d349032189520a8b8fcb77fa96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_hamilton, io.k8s.description=Red Hat Ceph Storage 7, release=553, distribution-scope=public, CEPH_POINT_RELEASE=, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.12, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, build-date=2025-09-24T08:57:55)
Nov 28 09:55:32 np0005538513.localdomain systemd[1]: libpod-conmon-8c6151cfc4a6ffddfdbc4bbabe0d183732cba6d349032189520a8b8fcb77fa96.scope: Deactivated successfully.
Nov 28 09:55:32 np0005538513.localdomain sudo[300329]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:32 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:55:32 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:55:32 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:55:32 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:55:32 np0005538513.localdomain sudo[300409]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:55:32 np0005538513.localdomain sudo[300409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:32 np0005538513.localdomain sudo[300409]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:33 np0005538513.localdomain sudo[300427]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:33 np0005538513.localdomain sudo[300427]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:33.026 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:33.028 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:33.028 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:33.028 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:55:33 np0005538513.localdomain ceph-mon[292954]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 28 09:55:33 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.5 (monmap changed)...
Nov 28 09:55:33 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:55:33 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:33 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:33 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:33 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:33 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:55:33 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:33 np0005538513.localdomain ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.27172 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538512", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:55:33 np0005538513.localdomain podman[300462]: 
Nov 28 09:55:33 np0005538513.localdomain podman[300462]: 2025-11-28 09:55:33.470537042 +0000 UTC m=+0.074672990 container create f5b751945cf6f78c8ce46a40ae366f8a2422a1acc6551a34a7a9490c6b63d4d4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_diffie, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, ceph=True, io.buildah.version=1.33.12, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, name=rhceph, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7)
Nov 28 09:55:33 np0005538513.localdomain systemd[1]: Started libpod-conmon-f5b751945cf6f78c8ce46a40ae366f8a2422a1acc6551a34a7a9490c6b63d4d4.scope.
Nov 28 09:55:33 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:55:33 np0005538513.localdomain podman[300462]: 2025-11-28 09:55:33.442046045 +0000 UTC m=+0.046182013 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:55:33 np0005538513.localdomain podman[300462]: 2025-11-28 09:55:33.549923915 +0000 UTC m=+0.154059883 container init f5b751945cf6f78c8ce46a40ae366f8a2422a1acc6551a34a7a9490c6b63d4d4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_diffie, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, RELEASE=main, vcs-type=git, io.openshift.tags=rhceph ceph, ceph=True, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, name=rhceph, build-date=2025-09-24T08:57:55, architecture=x86_64, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.openshift.expose-services=)
Nov 28 09:55:33 np0005538513.localdomain podman[300462]: 2025-11-28 09:55:33.560826441 +0000 UTC m=+0.164962409 container start f5b751945cf6f78c8ce46a40ae366f8a2422a1acc6551a34a7a9490c6b63d4d4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_diffie, io.openshift.tags=rhceph ceph, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, distribution-scope=public, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_BRANCH=main, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:55:33 np0005538513.localdomain podman[300462]: 2025-11-28 09:55:33.561212553 +0000 UTC m=+0.165348521 container attach f5b751945cf6f78c8ce46a40ae366f8a2422a1acc6551a34a7a9490c6b63d4d4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_diffie, ceph=True, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, description=Red Hat Ceph Storage 7, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_BRANCH=main)
Nov 28 09:55:33 np0005538513.localdomain condescending_diffie[300477]: 167 167
Nov 28 09:55:33 np0005538513.localdomain systemd[1]: libpod-f5b751945cf6f78c8ce46a40ae366f8a2422a1acc6551a34a7a9490c6b63d4d4.scope: Deactivated successfully.
Nov 28 09:55:33 np0005538513.localdomain podman[300462]: 2025-11-28 09:55:33.563843414 +0000 UTC m=+0.167979412 container died f5b751945cf6f78c8ce46a40ae366f8a2422a1acc6551a34a7a9490c6b63d4d4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_diffie, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, vendor=Red Hat, Inc., release=553, architecture=x86_64, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.tags=rhceph ceph)
Nov 28 09:55:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-a10b28bbfa262a8941d71b00637f3759952fa6422790e7aa06ad2b2fba7e3b1f-merged.mount: Deactivated successfully.
Nov 28 09:55:33 np0005538513.localdomain podman[300482]: 2025-11-28 09:55:33.664167162 +0000 UTC m=+0.091334893 container remove f5b751945cf6f78c8ce46a40ae366f8a2422a1acc6551a34a7a9490c6b63d4d4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_diffie, CEPH_POINT_RELEASE=, io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, distribution-scope=public, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, release=553, io.openshift.tags=rhceph ceph, version=7, GIT_BRANCH=main, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, vendor=Red Hat, Inc., name=rhceph)
Nov 28 09:55:33 np0005538513.localdomain systemd[1]: libpod-conmon-f5b751945cf6f78c8ce46a40ae366f8a2422a1acc6551a34a7a9490c6b63d4d4.scope: Deactivated successfully.
Nov 28 09:55:33 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:55:33 np0005538513.localdomain sudo[300427]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:33.746 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:33.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:33.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:33 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:55:33 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:55:33 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:55:33 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:55:33 np0005538513.localdomain sudo[300499]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:55:33 np0005538513.localdomain sudo[300499]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:33 np0005538513.localdomain sudo[300499]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:33 np0005538513.localdomain sudo[300517]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:33 np0005538513.localdomain sudo[300517]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:34 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:55:34 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:55:34 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:34 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:34 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:55:34 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:55:34 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:34 np0005538513.localdomain podman[300552]: 
Nov 28 09:55:34 np0005538513.localdomain podman[300552]: 2025-11-28 09:55:34.552900428 +0000 UTC m=+0.085203693 container create 7249413b257ac454017e9d06d17907e71b411373bba34b6464d07f4fdb08925b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_clarke, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.tags=rhceph ceph, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.openshift.expose-services=, distribution-scope=public, ceph=True, version=7, release=553)
Nov 28 09:55:34 np0005538513.localdomain systemd[1]: Started libpod-conmon-7249413b257ac454017e9d06d17907e71b411373bba34b6464d07f4fdb08925b.scope.
Nov 28 09:55:34 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:55:34 np0005538513.localdomain podman[300552]: 2025-11-28 09:55:34.518111628 +0000 UTC m=+0.050414933 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:55:34 np0005538513.localdomain podman[300552]: 2025-11-28 09:55:34.623839831 +0000 UTC m=+0.156143086 container init 7249413b257ac454017e9d06d17907e71b411373bba34b6464d07f4fdb08925b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_clarke, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, distribution-scope=public, name=rhceph, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.openshift.tags=rhceph ceph, vcs-type=git)
Nov 28 09:55:34 np0005538513.localdomain systemd[1]: tmp-crun.XW9Ffo.mount: Deactivated successfully.
Nov 28 09:55:34 np0005538513.localdomain podman[300552]: 2025-11-28 09:55:34.640122393 +0000 UTC m=+0.172425648 container start 7249413b257ac454017e9d06d17907e71b411373bba34b6464d07f4fdb08925b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_clarke, vcs-type=git, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, GIT_CLEAN=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, description=Red Hat Ceph Storage 7, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_BRANCH=main)
Nov 28 09:55:34 np0005538513.localdomain podman[300552]: 2025-11-28 09:55:34.640345839 +0000 UTC m=+0.172649084 container attach 7249413b257ac454017e9d06d17907e71b411373bba34b6464d07f4fdb08925b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_clarke, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, description=Red Hat Ceph Storage 7, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, GIT_CLEAN=True, CEPH_POINT_RELEASE=, RELEASE=main)
Nov 28 09:55:34 np0005538513.localdomain relaxed_clarke[300566]: 167 167
Nov 28 09:55:34 np0005538513.localdomain systemd[1]: libpod-7249413b257ac454017e9d06d17907e71b411373bba34b6464d07f4fdb08925b.scope: Deactivated successfully.
Nov 28 09:55:34 np0005538513.localdomain podman[300552]: 2025-11-28 09:55:34.644093615 +0000 UTC m=+0.176396910 container died 7249413b257ac454017e9d06d17907e71b411373bba34b6464d07f4fdb08925b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_clarke, GIT_CLEAN=True, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, ceph=True, release=553, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., RELEASE=main, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, build-date=2025-09-24T08:57:55, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, version=7)
Nov 28 09:55:34 np0005538513.localdomain podman[300571]: 2025-11-28 09:55:34.75473126 +0000 UTC m=+0.098379869 container remove 7249413b257ac454017e9d06d17907e71b411373bba34b6464d07f4fdb08925b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_clarke, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.expose-services=)
Nov 28 09:55:34 np0005538513.localdomain systemd[1]: libpod-conmon-7249413b257ac454017e9d06d17907e71b411373bba34b6464d07f4fdb08925b.scope: Deactivated successfully.
Nov 28 09:55:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:34.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:34.773 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:34 np0005538513.localdomain sudo[300517]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:34 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005538513 (monmap changed)...
Nov 28 09:55:34 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005538513 (monmap changed)...
Nov 28 09:55:34 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain
Nov 28 09:55:34 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain
Nov 28 09:55:34 np0005538513.localdomain sudo[300587]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:55:34 np0005538513.localdomain sudo[300587]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:55:34 np0005538513.localdomain sudo[300587]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:34 np0005538513.localdomain ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.27180 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005538512"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:55:34 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO root] Remove daemons mon.np0005538512
Nov 28 09:55:34 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Remove daemons mon.np0005538512
Nov 28 09:55:34 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005538512: new quorum should be ['np0005538515', 'np0005538513', 'np0005538514'] (from ['np0005538515', 'np0005538513', 'np0005538514'])
Nov 28 09:55:34 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005538512: new quorum should be ['np0005538515', 'np0005538513', 'np0005538514'] (from ['np0005538515', 'np0005538513', 'np0005538514'])
Nov 28 09:55:34 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005538512 from monmap...
Nov 28 09:55:34 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removing monitor np0005538512 from monmap...
Nov 28 09:55:34 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005538512 from np0005538512.localdomain -- ports []
Nov 28 09:55:34 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005538512 from np0005538512.localdomain -- ports []
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@2(peon) e13  my rank is now 1 (was 2)
Nov 28 09:55:35 np0005538513.localdomain ceph-mgr[286105]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0
Nov 28 09:55:35 np0005538513.localdomain ceph-mgr[286105]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0
Nov 28 09:55:35 np0005538513.localdomain ceph-mgr[286105]: client.44410 ms_handle_reset on v2:172.18.0.103:3300/0
Nov 28 09:55:35 np0005538513.localdomain ceph-mgr[286105]: client.27136 ms_handle_reset on v2:172.18.0.103:3300/0
Nov 28 09:55:35 np0005538513.localdomain sudo[300606]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:35 np0005538513.localdomain sudo[300606]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:35 np0005538513.localdomain podman[300605]: 2025-11-28 09:55:35.109335006 +0000 UTC m=+0.141702403 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, architecture=x86_64, container_name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, config_id=edpm, name=ubi9-minimal)
Nov 28 09:55:35 np0005538513.localdomain podman[300605]: 2025-11-28 09:55:35.151411231 +0000 UTC m=+0.183778608 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 28 09:55:35 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 calling monitor election
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: paxos.1).electionLogic(46) init, last seen epoch 46
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@1(electing) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@1(electing) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@1(peon) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mon.np0005538513 (monmap changed)...
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: from='client.27180 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005538512"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: Remove daemons mon.np0005538512
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: Safe to remove mon.np0005538512: new quorum should be ['np0005538515', 'np0005538513', 'np0005538514'] (from ['np0005538515', 'np0005538513', 'np0005538514'])
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: Removing monitor np0005538512 from monmap...
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: Removing daemon mon.np0005538512 from np0005538512.localdomain -- ports []
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538515 calling monitor election
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538514 calling monitor election
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513 calling monitor election
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538515 is new leader, mons np0005538515,np0005538513,np0005538514 in quorum (ranks 0,1,2)
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: monmap epoch 13
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: last_changed 2025-11-28T09:55:34.993934+0000
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: min_mon_release 18 (reef)
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: election_strategy: 1
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538514
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: osdmap e90: 6 total, 6 up, 6 in
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: mgrmap e34: np0005538513.dsfdlx(active, since 15s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:55:35 np0005538513.localdomain ceph-mon[292954]: overall HEALTH_OK
Nov 28 09:55:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:35.469 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:35 np0005538513.localdomain podman[300661]: 
Nov 28 09:55:35 np0005538513.localdomain podman[300661]: 2025-11-28 09:55:35.530788828 +0000 UTC m=+0.099626287 container create 3f86fa0776cae2989140e8c20aff3b0356013035e41a566a7c06a28ddead2e32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_proskuriakova, version=7, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, vendor=Red Hat, Inc., ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, RELEASE=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.openshift.tags=rhceph ceph)
Nov 28 09:55:35 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-ad0314c617d127c457a5ff6a4b0215dfdbc82f2da6c7ad2e4a7fde20133ee188-merged.mount: Deactivated successfully.
Nov 28 09:55:35 np0005538513.localdomain systemd[1]: Started libpod-conmon-3f86fa0776cae2989140e8c20aff3b0356013035e41a566a7c06a28ddead2e32.scope.
Nov 28 09:55:35 np0005538513.localdomain podman[300661]: 2025-11-28 09:55:35.49314735 +0000 UTC m=+0.061984869 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:55:35 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:55:35 np0005538513.localdomain podman[300661]: 2025-11-28 09:55:35.619565351 +0000 UTC m=+0.188402810 container init 3f86fa0776cae2989140e8c20aff3b0356013035e41a566a7c06a28ddead2e32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_proskuriakova, GIT_CLEAN=True, vcs-type=git, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., release=553, version=7, CEPH_POINT_RELEASE=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:55:35 np0005538513.localdomain podman[300661]: 2025-11-28 09:55:35.6292726 +0000 UTC m=+0.198110039 container start 3f86fa0776cae2989140e8c20aff3b0356013035e41a566a7c06a28ddead2e32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_proskuriakova, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, name=rhceph, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True)
Nov 28 09:55:35 np0005538513.localdomain podman[300661]: 2025-11-28 09:55:35.629539908 +0000 UTC m=+0.198377427 container attach 3f86fa0776cae2989140e8c20aff3b0356013035e41a566a7c06a28ddead2e32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_proskuriakova, build-date=2025-09-24T08:57:55, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, version=7, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, GIT_BRANCH=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vcs-type=git, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph)
Nov 28 09:55:35 np0005538513.localdomain objective_proskuriakova[300676]: 167 167
Nov 28 09:55:35 np0005538513.localdomain systemd[1]: libpod-3f86fa0776cae2989140e8c20aff3b0356013035e41a566a7c06a28ddead2e32.scope: Deactivated successfully.
Nov 28 09:55:35 np0005538513.localdomain podman[300661]: 2025-11-28 09:55:35.633932823 +0000 UTC m=+0.202770312 container died 3f86fa0776cae2989140e8c20aff3b0356013035e41a566a7c06a28ddead2e32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_proskuriakova, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., ceph=True, RELEASE=main, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, distribution-scope=public, version=7, CEPH_POINT_RELEASE=, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main)
Nov 28 09:55:35 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:55:35 np0005538513.localdomain podman[300681]: 2025-11-28 09:55:35.747959444 +0000 UTC m=+0.099078152 container remove 3f86fa0776cae2989140e8c20aff3b0356013035e41a566a7c06a28ddead2e32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_proskuriakova, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, release=553, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, vcs-type=git, name=rhceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55)
Nov 28 09:55:35 np0005538513.localdomain systemd[1]: libpod-conmon-3f86fa0776cae2989140e8c20aff3b0356013035e41a566a7c06a28ddead2e32.scope: Deactivated successfully.
Nov 28 09:55:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:35.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:35.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 09:55:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:35.794 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:35 np0005538513.localdomain sudo[300606]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:35 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:55:35 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:55:35 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:55:35 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:55:36 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-b06601690913dcc22fcbac68e9cdb32bab2dc906ba0d9841942ade18446df865-merged.mount: Deactivated successfully.
Nov 28 09:55:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:36.807 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:36 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)...
Nov 28 09:55:36 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)...
Nov 28 09:55:36 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005538514.localdomain
Nov 28 09:55:36 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005538514.localdomain
Nov 28 09:55:36 np0005538513.localdomain ceph-mon[292954]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:55:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:36 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:55:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:36 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:55:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:55:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:36 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/758709823' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:55:36 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/36973857' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:55:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 28 09:55:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:36.867 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:55:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:36.868 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:55:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:36.868 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:55:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:36.869 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:55:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:36.869 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:55:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@1(peon) e13 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:55:37 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/705674516' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:55:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:37.337 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:55:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:37.416 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:55:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:37.417 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:55:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:37.641 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:55:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:37.643 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11749MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:55:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:37.644 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:55:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:37.644 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:55:37 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:55:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:37.813 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:55:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:37.813 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:55:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:37.814 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:55:37 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.0 (monmap changed)...
Nov 28 09:55:37 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.0 on np0005538514.localdomain
Nov 28 09:55:37 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/705674516' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:55:37 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:37 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:37 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)...
Nov 28 09:55:37 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)...
Nov 28 09:55:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:37.872 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing inventories for resource provider 35fead26-0bad-4950-b646-987079d58a17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 09:55:37 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:55:37 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:55:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:37.918 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating ProviderTree inventory for provider 35fead26-0bad-4950-b646-987079d58a17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 09:55:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:37.918 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 09:55:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:37.933 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing aggregate associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 09:55:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:37.958 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing trait associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_FMA3,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_ABM,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AMD_SVM,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_CLMUL,HW_CPU_X86_BMI2,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 09:55:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:37.993 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:55:38 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@1(peon) e13 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:55:38 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4058543900' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:55:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:38.426 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:55:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:38.433 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:55:38 np0005538513.localdomain ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44470 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538512.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:55:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:38.457 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:55:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:38.460 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:55:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:38.460 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.816s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:55:38 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO root] Removed label mon from host np0005538512.localdomain
Nov 28 09:55:38 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removed label mon from host np0005538512.localdomain
Nov 28 09:55:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:38.750 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:38 np0005538513.localdomain ceph-mon[292954]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:55:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:38 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.3 (monmap changed)...
Nov 28 09:55:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 28 09:55:38 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:55:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/3157638842' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:55:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/486094489' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:55:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/4058543900' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:55:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:38 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:55:38 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:55:38 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:55:38 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:55:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:39 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:55:39 np0005538513.localdomain podman[300741]: 2025-11-28 09:55:39.86275089 +0000 UTC m=+0.089131003 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:55:39 np0005538513.localdomain podman[300741]: 2025-11-28 09:55:39.876680409 +0000 UTC m=+0.103060582 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:55:39 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:55:39 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:55:39 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:55:39 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:55:39 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:55:39 np0005538513.localdomain ceph-mon[292954]: from='client.44470 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538512.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:55:39 np0005538513.localdomain ceph-mon[292954]: Removed label mon from host np0005538512.localdomain
Nov 28 09:55:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:55:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:55:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:55:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:55:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:55:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:40 np0005538513.localdomain podman[238687]: time="2025-11-28T09:55:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:55:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:55:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 28 09:55:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:55:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18734 "" "Go-http-client/1.1"
Nov 28 09:55:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:40.471 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:40 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005538514 (monmap changed)...
Nov 28 09:55:40 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005538514 (monmap changed)...
Nov 28 09:55:40 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain
Nov 28 09:55:40 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain
Nov 28 09:55:41 np0005538513.localdomain ceph-mon[292954]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:41 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:55:41 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:55:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:55:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:55:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:41 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:41 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:55:41 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:55:41 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:55:41 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:55:42 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mon.np0005538514 (monmap changed)...
Nov 28 09:55:42 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain
Nov 28 09:55:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:55:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:42.425 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:42.507 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:42.507 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:55:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:42.508 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:55:42 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)...
Nov 28 09:55:42 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)...
Nov 28 09:55:42 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:55:42 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:55:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:55:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:55:42 np0005538513.localdomain podman[300764]: 2025-11-28 09:55:42.85609974 +0000 UTC m=+0.091426576 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:55:42 np0005538513.localdomain podman[300765]: 2025-11-28 09:55:42.934456482 +0000 UTC m=+0.167030803 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent)
Nov 28 09:55:42 np0005538513.localdomain podman[300764]: 2025-11-28 09:55:42.941523439 +0000 UTC m=+0.176850275 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Nov 28 09:55:42 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:55:42 np0005538513.localdomain podman[300765]: 2025-11-28 09:55:42.964832807 +0000 UTC m=+0.197407178 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:55:42 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:55:43 np0005538513.localdomain ceph-mon[292954]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:43 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:55:43 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:55:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 28 09:55:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:43 np0005538513.localdomain ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44473 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538512.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:55:43 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO root] Removed label mgr from host np0005538512.localdomain
Nov 28 09:55:43 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removed label mgr from host np0005538512.localdomain
Nov 28 09:55:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:43.486 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:55:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:43.486 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:55:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:43.486 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:55:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:43.487 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:55:43 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)...
Nov 28 09:55:43 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)...
Nov 28 09:55:43 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:55:43 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:55:43 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:43.753 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:44 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.1 (monmap changed)...
Nov 28 09:55:44 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:55:44 np0005538513.localdomain ceph-mon[292954]: from='client.44473 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538512.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:55:44 np0005538513.localdomain ceph-mon[292954]: Removed label mgr from host np0005538512.localdomain
Nov 28 09:55:44 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:44 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:44 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:44 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:44 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:44 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 28 09:55:44 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:44 np0005538513.localdomain ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44476 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538512.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:55:44 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO root] Removed label _admin from host np0005538512.localdomain
Nov 28 09:55:44 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removed label _admin from host np0005538512.localdomain
Nov 28 09:55:44 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:55:44 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:55:44 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:55:44 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:55:45 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.4 (monmap changed)...
Nov 28 09:55:45 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:55:45 np0005538513.localdomain ceph-mon[292954]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:55:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:45 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)...
Nov 28 09:55:45 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)...
Nov 28 09:55:45 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:55:45 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:55:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:45.517 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:45 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:55:45 np0005538513.localdomain podman[300809]: 2025-11-28 09:55:45.846994483 +0000 UTC m=+0.086681669 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 09:55:45 np0005538513.localdomain podman[300809]: 2025-11-28 09:55:45.86150018 +0000 UTC m=+0.101187376 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:55:45 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:55:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:46.178 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:55:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:46.198 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:55:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:46.198 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:55:46 np0005538513.localdomain ceph-mon[292954]: from='client.44476 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538512.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:55:46 np0005538513.localdomain ceph-mon[292954]: Removed label _admin from host np0005538512.localdomain
Nov 28 09:55:46 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:55:46 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:55:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:55:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:55:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:46 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005538515 (monmap changed)...
Nov 28 09:55:46 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005538515 (monmap changed)...
Nov 28 09:55:46 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:55:46 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:55:47 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)...
Nov 28 09:55:47 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:55:47 np0005538513.localdomain ceph-mon[292954]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:47 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:47 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:47 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:55:47 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:55:47 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:47 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:47 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:47 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:55:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:55:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:55:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:55:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:55:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:55:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:55:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:55:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:55:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:55:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:55:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:55:48 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mon.np0005538515 (monmap changed)...
Nov 28 09:55:48 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:55:48 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Removing np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:48 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removing np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:48 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:48 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:48 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:48 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:48 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:48 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:48 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Removing np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:55:48 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removing np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:55:48 np0005538513.localdomain sudo[300828]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:55:48 np0005538513.localdomain sudo[300828]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:48 np0005538513.localdomain sudo[300828]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:48 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Removing np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:55:48 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removing np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:55:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:48.755 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:48 np0005538513.localdomain sudo[300846]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:55:48 np0005538513.localdomain sudo[300846]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:48 np0005538513.localdomain sudo[300846]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:48 np0005538513.localdomain sudo[300864]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:55:48 np0005538513.localdomain sudo[300864]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:48 np0005538513.localdomain sudo[300864]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:48 np0005538513.localdomain sudo[300882]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:48 np0005538513.localdomain sudo[300882]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:48 np0005538513.localdomain sudo[300882]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:49 np0005538513.localdomain sudo[300900]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:55:49 np0005538513.localdomain sudo[300900]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:49 np0005538513.localdomain sudo[300900]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:49 np0005538513.localdomain sudo[300934]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:55:49 np0005538513.localdomain sudo[300934]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:49 np0005538513.localdomain sudo[300934]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:49 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:49 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:49 np0005538513.localdomain ceph-mon[292954]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:55:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:49 np0005538513.localdomain sudo[300952]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:55:49 np0005538513.localdomain sudo[300952]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:49 np0005538513.localdomain sudo[300952]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:49 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:49 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:49 np0005538513.localdomain sudo[300970]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:55:49 np0005538513.localdomain sudo[300970]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:49 np0005538513.localdomain sudo[300970]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:49 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:49 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:49 np0005538513.localdomain sudo[300988]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:55:49 np0005538513.localdomain sudo[300988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:49 np0005538513.localdomain sudo[300988]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:49 np0005538513.localdomain sudo[301006]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:55:49 np0005538513.localdomain sudo[301006]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:49 np0005538513.localdomain sudo[301006]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:49 np0005538513.localdomain sudo[301024]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:55:49 np0005538513.localdomain sudo[301024]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:49 np0005538513.localdomain sudo[301024]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:49 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:49 np0005538513.localdomain sudo[301042]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:49 np0005538513.localdomain sudo[301042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:49 np0005538513.localdomain sudo[301042]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:49 np0005538513.localdomain sudo[301060]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:55:49 np0005538513.localdomain sudo[301060]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:49 np0005538513.localdomain sudo[301060]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:49 np0005538513.localdomain ceph-mgr[286105]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 09:55:49 np0005538513.localdomain ceph-mgr[286105]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 09:55:49 np0005538513.localdomain ceph-mgr[286105]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 09:55:49 np0005538513.localdomain ceph-mgr[286105]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 09:55:49 np0005538513.localdomain ceph-mgr[286105]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 09:55:49 np0005538513.localdomain ceph-mgr[286105]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 09:55:49 np0005538513.localdomain sudo[301094]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:55:49 np0005538513.localdomain sudo[301094]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:49 np0005538513.localdomain sudo[301094]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:50 np0005538513.localdomain sudo[301112]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:55:50 np0005538513.localdomain sudo[301112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:50 np0005538513.localdomain sudo[301112]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.025943) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750025997, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 2811, "num_deletes": 255, "total_data_size": 8363170, "memory_usage": 8837936, "flush_reason": "Manual Compaction"}
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750055821, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 5010009, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14885, "largest_seqno": 17691, "table_properties": {"data_size": 4998653, "index_size": 7029, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3269, "raw_key_size": 28983, "raw_average_key_size": 22, "raw_value_size": 4973869, "raw_average_value_size": 3864, "num_data_blocks": 305, "num_entries": 1287, "num_filter_entries": 1287, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323655, "oldest_key_time": 1764323655, "file_creation_time": 1764323750, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 30031 microseconds, and 11462 cpu microseconds.
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.055972) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 5010009 bytes OK
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.056057) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.058171) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.058196) EVENT_LOG_v1 {"time_micros": 1764323750058187, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.058223) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 8349622, prev total WAL file size 8397823, number of live WAL files 2.
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.060849) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130373933' seq:72057594037927935, type:22 .. '7061786F73003131303435' seq:0, type:0; will stop at (end)
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(4892KB)], [24(15MB)]
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750060932, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 20816839, "oldest_snapshot_seqno": -1}
Nov 28 09:55:50 np0005538513.localdomain sudo[301130]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:50 np0005538513.localdomain sudo[301130]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:50 np0005538513.localdomain sudo[301130]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 11135 keys, 17522050 bytes, temperature: kUnknown
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750191828, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 17522050, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17457469, "index_size": 35680, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27845, "raw_key_size": 297871, "raw_average_key_size": 26, "raw_value_size": 17266507, "raw_average_value_size": 1550, "num_data_blocks": 1369, "num_entries": 11135, "num_filter_entries": 11135, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764323750, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.192269) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 17522050 bytes
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.194046) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 158.9 rd, 133.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.8, 15.1 +0.0 blob) out(16.7 +0.0 blob), read-write-amplify(7.7) write-amplify(3.5) OK, records in: 11685, records dropped: 550 output_compression: NoCompression
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.194078) EVENT_LOG_v1 {"time_micros": 1764323750194063, "job": 12, "event": "compaction_finished", "compaction_time_micros": 131028, "compaction_time_cpu_micros": 48343, "output_level": 6, "num_output_files": 1, "total_output_size": 17522050, "num_input_records": 11685, "num_output_records": 11135, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750195320, "job": 12, "event": "table_file_deletion", "file_number": 26}
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750197896, "job": 12, "event": "table_file_deletion", "file_number": 24}
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.060713) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.198056) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.198062) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.198065) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.198068) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.198071) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: Removing np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: Removing np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: Removing np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:50 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] update: starting ev bc8e7913-9bca-40e0-98a8-75add603a1f0 (Updating mgr deployment (-1 -> 3))
Nov 28 09:55:50 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Removing daemon mgr.np0005538512.zyhkxs from np0005538512.localdomain -- ports [8765]
Nov 28 09:55:50 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removing daemon mgr.np0005538512.zyhkxs from np0005538512.localdomain -- ports [8765]
Nov 28 09:55:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:50.520 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.569205) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750569514, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 324, "num_deletes": 253, "total_data_size": 170427, "memory_usage": 178568, "flush_reason": "Manual Compaction"}
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750573221, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 112846, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17696, "largest_seqno": 18015, "table_properties": {"data_size": 110711, "index_size": 310, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 4968, "raw_average_key_size": 16, "raw_value_size": 106348, "raw_average_value_size": 359, "num_data_blocks": 11, "num_entries": 296, "num_filter_entries": 296, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323750, "oldest_key_time": 1764323750, "file_creation_time": 1764323750, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 4066 microseconds, and 1340 cpu microseconds.
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.573266) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 112846 bytes OK
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.573291) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.575429) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.575453) EVENT_LOG_v1 {"time_micros": 1764323750575446, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.575476) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 168086, prev total WAL file size 168086, number of live WAL files 2.
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.576337) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031323935' seq:72057594037927935, type:22 .. '6B760031353439' seq:0, type:0; will stop at (end)
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(110KB)], [27(16MB)]
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750576484, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 17634896, "oldest_snapshot_seqno": -1}
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 10907 keys, 16607555 bytes, temperature: kUnknown
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750670656, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 16607555, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16545728, "index_size": 33438, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27333, "raw_key_size": 294644, "raw_average_key_size": 27, "raw_value_size": 16359919, "raw_average_value_size": 1499, "num_data_blocks": 1257, "num_entries": 10907, "num_filter_entries": 10907, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764323750, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.671089) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 16607555 bytes
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.673332) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 187.1 rd, 176.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 16.7 +0.0 blob) out(15.8 +0.0 blob), read-write-amplify(303.4) write-amplify(147.2) OK, records in: 11431, records dropped: 524 output_compression: NoCompression
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.673401) EVENT_LOG_v1 {"time_micros": 1764323750673375, "job": 14, "event": "compaction_finished", "compaction_time_micros": 94274, "compaction_time_cpu_micros": 51841, "output_level": 6, "num_output_files": 1, "total_output_size": 16607555, "num_input_records": 11431, "num_output_records": 10907, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750673760, "job": 14, "event": "table_file_deletion", "file_number": 29}
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750677957, "job": 14, "event": "table_file_deletion", "file_number": 27}
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.576241) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.678116) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.678125) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.678129) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.678132) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.678135) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:55:50.835 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:55:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:55:50.835 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:55:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:55:50.835 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:55:51 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:51 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:51 np0005538513.localdomain ceph-mon[292954]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:51 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:51 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:51 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:52 np0005538513.localdomain ceph-mon[292954]: Removing daemon mgr.np0005538512.zyhkxs from np0005538512.localdomain -- ports [8765]
Nov 28 09:55:52 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.services.cephadmservice] Removing key for mgr.np0005538512.zyhkxs
Nov 28 09:55:52 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removing key for mgr.np0005538512.zyhkxs
Nov 28 09:55:52 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] complete: finished ev bc8e7913-9bca-40e0-98a8-75add603a1f0 (Updating mgr deployment (-1 -> 3))
Nov 28 09:55:52 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] Completed event bc8e7913-9bca-40e0-98a8-75add603a1f0 (Updating mgr deployment (-1 -> 3)) in 2 seconds
Nov 28 09:55:52 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] update: starting ev 30a28b45-2802-4295-ba57-d14320d63229 (Updating node-proxy deployment (+4 -> 4))
Nov 28 09:55:52 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] complete: finished ev 30a28b45-2802-4295-ba57-d14320d63229 (Updating node-proxy deployment (+4 -> 4))
Nov 28 09:55:52 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] Completed event 30a28b45-2802-4295-ba57-d14320d63229 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Nov 28 09:55:52 np0005538513.localdomain sudo[301148]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:55:52 np0005538513.localdomain sudo[301148]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:52 np0005538513.localdomain sudo[301148]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:53 np0005538513.localdomain ceph-mon[292954]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:53 np0005538513.localdomain ceph-mon[292954]: Removing key for mgr.np0005538512.zyhkxs
Nov 28 09:55:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth rm", "entity": "mgr.np0005538512.zyhkxs"} : dispatch
Nov 28 09:55:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005538512.zyhkxs"}]': finished
Nov 28 09:55:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:55:53 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:55:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:53.782 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:55:53 np0005538513.localdomain systemd[1]: tmp-crun.wWQj3p.mount: Deactivated successfully.
Nov 28 09:55:53 np0005538513.localdomain podman[301166]: 2025-11-28 09:55:53.891172074 +0000 UTC m=+0.097315376 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:55:53 np0005538513.localdomain podman[301166]: 2025-11-28 09:55:53.899870632 +0000 UTC m=+0.106013954 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:55:53 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:55:53 np0005538513.localdomain podman[301167]: 2025-11-28 09:55:53.99825941 +0000 UTC m=+0.200483741 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd)
Nov 28 09:55:54 np0005538513.localdomain podman[301167]: 2025-11-28 09:55:54.013348035 +0000 UTC m=+0.215572386 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:55:54 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:55:54 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] update: starting ev ead8fbf6-91dc-4ccd-979c-c7502b02c89e (Updating node-proxy deployment (+4 -> 4))
Nov 28 09:55:54 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] complete: finished ev ead8fbf6-91dc-4ccd-979c-c7502b02c89e (Updating node-proxy deployment (+4 -> 4))
Nov 28 09:55:54 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] Completed event ead8fbf6-91dc-4ccd-979c-c7502b02c89e (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Nov 28 09:55:54 np0005538513.localdomain sudo[301208]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:55:54 np0005538513.localdomain sudo[301208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:54 np0005538513.localdomain sudo[301208]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:54 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:55:54 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:55:54 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:55:54 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:55:54 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] Writing back 50 completed events
Nov 28 09:55:55 np0005538513.localdomain ceph-mon[292954]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:55:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:55:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:55:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:55 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:55:55 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:55:55 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:55:55 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:55:55 np0005538513.localdomain sudo[301226]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:55:55 np0005538513.localdomain sudo[301226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:55 np0005538513.localdomain sudo[301226]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:55.541 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:55 np0005538513.localdomain sudo[301244]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:55 np0005538513.localdomain sudo[301244]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:55 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:56 np0005538513.localdomain podman[301278]: 
Nov 28 09:55:56 np0005538513.localdomain podman[301278]: 2025-11-28 09:55:56.114240042 +0000 UTC m=+0.082646145 container create fea14a6e92e77b8cee73ad98bf3b1c33df348d90b04168702cead51bb5f8ba61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_chebyshev, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.tags=rhceph ceph, name=rhceph, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 09:55:56 np0005538513.localdomain systemd[1]: Started libpod-conmon-fea14a6e92e77b8cee73ad98bf3b1c33df348d90b04168702cead51bb5f8ba61.scope.
Nov 28 09:55:56 np0005538513.localdomain podman[301278]: 2025-11-28 09:55:56.081964058 +0000 UTC m=+0.050370221 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:55:56 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:55:56 np0005538513.localdomain podman[301278]: 2025-11-28 09:55:56.203383746 +0000 UTC m=+0.171789839 container init fea14a6e92e77b8cee73ad98bf3b1c33df348d90b04168702cead51bb5f8ba61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_chebyshev, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, architecture=x86_64, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, name=rhceph, GIT_CLEAN=True)
Nov 28 09:55:56 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:55:56 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:55:56 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:56 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:56 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:55:56 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:56 np0005538513.localdomain podman[301278]: 2025-11-28 09:55:56.217256482 +0000 UTC m=+0.185662576 container start fea14a6e92e77b8cee73ad98bf3b1c33df348d90b04168702cead51bb5f8ba61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_chebyshev, vcs-type=git, distribution-scope=public, architecture=x86_64, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=553, io.buildah.version=1.33.12, ceph=True, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:55:56 np0005538513.localdomain podman[301278]: 2025-11-28 09:55:56.217541692 +0000 UTC m=+0.185947835 container attach fea14a6e92e77b8cee73ad98bf3b1c33df348d90b04168702cead51bb5f8ba61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_chebyshev, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, RELEASE=main, GIT_BRANCH=main, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, version=7, architecture=x86_64, vcs-type=git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12)
Nov 28 09:55:56 np0005538513.localdomain brave_chebyshev[301293]: 167 167
Nov 28 09:55:56 np0005538513.localdomain systemd[1]: libpod-fea14a6e92e77b8cee73ad98bf3b1c33df348d90b04168702cead51bb5f8ba61.scope: Deactivated successfully.
Nov 28 09:55:56 np0005538513.localdomain podman[301278]: 2025-11-28 09:55:56.223209977 +0000 UTC m=+0.191616120 container died fea14a6e92e77b8cee73ad98bf3b1c33df348d90b04168702cead51bb5f8ba61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_chebyshev, vcs-type=git, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, architecture=x86_64, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, release=553, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:55:56 np0005538513.localdomain ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.34545 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005538512.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:55:56 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO root] Added label _no_schedule to host np0005538512.localdomain
Nov 28 09:55:56 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Added label _no_schedule to host np0005538512.localdomain
Nov 28 09:55:56 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO root] Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005538512.localdomain
Nov 28 09:55:56 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005538512.localdomain
Nov 28 09:55:56 np0005538513.localdomain podman[301298]: 2025-11-28 09:55:56.339203047 +0000 UTC m=+0.101542527 container remove fea14a6e92e77b8cee73ad98bf3b1c33df348d90b04168702cead51bb5f8ba61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_chebyshev, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_CLEAN=True, release=553, name=rhceph, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main)
Nov 28 09:55:56 np0005538513.localdomain systemd[1]: libpod-conmon-fea14a6e92e77b8cee73ad98bf3b1c33df348d90b04168702cead51bb5f8ba61.scope: Deactivated successfully.
Nov 28 09:55:56 np0005538513.localdomain sudo[301244]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:56 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)...
Nov 28 09:55:56 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)...
Nov 28 09:55:56 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:55:56 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:55:56 np0005538513.localdomain sudo[301316]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:55:56 np0005538513.localdomain sudo[301316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:56 np0005538513.localdomain sudo[301316]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:56 np0005538513.localdomain sudo[301334]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:56 np0005538513.localdomain sudo[301334]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:57 np0005538513.localdomain podman[301368]: 
Nov 28 09:55:57 np0005538513.localdomain podman[301368]: 2025-11-28 09:55:57.106173145 +0000 UTC m=+0.081938113 container create 73845108f7a435264fbee658551baa3ae2ae1c5256f8041ade21810a50d828e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_proskuriakova, RELEASE=main, GIT_CLEAN=True, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.buildah.version=1.33.12, distribution-scope=public, version=7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_BRANCH=main)
Nov 28 09:55:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-fc5b0ac95bb28b6f007cdcdd9a002e54754d51b240592b901437450ee73c8796-merged.mount: Deactivated successfully.
Nov 28 09:55:57 np0005538513.localdomain systemd[1]: Started libpod-conmon-73845108f7a435264fbee658551baa3ae2ae1c5256f8041ade21810a50d828e4.scope.
Nov 28 09:55:57 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:55:57 np0005538513.localdomain podman[301368]: 2025-11-28 09:55:57.073109027 +0000 UTC m=+0.048874045 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:55:57 np0005538513.localdomain podman[301368]: 2025-11-28 09:55:57.172515586 +0000 UTC m=+0.148280554 container init 73845108f7a435264fbee658551baa3ae2ae1c5256f8041ade21810a50d828e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_proskuriakova, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, ceph=True, architecture=x86_64, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., release=553, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, GIT_CLEAN=True, io.openshift.expose-services=, build-date=2025-09-24T08:57:55)
Nov 28 09:55:57 np0005538513.localdomain podman[301368]: 2025-11-28 09:55:57.182905956 +0000 UTC m=+0.158670934 container start 73845108f7a435264fbee658551baa3ae2ae1c5256f8041ade21810a50d828e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_proskuriakova, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, ceph=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:55:57 np0005538513.localdomain podman[301368]: 2025-11-28 09:55:57.183255777 +0000 UTC m=+0.159020795 container attach 73845108f7a435264fbee658551baa3ae2ae1c5256f8041ade21810a50d828e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_proskuriakova, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, ceph=True, vcs-type=git, distribution-scope=public, GIT_CLEAN=True, version=7, build-date=2025-09-24T08:57:55, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:55:57 np0005538513.localdomain busy_proskuriakova[301383]: 167 167
Nov 28 09:55:57 np0005538513.localdomain systemd[1]: libpod-73845108f7a435264fbee658551baa3ae2ae1c5256f8041ade21810a50d828e4.scope: Deactivated successfully.
Nov 28 09:55:57 np0005538513.localdomain podman[301368]: 2025-11-28 09:55:57.186220468 +0000 UTC m=+0.161985456 container died 73845108f7a435264fbee658551baa3ae2ae1c5256f8041ade21810a50d828e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_proskuriakova, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., GIT_BRANCH=main, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.expose-services=, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, ceph=True)
Nov 28 09:55:57 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:55:57 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:55:57 np0005538513.localdomain ceph-mon[292954]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:55:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:57 np0005538513.localdomain podman[301388]: 2025-11-28 09:55:57.294477231 +0000 UTC m=+0.096069108 container remove 73845108f7a435264fbee658551baa3ae2ae1c5256f8041ade21810a50d828e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_proskuriakova, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_CLEAN=True)
Nov 28 09:55:57 np0005538513.localdomain systemd[1]: libpod-conmon-73845108f7a435264fbee658551baa3ae2ae1c5256f8041ade21810a50d828e4.scope: Deactivated successfully.
Nov 28 09:55:57 np0005538513.localdomain sudo[301334]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:57 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)...
Nov 28 09:55:57 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)...
Nov 28 09:55:57 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:55:57 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:55:57 np0005538513.localdomain sudo[301412]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:55:57 np0005538513.localdomain sudo[301412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:57 np0005538513.localdomain sudo[301412]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:57 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:57 np0005538513.localdomain sudo[301430]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:57 np0005538513.localdomain sudo[301430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:57 np0005538513.localdomain ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44482 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005538512.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:55:58 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-7595fdbdc5a323040c88f7fd2be76a1f62201296c430db9c745db9b784df7622-merged.mount: Deactivated successfully.
Nov 28 09:55:58 np0005538513.localdomain podman[301464]: 
Nov 28 09:55:58 np0005538513.localdomain podman[301464]: 2025-11-28 09:55:58.224421306 +0000 UTC m=+0.084648427 container create 96333aafbd3ac47ba87c1227d9a9850e17e3be62686ddfaac615e8e6bc25cd7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.buildah.version=1.33.12, GIT_CLEAN=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.openshift.tags=rhceph ceph, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, vcs-type=git)
Nov 28 09:55:58 np0005538513.localdomain ceph-mon[292954]: from='client.34545 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005538512.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:55:58 np0005538513.localdomain ceph-mon[292954]: Added label _no_schedule to host np0005538512.localdomain
Nov 28 09:55:58 np0005538513.localdomain ceph-mon[292954]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005538512.localdomain
Nov 28 09:55:58 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.2 (monmap changed)...
Nov 28 09:55:58 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:55:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:55:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:58 np0005538513.localdomain systemd[1]: Started libpod-conmon-96333aafbd3ac47ba87c1227d9a9850e17e3be62686ddfaac615e8e6bc25cd7a.scope.
Nov 28 09:55:58 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:55:58 np0005538513.localdomain podman[301464]: 2025-11-28 09:55:58.188553891 +0000 UTC m=+0.048781042 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:55:58 np0005538513.localdomain podman[301464]: 2025-11-28 09:55:58.297828975 +0000 UTC m=+0.158056086 container init 96333aafbd3ac47ba87c1227d9a9850e17e3be62686ddfaac615e8e6bc25cd7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-type=git, GIT_CLEAN=True, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553)
Nov 28 09:55:58 np0005538513.localdomain podman[301464]: 2025-11-28 09:55:58.306548183 +0000 UTC m=+0.166775304 container start 96333aafbd3ac47ba87c1227d9a9850e17e3be62686ddfaac615e8e6bc25cd7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., RELEASE=main, io.buildah.version=1.33.12, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.openshift.expose-services=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True)
Nov 28 09:55:58 np0005538513.localdomain podman[301464]: 2025-11-28 09:55:58.308913196 +0000 UTC m=+0.169140317 container attach 96333aafbd3ac47ba87c1227d9a9850e17e3be62686ddfaac615e8e6bc25cd7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, GIT_BRANCH=main, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, version=7, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.openshift.expose-services=, com.redhat.component=rhceph-container, architecture=x86_64, RELEASE=main, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, ceph=True, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:55:58 np0005538513.localdomain gracious_gauss[301479]: 167 167
Nov 28 09:55:58 np0005538513.localdomain systemd[1]: libpod-96333aafbd3ac47ba87c1227d9a9850e17e3be62686ddfaac615e8e6bc25cd7a.scope: Deactivated successfully.
Nov 28 09:55:58 np0005538513.localdomain podman[301464]: 2025-11-28 09:55:58.315173409 +0000 UTC m=+0.175400550 container died 96333aafbd3ac47ba87c1227d9a9850e17e3be62686ddfaac615e8e6bc25cd7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, RELEASE=main, io.openshift.tags=rhceph ceph, name=rhceph, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, vcs-type=git, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, com.redhat.component=rhceph-container)
Nov 28 09:55:58 np0005538513.localdomain podman[301484]: 2025-11-28 09:55:58.415951191 +0000 UTC m=+0.086968468 container remove 96333aafbd3ac47ba87c1227d9a9850e17e3be62686ddfaac615e8e6bc25cd7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, CEPH_POINT_RELEASE=, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, architecture=x86_64, name=rhceph, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.buildah.version=1.33.12, version=7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True)
Nov 28 09:55:58 np0005538513.localdomain systemd[1]: libpod-conmon-96333aafbd3ac47ba87c1227d9a9850e17e3be62686ddfaac615e8e6bc25cd7a.scope: Deactivated successfully.
Nov 28 09:55:58 np0005538513.localdomain sudo[301430]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:58 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:55:58 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:55:58 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:55:58 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:55:58 np0005538513.localdomain sudo[301507]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:55:58 np0005538513.localdomain sudo[301507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:58 np0005538513.localdomain sudo[301507]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:55:58.785 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:55:58 np0005538513.localdomain sudo[301525]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:58 np0005538513.localdomain sudo[301525]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:59 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-231a4da403ed17711b8b8d29e538ebab04bb8a7a203305a4da0c06f7368039e2-merged.mount: Deactivated successfully.
Nov 28 09:55:59 np0005538513.localdomain ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44488 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005538512.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:55:59 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO root] Removed host np0005538512.localdomain
Nov 28 09:55:59 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removed host np0005538512.localdomain
Nov 28 09:55:59 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.5 (monmap changed)...
Nov 28 09:55:59 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:55:59 np0005538513.localdomain ceph-mon[292954]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:59 np0005538513.localdomain ceph-mon[292954]: from='client.44482 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005538512.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:55:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:55:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain"} : dispatch
Nov 28 09:55:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain"}]': finished
Nov 28 09:55:59 np0005538513.localdomain podman[301560]: 
Nov 28 09:55:59 np0005538513.localdomain podman[301560]: 2025-11-28 09:55:59.351770586 +0000 UTC m=+0.082601473 container create 393965a305174173322cb6c83e4accb04ae48857ac867fd4e40053893edb59ca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_gould, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, description=Red Hat Ceph Storage 7, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, ceph=True, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.expose-services=)
Nov 28 09:55:59 np0005538513.localdomain systemd[1]: Started libpod-conmon-393965a305174173322cb6c83e4accb04ae48857ac867fd4e40053893edb59ca.scope.
Nov 28 09:55:59 np0005538513.localdomain systemd[1]: tmp-crun.2t8p8J.mount: Deactivated successfully.
Nov 28 09:55:59 np0005538513.localdomain podman[301560]: 2025-11-28 09:55:59.317540083 +0000 UTC m=+0.048371010 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:55:59 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:55:59 np0005538513.localdomain podman[301560]: 2025-11-28 09:55:59.447637588 +0000 UTC m=+0.178468485 container init 393965a305174173322cb6c83e4accb04ae48857ac867fd4e40053893edb59ca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_gould, GIT_CLEAN=True, architecture=x86_64, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, RELEASE=main, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, release=553, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:55:59 np0005538513.localdomain keen_gould[301576]: 167 167
Nov 28 09:55:59 np0005538513.localdomain podman[301560]: 2025-11-28 09:55:59.462111783 +0000 UTC m=+0.192942680 container start 393965a305174173322cb6c83e4accb04ae48857ac867fd4e40053893edb59ca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_gould, ceph=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, build-date=2025-09-24T08:57:55, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, release=553, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, com.redhat.component=rhceph-container)
Nov 28 09:55:59 np0005538513.localdomain podman[301560]: 2025-11-28 09:55:59.462546916 +0000 UTC m=+0.193377833 container attach 393965a305174173322cb6c83e4accb04ae48857ac867fd4e40053893edb59ca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_gould, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, distribution-scope=public, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, architecture=x86_64, vcs-type=git, GIT_BRANCH=main, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 09:55:59 np0005538513.localdomain systemd[1]: libpod-393965a305174173322cb6c83e4accb04ae48857ac867fd4e40053893edb59ca.scope: Deactivated successfully.
Nov 28 09:55:59 np0005538513.localdomain podman[301560]: 2025-11-28 09:55:59.465556559 +0000 UTC m=+0.196387496 container died 393965a305174173322cb6c83e4accb04ae48857ac867fd4e40053893edb59ca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_gould, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, description=Red Hat Ceph Storage 7, architecture=x86_64)
Nov 28 09:55:59 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:59 np0005538513.localdomain podman[301581]: 2025-11-28 09:55:59.564939688 +0000 UTC m=+0.091514228 container remove 393965a305174173322cb6c83e4accb04ae48857ac867fd4e40053893edb59ca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_gould, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, architecture=x86_64, io.buildah.version=1.33.12, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.openshift.expose-services=, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:55:59 np0005538513.localdomain systemd[1]: libpod-conmon-393965a305174173322cb6c83e4accb04ae48857ac867fd4e40053893edb59ca.scope: Deactivated successfully.
Nov 28 09:55:59 np0005538513.localdomain sudo[301525]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:59 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:55:59 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:55:59 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:55:59 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:55:59 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:59 np0005538513.localdomain sudo[301598]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:55:59 np0005538513.localdomain sudo[301598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:59 np0005538513.localdomain sudo[301598]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:59 np0005538513.localdomain sudo[301616]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:59 np0005538513.localdomain sudo[301616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:00 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f6da1730271c3af938e9920c5df8939128f7e8ba8af02d0cf3c55ccaa74fd8a5-merged.mount: Deactivated successfully.
Nov 28 09:56:00 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:56:00 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:56:00 np0005538513.localdomain ceph-mon[292954]: from='client.44488 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005538512.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:56:00 np0005538513.localdomain ceph-mon[292954]: Removed host np0005538512.localdomain
Nov 28 09:56:00 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:00 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:00 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:00 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:56:00 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:00 np0005538513.localdomain podman[301653]: 
Nov 28 09:56:00 np0005538513.localdomain podman[301653]: 2025-11-28 09:56:00.290941945 +0000 UTC m=+0.082654305 container create 35f03cdf8fc9abebe7dc6819d9c8c0cd5e0cc8d4749bd9f39714ff00353e110c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_keldysh, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, ceph=True, description=Red Hat Ceph Storage 7, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, name=rhceph, io.openshift.tags=rhceph ceph, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, version=7, GIT_CLEAN=True, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container)
Nov 28 09:56:00 np0005538513.localdomain systemd[1]: Started libpod-conmon-35f03cdf8fc9abebe7dc6819d9c8c0cd5e0cc8d4749bd9f39714ff00353e110c.scope.
Nov 28 09:56:00 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:56:00 np0005538513.localdomain podman[301653]: 2025-11-28 09:56:00.355913485 +0000 UTC m=+0.147625835 container init 35f03cdf8fc9abebe7dc6819d9c8c0cd5e0cc8d4749bd9f39714ff00353e110c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_keldysh, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, description=Red Hat Ceph Storage 7, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:56:00 np0005538513.localdomain podman[301653]: 2025-11-28 09:56:00.258372103 +0000 UTC m=+0.050084493 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:00 np0005538513.localdomain podman[301653]: 2025-11-28 09:56:00.367116 +0000 UTC m=+0.158828350 container start 35f03cdf8fc9abebe7dc6819d9c8c0cd5e0cc8d4749bd9f39714ff00353e110c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_keldysh, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, vcs-type=git, io.buildah.version=1.33.12, version=7, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 09:56:00 np0005538513.localdomain podman[301653]: 2025-11-28 09:56:00.36745741 +0000 UTC m=+0.159169820 container attach 35f03cdf8fc9abebe7dc6819d9c8c0cd5e0cc8d4749bd9f39714ff00353e110c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_keldysh, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.openshift.expose-services=, name=rhceph, vendor=Red Hat, Inc., version=7, CEPH_POINT_RELEASE=, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, vcs-type=git, RELEASE=main, ceph=True, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_CLEAN=True)
Nov 28 09:56:00 np0005538513.localdomain reverent_keldysh[301668]: 167 167
Nov 28 09:56:00 np0005538513.localdomain systemd[1]: libpod-35f03cdf8fc9abebe7dc6819d9c8c0cd5e0cc8d4749bd9f39714ff00353e110c.scope: Deactivated successfully.
Nov 28 09:56:00 np0005538513.localdomain podman[301653]: 2025-11-28 09:56:00.370389801 +0000 UTC m=+0.162102151 container died 35f03cdf8fc9abebe7dc6819d9c8c0cd5e0cc8d4749bd9f39714ff00353e110c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_keldysh, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, vcs-type=git, version=7, architecture=x86_64, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vendor=Red Hat, Inc., distribution-scope=public)
Nov 28 09:56:00 np0005538513.localdomain podman[301673]: 2025-11-28 09:56:00.470623946 +0000 UTC m=+0.087801444 container remove 35f03cdf8fc9abebe7dc6819d9c8c0cd5e0cc8d4749bd9f39714ff00353e110c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_keldysh, vcs-type=git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_CLEAN=True, name=rhceph, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, ceph=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, version=7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, architecture=x86_64)
Nov 28 09:56:00 np0005538513.localdomain systemd[1]: libpod-conmon-35f03cdf8fc9abebe7dc6819d9c8c0cd5e0cc8d4749bd9f39714ff00353e110c.scope: Deactivated successfully.
Nov 28 09:56:00 np0005538513.localdomain sudo[301616]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:00.545 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:00 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005538513 (monmap changed)...
Nov 28 09:56:00 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005538513 (monmap changed)...
Nov 28 09:56:00 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain
Nov 28 09:56:00 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain
Nov 28 09:56:00 np0005538513.localdomain sudo[301691]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:00 np0005538513.localdomain sudo[301691]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.674 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.675 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 09:56:00 np0005538513.localdomain sudo[301691]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.683 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8672ca8-2bfe-4198-9b89-b82400139bfa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:56:00.675771', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '71bfa2aa-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.847690972, 'message_signature': '1381f9ee9f180cd518866b21ccaa0d264db0a2692c9b27d1322bd221b1f77923'}]}, 'timestamp': '2025-11-28 09:56:00.684765', '_unique_id': 'c5a52d0288ab4bb3bcb37a5671f66689'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.689 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.689 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29d69ffb-fa2e-4af3-be4a-21d4c790122e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:56:00.689248', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '71c06db6-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.847690972, 'message_signature': 'd877a18a41cacbe22023d32ff1bb72cd24812deafb604f217c46245dcdf4cbb6'}]}, 'timestamp': '2025-11-28 09:56:00.689777', '_unique_id': '2c8bdea2785b4c399e8c503846e12e13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.691 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.692 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.692 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '079acd19-ad03-4e54-921e-f51594fa81c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:56:00.692265', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '71c0e7be-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.847690972, 'message_signature': 'bb4fc6737fc1453c80827ae7341f3995c4ea246ce72816ff8c4dbf3c011639f0'}]}, 'timestamp': '2025-11-28 09:56:00.693005', '_unique_id': '9789b1e25af94b1bb0dbe6db75bcc569'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.696 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.726 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.727 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '417b97e0-0b20-4e0e-86f9-f7e3e6336127', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:56:00.696359', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '71c6212a-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': '719c0ac34d0afc92609cc2cbd84945bcbe27d9515a5ea20bf3d7d2f40d1be22b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:56:00.696359', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '71c636ba-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': '04b91b41dfd6bdf8bdd3b742a2dd9242cf6df20cdba3cbb10f346bcbe65ea8f5'}]}, 'timestamp': '2025-11-28 09:56:00.727640', '_unique_id': 'ed07a52a8f2646db89a86e17e6a7b089'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.730 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.730 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.730 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd889f451-bac0-450c-89e0-6f545fb2be7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:56:00.730579', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '71c6bb8a-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.847690972, 'message_signature': 'b8c04f156447b41295c26343a9d6583672163fbe5e31f92b53c942813b74c64b'}]}, 'timestamp': '2025-11-28 09:56:00.731114', '_unique_id': 'b944c1ea84984c31b64c24d1ee9dd86a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.733 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.733 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.733 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b73822c3-7f2a-4922-82e9-6ce4c7947fb8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:56:00.733332', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '71c727a0-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': '0c2599de0a48f639bce76d2f57d0d000993c3cc307a1424b5ce32e2a3666284a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:56:00.733332', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '71c73a06-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': '90bc81385ed49376ef3d62f2367f7fb61a6c05a54bb0f744b0d46ce153e4ae55'}]}, 'timestamp': '2025-11-28 09:56:00.734278', '_unique_id': '8e5113de4edd4c91b588721dc801d44e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.736 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.736 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5db8afb2-bce6-4c5b-b750-92b37e331113', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:56:00.736519', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '71c7a374-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.847690972, 'message_signature': 'bf5c974cd34344924ed5e530b6efa4cc2c6007a8d9cc85b634d0e5bc9c34d5c4'}]}, 'timestamp': '2025-11-28 09:56:00.736989', '_unique_id': '2707bb6ff99342dbbce68374ecdc3941'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.739 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.740 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62690b0e-7d47-4602-a073-dd20c49003c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:56:00.740054', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '71c83136-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.847690972, 'message_signature': 'bb20c579a59ac6a6ffd3c54f9f47f8ff5a70e87358b236791686139bd0f7d0f6'}]}, 'timestamp': '2025-11-28 09:56:00.740699', '_unique_id': '08d4bfb399b144f7a6b4a7f28365dee4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.742 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.742 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.743 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec9249ce-6ed4-4cb5-a49d-3b9f0ad816fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:56:00.742955', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '71c8a0d0-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': 'b9188261aef4f7dd577ddd89c9d9eb62a16eac2a2f88dba7f540f9cd2acf7e55'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:56:00.742955', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '71c8b19c-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': '4e54b3469ae75115f7e953105eb6bc7e0f72f264e609fb1c9cb1b4a7a9093e3a'}]}, 'timestamp': '2025-11-28 09:56:00.743877', '_unique_id': 'becb27e6c51e470dbaf7c444f2eeaad9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.746 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.757 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.757 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ecae124-d048-44e5-882e-2415693e5b92', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:56:00.746257', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '71cac69e-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.918185332, 'message_signature': 'd49987c2714f60e0930352bd31ce754bbbce5c56721c2ef017ec58eb4f6e84be'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:56:00.746257', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '71cadbc0-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.918185332, 'message_signature': '4a36a4ce94a7ad8f121ce1fb38357e3a76e88a2200ecc8ce04cee1d4ea183d5d'}]}, 'timestamp': '2025-11-28 09:56:00.758165', '_unique_id': '72b10f638c1a45e09a9a822f3247f7db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.760 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.760 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.761 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51af6fdc-5ac6-4881-b28b-4b5b048fe786', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:56:00.760642', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '71cb5172-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.918185332, 'message_signature': '4ce3f194f48669f1a5a68183c3ace784d3498f93841b6007be7a695289136163'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:56:00.760642', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '71cb62d4-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.918185332, 'message_signature': '6a32f62f17747bb01df235329e3ce777a5593990cec3d0675bc30478996f6d8f'}]}, 'timestamp': '2025-11-28 09:56:00.761520', '_unique_id': '1e6a25cefda94bc4b59a478cf721c06d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain sudo[301709]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:56:00 np0005538513.localdomain sudo[301709]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.763 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.764 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3c51377-1839-4abb-b441-28093050ea6b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:56:00.763973', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '71cbd642-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.847690972, 'message_signature': '14ffe9ef11f139315dcdf8b59cc85386e8d6132cd6ac638fe64e8c22efc2f9ea'}]}, 'timestamp': '2025-11-28 09:56:00.764509', '_unique_id': 'd912833165714c62a41f3e98b4f8f685'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.766 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.767 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6296e3b3-a986-45a1-8f07-1339e63ca2eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:56:00.767077', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '71cc4d02-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.847690972, 'message_signature': 'e917e7838491bc7e98c3bbd88d1feb36fdbd001a1a404ecf748341caba777e5d'}]}, 'timestamp': '2025-11-28 09:56:00.767542', '_unique_id': '20a445f1c6ba4f75a453f441b34824ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.769 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.785 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 13960000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'faf90230-45c5-45a2-a5c7-2823656838de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13960000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:56:00.769731', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '71cf2626-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.957483872, 'message_signature': '6db3c35543e5031b2b5a030bc7e5907132a17a5b2824cf2a335b2868b6b3bff8'}]}, 'timestamp': '2025-11-28 09:56:00.786239', '_unique_id': '283f352c3e7440b09ad933fcc347c3f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.788 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.788 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.788 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50f9bf05-2f28-4669-b916-f9aa6ba2f0b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:56:00.788474', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '71cf9066-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': 'ffc7d3f51095009cfa33acede44b014126490cf3945a353191ea033d12a793ad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:56:00.788474', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '71cfa1aa-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': '1f622b0f048745fad78b18fe5a6d4f4d1d3d8931cd016bb8c38aa72eafbe2b89'}]}, 'timestamp': '2025-11-28 09:56:00.789342', '_unique_id': '36615011e02b47309400d682c242a77c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.791 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.791 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43a82553-8ff4-4308-aaf0-e4c161faf420', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:56:00.791504', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '71d006fe-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.847690972, 'message_signature': 'eb42d5ffe3952a9ee5665c9e0865e65323c16315392a176a23b20510da09f738'}]}, 'timestamp': '2025-11-28 09:56:00.791964', '_unique_id': '76699a81515a430d8c163dba9492af35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.794 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.794 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38a076a1-e3e4-498c-92b2-f0130f60f19e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:56:00.794179', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '71d06f72-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.847690972, 'message_signature': '63c2657b9ab1e2a13dc2e3a3026638de721fb4c9978934cb3b56f74cab8a809f'}]}, 'timestamp': '2025-11-28 09:56:00.794637', '_unique_id': '645898cde0c84ee8b9fd4d304b33fb60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.796 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.796 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.796 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.797 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.797 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22d9c47e-d5f4-4208-b28b-bde2597ffd6a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:56:00.796983', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '71d0df98-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': '431cabc1d3fd772916ce6a81660f9ea2b36e9b5d97f5eaffb1d8d421696353db'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:56:00.796983', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '71d0f104-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': '166e816bf70904594a8e7699048691dcb3537081a44bf5a860f7a04beb921389'}]}, 'timestamp': '2025-11-28 09:56:00.797928', '_unique_id': '348c65bce4dc4192bfe66886fd49981f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.800 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.800 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3335da16-7e15-45f5-b812-030d3b9eac5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:56:00.800363', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '71d160ee-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.957483872, 'message_signature': 'ddc195e1da2d4fa07bb8b005113bcf6b069578c09e992ed8a94b53fc964eb774'}]}, 'timestamp': '2025-11-28 09:56:00.800802', '_unique_id': '9ecad9bcdbb1495983b8733c3d60996b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.802 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.802 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.803 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b792b580-fce9-4139-a21d-779f7f719d14', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:56:00.802899', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '71d1c55c-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.918185332, 'message_signature': 'dc1cf2736cae55cee781181150ecfa36221b8bb47e56304a956b0abfe63f6f76'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:56:00.802899', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '71d1d8f8-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.918185332, 'message_signature': '30b7315e936d100a6e962ad6d4a2bb6dc2823aa830c7b20d106cb19f15f2bb57'}]}, 'timestamp': '2025-11-28 09:56:00.803871', '_unique_id': '3e67fb70af3645abaa36ac38b98d9aa6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.805 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.806 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.806 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5d36ac1-5cd9-482b-be3e-1f523eac02d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:56:00.806111', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '71d24162-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': '3e09a921ae54408ce936355e7a70f2fd17191780d9f56b52928b41fa8964b32d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:56:00.806111', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '71d25166-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': '3e915869eb93bbde8a31e301c83b22406fdc514cc0c7360ece9abf4351fb4e78'}]}, 'timestamp': '2025-11-28 09:56:00.806945', '_unique_id': '07dcdbac0b1f40fc8b1a7833c411f128'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:56:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:56:01 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-bdd1a8ec4da179b12eb78cb1acf0e6f146d77d795937468857b0b018f40afe66-merged.mount: Deactivated successfully.
Nov 28 09:56:01 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:56:01 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:56:01 np0005538513.localdomain ceph-mon[292954]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:56:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:56:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:01 np0005538513.localdomain podman[301744]: 
Nov 28 09:56:01 np0005538513.localdomain podman[301744]: 2025-11-28 09:56:01.304000078 +0000 UTC m=+0.084892254 container create d8da87e4903d5f8010d002c64182990ca3d02637d11562898bfbbc3af66513b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_diffie, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, version=7, ceph=True, GIT_BRANCH=main, com.redhat.component=rhceph-container, architecture=x86_64, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55)
Nov 28 09:56:01 np0005538513.localdomain systemd[1]: Started libpod-conmon-d8da87e4903d5f8010d002c64182990ca3d02637d11562898bfbbc3af66513b1.scope.
Nov 28 09:56:01 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:56:01 np0005538513.localdomain podman[301744]: 2025-11-28 09:56:01.271755256 +0000 UTC m=+0.052647472 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:01 np0005538513.localdomain podman[301744]: 2025-11-28 09:56:01.376713166 +0000 UTC m=+0.157605362 container init d8da87e4903d5f8010d002c64182990ca3d02637d11562898bfbbc3af66513b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_diffie, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, release=553, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, RELEASE=main, build-date=2025-09-24T08:57:55)
Nov 28 09:56:01 np0005538513.localdomain podman[301744]: 2025-11-28 09:56:01.384915298 +0000 UTC m=+0.165807494 container start d8da87e4903d5f8010d002c64182990ca3d02637d11562898bfbbc3af66513b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_diffie, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, release=553, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_BRANCH=main, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, name=rhceph, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, version=7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:56:01 np0005538513.localdomain podman[301744]: 2025-11-28 09:56:01.385445775 +0000 UTC m=+0.166337981 container attach d8da87e4903d5f8010d002c64182990ca3d02637d11562898bfbbc3af66513b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_diffie, architecture=x86_64, build-date=2025-09-24T08:57:55, release=553, io.openshift.expose-services=, GIT_BRANCH=main, name=rhceph, GIT_CLEAN=True, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:56:01 np0005538513.localdomain goofy_diffie[301760]: 167 167
Nov 28 09:56:01 np0005538513.localdomain systemd[1]: libpod-d8da87e4903d5f8010d002c64182990ca3d02637d11562898bfbbc3af66513b1.scope: Deactivated successfully.
Nov 28 09:56:01 np0005538513.localdomain podman[301744]: 2025-11-28 09:56:01.396397122 +0000 UTC m=+0.177289338 container died d8da87e4903d5f8010d002c64182990ca3d02637d11562898bfbbc3af66513b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_diffie, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, name=rhceph, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, RELEASE=main, release=553, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git)
Nov 28 09:56:01 np0005538513.localdomain podman[301765]: 2025-11-28 09:56:01.509724631 +0000 UTC m=+0.108426119 container remove d8da87e4903d5f8010d002c64182990ca3d02637d11562898bfbbc3af66513b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_diffie, CEPH_POINT_RELEASE=, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, distribution-scope=public, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.component=rhceph-container, release=553, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, vcs-type=git)
Nov 28 09:56:01 np0005538513.localdomain systemd[1]: libpod-conmon-d8da87e4903d5f8010d002c64182990ca3d02637d11562898bfbbc3af66513b1.scope: Deactivated successfully.
Nov 28 09:56:01 np0005538513.localdomain sudo[301709]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:01 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] update: starting ev c06c889e-5b25-4013-b3a0-99a9e621a0d9 (Updating node-proxy deployment (+3 -> 3))
Nov 28 09:56:01 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] complete: finished ev c06c889e-5b25-4013-b3a0-99a9e621a0d9 (Updating node-proxy deployment (+3 -> 3))
Nov 28 09:56:01 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] Completed event c06c889e-5b25-4013-b3a0-99a9e621a0d9 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 09:56:01 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:01 np0005538513.localdomain sudo[301781]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:56:01 np0005538513.localdomain sudo[301781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:01 np0005538513.localdomain sudo[301781]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:02 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-ffd3a3fc7af74570d5ddb6c58d75a1dcc063971da9c291d0dfb9cd81e848f7aa-merged.mount: Deactivated successfully.
Nov 28 09:56:02 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mon.np0005538513 (monmap changed)...
Nov 28 09:56:02 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain
Nov 28 09:56:02 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:02 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:02 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:02 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:56:02 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:02 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:56:03 np0005538513.localdomain ceph-mon[292954]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:03 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:03 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:03.824 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:04 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:56:04 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] Writing back 50 completed events
Nov 28 09:56:05 np0005538513.localdomain ceph-mon[292954]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:56:05 np0005538513.localdomain podman[301799]: 2025-11-28 09:56:05.489572654 +0000 UTC m=+0.083915874 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.tags=minimal rhel9, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Nov 28 09:56:05 np0005538513.localdomain podman[301799]: 2025-11-28 09:56:05.505458213 +0000 UTC m=+0.099801433 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=edpm, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, maintainer=Red Hat, Inc.)
Nov 28 09:56:05 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:56:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:05.576 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:05 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:06 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:06 np0005538513.localdomain ceph-mon[292954]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:07 np0005538513.localdomain ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44494 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:56:07 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO root] Saving service mon spec with placement label:mon
Nov 28 09:56:07 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon
Nov 28 09:56:07 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] update: starting ev 18aba204-dae4-489c-a558-bf8f9a9db726 (Updating node-proxy deployment (+3 -> 3))
Nov 28 09:56:07 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] complete: finished ev 18aba204-dae4-489c-a558-bf8f9a9db726 (Updating node-proxy deployment (+3 -> 3))
Nov 28 09:56:07 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] Completed event 18aba204-dae4-489c-a558-bf8f9a9db726 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 09:56:07 np0005538513.localdomain sudo[301820]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:56:07 np0005538513.localdomain sudo[301820]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:07 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:07 np0005538513.localdomain sudo[301820]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:08 np0005538513.localdomain ceph-mon[292954]: from='client.44494 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:56:08 np0005538513.localdomain ceph-mon[292954]: Saving service mon spec with placement label:mon
Nov 28 09:56:08 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:08 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:08 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:56:08 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:08 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:56:08 np0005538513.localdomain ceph-mon[292954]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:08 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:08.826 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:09 np0005538513.localdomain ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44500 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538515", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:56:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:56:09 np0005538513.localdomain ceph-mon[292954]: from='client.44500 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538515", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:56:09 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:10 np0005538513.localdomain podman[238687]: time="2025-11-28T09:56:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:56:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:56:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 28 09:56:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:56:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18735 "" "Go-http-client/1.1"
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] Writing back 50 completed events
Nov 28 09:56:10 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:10.578 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44506 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005538515"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO root] Remove daemons mon.np0005538515
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Remove daemons mon.np0005538515
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005538515: new quorum should be ['np0005538513', 'np0005538514'] (from ['np0005538513', 'np0005538514'])
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005538515: new quorum should be ['np0005538513', 'np0005538514'] (from ['np0005538513', 'np0005538514'])
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005538515 from monmap...
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removing monitor np0005538515 from monmap...
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005538515 from np0005538515.localdomain -- ports []
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005538515 from np0005538515.localdomain -- ports []
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@1(peon) e14  my rank is now 0 (was 1)
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: client.44410 ms_handle_reset on v2:172.18.0.103:3300/0
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 calling monitor election
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: paxos.0).electionLogic(48) init, last seen epoch 48
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(electing) e14 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 is new leader, mons np0005538513,np0005538514 in quorum (ranks 0,1)
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : monmap epoch 14
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : last_changed 2025-11-28T09:56:10.676143+0000
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : created 2025-11-28T07:45:36.120469+0000
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef)
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : election_strategy: 1
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538514
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e90: 6 total, 6 up, 6 in
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e34: np0005538513.dsfdlx(active, since 51s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : overall HEALTH_OK
Nov 28 09:56:10 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513 calling monitor election
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538514 calling monitor election
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513 is new leader, mons np0005538513,np0005538514 in quorum (ranks 0,1)
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: monmap epoch 14
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: last_changed 2025-11-28T09:56:10.676143+0000
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: min_mon_release 18 (reef)
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: election_strategy: 1
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538514
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: osdmap e90: 6 total, 6 up, 6 in
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: mgrmap e34: np0005538513.dsfdlx(active, since 51s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: overall HEALTH_OK
Nov 28 09:56:10 np0005538513.localdomain podman[301838]: 2025-11-28 09:56:10.85406511 +0000 UTC m=+0.086086130 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:56:10 np0005538513.localdomain ceph-mds[282744]: --2- [v2:172.18.0.106:6808/2782735008,v1:172.18.0.106:6809/2782735008] >> [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] conn(0x55dac314f800 0x55dac3506000 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: --2- 172.18.0.106:0/2775473572 >> [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] conn(0x560b9b6e2400 0x560b9ae07700 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: --2- 172.18.0.106:0/3621695456 >> [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] conn(0x560b9b6e3800 0x560b9b75f080 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: --2- 172.18.0.106:0/3109361859 >> [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] conn(0x560b9b920400 0x560b9b918000 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: --2- 172.18.0.106:0/4290692976 >> [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] conn(0x560b9c7edc00 0x560b9ae04b00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: client.27136 ms_handle_reset on v2:172.18.0.104:3300/0
Nov 28 09:56:10 np0005538513.localdomain podman[301838]: 2025-11-28 09:56:10.89108926 +0000 UTC m=+0.123110340 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538513"} v 0)
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:56:10 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 09:56:10 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:10 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:10 np0005538513.localdomain sudo[301863]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:56:10 np0005538513.localdomain sudo[301863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:10 np0005538513.localdomain sudo[301863]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:11 np0005538513.localdomain sudo[301881]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:56:11 np0005538513.localdomain sudo[301881]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:11 np0005538513.localdomain sudo[301881]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:11 np0005538513.localdomain sudo[301899]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:11 np0005538513.localdomain sudo[301899]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:11 np0005538513.localdomain sudo[301899]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:11 np0005538513.localdomain sudo[301917]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:11 np0005538513.localdomain sudo[301917]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:11 np0005538513.localdomain sudo[301917]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:11 np0005538513.localdomain sudo[301935]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:11 np0005538513.localdomain sudo[301935]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:11 np0005538513.localdomain sudo[301935]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:11 np0005538513.localdomain sudo[301969]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:11 np0005538513.localdomain sudo[301969]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:11 np0005538513.localdomain sudo[301969]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:11 np0005538513.localdomain sudo[301987]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:11 np0005538513.localdomain sudo[301987]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:11 np0005538513.localdomain sudo[301987]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:11 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:11 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:11 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:11 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:11 np0005538513.localdomain sudo[302005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:56:11 np0005538513.localdomain sudo[302005]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:11 np0005538513.localdomain sudo[302005]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:11 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:11 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:11 np0005538513.localdomain sudo[302023]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:56:11 np0005538513.localdomain sudo[302023]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:11 np0005538513.localdomain sudo[302023]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:11 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:11 np0005538513.localdomain sudo[302041]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:56:11 np0005538513.localdomain sudo[302041]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:11 np0005538513.localdomain sudo[302041]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:11 np0005538513.localdomain ceph-mon[292954]: Remove daemons mon.np0005538515
Nov 28 09:56:11 np0005538513.localdomain ceph-mon[292954]: Safe to remove mon.np0005538515: new quorum should be ['np0005538513', 'np0005538514'] (from ['np0005538513', 'np0005538514'])
Nov 28 09:56:11 np0005538513.localdomain ceph-mon[292954]: Removing monitor np0005538515 from monmap...
Nov 28 09:56:11 np0005538513.localdomain ceph-mon[292954]: Removing daemon mon.np0005538515 from np0005538515.localdomain -- ports []
Nov 28 09:56:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:56:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:56:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:56:11 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:11 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:11 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:11 np0005538513.localdomain sudo[302059]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:11 np0005538513.localdomain sudo[302059]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:11 np0005538513.localdomain sudo[302059]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:11 np0005538513.localdomain sudo[302077]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:11 np0005538513.localdomain sudo[302077]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:11 np0005538513.localdomain sudo[302077]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:11 np0005538513.localdomain sudo[302095]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:11 np0005538513.localdomain sudo[302095]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:11 np0005538513.localdomain sudo[302095]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:12 np0005538513.localdomain sudo[302129]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:12 np0005538513.localdomain sudo[302129]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:12 np0005538513.localdomain sudo[302129]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:12 np0005538513.localdomain sudo[302147]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:12 np0005538513.localdomain sudo[302147]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:12 np0005538513.localdomain sudo[302147]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:12 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:56:12 np0005538513.localdomain sudo[302165]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:12 np0005538513.localdomain sudo[302165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:12 np0005538513.localdomain sudo[302165]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:12 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:12 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:56:12 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:56:12 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:12 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:56:12 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:12 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:12 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:56:12 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:12 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:56:12 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:12 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 09:56:12 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:12 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] update: starting ev 53cd24d9-055a-4195-99ca-ff907803a144 (Updating node-proxy deployment (+3 -> 3))
Nov 28 09:56:12 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] complete: finished ev 53cd24d9-055a-4195-99ca-ff907803a144 (Updating node-proxy deployment (+3 -> 3))
Nov 28 09:56:12 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] Completed event 53cd24d9-055a-4195-99ca-ff907803a144 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 09:56:12 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 09:56:12 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:56:12 np0005538513.localdomain sudo[302183]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:56:12 np0005538513.localdomain sudo[302183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:12 np0005538513.localdomain sudo[302183]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:12 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:56:12 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:56:12 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Nov 28 09:56:12 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:12 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:12 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:12 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:56:12 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:56:12 np0005538513.localdomain sudo[302201]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:12 np0005538513.localdomain sudo[302201]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:12 np0005538513.localdomain sudo[302201]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:12 np0005538513.localdomain sudo[302219]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:12 np0005538513.localdomain sudo[302219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:12 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:56:13 np0005538513.localdomain podman[302237]: 2025-11-28 09:56:13.053207171 +0000 UTC m=+0.064024582 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller)
Nov 28 09:56:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:56:13 np0005538513.localdomain podman[302237]: 2025-11-28 09:56:13.125386823 +0000 UTC m=+0.136204204 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 09:56:13 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:56:13 np0005538513.localdomain podman[302256]: 2025-11-28 09:56:13.178851349 +0000 UTC m=+0.106743357 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 09:56:13 np0005538513.localdomain podman[302256]: 2025-11-28 09:56:13.216537999 +0000 UTC m=+0.144429987 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 28 09:56:13 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3571339704' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3571339704' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:56:13 np0005538513.localdomain podman[302296]: 
Nov 28 09:56:13 np0005538513.localdomain podman[302296]: 2025-11-28 09:56:13.500323714 +0000 UTC m=+0.082902823 container create 66f70c5ca36463b5108ee763191a47ca79bdc1155632debe1533c211c1b2526e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_neumann, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, build-date=2025-09-24T08:57:55, RELEASE=main, name=rhceph, architecture=x86_64, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, version=7, io.openshift.expose-services=)
Nov 28 09:56:13 np0005538513.localdomain systemd[1]: Started libpod-conmon-66f70c5ca36463b5108ee763191a47ca79bdc1155632debe1533c211c1b2526e.scope.
Nov 28 09:56:13 np0005538513.localdomain podman[302296]: 2025-11-28 09:56:13.463627475 +0000 UTC m=+0.046206644 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:13 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:56:13 np0005538513.localdomain podman[302296]: 2025-11-28 09:56:13.578939674 +0000 UTC m=+0.161518793 container init 66f70c5ca36463b5108ee763191a47ca79bdc1155632debe1533c211c1b2526e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_neumann, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, RELEASE=main, name=rhceph, io.buildah.version=1.33.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, vcs-type=git, GIT_BRANCH=main, GIT_CLEAN=True, release=553, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc.)
Nov 28 09:56:13 np0005538513.localdomain podman[302296]: 2025-11-28 09:56:13.59049057 +0000 UTC m=+0.173069689 container start 66f70c5ca36463b5108ee763191a47ca79bdc1155632debe1533c211c1b2526e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_neumann, version=7, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, ceph=True, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, release=553, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, CEPH_POINT_RELEASE=, name=rhceph, vcs-type=git)
Nov 28 09:56:13 np0005538513.localdomain podman[302296]: 2025-11-28 09:56:13.591551312 +0000 UTC m=+0.174130421 container attach 66f70c5ca36463b5108ee763191a47ca79bdc1155632debe1533c211c1b2526e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_neumann, distribution-scope=public, RELEASE=main, release=553, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, vendor=Red Hat, Inc., ceph=True, GIT_CLEAN=True, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_BRANCH=main)
Nov 28 09:56:13 np0005538513.localdomain cool_neumann[302312]: 167 167
Nov 28 09:56:13 np0005538513.localdomain systemd[1]: libpod-66f70c5ca36463b5108ee763191a47ca79bdc1155632debe1533c211c1b2526e.scope: Deactivated successfully.
Nov 28 09:56:13 np0005538513.localdomain podman[302296]: 2025-11-28 09:56:13.597330251 +0000 UTC m=+0.179909360 container died 66f70c5ca36463b5108ee763191a47ca79bdc1155632debe1533c211c1b2526e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_neumann, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:56:13 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:13 np0005538513.localdomain podman[302317]: 2025-11-28 09:56:13.710283877 +0000 UTC m=+0.100873546 container remove 66f70c5ca36463b5108ee763191a47ca79bdc1155632debe1533c211c1b2526e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_neumann, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, name=rhceph, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, vcs-type=git, release=553, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:56:13 np0005538513.localdomain systemd[1]: libpod-conmon-66f70c5ca36463b5108ee763191a47ca79bdc1155632debe1533c211c1b2526e.scope: Deactivated successfully.
Nov 28 09:56:13 np0005538513.localdomain sudo[302219]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:13 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)...
Nov 28 09:56:13 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)...
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0)
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:13 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:13 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:56:13 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:56:13 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:13.829 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:13 np0005538513.localdomain sudo[302333]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:13 np0005538513.localdomain sudo[302333]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:13 np0005538513.localdomain sudo[302333]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:13 np0005538513.localdomain sudo[302351]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:13 np0005538513.localdomain sudo[302351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:14 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-91eb1e456f81d915cfcfd114f3e7915de83d6ef8b718cba73e5a296374edd668-merged.mount: Deactivated successfully.
Nov 28 09:56:14 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:56:14 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:56:14 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3571339704' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:56:14 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3571339704' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:56:14 np0005538513.localdomain ceph-mon[292954]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:14 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.2 (monmap changed)...
Nov 28 09:56:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:56:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:14 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:56:14 np0005538513.localdomain podman[302385]: 
Nov 28 09:56:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:56:14 np0005538513.localdomain podman[302385]: 2025-11-28 09:56:14.492583377 +0000 UTC m=+0.082728017 container create 9cc4927a49569c85811c1578b9c8372282862ad0440ea85f1ae72504d450edae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_tu, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.expose-services=, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, architecture=x86_64, vcs-type=git)
Nov 28 09:56:14 np0005538513.localdomain systemd[1]: Started libpod-conmon-9cc4927a49569c85811c1578b9c8372282862ad0440ea85f1ae72504d450edae.scope.
Nov 28 09:56:14 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:56:14 np0005538513.localdomain podman[302385]: 2025-11-28 09:56:14.457159476 +0000 UTC m=+0.047304116 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:14 np0005538513.localdomain podman[302385]: 2025-11-28 09:56:14.561442537 +0000 UTC m=+0.151587177 container init 9cc4927a49569c85811c1578b9c8372282862ad0440ea85f1ae72504d450edae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_tu, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, name=rhceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, version=7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git)
Nov 28 09:56:14 np0005538513.localdomain podman[302385]: 2025-11-28 09:56:14.571446924 +0000 UTC m=+0.161591564 container start 9cc4927a49569c85811c1578b9c8372282862ad0440ea85f1ae72504d450edae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_tu, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, GIT_BRANCH=main, io.buildah.version=1.33.12, name=rhceph, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, distribution-scope=public, ceph=True, RELEASE=main, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_CLEAN=True, version=7, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:56:14 np0005538513.localdomain podman[302385]: 2025-11-28 09:56:14.571880597 +0000 UTC m=+0.162025237 container attach 9cc4927a49569c85811c1578b9c8372282862ad0440ea85f1ae72504d450edae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_tu, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, release=553, distribution-scope=public, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, ceph=True, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, description=Red Hat Ceph Storage 7)
Nov 28 09:56:14 np0005538513.localdomain loving_tu[302400]: 167 167
Nov 28 09:56:14 np0005538513.localdomain systemd[1]: libpod-9cc4927a49569c85811c1578b9c8372282862ad0440ea85f1ae72504d450edae.scope: Deactivated successfully.
Nov 28 09:56:14 np0005538513.localdomain podman[302385]: 2025-11-28 09:56:14.574825478 +0000 UTC m=+0.164970148 container died 9cc4927a49569c85811c1578b9c8372282862ad0440ea85f1ae72504d450edae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_tu, name=rhceph, architecture=x86_64, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., ceph=True, distribution-scope=public, io.openshift.expose-services=, RELEASE=main, io.buildah.version=1.33.12, version=7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-type=git)
Nov 28 09:56:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:14.589 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:56:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:14.615 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Triggering sync for uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 28 09:56:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:14.616 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:56:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:14.617 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:56:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:14.644 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:56:14 np0005538513.localdomain podman[302405]: 2025-11-28 09:56:14.684465184 +0000 UTC m=+0.098147092 container remove 9cc4927a49569c85811c1578b9c8372282862ad0440ea85f1ae72504d450edae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_tu, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, distribution-scope=public, version=7, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:56:14 np0005538513.localdomain systemd[1]: libpod-conmon-9cc4927a49569c85811c1578b9c8372282862ad0440ea85f1ae72504d450edae.scope: Deactivated successfully.
Nov 28 09:56:14 np0005538513.localdomain sudo[302351]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:56:14 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:56:14 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:14 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)...
Nov 28 09:56:14 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)...
Nov 28 09:56:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0)
Nov 28 09:56:14 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:56:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:14 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:14 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:56:14 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:56:15 np0005538513.localdomain sudo[302428]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:15 np0005538513.localdomain sudo[302428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:15 np0005538513.localdomain sudo[302428]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:15 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-589615937de90216bb8fad5f08e8ce8996473ff2ce29c17f87e90c91023b1000-merged.mount: Deactivated successfully.
Nov 28 09:56:15 np0005538513.localdomain sudo[302446]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:15 np0005538513.localdomain sudo[302446]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:15 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] Writing back 50 completed events
Nov 28 09:56:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 09:56:15 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:15 np0005538513.localdomain podman[302480]: 
Nov 28 09:56:15 np0005538513.localdomain podman[302480]: 2025-11-28 09:56:15.524664426 +0000 UTC m=+0.081399868 container create 834317ed8d664412db7745c1ab9e50df30feb1ba6137d978a406cb59b65ac9d5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_neumann, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, RELEASE=main, distribution-scope=public, io.buildah.version=1.33.12, release=553, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.openshift.expose-services=)
Nov 28 09:56:15 np0005538513.localdomain systemd[1]: Started libpod-conmon-834317ed8d664412db7745c1ab9e50df30feb1ba6137d978a406cb59b65ac9d5.scope.
Nov 28 09:56:15 np0005538513.localdomain podman[302480]: 2025-11-28 09:56:15.491565157 +0000 UTC m=+0.048300639 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:15 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:56:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:15.619 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:15 np0005538513.localdomain podman[302480]: 2025-11-28 09:56:15.636361073 +0000 UTC m=+0.193096485 container init 834317ed8d664412db7745c1ab9e50df30feb1ba6137d978a406cb59b65ac9d5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_neumann, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, name=rhceph, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, version=7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True)
Nov 28 09:56:15 np0005538513.localdomain podman[302480]: 2025-11-28 09:56:15.64956897 +0000 UTC m=+0.206304402 container start 834317ed8d664412db7745c1ab9e50df30feb1ba6137d978a406cb59b65ac9d5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_neumann, io.buildah.version=1.33.12, architecture=x86_64, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, description=Red Hat Ceph Storage 7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-type=git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container)
Nov 28 09:56:15 np0005538513.localdomain gracious_neumann[302495]: 167 167
Nov 28 09:56:15 np0005538513.localdomain podman[302480]: 2025-11-28 09:56:15.652580233 +0000 UTC m=+0.209315705 container attach 834317ed8d664412db7745c1ab9e50df30feb1ba6137d978a406cb59b65ac9d5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_neumann, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, release=553, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, com.redhat.component=rhceph-container, GIT_CLEAN=True, distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vcs-type=git)
Nov 28 09:56:15 np0005538513.localdomain systemd[1]: libpod-834317ed8d664412db7745c1ab9e50df30feb1ba6137d978a406cb59b65ac9d5.scope: Deactivated successfully.
Nov 28 09:56:15 np0005538513.localdomain podman[302480]: 2025-11-28 09:56:15.655228174 +0000 UTC m=+0.211963636 container died 834317ed8d664412db7745c1ab9e50df30feb1ba6137d978a406cb59b65ac9d5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_neumann, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-type=git, build-date=2025-09-24T08:57:55, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, release=553, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, version=7)
Nov 28 09:56:15 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:15 np0005538513.localdomain podman[302500]: 2025-11-28 09:56:15.757709959 +0000 UTC m=+0.092335153 container remove 834317ed8d664412db7745c1ab9e50df30feb1ba6137d978a406cb59b65ac9d5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_neumann, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git)
Nov 28 09:56:15 np0005538513.localdomain systemd[1]: libpod-conmon-834317ed8d664412db7745c1ab9e50df30feb1ba6137d978a406cb59b65ac9d5.scope: Deactivated successfully.
Nov 28 09:56:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:15 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.5 (monmap changed)...
Nov 28 09:56:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:56:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:15 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:56:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:15 np0005538513.localdomain sudo[302446]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:56:15 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:56:15 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:15 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:56:15 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:56:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Nov 28 09:56:15 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:56:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:15 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:15 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:56:15 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:56:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:56:16 np0005538513.localdomain systemd[1]: tmp-crun.1KaqtF.mount: Deactivated successfully.
Nov 28 09:56:16 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-21bda7b6591885f7cd0085a1ae2256d09e2eabadc0cb751da4c419696b1902cd-merged.mount: Deactivated successfully.
Nov 28 09:56:16 np0005538513.localdomain sudo[302525]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:16 np0005538513.localdomain sudo[302525]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:16 np0005538513.localdomain sudo[302525]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:16 np0005538513.localdomain podman[302533]: 2025-11-28 09:56:16.120666251 +0000 UTC m=+0.092448397 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 28 09:56:16 np0005538513.localdomain sudo[302554]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:16 np0005538513.localdomain sudo[302554]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:16 np0005538513.localdomain podman[302533]: 2025-11-28 09:56:16.161482978 +0000 UTC m=+0.133265154 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:56:16 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:56:16 np0005538513.localdomain podman[302596]: 
Nov 28 09:56:16 np0005538513.localdomain podman[302596]: 2025-11-28 09:56:16.606601309 +0000 UTC m=+0.070331686 container create ca32b51280dd95df222879fdb987941e9ac8be5cc4469b2ac5cd650bf8e5199a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chandrasekhar, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, release=553, build-date=2025-09-24T08:57:55, ceph=True, distribution-scope=public, RELEASE=main, name=rhceph, io.openshift.tags=rhceph ceph, architecture=x86_64, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., vcs-type=git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.buildah.version=1.33.12)
Nov 28 09:56:16 np0005538513.localdomain systemd[1]: Started libpod-conmon-ca32b51280dd95df222879fdb987941e9ac8be5cc4469b2ac5cd650bf8e5199a.scope.
Nov 28 09:56:16 np0005538513.localdomain podman[302596]: 2025-11-28 09:56:16.572107116 +0000 UTC m=+0.035837533 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:16 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:56:16 np0005538513.localdomain podman[302596]: 2025-11-28 09:56:16.686717904 +0000 UTC m=+0.150448291 container init ca32b51280dd95df222879fdb987941e9ac8be5cc4469b2ac5cd650bf8e5199a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chandrasekhar, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.tags=rhceph ceph, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, RELEASE=main, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7)
Nov 28 09:56:16 np0005538513.localdomain podman[302596]: 2025-11-28 09:56:16.696843646 +0000 UTC m=+0.160574023 container start ca32b51280dd95df222879fdb987941e9ac8be5cc4469b2ac5cd650bf8e5199a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chandrasekhar, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.component=rhceph-container, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, distribution-scope=public, GIT_CLEAN=True, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:56:16 np0005538513.localdomain podman[302596]: 2025-11-28 09:56:16.697207277 +0000 UTC m=+0.160937694 container attach ca32b51280dd95df222879fdb987941e9ac8be5cc4469b2ac5cd650bf8e5199a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chandrasekhar, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, com.redhat.component=rhceph-container, architecture=x86_64, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, RELEASE=main, release=553, name=rhceph, version=7, vcs-type=git)
Nov 28 09:56:16 np0005538513.localdomain funny_chandrasekhar[302611]: 167 167
Nov 28 09:56:16 np0005538513.localdomain systemd[1]: libpod-ca32b51280dd95df222879fdb987941e9ac8be5cc4469b2ac5cd650bf8e5199a.scope: Deactivated successfully.
Nov 28 09:56:16 np0005538513.localdomain podman[302596]: 2025-11-28 09:56:16.70055644 +0000 UTC m=+0.164286827 container died ca32b51280dd95df222879fdb987941e9ac8be5cc4469b2ac5cd650bf8e5199a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chandrasekhar, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_CLEAN=True, GIT_BRANCH=main, release=553, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph)
Nov 28 09:56:16 np0005538513.localdomain podman[302616]: 2025-11-28 09:56:16.800992492 +0000 UTC m=+0.086334219 container remove ca32b51280dd95df222879fdb987941e9ac8be5cc4469b2ac5cd650bf8e5199a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chandrasekhar, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vendor=Red Hat, Inc., name=rhceph, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, RELEASE=main, description=Red Hat Ceph Storage 7)
Nov 28 09:56:16 np0005538513.localdomain systemd[1]: libpod-conmon-ca32b51280dd95df222879fdb987941e9ac8be5cc4469b2ac5cd650bf8e5199a.scope: Deactivated successfully.
Nov 28 09:56:16 np0005538513.localdomain sudo[302554]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:56:16 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:56:16 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:16 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:56:16 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:56:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Nov 28 09:56:16 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 28 09:56:16 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:56:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:16 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:16 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:56:16 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:56:16 np0005538513.localdomain ceph-mon[292954]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:16 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:56:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:56:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:16 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:56:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:56:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:16 np0005538513.localdomain sudo[302632]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:16 np0005538513.localdomain sudo[302632]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:16 np0005538513.localdomain sudo[302632]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:17 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-24a76214d0b604632e20f804f365cd6a1301270744059c3eb79caa7ed867f8a0-merged.mount: Deactivated successfully.
Nov 28 09:56:17 np0005538513.localdomain sudo[302650]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:17 np0005538513.localdomain sudo[302650]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:17 np0005538513.localdomain podman[302685]: 
Nov 28 09:56:17 np0005538513.localdomain podman[302685]: 2025-11-28 09:56:17.528162055 +0000 UTC m=+0.078674072 container create 4415bb8813e73b44d63e351928a3d53f0c2b0faa16a92e5fb19bbd49045f47ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_raman, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, name=rhceph, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, release=553, distribution-scope=public, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, version=7, ceph=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc.)
Nov 28 09:56:17 np0005538513.localdomain systemd[1]: Started libpod-conmon-4415bb8813e73b44d63e351928a3d53f0c2b0faa16a92e5fb19bbd49045f47ee.scope.
Nov 28 09:56:17 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:56:17 np0005538513.localdomain podman[302685]: 2025-11-28 09:56:17.493093555 +0000 UTC m=+0.043605592 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:17 np0005538513.localdomain podman[302685]: 2025-11-28 09:56:17.5936354 +0000 UTC m=+0.144147417 container init 4415bb8813e73b44d63e351928a3d53f0c2b0faa16a92e5fb19bbd49045f47ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_raman, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, io.openshift.expose-services=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, com.redhat.component=rhceph-container)
Nov 28 09:56:17 np0005538513.localdomain competent_raman[302700]: 167 167
Nov 28 09:56:17 np0005538513.localdomain systemd[1]: libpod-4415bb8813e73b44d63e351928a3d53f0c2b0faa16a92e5fb19bbd49045f47ee.scope: Deactivated successfully.
Nov 28 09:56:17 np0005538513.localdomain podman[302685]: 2025-11-28 09:56:17.603184574 +0000 UTC m=+0.153696591 container start 4415bb8813e73b44d63e351928a3d53f0c2b0faa16a92e5fb19bbd49045f47ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_raman, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, ceph=True, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-type=git)
Nov 28 09:56:17 np0005538513.localdomain podman[302685]: 2025-11-28 09:56:17.603492094 +0000 UTC m=+0.154004101 container attach 4415bb8813e73b44d63e351928a3d53f0c2b0faa16a92e5fb19bbd49045f47ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_raman, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, name=rhceph, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, com.redhat.component=rhceph-container, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 09:56:17 np0005538513.localdomain podman[302685]: 2025-11-28 09:56:17.605937359 +0000 UTC m=+0.156449416 container died 4415bb8813e73b44d63e351928a3d53f0c2b0faa16a92e5fb19bbd49045f47ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_raman, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, name=rhceph, version=7, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.buildah.version=1.33.12, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, vendor=Red Hat, Inc.)
Nov 28 09:56:17 np0005538513.localdomain podman[302705]: 2025-11-28 09:56:17.698639963 +0000 UTC m=+0.084916875 container remove 4415bb8813e73b44d63e351928a3d53f0c2b0faa16a92e5fb19bbd49045f47ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_raman, ceph=True, RELEASE=main, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, name=rhceph, io.openshift.expose-services=, version=7, distribution-scope=public, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=)
Nov 28 09:56:17 np0005538513.localdomain systemd[1]: libpod-conmon-4415bb8813e73b44d63e351928a3d53f0c2b0faa16a92e5fb19bbd49045f47ee.scope: Deactivated successfully.
Nov 28 09:56:17 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:17 np0005538513.localdomain sudo[302650]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:17 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:56:17 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:17 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:56:17 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:17 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:56:17 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:56:17 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Nov 28 09:56:17 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:17 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:17 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:17 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:56:17 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:56:18 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:56:18 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:56:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:18 np0005538513.localdomain systemd[1]: tmp-crun.7ebelN.mount: Deactivated successfully.
Nov 28 09:56:18 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-dcd5052f61ff523c6112cde343c61f8c68ec103d5011ca6267601757d3eb715c-merged.mount: Deactivated successfully.
Nov 28 09:56:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:56:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:56:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:56:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:56:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:56:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:56:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:56:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:56:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:56:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:56:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:56:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:56:18 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:56:18 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:18 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:56:18 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:18 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)...
Nov 28 09:56:18 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)...
Nov 28 09:56:18 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0)
Nov 28 09:56:18 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 28 09:56:18 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:18 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:18 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005538514.localdomain
Nov 28 09:56:18 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005538514.localdomain
Nov 28 09:56:18 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:18.832 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:19 np0005538513.localdomain ceph-mon[292954]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:19 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:56:19 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:56:19 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:19 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:19 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 28 09:56:19 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:56:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [balancer INFO root] Optimize plan auto_2025-11-28_09:56:19
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [balancer INFO root] do_upmap
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [balancer INFO root] pools ['manila_data', 'images', 'backups', 'vms', 'manila_metadata', '.mgr', 'volumes']
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [balancer INFO root] prepared 0/10 changes
Nov 28 09:56:19 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:56:19 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)...
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)...
Nov 28 09:56:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0)
Nov 28 09:56:19 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 28 09:56:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:19 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003328000680485762 of space, bias 1.0, pg target 0.6656001360971524 quantized to 32 (current 32)
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0014449417225013959 of space, bias 1.0, pg target 0.2885066972594454 quantized to 32 (current 32)
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.453674623115578e-06 of space, bias 4.0, pg target 0.0019596681323283084 quantized to 16 (current 16)
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 09:56:19 np0005538513.localdomain ceph-mgr[286105]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 09:56:20 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.0 (monmap changed)...
Nov 28 09:56:20 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.0 on np0005538514.localdomain
Nov 28 09:56:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 28 09:56:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:20.622 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:20 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:56:20 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:20 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:56:20 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:20 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:56:20 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:56:20 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Nov 28 09:56:20 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:56:20 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:20 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:20 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:56:20 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:56:21 np0005538513.localdomain ceph-mon[292954]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:21 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.3 (monmap changed)...
Nov 28 09:56:21 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:56:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:56:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:56:21 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:56:21 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:21 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:56:21 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:56:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Nov 28 09:56:21 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 28 09:56:21 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:56:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:21 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:21 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:56:21 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:56:21 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:22 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:56:22 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:56:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:56:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:22 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:56:22 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:22 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:56:22 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:22 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:56:22 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:56:22 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Nov 28 09:56:22 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:22 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:22 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:22 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:56:22 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:56:23 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:56:23 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:56:23 np0005538513.localdomain ceph-mon[292954]: pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:23 np0005538513.localdomain ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.54203 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005538515.localdomain:172.18.0.105", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:56:23 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Nov 28 09:56:23 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:23 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:56:23 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Nov 28 09:56:23 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:56:23 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:23 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:23 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:23 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:56:23 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:23 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Deploying daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:56:23 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Deploying daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:56:23 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)...
Nov 28 09:56:23 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)...
Nov 28 09:56:23 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0)
Nov 28 09:56:23 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 28 09:56:23 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:23 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:23 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:56:23 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:56:23 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:23.876 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:24 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:56:24 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:56:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:56:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 28 09:56:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:56:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:56:24 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:56:24 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:24 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)...
Nov 28 09:56:24 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)...
Nov 28 09:56:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0)
Nov 28 09:56:24 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 28 09:56:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:24 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:24 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:56:24 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:56:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:56:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:56:24 np0005538513.localdomain systemd[1]: tmp-crun.UFPg3K.mount: Deactivated successfully.
Nov 28 09:56:24 np0005538513.localdomain podman[302721]: 2025-11-28 09:56:24.871515241 +0000 UTC m=+0.099320848 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:56:24 np0005538513.localdomain podman[302722]: 2025-11-28 09:56:24.916526667 +0000 UTC m=+0.143137047 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0)
Nov 28 09:56:24 np0005538513.localdomain podman[302721]: 2025-11-28 09:56:24.938595036 +0000 UTC m=+0.166400603 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:56:24 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:56:24 np0005538513.localdomain podman[302722]: 2025-11-28 09:56:24.959905222 +0000 UTC m=+0.186515562 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 09:56:24 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:56:25 np0005538513.localdomain ceph-mon[292954]: from='client.54203 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005538515.localdomain:172.18.0.105", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:56:25 np0005538513.localdomain ceph-mon[292954]: Deploying daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:56:25 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.1 (monmap changed)...
Nov 28 09:56:25 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:56:25 np0005538513.localdomain ceph-mon[292954]: pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 28 09:56:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:25.660 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:25 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:26 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.4 (monmap changed)...
Nov 28 09:56:26 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:56:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:56:26 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:56:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Nov 28 09:56:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Nov 28 09:56:26 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:56:27 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:56:27 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).monmap v14 adding/updating np0005538515 at [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to monitor cluster
Nov 28 09:56:27 np0005538513.localdomain ceph-mgr[286105]: mgr.server handle_open ignoring open from mon.np0005538515 172.18.0.108:0/3638165379; not ready for session (expect reconnect)
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0)
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:27 np0005538513.localdomain ceph-mgr[286105]: mgr finish mon failed to return metadata for mon.np0005538515: (2) No such file or directory
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(probing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538513"} v 0)
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(probing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(probing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0)
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:27 np0005538513.localdomain ceph-mgr[286105]: mgr finish mon failed to return metadata for mon.np0005538515: (22) Invalid argument
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 calling monitor election
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: paxos.0).electionLogic(50) init, last seen epoch 50
Nov 28 09:56:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:56:27 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:28 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(electing) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:56:28 np0005538513.localdomain ceph-mgr[286105]: mgr.server handle_open ignoring open from mon.np0005538515 172.18.0.108:0/3638165379; not ready for session (expect reconnect)
Nov 28 09:56:28 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0)
Nov 28 09:56:28 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:28 np0005538513.localdomain ceph-mgr[286105]: mgr finish mon failed to return metadata for mon.np0005538515: (22) Invalid argument
Nov 28 09:56:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:28.880 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:29 np0005538513.localdomain ceph-mgr[286105]: mgr.server handle_open ignoring open from mon.np0005538515 172.18.0.108:0/3638165379; not ready for session (expect reconnect)
Nov 28 09:56:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0)
Nov 28 09:56:29 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:29 np0005538513.localdomain ceph-mgr[286105]: mgr finish mon failed to return metadata for mon.np0005538515: (22) Invalid argument
Nov 28 09:56:29 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:30 np0005538513.localdomain ceph-mgr[286105]: mgr.server handle_open ignoring open from mon.np0005538515 172.18.0.108:0/3638165379; not ready for session (expect reconnect)
Nov 28 09:56:30 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0)
Nov 28 09:56:30 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:30 np0005538513.localdomain ceph-mgr[286105]: mgr finish mon failed to return metadata for mon.np0005538515: (22) Invalid argument
Nov 28 09:56:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:30.663 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:31 np0005538513.localdomain ceph-mgr[286105]: mgr.server handle_open ignoring open from mon.np0005538515 172.18.0.108:0/3638165379; not ready for session (expect reconnect)
Nov 28 09:56:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0)
Nov 28 09:56:31 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:31 np0005538513.localdomain ceph-mgr[286105]: mgr finish mon failed to return metadata for mon.np0005538515: (22) Invalid argument
Nov 28 09:56:31 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:32 np0005538513.localdomain ceph-mgr[286105]: mgr.server handle_open ignoring open from mon.np0005538515 172.18.0.108:0/3638165379; not ready for session (expect reconnect)
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0)
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:32 np0005538513.localdomain ceph-mgr[286105]: mgr finish mon failed to return metadata for mon.np0005538515: (22) Invalid argument
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 is new leader, mons np0005538513,np0005538514 in quorum (ranks 0,1)
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : monmap epoch 15
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : last_changed 2025-11-28T09:56:27.227153+0000
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : created 2025-11-28T07:45:36.120469+0000
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef)
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : election_strategy: 1
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538514
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538515
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e90: 6 total, 6 up, 6 in
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e34: np0005538513.dsfdlx(active, since 72s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [WRN] : Health check failed: 1/3 mons down, quorum np0005538513,np0005538514 (MON_DOWN)
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [WRN] : Health detail: HEALTH_WARN 1/3 mons down, quorum np0005538513,np0005538514
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [WRN] : [WRN] MON_DOWN: 1/3 mons down, quorum np0005538513,np0005538514
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [WRN] :     mon.np0005538515 (rank 2) addr [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] is down (out of quorum)
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:32 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)...
Nov 28 09:56:32 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)...
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:32 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:56:32 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513 calling monitor election
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538514 calling monitor election
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513 is new leader, mons np0005538513,np0005538514 in quorum (ranks 0,1)
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: monmap epoch 15
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: last_changed 2025-11-28T09:56:27.227153+0000
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: min_mon_release 18 (reef)
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: election_strategy: 1
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538514
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538515
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: osdmap e90: 6 total, 6 up, 6 in
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: mgrmap e34: np0005538513.dsfdlx(active, since 72s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: Health check failed: 1/3 mons down, quorum np0005538513,np0005538514 (MON_DOWN)
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: Health detail: HEALTH_WARN 1/3 mons down, quorum np0005538513,np0005538514
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005538513,np0005538514
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]:     mon.np0005538515 (rank 2) addr [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] is down (out of quorum)
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:56:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:32.798 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:56:33 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:56:33 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:33 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:56:33 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:33 np0005538513.localdomain ceph-mgr[286105]: mgr.server handle_open ignoring open from mon.np0005538515 172.18.0.108:0/3638165379; not ready for session (expect reconnect)
Nov 28 09:56:33 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0)
Nov 28 09:56:33 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:33 np0005538513.localdomain ceph-mgr[286105]: mgr finish mon failed to return metadata for mon.np0005538515: (22) Invalid argument
Nov 28 09:56:33 np0005538513.localdomain sudo[302763]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:33 np0005538513.localdomain sudo[302763]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:33 np0005538513.localdomain sudo[302763]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:33 np0005538513.localdomain sudo[302781]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:56:33 np0005538513.localdomain sudo[302781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:33 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:33.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:56:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:33.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:56:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:33.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:56:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:33.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:56:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:33.912 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:34 np0005538513.localdomain sudo[302781]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)...
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:34 np0005538513.localdomain ceph-mgr[286105]: mgr.server handle_open ignoring open from mon.np0005538515 172.18.0.108:0/3638165379; not ready for session (expect reconnect)
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0)
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:34 np0005538513.localdomain ceph-mgr[286105]: mgr finish mon failed to return metadata for mon.np0005538515: (22) Invalid argument
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 calling monitor election
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: paxos.0).electionLogic(52) init, last seen epoch 52
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 is new leader, mons np0005538513,np0005538514,np0005538515 in quorum (ranks 0,1,2)
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : monmap epoch 15
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : last_changed 2025-11-28T09:56:27.227153+0000
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : created 2025-11-28T07:45:36.120469+0000
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef)
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : election_strategy: 1
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538514
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538515
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e90: 6 total, 6 up, 6 in
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e34: np0005538513.dsfdlx(active, since 74s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005538513,np0005538514)
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : Cluster is now healthy
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : overall HEALTH_OK
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:56:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:34.768 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:56:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:34.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:56:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:34.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:56:34 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538515 calling monitor election
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513 calling monitor election
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538515 calling monitor election
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513 is new leader, mons np0005538513,np0005538514,np0005538515 in quorum (ranks 0,1,2)
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538514 calling monitor election
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: monmap epoch 15
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: last_changed 2025-11-28T09:56:27.227153+0000
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: min_mon_release 18 (reef)
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: election_strategy: 1
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538514
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538515
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: osdmap e90: 6 total, 6 up, 6 in
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: mgrmap e34: np0005538513.dsfdlx(active, since 74s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005538513,np0005538514)
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: Cluster is now healthy
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: overall HEALTH_OK
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:35 np0005538513.localdomain ceph-mgr[286105]: mgr.server handle_open ignoring open from mon.np0005538515 172.18.0.108:0/3638165379; not ready for session (expect reconnect)
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0)
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:35.692 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:35 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 09:56:35 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:56:35 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:35 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:35 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:35 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:35 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:35 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:35 np0005538513.localdomain sudo[302840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:56:35 np0005538513.localdomain sudo[302840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:35 np0005538513.localdomain sudo[302840]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:35 np0005538513.localdomain podman[302831]: 2025-11-28 09:56:35.863444554 +0000 UTC m=+0.090557909 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, version=9.6)
Nov 28 09:56:35 np0005538513.localdomain podman[302831]: 2025-11-28 09:56:35.881347785 +0000 UTC m=+0.108461120 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, io.buildah.version=1.33.7)
Nov 28 09:56:35 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:56:35 np0005538513.localdomain sudo[302870]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:56:35 np0005538513.localdomain sudo[302870]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:35 np0005538513.localdomain sudo[302870]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538513.localdomain sudo[302888]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:36 np0005538513.localdomain sudo[302888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538513.localdomain sudo[302888]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538513.localdomain sudo[302906]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:36 np0005538513.localdomain sudo[302906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538513.localdomain sudo[302906]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538513.localdomain sudo[302924]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:36 np0005538513.localdomain sudo[302924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538513.localdomain sudo[302924]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:56:36 np0005538513.localdomain ceph-mgr[286105]: mgr.server handle_report got status from non-daemon mon.np0005538515
Nov 28 09:56:36 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:36.231+0000 7fc4e28b7640 -1 mgr.server handle_report got status from non-daemon mon.np0005538515
Nov 28 09:56:36 np0005538513.localdomain sudo[302958]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:36 np0005538513.localdomain sudo[302958]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538513.localdomain sudo[302958]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538513.localdomain sudo[302976]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:36 np0005538513.localdomain sudo[302976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538513.localdomain sudo[302976]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:36 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:36 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:36 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:36 np0005538513.localdomain sudo[302994]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:56:36 np0005538513.localdomain sudo[302994]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538513.localdomain sudo[302994]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:36 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:36 np0005538513.localdomain sudo[303012]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:56:36 np0005538513.localdomain sudo[303012]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538513.localdomain sudo[303012]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538513.localdomain sudo[303030]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:56:36 np0005538513.localdomain sudo[303030]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538513.localdomain sudo[303030]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538513.localdomain sudo[303048]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:36 np0005538513.localdomain sudo[303048]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538513.localdomain sudo[303048]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538513.localdomain sudo[303066]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:36 np0005538513.localdomain sudo[303066]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538513.localdomain sudo[303066]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538513.localdomain sudo[303084]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:36 np0005538513.localdomain sudo[303084]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538513.localdomain sudo[303084]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538513.localdomain sudo[303118]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:36 np0005538513.localdomain sudo[303118]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538513.localdomain sudo[303118]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538513.localdomain sudo[303136]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:36 np0005538513.localdomain sudo[303136]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538513.localdomain sudo[303136]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:37 np0005538513.localdomain sudo[303154]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:37 np0005538513.localdomain sudo[303154]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:37 np0005538513.localdomain sudo[303154]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:37 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] update: starting ev 11a5422d-8b81-46a0-a867-01b1dfdefb4c (Updating node-proxy deployment (+3 -> 3))
Nov 28 09:56:37 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] complete: finished ev 11a5422d-8b81-46a0-a867-01b1dfdefb4c (Updating node-proxy deployment (+3 -> 3))
Nov 28 09:56:37 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] Completed event 11a5422d-8b81-46a0-a867-01b1dfdefb4c (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:56:37 np0005538513.localdomain sudo[303172]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:56:37 np0005538513.localdomain sudo[303172]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:37 np0005538513.localdomain sudo[303172]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:37 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:56:37 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:37 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:37 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:56:37 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:56:37 np0005538513.localdomain sudo[303190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:37 np0005538513.localdomain sudo[303190]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:37 np0005538513.localdomain sudo[303190]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:37 np0005538513.localdomain sudo[303208]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:37 np0005538513.localdomain sudo[303208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:37 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:37.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:56:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:37.789 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:56:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:37.790 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:56:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:37.790 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:56:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:37.791 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:56:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:37.791 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:56:38 np0005538513.localdomain podman[303263]: 
Nov 28 09:56:38 np0005538513.localdomain podman[303263]: 2025-11-28 09:56:38.0282842 +0000 UTC m=+0.072884294 container create 31e3568e5d4528293458187efbe96961428e36ea6625ac0c9593c931aca3f5e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_chandrasekhar, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:56:38 np0005538513.localdomain systemd[1]: Started libpod-conmon-31e3568e5d4528293458187efbe96961428e36ea6625ac0c9593c931aca3f5e2.scope.
Nov 28 09:56:38 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:56:38 np0005538513.localdomain podman[303263]: 2025-11-28 09:56:38.095812558 +0000 UTC m=+0.140412662 container init 31e3568e5d4528293458187efbe96961428e36ea6625ac0c9593c931aca3f5e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_chandrasekhar, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, architecture=x86_64, io.buildah.version=1.33.12, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, release=553, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, RELEASE=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:56:38 np0005538513.localdomain podman[303263]: 2025-11-28 09:56:37.998511433 +0000 UTC m=+0.043111547 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:38 np0005538513.localdomain podman[303263]: 2025-11-28 09:56:38.105697442 +0000 UTC m=+0.150297516 container start 31e3568e5d4528293458187efbe96961428e36ea6625ac0c9593c931aca3f5e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_chandrasekhar, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, version=7, io.buildah.version=1.33.12, release=553, ceph=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:56:38 np0005538513.localdomain podman[303263]: 2025-11-28 09:56:38.10594503 +0000 UTC m=+0.150545134 container attach 31e3568e5d4528293458187efbe96961428e36ea6625ac0c9593c931aca3f5e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_chandrasekhar, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.openshift.expose-services=, ceph=True, version=7, architecture=x86_64)
Nov 28 09:56:38 np0005538513.localdomain compassionate_chandrasekhar[303278]: 167 167
Nov 28 09:56:38 np0005538513.localdomain systemd[1]: libpod-31e3568e5d4528293458187efbe96961428e36ea6625ac0c9593c931aca3f5e2.scope: Deactivated successfully.
Nov 28 09:56:38 np0005538513.localdomain podman[303263]: 2025-11-28 09:56:38.108910381 +0000 UTC m=+0.153510485 container died 31e3568e5d4528293458187efbe96961428e36ea6625ac0c9593c931aca3f5e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_chandrasekhar, RELEASE=main, vcs-type=git, name=rhceph, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, version=7, GIT_CLEAN=True, architecture=x86_64, distribution-scope=public, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, ceph=True)
Nov 28 09:56:38 np0005538513.localdomain podman[303283]: 2025-11-28 09:56:38.20243468 +0000 UTC m=+0.083883193 container remove 31e3568e5d4528293458187efbe96961428e36ea6625ac0c9593c931aca3f5e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_chandrasekhar, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, vcs-type=git, RELEASE=main)
Nov 28 09:56:38 np0005538513.localdomain systemd[1]: libpod-conmon-31e3568e5d4528293458187efbe96961428e36ea6625ac0c9593c931aca3f5e2.scope: Deactivated successfully.
Nov 28 09:56:38 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:38 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:38 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/3870773326' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:56:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/289694563' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:56:38 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:56:38 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1396630985' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:56:38 np0005538513.localdomain sudo[303208]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:38 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:56:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:38.282 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:56:38 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:38 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:56:38 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:38 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)...
Nov 28 09:56:38 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)...
Nov 28 09:56:38 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0)
Nov 28 09:56:38 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:56:38 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:38 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:38 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:56:38 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:56:38 np0005538513.localdomain sudo[303302]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:38.367 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:56:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:38.369 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:56:38 np0005538513.localdomain sudo[303302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:38 np0005538513.localdomain sudo[303302]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:38 np0005538513.localdomain sudo[303320]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:38 np0005538513.localdomain sudo[303320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:38.570 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:56:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:38.572 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11721MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:56:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:38.572 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:56:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:38.572 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:56:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:38.670 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:56:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:38.671 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:56:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:38.671 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:56:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:38.719 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:56:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:38.916 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:38 np0005538513.localdomain podman[303375]: 
Nov 28 09:56:38 np0005538513.localdomain podman[303375]: 2025-11-28 09:56:38.97759227 +0000 UTC m=+0.093730696 container create 2293bc1b166644dd665bbb376e12644b2305e3ab4579552f7063c3ef58732e22 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_tesla, build-date=2025-09-24T08:57:55, ceph=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, RELEASE=main, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:56:39 np0005538513.localdomain systemd[1]: Started libpod-conmon-2293bc1b166644dd665bbb376e12644b2305e3ab4579552f7063c3ef58732e22.scope.
Nov 28 09:56:39 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:56:39 np0005538513.localdomain systemd[1]: tmp-crun.bIHuqE.mount: Deactivated successfully.
Nov 28 09:56:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-edd0a6ce3fe658326a09178f2ffecc47f6c7c755832390f5d49c466a077806d2-merged.mount: Deactivated successfully.
Nov 28 09:56:39 np0005538513.localdomain podman[303375]: 2025-11-28 09:56:38.939315832 +0000 UTC m=+0.055454258 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:39 np0005538513.localdomain podman[303375]: 2025-11-28 09:56:39.046857902 +0000 UTC m=+0.162996298 container init 2293bc1b166644dd665bbb376e12644b2305e3ab4579552f7063c3ef58732e22 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_tesla, ceph=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_BRANCH=main, CEPH_POINT_RELEASE=, distribution-scope=public)
Nov 28 09:56:39 np0005538513.localdomain systemd[1]: tmp-crun.IZN1Zi.mount: Deactivated successfully.
Nov 28 09:56:39 np0005538513.localdomain podman[303375]: 2025-11-28 09:56:39.065131275 +0000 UTC m=+0.181269711 container start 2293bc1b166644dd665bbb376e12644b2305e3ab4579552f7063c3ef58732e22 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_tesla, description=Red Hat Ceph Storage 7, ceph=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, name=rhceph, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc.)
Nov 28 09:56:39 np0005538513.localdomain podman[303375]: 2025-11-28 09:56:39.065516807 +0000 UTC m=+0.181655293 container attach 2293bc1b166644dd665bbb376e12644b2305e3ab4579552f7063c3ef58732e22 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_tesla, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, ceph=True, vcs-type=git, RELEASE=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_BRANCH=main, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=)
Nov 28 09:56:39 np0005538513.localdomain funny_tesla[303389]: 167 167
Nov 28 09:56:39 np0005538513.localdomain systemd[1]: libpod-2293bc1b166644dd665bbb376e12644b2305e3ab4579552f7063c3ef58732e22.scope: Deactivated successfully.
Nov 28 09:56:39 np0005538513.localdomain podman[303375]: 2025-11-28 09:56:39.070054946 +0000 UTC m=+0.186193402 container died 2293bc1b166644dd665bbb376e12644b2305e3ab4579552f7063c3ef58732e22 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_tesla, io.openshift.expose-services=, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., name=rhceph, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main, build-date=2025-09-24T08:57:55, ceph=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main)
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2861436119' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:56:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:39.180 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:56:39 np0005538513.localdomain podman[303394]: 2025-11-28 09:56:39.18127507 +0000 UTC m=+0.099504684 container remove 2293bc1b166644dd665bbb376e12644b2305e3ab4579552f7063c3ef58732e22 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_tesla, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_BRANCH=main, architecture=x86_64, release=553, GIT_CLEAN=True, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.expose-services=, name=rhceph)
Nov 28 09:56:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:39.188 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:56:39 np0005538513.localdomain systemd[1]: libpod-conmon-2293bc1b166644dd665bbb376e12644b2305e3ab4579552f7063c3ef58732e22.scope: Deactivated successfully.
Nov 28 09:56:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:39.206 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:56:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:39.210 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:56:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:39.211 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.639s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1396630985' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.200:0/2594589190' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/3004239366' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/3912028822' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2861436119' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:56:39 np0005538513.localdomain sudo[303320]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:39 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)...
Nov 28 09:56:39 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)...
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0)
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:39 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:56:39 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:56:39 np0005538513.localdomain sudo[303420]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:39 np0005538513.localdomain sudo[303420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:39 np0005538513.localdomain sudo[303420]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:39 np0005538513.localdomain sudo[303438]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:39 np0005538513.localdomain sudo[303438]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:56:39 np0005538513.localdomain ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44535 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:56:39 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO root] Reconfig service osd.default_drive_group
Nov 28 09:56:39 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfig service osd.default_drive_group
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:56:39 np0005538513.localdomain ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:56:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-630a1a762f124f3926c6225ad15e484fa1812775dbba5d8882661eb4f503739d-merged.mount: Deactivated successfully.
Nov 28 09:56:40 np0005538513.localdomain podman[303473]: 
Nov 28 09:56:40 np0005538513.localdomain podman[303473]: 2025-11-28 09:56:40.064130405 +0000 UTC m=+0.077448375 container create ef5bcc9afd923683dc91809f084f19cc4164158329a4f755d6cb0c376743377d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_liskov, io.openshift.tags=rhceph ceph, version=7, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, name=rhceph, io.openshift.expose-services=, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, release=553)
Nov 28 09:56:40 np0005538513.localdomain podman[238687]: time="2025-11-28T09:56:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:56:40 np0005538513.localdomain systemd[1]: Started libpod-conmon-ef5bcc9afd923683dc91809f084f19cc4164158329a4f755d6cb0c376743377d.scope.
Nov 28 09:56:40 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:56:40 np0005538513.localdomain podman[303473]: 2025-11-28 09:56:40.032984226 +0000 UTC m=+0.046302226 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:40 np0005538513.localdomain podman[303473]: 2025-11-28 09:56:40.138339189 +0000 UTC m=+0.151657169 container init ef5bcc9afd923683dc91809f084f19cc4164158329a4f755d6cb0c376743377d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_liskov, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, ceph=True, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, com.redhat.component=rhceph-container, vcs-type=git, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, distribution-scope=public, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, version=7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc.)
Nov 28 09:56:40 np0005538513.localdomain podman[303473]: 2025-11-28 09:56:40.148040898 +0000 UTC m=+0.161358868 container start ef5bcc9afd923683dc91809f084f19cc4164158329a4f755d6cb0c376743377d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_liskov, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, vcs-type=git, GIT_CLEAN=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2025-09-24T08:57:55, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, vendor=Red Hat, Inc., name=rhceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12)
Nov 28 09:56:40 np0005538513.localdomain podman[303473]: 2025-11-28 09:56:40.148242764 +0000 UTC m=+0.161560954 container attach ef5bcc9afd923683dc91809f084f19cc4164158329a4f755d6cb0c376743377d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_liskov, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, vcs-type=git, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, ceph=True, CEPH_POINT_RELEASE=, name=rhceph, release=553, vendor=Red Hat, Inc.)
Nov 28 09:56:40 np0005538513.localdomain blissful_liskov[303489]: 167 167
Nov 28 09:56:40 np0005538513.localdomain systemd[1]: libpod-ef5bcc9afd923683dc91809f084f19cc4164158329a4f755d6cb0c376743377d.scope: Deactivated successfully.
Nov 28 09:56:40 np0005538513.localdomain podman[303473]: 2025-11-28 09:56:40.151439162 +0000 UTC m=+0.164757182 container died ef5bcc9afd923683dc91809f084f19cc4164158329a4f755d6cb0c376743377d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_liskov, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, ceph=True, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=553, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, name=rhceph)
Nov 28 09:56:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:56:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155578 "" "Go-http-client/1.1"
Nov 28 09:56:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:56:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19045 "" "Go-http-client/1.1"
Nov 28 09:56:40 np0005538513.localdomain podman[303494]: 2025-11-28 09:56:40.276812731 +0000 UTC m=+0.113587206 container remove ef5bcc9afd923683dc91809f084f19cc4164158329a4f755d6cb0c376743377d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_liskov, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, name=rhceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=553, com.redhat.component=rhceph-container, vcs-type=git, version=7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, ceph=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:56:40 np0005538513.localdomain systemd[1]: libpod-conmon-ef5bcc9afd923683dc91809f084f19cc4164158329a4f755d6cb0c376743377d.scope: Deactivated successfully.
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.2 (monmap changed)...
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538513.localdomain ceph-mgr[286105]: [progress INFO root] Writing back 50 completed events
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538513.localdomain sudo[303438]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:56:40 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:40 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:40 np0005538513.localdomain ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:56:40 np0005538513.localdomain ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:56:40 np0005538513.localdomain sudo[303517]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:40 np0005538513.localdomain sudo[303517]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:40 np0005538513.localdomain sudo[303517]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:40 np0005538513.localdomain sudo[303535]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:40 np0005538513.localdomain sudo[303535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:40.716 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:41 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:56:41 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-32579a85f17ae0b9919982077d9cde35bfdd8958595e703f01f98577d050a365-merged.mount: Deactivated successfully.
Nov 28 09:56:41 np0005538513.localdomain podman[303569]: 2025-11-28 09:56:41.113784664 +0000 UTC m=+0.096150780 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:56:41 np0005538513.localdomain podman[303569]: 2025-11-28 09:56:41.129376135 +0000 UTC m=+0.111742291 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:56:41 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:56:41 np0005538513.localdomain podman[303577]: 
Nov 28 09:56:41 np0005538513.localdomain podman[303577]: 2025-11-28 09:56:41.185178942 +0000 UTC m=+0.137433842 container create e3a7233f00fc25db74dbfe7c8b5d3a656a39bd2d16c0518429c77dbef62649d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_bartik, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, distribution-scope=public, RELEASE=main, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7)
Nov 28 09:56:41 np0005538513.localdomain podman[303577]: 2025-11-28 09:56:41.09901159 +0000 UTC m=+0.051266490 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:41 np0005538513.localdomain systemd[1]: Started libpod-conmon-e3a7233f00fc25db74dbfe7c8b5d3a656a39bd2d16c0518429c77dbef62649d7.scope.
Nov 28 09:56:41 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:56:41 np0005538513.localdomain podman[303577]: 2025-11-28 09:56:41.272419498 +0000 UTC m=+0.224674398 container init e3a7233f00fc25db74dbfe7c8b5d3a656a39bd2d16c0518429c77dbef62649d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_bartik, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, com.redhat.component=rhceph-container, distribution-scope=public, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, architecture=x86_64, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, ceph=True)
Nov 28 09:56:41 np0005538513.localdomain podman[303577]: 2025-11-28 09:56:41.283047294 +0000 UTC m=+0.235302194 container start e3a7233f00fc25db74dbfe7c8b5d3a656a39bd2d16c0518429c77dbef62649d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_bartik, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, vendor=Red Hat, Inc., ceph=True, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, distribution-scope=public, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_CLEAN=True, vcs-type=git, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:56:41 np0005538513.localdomain podman[303577]: 2025-11-28 09:56:41.283415345 +0000 UTC m=+0.235670245 container attach e3a7233f00fc25db74dbfe7c8b5d3a656a39bd2d16c0518429c77dbef62649d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_bartik, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, RELEASE=main, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, version=7, distribution-scope=public, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55)
Nov 28 09:56:41 np0005538513.localdomain gallant_bartik[303609]: 167 167
Nov 28 09:56:41 np0005538513.localdomain systemd[1]: libpod-e3a7233f00fc25db74dbfe7c8b5d3a656a39bd2d16c0518429c77dbef62649d7.scope: Deactivated successfully.
Nov 28 09:56:41 np0005538513.localdomain podman[303577]: 2025-11-28 09:56:41.287475591 +0000 UTC m=+0.239730531 container died e3a7233f00fc25db74dbfe7c8b5d3a656a39bd2d16c0518429c77dbef62649d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_bartik, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, architecture=x86_64, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, version=7, release=553, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, RELEASE=main, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.5 (monmap changed)...
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: from='client.44535 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: Reconfig service osd.default_drive_group
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:41 np0005538513.localdomain podman[303614]: 2025-11-28 09:56:41.390792371 +0000 UTC m=+0.090498656 container remove e3a7233f00fc25db74dbfe7c8b5d3a656a39bd2d16c0518429c77dbef62649d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_bartik, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, version=7, io.openshift.expose-services=, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, vcs-type=git, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 09:56:41 np0005538513.localdomain systemd[1]: libpod-conmon-e3a7233f00fc25db74dbfe7c8b5d3a656a39bd2d16c0518429c77dbef62649d7.scope: Deactivated successfully.
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mgr fail"} v 0)
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/937537164' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e90 do_prune osdmap full prune enabled
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : Activating manager daemon np0005538514.djozup
Nov 28 09:56:41 np0005538513.localdomain sudo[303535]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 e91: 6 total, 6 up, 6 in
Nov 28 09:56:41 np0005538513.localdomain ceph-mgr[286105]: mgr handle_mgr_map I was active but no longer am
Nov 28 09:56:41 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:41.482+0000 7fc53e906640 -1 mgr handle_mgr_map I was active but no longer am
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e91: 6 total, 6 up, 6 in
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/937537164' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e35: np0005538514.djozup(active, starting, since 0.0458243s), standbys: np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : Manager daemon np0005538514.djozup is now available
Nov 28 09:56:41 np0005538513.localdomain sshd[299027]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:56:41 np0005538513.localdomain systemd[1]: session-69.scope: Deactivated successfully.
Nov 28 09:56:41 np0005538513.localdomain systemd[1]: session-69.scope: Consumed 28.760s CPU time.
Nov 28 09:56:41 np0005538513.localdomain systemd-logind[764]: Session 69 logged out. Waiting for processes to exit.
Nov 28 09:56:41 np0005538513.localdomain systemd-logind[764]: Removed session 69.
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"} v 0)
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"} : dispatch
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"}]': finished
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"} v 0)
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"} : dispatch
Nov 28 09:56:41 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: ignoring --setuser ceph since I am not root
Nov 28 09:56:41 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: ignoring --setgroup ceph since I am not root
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"}]': finished
Nov 28 09:56:41 np0005538513.localdomain ceph-mgr[286105]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2
Nov 28 09:56:41 np0005538513.localdomain ceph-mgr[286105]: pidfile_write: ignore empty --pid-file
Nov 28 09:56:41 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'alerts'
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/mirror_snapshot_schedule"} v 0)
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/mirror_snapshot_schedule"} : dispatch
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/trash_purge_schedule"} v 0)
Nov 28 09:56:41 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/trash_purge_schedule"} : dispatch
Nov 28 09:56:41 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:41.719+0000 7f02af17f140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 28 09:56:41 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 28 09:56:41 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'balancer'
Nov 28 09:56:41 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:41.788+0000 7f02af17f140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 28 09:56:41 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 28 09:56:41 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'cephadm'
Nov 28 09:56:41 np0005538513.localdomain sshd[303654]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:56:41 np0005538513.localdomain sshd[303654]: Accepted publickey for ceph-admin from 192.168.122.107 port 51448 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 09:56:41 np0005538513.localdomain systemd-logind[764]: New session 71 of user ceph-admin.
Nov 28 09:56:41 np0005538513.localdomain systemd[1]: Started Session 71 of User ceph-admin.
Nov 28 09:56:41 np0005538513.localdomain sshd[303654]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 09:56:42 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-78e92897badadd3a0d4e6279f34a3c1d6f59b1d9ac087caa19b8c5bcc97568bd-merged.mount: Deactivated successfully.
Nov 28 09:56:42 np0005538513.localdomain sudo[303658]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:42 np0005538513.localdomain sudo[303658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:42 np0005538513.localdomain sudo[303658]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:42 np0005538513.localdomain sudo[303676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:56:42 np0005538513.localdomain sudo[303676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.200:0/937537164' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: Activating manager daemon np0005538514.djozup
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: osdmap e91: 6 total, 6 up, 6 in
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.200:0/937537164' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: mgrmap e35: np0005538514.djozup(active, starting, since 0.0458243s), standbys: np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mds metadata", "who": "mds.np0005538513.yljthc"} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mds metadata", "who": "mds.np0005538515.anvatb"} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mds metadata", "who": "mds.np0005538514.umgtoy"} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr metadata", "who": "np0005538514.djozup", "id": "np0005538514.djozup"} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr metadata", "who": "np0005538515.yfkzhl", "id": "np0005538515.yfkzhl"} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr metadata", "who": "np0005538512.zyhkxs", "id": "np0005538512.zyhkxs"} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mds metadata"} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata"} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mon metadata"} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: Manager daemon np0005538514.djozup is now available
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: removing stray HostCache host record np0005538512.localdomain.devices.0
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"}]': finished
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"}]': finished
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/mirror_snapshot_schedule"} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/mirror_snapshot_schedule"} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/trash_purge_schedule"} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/trash_purge_schedule"} : dispatch
Nov 28 09:56:42 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'crash'
Nov 28 09:56:42 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:42.439+0000 7f02af17f140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 28 09:56:42 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 28 09:56:42 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'dashboard'
Nov 28 09:56:42 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e36: np0005538514.djozup(active, since 1.10238s), standbys: np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:56:42 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'devicehealth'
Nov 28 09:56:43 np0005538513.localdomain podman[303768]: 2025-11-28 09:56:43.003078198 +0000 UTC m=+0.078389654 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, version=7, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.openshift.expose-services=, architecture=x86_64, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_CLEAN=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, ceph=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=)
Nov 28 09:56:43 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:43.009+0000 7f02af17f140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 28 09:56:43 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 28 09:56:43 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'diskprediction_local'
Nov 28 09:56:43 np0005538513.localdomain podman[303768]: 2025-11-28 09:56:43.09342516 +0000 UTC m=+0.168736666 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_BRANCH=main, GIT_CLEAN=True, RELEASE=main, com.redhat.component=rhceph-container)
Nov 28 09:56:43 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Nov 28 09:56:43 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Nov 28 09:56:43 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]:   from numpy import show_config as show_numpy_config
Nov 28 09:56:43 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:43.138+0000 7f02af17f140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 28 09:56:43 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 28 09:56:43 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'influx'
Nov 28 09:56:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:56:43 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 28 09:56:43 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:43.196+0000 7f02af17f140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 28 09:56:43 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'insights'
Nov 28 09:56:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:56:43 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'iostat'
Nov 28 09:56:43 np0005538513.localdomain podman[303813]: 2025-11-28 09:56:43.260143121 +0000 UTC m=+0.087812933 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 28 09:56:43 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 28 09:56:43 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'k8sevents'
Nov 28 09:56:43 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:43.309+0000 7f02af17f140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 28 09:56:43 np0005538513.localdomain podman[303836]: 2025-11-28 09:56:43.364339628 +0000 UTC m=+0.107528420 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 09:56:43 np0005538513.localdomain podman[303836]: 2025-11-28 09:56:43.372354005 +0000 UTC m=+0.115542857 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 09:56:43 np0005538513.localdomain podman[303813]: 2025-11-28 09:56:43.382530359 +0000 UTC m=+0.210200191 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 28 09:56:43 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:56:43 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:56:43 np0005538513.localdomain ceph-mon[292954]: mgrmap e36: np0005538514.djozup(active, since 1.10238s), standbys: np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:56:43 np0005538513.localdomain ceph-mon[292954]: pgmap v3: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:43 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e37: np0005538514.djozup(active, since 2s), standbys: np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:56:43 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'localpool'
Nov 28 09:56:43 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:56:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:43 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:56:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:43 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'mds_autoscaler'
Nov 28 09:56:43 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:56:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:43 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:56:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:43 np0005538513.localdomain sudo[303676]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:43 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:56:43 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'mirroring'
Nov 28 09:56:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:43 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:56:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:43 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'nfs'
Nov 28 09:56:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:43.918 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:43 np0005538513.localdomain sudo[303932]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:43 np0005538513.localdomain sudo[303932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:43 np0005538513.localdomain sudo[303932]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:44 np0005538513.localdomain sudo[303950]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:56:44 np0005538513.localdomain sudo[303950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 28 09:56:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'orchestrator'
Nov 28 09:56:44 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:44.059+0000 7f02af17f140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 28 09:56:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 28 09:56:44 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:44.201+0000 7f02af17f140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 28 09:56:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'osd_perf_query'
Nov 28 09:56:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:44.213 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:56:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:44.213 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:56:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:44.213 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:56:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 28 09:56:44 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:44.264+0000 7f02af17f140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 28 09:56:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'osd_support'
Nov 28 09:56:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 28 09:56:44 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:44.319+0000 7f02af17f140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 28 09:56:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'pg_autoscaler'
Nov 28 09:56:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 28 09:56:44 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:44.386+0000 7f02af17f140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 28 09:56:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'progress'
Nov 28 09:56:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 28 09:56:44 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:44.446+0000 7f02af17f140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 28 09:56:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'prometheus'
Nov 28 09:56:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:44.499 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:56:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:44.499 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:56:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:44.499 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:56:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:44.500 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:56:44 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:56:43] ENGINE Bus STARTING
Nov 28 09:56:44 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:56:43] ENGINE Serving on https://172.18.0.107:7150
Nov 28 09:56:44 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:56:43] ENGINE Client ('172.18.0.107', 59370) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 28 09:56:44 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:56:43] ENGINE Serving on http://172.18.0.107:8765
Nov 28 09:56:44 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:56:43] ENGINE Bus STARTED
Nov 28 09:56:44 np0005538513.localdomain ceph-mon[292954]: pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:44 np0005538513.localdomain ceph-mon[292954]: mgrmap e37: np0005538514.djozup(active, since 2s), standbys: np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:56:44 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:44 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:44 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:44 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:44 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:44 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:56:44 np0005538513.localdomain sudo[303950]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 28 09:56:44 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:44.745+0000 7f02af17f140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 28 09:56:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'rbd_support'
Nov 28 09:56:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 28 09:56:44 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:44.827+0000 7f02af17f140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 28 09:56:44 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'restful'
Nov 28 09:56:44 np0005538513.localdomain sudo[303999]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:44 np0005538513.localdomain sudo[303999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:44 np0005538513.localdomain sudo[303999]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:44.848 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:56:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:44.864 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:56:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:44.865 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:56:44 np0005538513.localdomain sudo[304017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 28 09:56:44 np0005538513.localdomain sudo[304017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:56:44 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:56:44 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:45 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'rgw'
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 09:56:45 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 28 09:56:45 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:45.151+0000 7f02af17f140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 28 09:56:45 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'rook'
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 09:56:45 np0005538513.localdomain sudo[304017]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 09:56:45 np0005538513.localdomain sudo[304054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:56:45 np0005538513.localdomain sudo[304054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:45 np0005538513.localdomain sudo[304054]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:45 np0005538513.localdomain sudo[304072]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:56:45 np0005538513.localdomain sudo[304072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:45 np0005538513.localdomain sudo[304072]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:45 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 28 09:56:45 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:45.583+0000 7f02af17f140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 28 09:56:45 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'selftest'
Nov 28 09:56:45 np0005538513.localdomain sudo[304090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:45 np0005538513.localdomain sudo[304090]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:45 np0005538513.localdomain sudo[304090]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:45 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 28 09:56:45 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:45.644+0000 7f02af17f140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 28 09:56:45 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'snap_schedule'
Nov 28 09:56:45 np0005538513.localdomain sudo[304108]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:45 np0005538513.localdomain sudo[304108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:45 np0005538513.localdomain sudo[304108]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:45 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'stats'
Nov 28 09:56:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:45.764 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:45 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'status'
Nov 28 09:56:45 np0005538513.localdomain sudo[304126]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:45 np0005538513.localdomain sudo[304126]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:45 np0005538513.localdomain sudo[304126]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:45 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module status has missing NOTIFY_TYPES member
Nov 28 09:56:45 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:45.837+0000 7f02af17f140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Nov 28 09:56:45 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'telegraf'
Nov 28 09:56:45 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 28 09:56:45 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:45.896+0000 7f02af17f140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 28 09:56:45 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'telemetry'
Nov 28 09:56:45 np0005538513.localdomain sudo[304160]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:45 np0005538513.localdomain sudo[304160]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:45 np0005538513.localdomain sudo[304160]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:45 np0005538513.localdomain sudo[304178]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:45 np0005538513.localdomain sudo[304178]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:45 np0005538513.localdomain ceph-mon[292954]: pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:45 np0005538513.localdomain sudo[304178]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 28 09:56:46 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:46.027+0000 7f02af17f140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 28 09:56:46 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'test_orchestrator'
Nov 28 09:56:46 np0005538513.localdomain sudo[304196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:56:46 np0005538513.localdomain sudo[304196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538513.localdomain sudo[304196]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538513.localdomain sudo[304214]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:56:46 np0005538513.localdomain sudo[304214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538513.localdomain sudo[304214]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 28 09:56:46 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:46.172+0000 7f02af17f140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 28 09:56:46 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'volumes'
Nov 28 09:56:46 np0005538513.localdomain sudo[304232]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:56:46 np0005538513.localdomain sudo[304232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538513.localdomain sudo[304232]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:56:46 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e38: np0005538514.djozup(active, since 4s), standbys: np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:56:46 np0005538513.localdomain sudo[304251]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:46 np0005538513.localdomain sudo[304251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538513.localdomain sudo[304251]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538513.localdomain podman[304250]: 2025-11-28 09:56:46.340421906 +0000 UTC m=+0.099756522 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 09:56:46 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 28 09:56:46 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Loading python module 'zabbix'
Nov 28 09:56:46 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:46.362+0000 7f02af17f140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 28 09:56:46 np0005538513.localdomain podman[304250]: 2025-11-28 09:56:46.381431637 +0000 UTC m=+0.140766253 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:56:46 np0005538513.localdomain sudo[304282]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:46 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:56:46 np0005538513.localdomain sudo[304282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538513.localdomain sudo[304282]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538513.localdomain ceph-mgr[286105]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 28 09:56:46 np0005538513.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:46.420+0000 7f02af17f140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 28 09:56:46 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : Standby manager daemon np0005538513.dsfdlx started
Nov 28 09:56:46 np0005538513.localdomain ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x55f77a9db1e0 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0
Nov 28 09:56:46 np0005538513.localdomain ceph-mgr[286105]: client.0 ms_handle_reset on v2:172.18.0.107:6810/2760684413
Nov 28 09:56:46 np0005538513.localdomain sudo[304305]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:46 np0005538513.localdomain sudo[304305]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538513.localdomain sudo[304305]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538513.localdomain sudo[304339]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:46 np0005538513.localdomain sudo[304339]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538513.localdomain sudo[304339]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538513.localdomain sudo[304357]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:46 np0005538513.localdomain sudo[304357]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538513.localdomain sudo[304357]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538513.localdomain sudo[304375]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:46 np0005538513.localdomain sudo[304375]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538513.localdomain sudo[304375]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538513.localdomain sudo[304393]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:56:46 np0005538513.localdomain sudo[304393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538513.localdomain sudo[304393]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538513.localdomain sudo[304411]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:56:46 np0005538513.localdomain sudo[304411]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538513.localdomain sudo[304411]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538513.localdomain sudo[304429]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:56:46 np0005538513.localdomain sudo[304429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538513.localdomain sudo[304429]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:47 np0005538513.localdomain sudo[304447]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:47 np0005538513.localdomain sudo[304447]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:47 np0005538513.localdomain sudo[304447]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:47 np0005538513.localdomain sudo[304465]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:56:47 np0005538513.localdomain sudo[304465]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:47 np0005538513.localdomain sudo[304465]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:47 np0005538513.localdomain sudo[304499]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:56:47 np0005538513.localdomain sudo[304499]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:47 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:47 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:47 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:47 np0005538513.localdomain ceph-mon[292954]: mgrmap e38: np0005538514.djozup(active, since 4s), standbys: np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:56:47 np0005538513.localdomain ceph-mon[292954]: Standby manager daemon np0005538513.dsfdlx started
Nov 28 09:56:47 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:56:47 np0005538513.localdomain sudo[304499]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:47 np0005538513.localdomain sudo[304517]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:56:47 np0005538513.localdomain sudo[304517]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:47 np0005538513.localdomain sudo[304517]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:47 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e39: np0005538514.djozup(active, since 5s), standbys: np0005538515.yfkzhl, np0005538512.zyhkxs, np0005538513.dsfdlx
Nov 28 09:56:47 np0005538513.localdomain sudo[304535]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Nov 28 09:56:47 np0005538513.localdomain sudo[304535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:47 np0005538513.localdomain sudo[304535]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:47 np0005538513.localdomain sudo[304553]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:56:47 np0005538513.localdomain sudo[304553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:47 np0005538513.localdomain sudo[304553]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:47 np0005538513.localdomain sudo[304571]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:56:47 np0005538513.localdomain sudo[304571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:47 np0005538513.localdomain sudo[304571]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:47 np0005538513.localdomain sudo[304589]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:56:47 np0005538513.localdomain sudo[304589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:47 np0005538513.localdomain sudo[304589]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:47 np0005538513.localdomain sudo[304607]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:47 np0005538513.localdomain sudo[304607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:47 np0005538513.localdomain sudo[304607]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:47 np0005538513.localdomain sudo[304625]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:56:47 np0005538513.localdomain sudo[304625]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:47 np0005538513.localdomain sudo[304625]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:56:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:56:47 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:56:47 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:56:47 np0005538513.localdomain sudo[304659]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:56:47 np0005538513.localdomain sudo[304659]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:47 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:47 np0005538513.localdomain sudo[304659]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:48 np0005538513.localdomain sudo[304677]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:56:48 np0005538513.localdomain sudo[304677]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:48 np0005538513.localdomain sudo[304677]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:56:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:56:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:56:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:56:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:56:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:56:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:56:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:56:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:56:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:56:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:56:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:56:48 np0005538513.localdomain sudo[304695]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:56:48 np0005538513.localdomain sudo[304695]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:48 np0005538513.localdomain sudo[304695]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:48 np0005538513.localdomain sudo[304713]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:56:48 np0005538513.localdomain sudo[304713]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:48 np0005538513.localdomain sudo[304713]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [WRN] : Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [WRN] : Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: mgrmap e39: np0005538514.djozup(active, since 5s), standbys: np0005538515.yfkzhl, np0005538512.zyhkxs, np0005538513.dsfdlx
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr metadata", "who": "np0005538513.dsfdlx", "id": "np0005538513.dsfdlx"} : dispatch
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:48 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:56:48 np0005538513.localdomain sudo[304731]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:48 np0005538513.localdomain sudo[304731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:48 np0005538513.localdomain sudo[304731]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:48 np0005538513.localdomain sudo[304749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:48 np0005538513.localdomain sudo[304749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:48.920 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:49 np0005538513.localdomain podman[304783]: 
Nov 28 09:56:49 np0005538513.localdomain podman[304783]: 2025-11-28 09:56:49.144438986 +0000 UTC m=+0.079889740 container create fe8543c05f744f80dcc95afcaf11d9f147a5c688cba7af784897f2c01011d40a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_hofstadter, description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., name=rhceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git)
Nov 28 09:56:49 np0005538513.localdomain systemd[1]: Started libpod-conmon-fe8543c05f744f80dcc95afcaf11d9f147a5c688cba7af784897f2c01011d40a.scope.
Nov 28 09:56:49 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:56:49 np0005538513.localdomain podman[304783]: 2025-11-28 09:56:49.110579444 +0000 UTC m=+0.046030278 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:49 np0005538513.localdomain podman[304783]: 2025-11-28 09:56:49.22059167 +0000 UTC m=+0.156042424 container init fe8543c05f744f80dcc95afcaf11d9f147a5c688cba7af784897f2c01011d40a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_hofstadter, version=7, architecture=x86_64, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, release=553, io.openshift.tags=rhceph ceph, name=rhceph, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, vcs-type=git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:56:49 np0005538513.localdomain nifty_hofstadter[304798]: 167 167
Nov 28 09:56:49 np0005538513.localdomain podman[304783]: 2025-11-28 09:56:49.233624531 +0000 UTC m=+0.169075285 container start fe8543c05f744f80dcc95afcaf11d9f147a5c688cba7af784897f2c01011d40a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_hofstadter, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, distribution-scope=public, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, version=7, io.openshift.expose-services=, GIT_CLEAN=True, ceph=True, release=553, name=rhceph, architecture=x86_64, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:56:49 np0005538513.localdomain systemd[1]: libpod-fe8543c05f744f80dcc95afcaf11d9f147a5c688cba7af784897f2c01011d40a.scope: Deactivated successfully.
Nov 28 09:56:49 np0005538513.localdomain podman[304783]: 2025-11-28 09:56:49.234960852 +0000 UTC m=+0.170411616 container attach fe8543c05f744f80dcc95afcaf11d9f147a5c688cba7af784897f2c01011d40a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_hofstadter, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, RELEASE=main, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, CEPH_POINT_RELEASE=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, distribution-scope=public, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph)
Nov 28 09:56:49 np0005538513.localdomain podman[304783]: 2025-11-28 09:56:49.2381382 +0000 UTC m=+0.173588964 container died fe8543c05f744f80dcc95afcaf11d9f147a5c688cba7af784897f2c01011d40a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_hofstadter, version=7, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_BRANCH=main, distribution-scope=public, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, ceph=True, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True)
Nov 28 09:56:49 np0005538513.localdomain podman[304803]: 2025-11-28 09:56:49.335700183 +0000 UTC m=+0.091135597 container remove fe8543c05f744f80dcc95afcaf11d9f147a5c688cba7af784897f2c01011d40a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_hofstadter, RELEASE=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, architecture=x86_64, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=)
Nov 28 09:56:49 np0005538513.localdomain systemd[1]: libpod-conmon-fe8543c05f744f80dcc95afcaf11d9f147a5c688cba7af784897f2c01011d40a.scope: Deactivated successfully.
Nov 28 09:56:49 np0005538513.localdomain ceph-mon[292954]: pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 0 B/s wr, 20 op/s
Nov 28 09:56:49 np0005538513.localdomain ceph-mon[292954]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Nov 28 09:56:49 np0005538513.localdomain ceph-mon[292954]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Nov 28 09:56:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:56:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:49 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:56:49 np0005538513.localdomain sudo[304749]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:56:49 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:56:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:56:49 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:56:49 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:56:49 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Nov 28 09:56:49 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:56:49 np0005538513.localdomain sudo[304829]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:49 np0005538513.localdomain sudo[304829]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:49 np0005538513.localdomain sudo[304829]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:49 np0005538513.localdomain sudo[304847]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:49 np0005538513.localdomain sudo[304847]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-bc457c5c37bce5163bcc1575fcb0fee9857a0ab1e47c6523bec72b4d0411e735-merged.mount: Deactivated successfully.
Nov 28 09:56:50 np0005538513.localdomain podman[304882]: 
Nov 28 09:56:50 np0005538513.localdomain podman[304882]: 2025-11-28 09:56:50.332166266 +0000 UTC m=+0.076776575 container create 003b3b17193d134a03f3cfe786c7cc57962cb60d93c55e0970ebf853cb9fcd20 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_feistel, GIT_BRANCH=main, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, release=553, architecture=x86_64, name=rhceph)
Nov 28 09:56:50 np0005538513.localdomain systemd[1]: Started libpod-conmon-003b3b17193d134a03f3cfe786c7cc57962cb60d93c55e0970ebf853cb9fcd20.scope.
Nov 28 09:56:50 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:56:50 np0005538513.localdomain podman[304882]: 2025-11-28 09:56:50.302067409 +0000 UTC m=+0.046677748 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:50 np0005538513.localdomain podman[304882]: 2025-11-28 09:56:50.407066771 +0000 UTC m=+0.151677080 container init 003b3b17193d134a03f3cfe786c7cc57962cb60d93c55e0970ebf853cb9fcd20 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_feistel, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, name=rhceph, release=553, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, ceph=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph)
Nov 28 09:56:50 np0005538513.localdomain podman[304882]: 2025-11-28 09:56:50.417897254 +0000 UTC m=+0.162507573 container start 003b3b17193d134a03f3cfe786c7cc57962cb60d93c55e0970ebf853cb9fcd20 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_feistel, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, architecture=x86_64, name=rhceph, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vcs-type=git, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, version=7, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=)
Nov 28 09:56:50 np0005538513.localdomain podman[304882]: 2025-11-28 09:56:50.418273525 +0000 UTC m=+0.162883874 container attach 003b3b17193d134a03f3cfe786c7cc57962cb60d93c55e0970ebf853cb9fcd20 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_feistel, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, ceph=True, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, GIT_BRANCH=main, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, description=Red Hat Ceph Storage 7, architecture=x86_64)
Nov 28 09:56:50 np0005538513.localdomain silly_feistel[304897]: 167 167
Nov 28 09:56:50 np0005538513.localdomain systemd[1]: libpod-003b3b17193d134a03f3cfe786c7cc57962cb60d93c55e0970ebf853cb9fcd20.scope: Deactivated successfully.
Nov 28 09:56:50 np0005538513.localdomain podman[304882]: 2025-11-28 09:56:50.423865468 +0000 UTC m=+0.168475807 container died 003b3b17193d134a03f3cfe786c7cc57962cb60d93c55e0970ebf853cb9fcd20 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_feistel, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, release=553, RELEASE=main, distribution-scope=public, version=7, io.openshift.tags=rhceph ceph, ceph=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:56:50 np0005538513.localdomain podman[304902]: 2025-11-28 09:56:50.534887165 +0000 UTC m=+0.093601301 container remove 003b3b17193d134a03f3cfe786c7cc57962cb60d93c55e0970ebf853cb9fcd20 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_feistel, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., RELEASE=main, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, distribution-scope=public)
Nov 28 09:56:50 np0005538513.localdomain systemd[1]: libpod-conmon-003b3b17193d134a03f3cfe786c7cc57962cb60d93c55e0970ebf853cb9fcd20.scope: Deactivated successfully.
Nov 28 09:56:50 np0005538513.localdomain sudo[304847]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:50 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:50 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:50 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:50 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:50 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:56:50 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:56:50 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:56:50 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:50 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:56:50 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:56:50 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:50 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:56:50 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:50 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Nov 28 09:56:50 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:50.767 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:50 np0005538513.localdomain sudo[304918]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:50 np0005538513.localdomain sudo[304918]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:50 np0005538513.localdomain sudo[304918]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:56:50.837 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:56:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:56:50.837 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:56:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:56:50.839 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:56:50 np0005538513.localdomain sudo[304936]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:50 np0005538513.localdomain sudo[304936]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:51 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-657524b94084caf0ee278ccd4f4a33e6b5969e8fa6c5b6eaf213bf0c33e1a2ec-merged.mount: Deactivated successfully.
Nov 28 09:56:51 np0005538513.localdomain podman[304971]: 
Nov 28 09:56:51 np0005538513.localdomain podman[304971]: 2025-11-28 09:56:51.395748403 +0000 UTC m=+0.079945181 container create 64c640f92507e5ccc55f6d3fc5d30c8237ff1563ed4914e674b1d81581ce7fde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_mcnulty, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, GIT_CLEAN=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., release=553, version=7, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55)
Nov 28 09:56:51 np0005538513.localdomain systemd[1]: Started libpod-conmon-64c640f92507e5ccc55f6d3fc5d30c8237ff1563ed4914e674b1d81581ce7fde.scope.
Nov 28 09:56:51 np0005538513.localdomain podman[304971]: 2025-11-28 09:56:51.361156589 +0000 UTC m=+0.045353367 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:51 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:56:51 np0005538513.localdomain podman[304971]: 2025-11-28 09:56:51.475623892 +0000 UTC m=+0.159820670 container init 64c640f92507e5ccc55f6d3fc5d30c8237ff1563ed4914e674b1d81581ce7fde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_mcnulty, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, release=553, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, version=7)
Nov 28 09:56:51 np0005538513.localdomain podman[304971]: 2025-11-28 09:56:51.4875861 +0000 UTC m=+0.171782868 container start 64c640f92507e5ccc55f6d3fc5d30c8237ff1563ed4914e674b1d81581ce7fde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_mcnulty, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, version=7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, vcs-type=git, description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, RELEASE=main, GIT_BRANCH=main)
Nov 28 09:56:51 np0005538513.localdomain podman[304971]: 2025-11-28 09:56:51.487955012 +0000 UTC m=+0.172151790 container attach 64c640f92507e5ccc55f6d3fc5d30c8237ff1563ed4914e674b1d81581ce7fde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_mcnulty, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, release=553, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, distribution-scope=public, version=7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-type=git, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=)
Nov 28 09:56:51 np0005538513.localdomain crazy_mcnulty[304986]: 167 167
Nov 28 09:56:51 np0005538513.localdomain systemd[1]: libpod-64c640f92507e5ccc55f6d3fc5d30c8237ff1563ed4914e674b1d81581ce7fde.scope: Deactivated successfully.
Nov 28 09:56:51 np0005538513.localdomain podman[304971]: 2025-11-28 09:56:51.490447218 +0000 UTC m=+0.174643996 container died 64c640f92507e5ccc55f6d3fc5d30c8237ff1563ed4914e674b1d81581ce7fde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_mcnulty, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, distribution-scope=public, architecture=x86_64, ceph=True, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.expose-services=, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, RELEASE=main)
Nov 28 09:56:51 np0005538513.localdomain podman[304991]: 2025-11-28 09:56:51.593884312 +0000 UTC m=+0.088295209 container remove 64c640f92507e5ccc55f6d3fc5d30c8237ff1563ed4914e674b1d81581ce7fde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_mcnulty, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, vcs-type=git, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, release=553, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 09:56:51 np0005538513.localdomain systemd[1]: libpod-conmon-64c640f92507e5ccc55f6d3fc5d30c8237ff1563ed4914e674b1d81581ce7fde.scope: Deactivated successfully.
Nov 28 09:56:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 09:56:51 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:51 np0005538513.localdomain ceph-mon[292954]: pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 0 B/s wr, 15 op/s
Nov 28 09:56:51 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:51 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:51 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:51 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:56:51 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:51 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:56:51 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:51 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:56:51 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:51 np0005538513.localdomain sudo[304936]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:56:51 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:56:51 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Nov 28 09:56:51 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:52 np0005538513.localdomain systemd[1]: tmp-crun.rzV7al.mount: Deactivated successfully.
Nov 28 09:56:52 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-cf95f5dd5d66a2419260b23995ada69ce7f9cb04138bc8241830f25a814c7432-merged.mount: Deactivated successfully.
Nov 28 09:56:52 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:56:52 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:52 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:56:52 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:52 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:56:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:52 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:56:52 np0005538513.localdomain ceph-mon[292954]: pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 0 B/s wr, 11 op/s
Nov 28 09:56:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 28 09:56:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:53 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:56:53 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:53 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:56:53 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:53 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.0 (monmap changed)...
Nov 28 09:56:53 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.0 on np0005538514.localdomain
Nov 28 09:56:53 np0005538513.localdomain ceph-mon[292954]: from='client.44559 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:56:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:53 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:56:53 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:53 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:56:53 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:53.951 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.760382) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323814760423, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 2552, "num_deletes": 255, "total_data_size": 5484296, "memory_usage": 5657152, "flush_reason": "Manual Compaction"}
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323814786295, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 4558535, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18020, "largest_seqno": 20567, "table_properties": {"data_size": 4547473, "index_size": 6799, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3205, "raw_key_size": 28856, "raw_average_key_size": 22, "raw_value_size": 4523024, "raw_average_value_size": 3536, "num_data_blocks": 297, "num_entries": 1279, "num_filter_entries": 1279, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323751, "oldest_key_time": 1764323751, "file_creation_time": 1764323814, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 25977 microseconds, and 11197 cpu microseconds.
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.3 (monmap changed)...
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.786353) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 4558535 bytes OK
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.786381) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.788426) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.788451) EVENT_LOG_v1 {"time_micros": 1764323814788443, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.788478) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 5472261, prev total WAL file size 5482291, number of live WAL files 2.
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.789566) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131303434' seq:72057594037927935, type:22 .. '7061786F73003131323936' seq:0, type:0; will stop at (end)
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(4451KB)], [30(15MB)]
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323814789629, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 21166090, "oldest_snapshot_seqno": -1}
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 11638 keys, 18010737 bytes, temperature: kUnknown
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323814913211, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 18010737, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17943903, "index_size": 36653, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29125, "raw_key_size": 312230, "raw_average_key_size": 26, "raw_value_size": 17745227, "raw_average_value_size": 1524, "num_data_blocks": 1394, "num_entries": 11638, "num_filter_entries": 11638, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764323814, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.913928) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 18010737 bytes
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.916839) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 171.1 rd, 145.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.3, 15.8 +0.0 blob) out(17.2 +0.0 blob), read-write-amplify(8.6) write-amplify(4.0) OK, records in: 12186, records dropped: 548 output_compression: NoCompression
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.916875) EVENT_LOG_v1 {"time_micros": 1764323814916858, "job": 16, "event": "compaction_finished", "compaction_time_micros": 123703, "compaction_time_cpu_micros": 44215, "output_level": 6, "num_output_files": 1, "total_output_size": 18010737, "num_input_records": 12186, "num_output_records": 11638, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.789484) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.917444) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.917450) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.917453) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.917457) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.917460) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323814918444, "job": 0, "event": "table_file_deletion", "file_number": 32}
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323814920999, "job": 0, "event": "table_file_deletion", "file_number": 30}
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Nov 28 09:56:54 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:56:55 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Nov 28 09:56:55 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:56:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:56:55 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:56:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:55.799 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:55 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:55 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:56:55 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:55 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Nov 28 09:56:55 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:55 np0005538513.localdomain podman[305008]: 2025-11-28 09:56:55.860067829 +0000 UTC m=+0.090071923 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:56:55 np0005538513.localdomain podman[305009]: 2025-11-28 09:56:55.911137991 +0000 UTC m=+0.139254408 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd)
Nov 28 09:56:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:56:55 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:56:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:56:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:55 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:56:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:56:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:55 np0005538513.localdomain podman[305008]: 2025-11-28 09:56:55.92800498 +0000 UTC m=+0.158009064 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:56:55 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:56:55 np0005538513.localdomain podman[305009]: 2025-11-28 09:56:55.981499067 +0000 UTC m=+0.209615514 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 09:56:55 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:56:56 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:56:56 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:56 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:56:56 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:56 np0005538513.localdomain ceph-mon[292954]: from='client.54277 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:56:56 np0005538513.localdomain ceph-mon[292954]: Saving service mon spec with placement label:mon
Nov 28 09:56:56 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:56:56 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:56:56 np0005538513.localdomain ceph-mon[292954]: pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 28 09:56:56 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:56 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:56 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:56:56 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:56:56 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:57 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:56:57 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:57 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:56:57 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:57 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Nov 28 09:56:57 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:58 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:56:58 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:58 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:56:58 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:58 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mon.np0005538514 (monmap changed)...
Nov 28 09:56:58 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain
Nov 28 09:56:58 np0005538513.localdomain ceph-mon[292954]: from='client.44565 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538515", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:56:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:58 np0005538513.localdomain ceph-mon[292954]: Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:56:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:58 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:56:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 28 09:56:58 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:56:58.957 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:56:59 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:56:59 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:59 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:56:59 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:59 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:56:59 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:59 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:56:59 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:59 np0005538513.localdomain ceph-mon[292954]: pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 28 09:56:59 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.1 (monmap changed)...
Nov 28 09:56:59 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:56:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:59 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:00 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:57:00 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:00 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:57:00 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:00 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:00 np0005538513.localdomain ceph-mon[292954]: Reconfiguring osd.4 (monmap changed)...
Nov 28 09:57:00 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 28 09:57:00 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:57:00 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:57:00 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:00 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:57:00 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:00 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:57:00 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:00 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Nov 28 09:57:00 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:57:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:00.802 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:01 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:57:01 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:01 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:57:01 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:01 np0005538513.localdomain ceph-mon[292954]: pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:57:01 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:57:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:57:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:57:01 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:57:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:57:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:57:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:57:02 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:57:02 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:02 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mon.np0005538515 (monmap changed)...
Nov 28 09:57:02 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:57:02 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:57:02 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:02 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 09:57:02 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:02 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Nov 28 09:57:02 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:02 np0005538513.localdomain sudo[305049]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:57:02 np0005538513.localdomain sudo[305049]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:57:02 np0005538513.localdomain sudo[305049]: pam_unix(sudo:session): session closed for user root
Nov 28 09:57:03 np0005538513.localdomain sudo[305067]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:57:03 np0005538513.localdomain sudo[305067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:57:03 np0005538513.localdomain sudo[305067]: pam_unix(sudo:session): session closed for user root
Nov 28 09:57:03 np0005538513.localdomain sudo[305085]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:57:03 np0005538513.localdomain sudo[305085]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:57:03 np0005538513.localdomain podman[305119]: 
Nov 28 09:57:03 np0005538513.localdomain podman[305119]: 2025-11-28 09:57:03.747878924 +0000 UTC m=+0.084695379 container create a5e68cb7d08a88f6466fa0fd0ee2a1271f7ab08134e244cffd9055fdf03913ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_rhodes, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-type=git, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, architecture=x86_64, CEPH_POINT_RELEASE=, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, distribution-scope=public, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_BRANCH=main, version=7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc.)
Nov 28 09:57:03 np0005538513.localdomain systemd[1]: Started libpod-conmon-a5e68cb7d08a88f6466fa0fd0ee2a1271f7ab08134e244cffd9055fdf03913ce.scope.
Nov 28 09:57:03 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:57:03 np0005538513.localdomain ceph-mon[292954]: pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:57:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:57:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:57:03 np0005538513.localdomain ceph-mon[292954]: Reconfiguring mon.np0005538513 (monmap changed)...
Nov 28 09:57:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:57:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:57:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:57:03 np0005538513.localdomain ceph-mon[292954]: Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain
Nov 28 09:57:03 np0005538513.localdomain podman[305119]: 2025-11-28 09:57:03.711431312 +0000 UTC m=+0.048247837 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:57:03 np0005538513.localdomain podman[305119]: 2025-11-28 09:57:03.81763953 +0000 UTC m=+0.154455995 container init a5e68cb7d08a88f6466fa0fd0ee2a1271f7ab08134e244cffd9055fdf03913ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_rhodes, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vcs-type=git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, architecture=x86_64, release=553, vendor=Red Hat, Inc., GIT_BRANCH=main, distribution-scope=public, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main)
Nov 28 09:57:03 np0005538513.localdomain podman[305119]: 2025-11-28 09:57:03.82928525 +0000 UTC m=+0.166101715 container start a5e68cb7d08a88f6466fa0fd0ee2a1271f7ab08134e244cffd9055fdf03913ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_rhodes, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, ceph=True, version=7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_BRANCH=main, name=rhceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55)
Nov 28 09:57:03 np0005538513.localdomain podman[305119]: 2025-11-28 09:57:03.829528917 +0000 UTC m=+0.166345412 container attach a5e68cb7d08a88f6466fa0fd0ee2a1271f7ab08134e244cffd9055fdf03913ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_rhodes, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, architecture=x86_64, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_BRANCH=main, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:57:03 np0005538513.localdomain fervent_rhodes[305134]: 167 167
Nov 28 09:57:03 np0005538513.localdomain systemd[1]: libpod-a5e68cb7d08a88f6466fa0fd0ee2a1271f7ab08134e244cffd9055fdf03913ce.scope: Deactivated successfully.
Nov 28 09:57:03 np0005538513.localdomain podman[305119]: 2025-11-28 09:57:03.835217432 +0000 UTC m=+0.172033897 container died a5e68cb7d08a88f6466fa0fd0ee2a1271f7ab08134e244cffd9055fdf03913ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_rhodes, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, release=553, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-type=git, distribution-scope=public, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:57:03 np0005538513.localdomain podman[305139]: 2025-11-28 09:57:03.937646594 +0000 UTC m=+0.091815047 container remove a5e68cb7d08a88f6466fa0fd0ee2a1271f7ab08134e244cffd9055fdf03913ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_rhodes, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_BRANCH=main, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, distribution-scope=public, build-date=2025-09-24T08:57:55, name=rhceph, release=553, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=)
Nov 28 09:57:03 np0005538513.localdomain systemd[1]: libpod-conmon-a5e68cb7d08a88f6466fa0fd0ee2a1271f7ab08134e244cffd9055fdf03913ce.scope: Deactivated successfully.
Nov 28 09:57:03 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:03.979 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:04 np0005538513.localdomain sudo[305085]: pam_unix(sudo:session): session closed for user root
Nov 28 09:57:04 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:57:04 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:04 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:57:04 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:04 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:04 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-6016f8d5ea77cfedcee980831b360ae116ffed49e619f4967787bc77a95b3280-merged.mount: Deactivated successfully.
Nov 28 09:57:04 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e40: np0005538514.djozup(active, since 23s), standbys: np0005538515.yfkzhl, np0005538513.dsfdlx
Nov 28 09:57:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:05 np0005538513.localdomain ceph-mon[292954]: pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:05 np0005538513.localdomain ceph-mon[292954]: mgrmap e40: np0005538514.djozup(active, since 23s), standbys: np0005538515.yfkzhl, np0005538513.dsfdlx
Nov 28 09:57:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:05.828 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:06 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 09:57:06 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:57:06 np0005538513.localdomain systemd[1]: tmp-crun.auq7aN.mount: Deactivated successfully.
Nov 28 09:57:06 np0005538513.localdomain podman[305156]: 2025-11-28 09:57:06.87275429 +0000 UTC m=+0.102129174 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Nov 28 09:57:06 np0005538513.localdomain podman[305156]: 2025-11-28 09:57:06.917473057 +0000 UTC m=+0.146847931 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, vcs-type=git, release=1755695350)
Nov 28 09:57:06 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:57:07 np0005538513.localdomain ceph-mon[292954]: pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:08 np0005538513.localdomain ceph-mon[292954]: pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:08 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:08.982 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:10 np0005538513.localdomain podman[238687]: time="2025-11-28T09:57:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:57:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:57:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 28 09:57:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:57:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18736 "" "Go-http-client/1.1"
Nov 28 09:57:10 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:10.832 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:11 np0005538513.localdomain ceph-mon[292954]: pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:57:11 np0005538513.localdomain podman[305176]: 2025-11-28 09:57:11.847651872 +0000 UTC m=+0.083415859 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:57:11 np0005538513.localdomain podman[305176]: 2025-11-28 09:57:11.857415712 +0000 UTC m=+0.093179739 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:57:11 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:57:12 np0005538513.localdomain ceph-mon[292954]: pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3967776959' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:57:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3967776959' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:57:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:57:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:57:13 np0005538513.localdomain podman[305197]: 2025-11-28 09:57:13.847256622 +0000 UTC m=+0.087297749 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:57:13 np0005538513.localdomain podman[305198]: 2025-11-28 09:57:13.911069865 +0000 UTC m=+0.147730597 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:57:13 np0005538513.localdomain podman[305197]: 2025-11-28 09:57:13.920457735 +0000 UTC m=+0.160498802 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:57:13 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:57:13 np0005538513.localdomain podman[305198]: 2025-11-28 09:57:13.943529655 +0000 UTC m=+0.180190407 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true)
Nov 28 09:57:13 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:57:13 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:13.984 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:15 np0005538513.localdomain ceph-mon[292954]: pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:15.869 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:57:16 np0005538513.localdomain podman[305239]: 2025-11-28 09:57:16.845340416 +0000 UTC m=+0.082474680 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 28 09:57:16 np0005538513.localdomain podman[305239]: 2025-11-28 09:57:16.859494871 +0000 UTC m=+0.096629145 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:57:16 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:57:17 np0005538513.localdomain ceph-mon[292954]: pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:57:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:57:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:57:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:57:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:57:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:57:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:57:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:57:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:57:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:57:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:57:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:57:18 np0005538513.localdomain ceph-mon[292954]: pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:18 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:18.987 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:20.872 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:21 np0005538513.localdomain ceph-mon[292954]: pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:23 np0005538513.localdomain ceph-mon[292954]: pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:24.018 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:25 np0005538513.localdomain ceph-mon[292954]: pgmap v25: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:25.906 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:57:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:57:26 np0005538513.localdomain systemd[299031]: Starting Mark boot as successful...
Nov 28 09:57:26 np0005538513.localdomain podman[305259]: 2025-11-28 09:57:26.85750472 +0000 UTC m=+0.086215785 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:57:26 np0005538513.localdomain systemd[299031]: Finished Mark boot as successful.
Nov 28 09:57:26 np0005538513.localdomain podman[305259]: 2025-11-28 09:57:26.888586706 +0000 UTC m=+0.117297731 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:57:26 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:57:26 np0005538513.localdomain podman[305260]: 2025-11-28 09:57:26.935169671 +0000 UTC m=+0.160358118 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd)
Nov 28 09:57:26 np0005538513.localdomain podman[305260]: 2025-11-28 09:57:26.948402027 +0000 UTC m=+0.173590514 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 09:57:26 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:57:27 np0005538513.localdomain ceph-mon[292954]: pgmap v26: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:28 np0005538513.localdomain ceph-mon[292954]: pgmap v27: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:29.021 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:30.908 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:31 np0005538513.localdomain ceph-mon[292954]: pgmap v28: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:32 np0005538513.localdomain ceph-mon[292954]: pgmap v29: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:33.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:57:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:33.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:57:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:34.067 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:34.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:57:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:34.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:57:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:34.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:57:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "status", "format": "json"} v 0)
Nov 28 09:57:35 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/1459346702' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 28 09:57:35 np0005538513.localdomain ceph-mon[292954]: pgmap v30: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:35 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.200:0/1459346702' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 28 09:57:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:35.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:57:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:35.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:57:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:35.937 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:36.766 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:57:37 np0005538513.localdomain ceph-mon[292954]: pgmap v31: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:57:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:37.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:57:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:37.796 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:57:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:37.796 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:57:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:37.796 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:57:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:37.797 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:57:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:37.797 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:57:37 np0005538513.localdomain podman[305300]: 2025-11-28 09:57:37.8557683 +0000 UTC m=+0.091803677 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:57:37 np0005538513.localdomain podman[305300]: 2025-11-28 09:57:37.873491445 +0000 UTC m=+0.109526802 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, version=9.6, io.openshift.tags=minimal rhel9)
Nov 28 09:57:37 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:57:38 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:57:38 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3192274012' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:57:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:38.253 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:57:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:38.316 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:57:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:38.317 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:57:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:38.504 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:57:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:38.506 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11760MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:57:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:38.507 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:57:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:38.507 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:57:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:38.623 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:57:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:38.624 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:57:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:38.624 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:57:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:38.662 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:57:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3192274012' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:57:38 np0005538513.localdomain ceph-mon[292954]: pgmap v32: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:39.071 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:57:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/510121500' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:57:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:39.115 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:57:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:39.121 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:57:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:39.142 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:57:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:39.145 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:57:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:39.146 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:57:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/510121500' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:57:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1903764416' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:57:40 np0005538513.localdomain podman[238687]: time="2025-11-28T09:57:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:57:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:57:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 28 09:57:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:57:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18727 "" "Go-http-client/1.1"
Nov 28 09:57:40 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/3305388726' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:57:40 np0005538513.localdomain ceph-mon[292954]: pgmap v33: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:40 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/279813176' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:57:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:40.939 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:41 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/436717469' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:57:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:57:42 np0005538513.localdomain ceph-mon[292954]: pgmap v34: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:42 np0005538513.localdomain systemd[1]: tmp-crun.wTrBOp.mount: Deactivated successfully.
Nov 28 09:57:42 np0005538513.localdomain podman[305363]: 2025-11-28 09:57:42.85426823 +0000 UTC m=+0.091733365 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:57:42 np0005538513.localdomain podman[305363]: 2025-11-28 09:57:42.887762651 +0000 UTC m=+0.125227786 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:57:42 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:57:43 np0005538513.localdomain ceph-mon[292954]: from='client.44598 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:57:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:44.103 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:44.146 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:57:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:44.146 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:57:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:44.146 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:57:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:44.532 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:57:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:44.533 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:57:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:44.533 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:57:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:44.534 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:57:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:44 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:57:44 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:57:44 np0005538513.localdomain podman[305386]: 2025-11-28 09:57:44.855304304 +0000 UTC m=+0.091153836 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 28 09:57:44 np0005538513.localdomain podman[305387]: 2025-11-28 09:57:44.901663122 +0000 UTC m=+0.133912643 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:57:44 np0005538513.localdomain podman[305387]: 2025-11-28 09:57:44.910355099 +0000 UTC m=+0.142604630 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 09:57:44 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:57:44 np0005538513.localdomain podman[305386]: 2025-11-28 09:57:44.968814169 +0000 UTC m=+0.204663671 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 09:57:44 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:57:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:45.011 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:57:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:45.030 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:57:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:45.030 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:57:45 np0005538513.localdomain ceph-mon[292954]: pgmap v35: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:45.969 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:47.651 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:57:47 np0005538513.localdomain ceph-mon[292954]: pgmap v36: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:57:47 np0005538513.localdomain systemd[1]: tmp-crun.aO0sQn.mount: Deactivated successfully.
Nov 28 09:57:47 np0005538513.localdomain podman[305429]: 2025-11-28 09:57:47.832371482 +0000 UTC m=+0.072774591 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:57:47 np0005538513.localdomain podman[305429]: 2025-11-28 09:57:47.846505917 +0000 UTC m=+0.086909056 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:57:47 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:57:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:57:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:57:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:57:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:57:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:57:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:57:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:57:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:57:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:57:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:57:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:57:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:57:48 np0005538513.localdomain ceph-mon[292954]: pgmap v37: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:49.109 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0)
Nov 28 09:57:49 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/3756897191' entity='client.admin' cmd={"prefix": "config dump", "format": "json"} : dispatch
Nov 28 09:57:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:49 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.200:0/3756897191' entity='client.admin' cmd={"prefix": "config dump", "format": "json"} : dispatch
Nov 28 09:57:50 np0005538513.localdomain ceph-mon[292954]: pgmap v38: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:57:50.837 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:57:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:57:50.838 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:57:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:57:50.839 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:57:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:50.972 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:53 np0005538513.localdomain ceph-mon[292954]: pgmap v39: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:54.135 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:55 np0005538513.localdomain ceph-mon[292954]: pgmap v40: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:56.000 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:57 np0005538513.localdomain ceph-mon[292954]: from='client.44607 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:57:57 np0005538513.localdomain ceph-mon[292954]: pgmap v41: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:57:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:57:57 np0005538513.localdomain systemd[1]: tmp-crun.uyCfxD.mount: Deactivated successfully.
Nov 28 09:57:57 np0005538513.localdomain podman[305448]: 2025-11-28 09:57:57.857804076 +0000 UTC m=+0.091414154 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:57:57 np0005538513.localdomain podman[305448]: 2025-11-28 09:57:57.867475363 +0000 UTC m=+0.101085481 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:57:57 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:57:57 np0005538513.localdomain podman[305449]: 2025-11-28 09:57:57.921740117 +0000 UTC m=+0.148620448 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 09:57:57 np0005538513.localdomain podman[305449]: 2025-11-28 09:57:57.958159064 +0000 UTC m=+0.185039395 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 09:57:57 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:57:58 np0005538513.localdomain ceph-mon[292954]: pgmap v42: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:57:59.139 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:57:59 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.673 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.673 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.678 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '381850ba-2556-441b-8d90-c6bad9de787f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:58:00.674225', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'b9454e36-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.846130737, 'message_signature': '8c82c25bcb1b43d262155e005990eaa516cc04ea53d532cd5e12ffba163c1d26'}]}, 'timestamp': '2025-11-28 09:58:00.678848', '_unique_id': '7a9a89928aee4af79361467b6d5a31c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.681 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.693 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.693 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '717bb117-4ac9-407a-8f3a-2b0cec487e70', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:58:00.681996', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b9478e6c-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.853922026, 'message_signature': '5c004f5a98c075fe8517907f3de65db293000dd41e572d4c4724f39daea6a3ef'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:58:00.681996', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b9479f06-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.853922026, 'message_signature': '92edd8a22433c2a6d767825168ecfc2531e7e64084da7f5a1f848ef067ea057d'}]}, 'timestamp': '2025-11-28 09:58:00.693936', '_unique_id': 'b882e977d67c4ec68fd151d601a6e6c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.696 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.696 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34d6eebd-5fc4-4b6d-a541-8705ac5f7516', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:58:00.696196', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'b9480810-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.846130737, 'message_signature': 'e162aaf200bc3d0077d97dd446a2f709a0cbc61936a73a99840d1ac6889c37f8'}]}, 'timestamp': '2025-11-28 09:58:00.696656', '_unique_id': '29f7628d3fc743ca910cf6a627c69d66'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.698 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.725 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.725 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe8ef821-c176-4882-8d32-50dfd33104a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:58:00.698720', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b94c7332-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': 'dbc558f696527d8f084c2ab64ee0131bb260e7f27f3b8e62e8faaabb151f67c2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:58:00.698720', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b94c85ca-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': '4fe48d5f97754c26ad641859eeed150e4e84d9b572419b4b4e7b9f36b3afd3cc'}]}, 'timestamp': '2025-11-28 09:58:00.726109', '_unique_id': 'f8556b5347944ed08e9b2391afbd1815'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.728 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.728 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b2472f5-92bf-49a9-a819-2138c336d763', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:58:00.728447', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'b94cf4e2-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.846130737, 'message_signature': '27df0f1b7f3a9f9500e203dfcf792897de844d89723c232d853eae1009895fe0'}]}, 'timestamp': '2025-11-28 09:58:00.728940', '_unique_id': '270e735ae67c4c0dbdad1a2049c37ea4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.731 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.731 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.731 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73e9fc56-eda6-4836-bc4c-0126dbc960df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:58:00.731336', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'b94d6486-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.846130737, 'message_signature': '6ab8ad9eec20310ebf532f91275d3f62253e5b97be7385e864bdcf198ec1d786'}]}, 'timestamp': '2025-11-28 09:58:00.731823', '_unique_id': '9c02f0c9021244858cbae5b995dd6741'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.733 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.734 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.734 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d3ab2c2-9b03-42a5-94ff-8b1dc22613dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:58:00.733993', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b94dcdd6-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': '0139bb7300224533ba780de2ff3b4fabef3196ff85b6c2806f9db4af360de88b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:58:00.733993', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b94dddf8-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': '465bc5e0047df21aef6279323d3d54b85d07b0f29fb85af358fd8b9fef037687'}]}, 'timestamp': '2025-11-28 09:58:00.734902', '_unique_id': 'fb325843f531460dbd664aa361cabe07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.736 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.737 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.737 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd7e4d95b-6010-4b62-baac-d58fed0f0551', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:58:00.737150', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b94e4798-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': '5d77eeac4e27d087544f07e1f3a0182999d9f349e6ae2418c202b6becce2cf0a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:58:00.737150', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b94e579c-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': '33dd6ad122080d7714ce61f313af9d26b5367cbc1cf39a15c082c8263fdd4eaa'}]}, 'timestamp': '2025-11-28 09:58:00.738010', '_unique_id': '2f56664fea074f8eab1f08e583d32a05'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.740 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.740 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.740 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65f712cb-9ccc-443b-91ba-6f84f2ac38f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:58:00.740250', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b94ec204-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': '2803d2bdd5543b629b210e6ae5ebecbecbdff726724f83db8cc1f4cc5ae88063'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:58:00.740250', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b94ed212-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': '0647523d875c7825f0d65249d12b689f0a8c11d345d160adbf56358f498597a3'}]}, 'timestamp': '2025-11-28 09:58:00.741176', '_unique_id': '2f9e46dcd610456983d2b6d71aea741b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.743 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.761 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '735547e4-b61c-4184-bf7d-564485fa2c8c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:58:00.743360', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'b95205f4-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.933456905, 'message_signature': 'efc781793bc830662eea4c69bb40c0ac2443cedc693e965f1a607af9348a9d1b'}]}, 'timestamp': '2025-11-28 09:58:00.762162', '_unique_id': 'b0329057ddaa4646aff08f86371358de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.764 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.764 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.764 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c17ce39a-5727-47f4-8a50-4c43594b536b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:58:00.764312', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b9526e0e-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.853922026, 'message_signature': '388e6c287398fa6697d56d952b00d04327e40f56b9f01c1de93a50bb15d1c756'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:58:00.764312', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b9528074-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.853922026, 'message_signature': '93290285964464c84d7152c1613c2999ffe0c864eb0bd2039dd714d37141c938'}]}, 'timestamp': '2025-11-28 09:58:00.765250', '_unique_id': 'f60e219ddfbd4a78a8cb06543b88dc6e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.767 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.767 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.768 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a02bc1b8-4b58-4494-8456-cd362ef9c92f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:58:00.767662', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b952f004-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': '69cdf7ae2931186562410179cbd5e9e4b49a7ae029d5e734a20ba68538b5a28e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:58:00.767662', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b95301de-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': '3c737cb06a3282e1b9a60cfb280c79133bf6726a2e5bc06f3d6cb7ecfaab1017'}]}, 'timestamp': '2025-11-28 09:58:00.768585', '_unique_id': 'e2aa3816ce0b42f5a4790d0b8e35d8d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.770 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.770 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 14550000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e17382f6-e00f-42fd-b13b-e7e69b91d29d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14550000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:58:00.770816', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'b9536bb0-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.933456905, 'message_signature': '2b3b86cc4341f2baf3e94864e0d0f5ad0101aed18d3c0d62ed44dd13eb740dc7'}]}, 'timestamp': '2025-11-28 09:58:00.771283', '_unique_id': '51ca6bbff7c248d5abede5120a017b6f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.773 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.773 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5362d2af-d200-4f95-8151-d8bed4d7f17d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:58:00.773372', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'b953ce34-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.846130737, 'message_signature': '73114aff2dfc8d7a8ff38270bd8fc3c4c319553e1f9099fce45ce2c2c770c96f'}]}, 'timestamp': '2025-11-28 09:58:00.773813', '_unique_id': '5830643199cd40c88f4a723b637c099d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.775 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.775 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e58ef211-a45c-43b7-8930-9cc2cffc6094', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:58:00.775824', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'b9542f0a-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.846130737, 'message_signature': '982a4fd72d7d871fac64f3aeca234022b1b88ce7ccd0a02c23b032d5a30a0aed'}]}, 'timestamp': '2025-11-28 09:58:00.776293', '_unique_id': '5aaacfddad3849db937ea3c207a98959'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.778 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.778 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.778 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '581260ba-a6f1-460c-942f-ebd38a9592d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:58:00.778330', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b9548fc2-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': '6f47e2343bfc887fbcafddb1573dce07a17766248b9dd174e69d94c200086461'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:58:00.778330', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b9549f3a-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': 'f07d07f12eb6352d6cf434e728ba10520c586f7338546b87094eb16d928ee768'}]}, 'timestamp': '2025-11-28 09:58:00.779166', '_unique_id': '34559702dc9b4854affaa68d236c516c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.781 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.781 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9bba8455-c7ac-46d3-9237-f943ed1bff05', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:58:00.781240', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'b95501be-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.846130737, 'message_signature': '39d8e722c5054b99ff98c021bcab27398bef4428b533ca4a574815344a3243de'}]}, 'timestamp': '2025-11-28 09:58:00.781685', '_unique_id': 'fa43228f6fd24ab4b42bd6226b33df5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.783 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.783 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.784 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52c1552a-87d5-4c0f-9fa4-a2d1b7f6c1fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:58:00.783697', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b9556136-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.853922026, 'message_signature': '7485c19660a8373d7e8c245726bb6b52959fce92a76f2fedf4c306b56def12f4'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:58:00.783697', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b9557220-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.853922026, 'message_signature': '3d40489de150342ae5a8b6261a6a08154afe3833d8852a1846ba4534426a06e1'}]}, 'timestamp': '2025-11-28 09:58:00.784559', '_unique_id': 'c84b6963a25444a4900f00cc2542bba3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.786 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.787 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73fbef90-9330-4d7b-b0a2-7cbadbe5d5cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:58:00.786999', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'b955e430-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.846130737, 'message_signature': '321742c1e55756f229147e6e1f98c930a2e536abdd1376bd39ac21c2cc61212d'}]}, 'timestamp': '2025-11-28 09:58:00.787488', '_unique_id': 'faa1bf61e93b4810804ccbb9139b8548'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.789 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.789 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2b1a205-7d28-40a7-a2c3-f68a0d2e6833', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:58:00.789521', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'b9564510-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.846130737, 'message_signature': '0f0c67f84fc5a12ab2e06e52ff8c5774d870d0e36ddb1f01f2ae188bf29b3580'}]}, 'timestamp': '2025-11-28 09:58:00.789962', '_unique_id': '25f0cb0f2cbb4d0080a1a7bbe3ef7f39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.791 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.792 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7772d74-c8b1-4d77-8ffa-eed95b26a73e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:58:00.792127', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'b956ab04-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.846130737, 'message_signature': '75ce98269882f3fe13da93df27fe2416f8194996238ae34f57326c3b4ebfb54d'}]}, 'timestamp': '2025-11-28 09:58:00.792595', '_unique_id': 'babf634c34e74e4dbcc0d6387fef2267'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 09:58:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.794 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:01.003 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:01 np0005538513.localdomain ceph-mon[292954]: pgmap v43: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:02 np0005538513.localdomain ceph-mon[292954]: pgmap v44: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:02 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 28 09:58:02 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/2455933958' entity='client.admin' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 09:58:03 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.200:0/2455933958' entity='client.admin' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 09:58:04 np0005538513.localdomain sudo[305490]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:58:04 np0005538513.localdomain sudo[305490]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:04.170 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:04 np0005538513.localdomain sudo[305490]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:04 np0005538513.localdomain sudo[305508]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:58:04 np0005538513.localdomain sudo[305508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:04 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:04 np0005538513.localdomain sudo[305508]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:04 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:58:05 np0005538513.localdomain sudo[305558]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:58:05 np0005538513.localdomain sudo[305558]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:05 np0005538513.localdomain sudo[305558]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mgr fail"} v 0)
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/1630948378' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 do_prune osdmap full prune enabled
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : Activating manager daemon np0005538515.yfkzhl
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 e92: 6 total, 6 up, 6 in
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e92: 6 total, 6 up, 6 in
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/1630948378' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e41: np0005538515.yfkzhl(active, starting, since 0.0380796s), standbys: np0005538513.dsfdlx
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : Manager daemon np0005538515.yfkzhl is now available
Nov 28 09:58:05 np0005538513.localdomain sshd[303654]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:58:05 np0005538513.localdomain systemd[1]: session-71.scope: Deactivated successfully.
Nov 28 09:58:05 np0005538513.localdomain systemd[1]: session-71.scope: Consumed 10.450s CPU time.
Nov 28 09:58:05 np0005538513.localdomain systemd-logind[764]: Session 71 logged out. Waiting for processes to exit.
Nov 28 09:58:05 np0005538513.localdomain systemd-logind[764]: Removed session 71.
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/mirror_snapshot_schedule"} v 0)
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/mirror_snapshot_schedule"} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: pgmap v45: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.200:0/1630948378' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: Activating manager daemon np0005538515.yfkzhl
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: osdmap e92: 6 total, 6 up, 6 in
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.200:0/1630948378' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: mgrmap e41: np0005538515.yfkzhl(active, starting, since 0.0380796s), standbys: np0005538513.dsfdlx
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mds metadata", "who": "mds.np0005538513.yljthc"} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mds metadata", "who": "mds.np0005538515.anvatb"} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mds metadata", "who": "mds.np0005538514.umgtoy"} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr metadata", "who": "np0005538515.yfkzhl", "id": "np0005538515.yfkzhl"} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr metadata", "who": "np0005538513.dsfdlx", "id": "np0005538513.dsfdlx"} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mds metadata"} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata"} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata"} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: Manager daemon np0005538515.yfkzhl is now available
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/mirror_snapshot_schedule"} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/mirror_snapshot_schedule"} : dispatch
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/trash_purge_schedule"} v 0)
Nov 28 09:58:05 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/trash_purge_schedule"} : dispatch
Nov 28 09:58:05 np0005538513.localdomain sshd[305576]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:58:05 np0005538513.localdomain sshd[305576]: Accepted publickey for ceph-admin from 192.168.122.108 port 58368 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 09:58:05 np0005538513.localdomain systemd-logind[764]: New session 72 of user ceph-admin.
Nov 28 09:58:05 np0005538513.localdomain systemd[1]: Started Session 72 of User ceph-admin.
Nov 28 09:58:05 np0005538513.localdomain sshd[305576]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 09:58:06 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:06.038 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:06 np0005538513.localdomain sudo[305580]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:58:06 np0005538513.localdomain sudo[305580]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:06 np0005538513.localdomain sudo[305580]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:06 np0005538513.localdomain sudo[305598]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:58:06 np0005538513.localdomain sudo[305598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:06 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e42: np0005538515.yfkzhl(active, since 1.05546s), standbys: np0005538513.dsfdlx
Nov 28 09:58:06 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/trash_purge_schedule"} : dispatch
Nov 28 09:58:06 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/trash_purge_schedule"} : dispatch
Nov 28 09:58:06 np0005538513.localdomain ceph-mon[292954]: mgrmap e42: np0005538515.yfkzhl(active, since 1.05546s), standbys: np0005538513.dsfdlx
Nov 28 09:58:06 np0005538513.localdomain podman[305689]: 2025-11-28 09:58:06.980115969 +0000 UTC m=+0.092917750 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, version=7, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, release=553, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, ceph=True, com.redhat.component=rhceph-container, name=rhceph, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:58:07 np0005538513.localdomain podman[305689]: 2025-11-28 09:58:07.082931362 +0000 UTC m=+0.195733153 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vendor=Red Hat, Inc., GIT_CLEAN=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, release=553, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, vcs-type=git)
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : Cluster is now healthy
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:07 np0005538513.localdomain sudo[305598]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:58:06] ENGINE Bus STARTING
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:58:06] ENGINE Serving on http://172.18.0.108:8765
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:58:07] ENGINE Serving on https://172.18.0.108:7150
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:58:07] ENGINE Bus STARTED
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: [28/Nov/2025:09:58:07] ENGINE Client ('172.18.0.108', 56132) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: Cluster is now healthy
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:58:07 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:07 np0005538513.localdomain sudo[305805]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:58:07 np0005538513.localdomain sudo[305805]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:07 np0005538513.localdomain sudo[305805]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:07 np0005538513.localdomain sudo[305823]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:58:07 np0005538513.localdomain sudo[305823]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:58:08 np0005538513.localdomain systemd[1]: tmp-crun.oOiKjb.mount: Deactivated successfully.
Nov 28 09:58:08 np0005538513.localdomain podman[305841]: 2025-11-28 09:58:08.032046397 +0000 UTC m=+0.092748545 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Nov 28 09:58:08 np0005538513.localdomain podman[305841]: 2025-11-28 09:58:08.048221452 +0000 UTC m=+0.108923600 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350)
Nov 28 09:58:08 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:58:08 np0005538513.localdomain sudo[305823]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:08 np0005538513.localdomain sudo[305894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:58:08 np0005538513.localdomain sudo[305894]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:08 np0005538513.localdomain sudo[305894]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:08 np0005538513.localdomain ceph-mon[292954]: pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:08 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:08 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:08 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:08 np0005538513.localdomain sudo[305912]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 28 09:58:08 np0005538513.localdomain sudo[305912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:08 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e43: np0005538515.yfkzhl(active, since 3s), standbys: np0005538513.dsfdlx
Nov 28 09:58:08 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:58:08 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:08 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:58:08 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:08 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Nov 28 09:58:08 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:08 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Nov 28 09:58:08 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:08 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 09:58:09 np0005538513.localdomain sudo[305912]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:09 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:09.173 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 09:58:09 np0005538513.localdomain sudo[305949]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:58:09 np0005538513.localdomain sudo[305949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:09 np0005538513.localdomain sudo[305949]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:09 np0005538513.localdomain sudo[305967]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:58:09 np0005538513.localdomain sudo[305967]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:09 np0005538513.localdomain sudo[305967]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:09 np0005538513.localdomain sudo[305985]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:58:09 np0005538513.localdomain sudo[305985]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:09 np0005538513.localdomain sudo[305985]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:09 np0005538513.localdomain sudo[306003]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:58:09 np0005538513.localdomain sudo[306003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:09 np0005538513.localdomain sudo[306003]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:09 np0005538513.localdomain sudo[306021]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:58:09 np0005538513.localdomain sudo[306021]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:09 np0005538513.localdomain sudo[306021]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: mgrmap e43: np0005538515.yfkzhl(active, since 3s), standbys: np0005538513.dsfdlx
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:58:09 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:58:09 np0005538513.localdomain sudo[306055]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:58:09 np0005538513.localdomain sudo[306055]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:09 np0005538513.localdomain sudo[306055]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:09 np0005538513.localdomain sudo[306073]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:58:09 np0005538513.localdomain sudo[306073]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:09 np0005538513.localdomain sudo[306073]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:09 np0005538513.localdomain sudo[306091]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:58:09 np0005538513.localdomain sudo[306091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:09 np0005538513.localdomain sudo[306091]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:10 np0005538513.localdomain sudo[306109]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:58:10 np0005538513.localdomain sudo[306109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:10 np0005538513.localdomain sudo[306109]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:10 np0005538513.localdomain podman[238687]: time="2025-11-28T09:58:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:58:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:58:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 28 09:58:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:58:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18743 "" "Go-http-client/1.1"
Nov 28 09:58:10 np0005538513.localdomain sudo[306127]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:58:10 np0005538513.localdomain sudo[306127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:10 np0005538513.localdomain sudo[306127]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:10 np0005538513.localdomain sudo[306145]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:58:10 np0005538513.localdomain sudo[306145]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:10 np0005538513.localdomain sudo[306145]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:10 np0005538513.localdomain sudo[306163]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:58:10 np0005538513.localdomain sudo[306163]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:10 np0005538513.localdomain sudo[306163]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:10 np0005538513.localdomain sudo[306181]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:58:10 np0005538513.localdomain sudo[306181]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:10 np0005538513.localdomain sudo[306181]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:10 np0005538513.localdomain sudo[306215]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:58:10 np0005538513.localdomain sudo[306215]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:10 np0005538513.localdomain sudo[306215]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:10 np0005538513.localdomain sudo[306233]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:58:10 np0005538513.localdomain sudo[306233]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:10 np0005538513.localdomain sudo[306233]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:10 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : Standby manager daemon np0005538514.djozup started
Nov 28 09:58:10 np0005538513.localdomain sudo[306251]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:58:10 np0005538513.localdomain sudo[306251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:10 np0005538513.localdomain sudo[306251]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:10 np0005538513.localdomain sudo[306269]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:58:10 np0005538513.localdomain sudo[306269]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:10 np0005538513.localdomain sudo[306269]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:10 np0005538513.localdomain sudo[306287]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:58:10 np0005538513.localdomain sudo[306287]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:10 np0005538513.localdomain sudo[306287]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:10 np0005538513.localdomain sudo[306305]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:58:10 np0005538513.localdomain sudo[306305]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:10 np0005538513.localdomain sudo[306305]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:11 np0005538513.localdomain sudo[306323]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:58:11 np0005538513.localdomain sudo[306323]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:11 np0005538513.localdomain sudo[306323]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:11.042 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:11 np0005538513.localdomain sudo[306341]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:58:11 np0005538513.localdomain sudo[306341]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:11 np0005538513.localdomain sudo[306341]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:11 np0005538513.localdomain ceph-mon[292954]: pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:11 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:58:11 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:58:11 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:58:11 np0005538513.localdomain ceph-mon[292954]: Standby manager daemon np0005538514.djozup started
Nov 28 09:58:11 np0005538513.localdomain sudo[306375]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:58:11 np0005538513.localdomain sudo[306375]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:11 np0005538513.localdomain sudo[306375]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:11 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e44: np0005538515.yfkzhl(active, since 5s), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 09:58:11 np0005538513.localdomain sudo[306393]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:58:11 np0005538513.localdomain sudo[306393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:11 np0005538513.localdomain sudo[306393]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:11 np0005538513.localdomain sudo[306411]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Nov 28 09:58:11 np0005538513.localdomain sudo[306411]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:11 np0005538513.localdomain sudo[306411]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:11 np0005538513.localdomain sudo[306429]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:58:11 np0005538513.localdomain sudo[306429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:11 np0005538513.localdomain sudo[306429]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:11 np0005538513.localdomain sudo[306447]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:58:11 np0005538513.localdomain sudo[306447]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:11 np0005538513.localdomain sudo[306447]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:11 np0005538513.localdomain sudo[306465]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:58:11 np0005538513.localdomain sudo[306465]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:11 np0005538513.localdomain sudo[306465]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:11 np0005538513.localdomain sudo[306483]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:58:11 np0005538513.localdomain sudo[306483]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:11 np0005538513.localdomain sudo[306483]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:11 np0005538513.localdomain sudo[306501]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:58:11 np0005538513.localdomain sudo[306501]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:11 np0005538513.localdomain sudo[306501]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:58:11 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:58:11 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:11 np0005538513.localdomain sudo[306535]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:58:11 np0005538513.localdomain sudo[306535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:11 np0005538513.localdomain sudo[306535]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:12 np0005538513.localdomain sudo[306553]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:58:12 np0005538513.localdomain sudo[306553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:12 np0005538513.localdomain sudo[306553]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:12 np0005538513.localdomain sudo[306571]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:58:12 np0005538513.localdomain sudo[306571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:12 np0005538513.localdomain sudo[306571]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: mgrmap e44: np0005538515.yfkzhl(active, since 5s), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr metadata", "who": "np0005538514.djozup", "id": "np0005538514.djozup"} : dispatch
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:58:12 np0005538513.localdomain sudo[306589]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:58:12 np0005538513.localdomain sudo[306589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:12 np0005538513.localdomain sudo[306589]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 09:58:12 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:12 np0005538513.localdomain sudo[306607]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:58:12 np0005538513.localdomain sudo[306607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:12 np0005538513.localdomain sudo[306607]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:13 np0005538513.localdomain ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:58:13 np0005538513.localdomain ceph-mon[292954]: pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:58:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:58:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:58:13 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 09:58:13 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1746112093' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:58:13 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 09:58:13 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1746112093' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:58:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:58:13 np0005538513.localdomain systemd[1]: tmp-crun.dAdJv5.mount: Deactivated successfully.
Nov 28 09:58:13 np0005538513.localdomain podman[306625]: 2025-11-28 09:58:13.872357412 +0000 UTC m=+0.103883998 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:58:13 np0005538513.localdomain podman[306625]: 2025-11-28 09:58:13.886516206 +0000 UTC m=+0.118042772 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:58:13 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:58:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:14.209 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:14 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1746112093' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:58:14 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1746112093' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:58:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:15 np0005538513.localdomain ceph-mon[292954]: pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Nov 28 09:58:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 09:58:15 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:58:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:58:15 np0005538513.localdomain podman[306648]: 2025-11-28 09:58:15.855731313 +0000 UTC m=+0.082841671 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:58:15 np0005538513.localdomain podman[306649]: 2025-11-28 09:58:15.948433326 +0000 UTC m=+0.173848542 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 28 09:58:15 np0005538513.localdomain podman[306648]: 2025-11-28 09:58:15.959438974 +0000 UTC m=+0.186549352 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:58:15 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:58:15 np0005538513.localdomain podman[306649]: 2025-11-28 09:58:15.98641103 +0000 UTC m=+0.211826206 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 28 09:58:16 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:58:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:16.045 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:16 np0005538513.localdomain ceph-mon[292954]: pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Nov 28 09:58:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:58:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:58:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:58:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:58:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:58:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:58:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:58:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:58:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:58:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:58:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:58:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:58:18 np0005538513.localdomain ceph-mon[292954]: pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 28 09:58:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:58:18 np0005538513.localdomain podman[306690]: 2025-11-28 09:58:18.855714658 +0000 UTC m=+0.093470097 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible)
Nov 28 09:58:18 np0005538513.localdomain podman[306690]: 2025-11-28 09:58:18.87041041 +0000 UTC m=+0.108165819 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:58:18 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:58:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:19.233 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:20 np0005538513.localdomain ceph-mon[292954]: pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:58:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:21.046 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:22 np0005538513.localdomain ceph-mon[292954]: pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:58:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:24.259 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:24 np0005538513.localdomain ceph-mon[292954]: pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:58:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:26.089 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:26 np0005538513.localdomain ceph-mon[292954]: pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:58:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:58:28 np0005538513.localdomain ceph-mon[292954]: pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:28 np0005538513.localdomain podman[306710]: 2025-11-28 09:58:28.846690547 +0000 UTC m=+0.077309962 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:58:28 np0005538513.localdomain podman[306710]: 2025-11-28 09:58:28.857414026 +0000 UTC m=+0.088033471 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:58:28 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:58:28 np0005538513.localdomain podman[306711]: 2025-11-28 09:58:28.909135641 +0000 UTC m=+0.136269150 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 09:58:28 np0005538513.localdomain podman[306711]: 2025-11-28 09:58:28.946511157 +0000 UTC m=+0.173644686 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 09:58:28 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:58:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:29.263 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:30 np0005538513.localdomain ceph-mon[292954]: pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:31.092 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:32 np0005538513.localdomain ceph-mon[292954]: pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:34.444 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:58:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.1 total, 600.0 interval
                                                          Cumulative writes: 5057 writes, 22K keys, 5057 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5057 writes, 683 syncs, 7.40 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 86 writes, 299 keys, 86 commit groups, 1.0 writes per commit group, ingest: 0.37 MB, 0.00 MB/s
                                                          Interval WAL: 86 writes, 39 syncs, 2.21 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 09:58:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:34.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:58:34 np0005538513.localdomain ceph-mon[292954]: pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:35.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:58:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:35.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:58:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:35.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:58:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:35.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:58:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:35.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:58:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:36.121 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:36.768 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:58:36 np0005538513.localdomain ceph-mon[292954]: pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:37.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:58:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:58:38 np0005538513.localdomain podman[306751]: 2025-11-28 09:58:38.839251722 +0000 UTC m=+0.079704896 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, config_id=edpm, io.buildah.version=1.33.7, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 09:58:38 np0005538513.localdomain podman[306751]: 2025-11-28 09:58:38.882441616 +0000 UTC m=+0.122894750 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, config_id=edpm, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 09:58:38 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:58:38 np0005538513.localdomain ceph-mon[292954]: pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:39.296 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:58:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.3 total, 600.0 interval
                                                          Cumulative writes: 5847 writes, 25K keys, 5847 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5847 writes, 861 syncs, 6.79 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 165 writes, 340 keys, 165 commit groups, 1.0 writes per commit group, ingest: 0.33 MB, 0.00 MB/s
                                                          Interval WAL: 165 writes, 82 syncs, 2.01 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 09:58:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:39.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:58:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:39.787 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:58:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:39.787 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:58:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:39.788 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:58:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:39.788 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:58:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:39.789 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:58:40 np0005538513.localdomain podman[238687]: time="2025-11-28T09:58:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:58:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:58:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 28 09:58:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:58:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18739 "" "Go-http-client/1.1"
Nov 28 09:58:40 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:58:40 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4221705048' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:58:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:40.242 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:58:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:40.300 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:58:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:40.300 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:58:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:40.522 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:58:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:40.523 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11739MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:58:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:40.524 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:58:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:40.524 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:58:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:40.594 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:58:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:40.595 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:58:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:40.595 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:58:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:40.633 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:58:40 np0005538513.localdomain ceph-mon[292954]: pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:40 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/4221705048' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:58:40 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2543969539' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:58:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:58:41 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1527944200' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:58:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:41.081 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:58:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:41.088 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:58:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:41.104 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:58:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:41.107 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:58:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:41.107 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.583s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:58:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:41.123 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:41 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1527944200' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:58:41 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1985352112' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:58:42 np0005538513.localdomain ceph-mon[292954]: pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:42 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/355013671' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:58:44 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/716858050' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:58:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:44.336 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:44 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:58:44 np0005538513.localdomain systemd[1]: tmp-crun.NyINIH.mount: Deactivated successfully.
Nov 28 09:58:44 np0005538513.localdomain podman[306815]: 2025-11-28 09:58:44.829135733 +0000 UTC m=+0.070742510 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:58:44 np0005538513.localdomain podman[306815]: 2025-11-28 09:58:44.865348884 +0000 UTC m=+0.106955661 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:58:44 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:58:45 np0005538513.localdomain ceph-mon[292954]: pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:46.110 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:58:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:46.110 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:58:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:46.111 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:58:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:46.173 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:46.230 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:58:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:46.230 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:58:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:46.230 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:58:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:46.231 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:58:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:46.614 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:58:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:46.628 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:58:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:46.628 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:58:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:58:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:58:46 np0005538513.localdomain podman[306839]: 2025-11-28 09:58:46.846033482 +0000 UTC m=+0.083345107 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:58:46 np0005538513.localdomain systemd[1]: tmp-crun.gXDztf.mount: Deactivated successfully.
Nov 28 09:58:46 np0005538513.localdomain podman[306840]: 2025-11-28 09:58:46.905937078 +0000 UTC m=+0.139750126 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 09:58:46 np0005538513.localdomain podman[306839]: 2025-11-28 09:58:46.914566143 +0000 UTC m=+0.151877718 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller)
Nov 28 09:58:46 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:58:46 np0005538513.localdomain podman[306840]: 2025-11-28 09:58:46.939868219 +0000 UTC m=+0.173681287 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:58:46 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:58:47 np0005538513.localdomain ceph-mon[292954]: pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:58:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:58:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:58:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:58:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:58:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:58:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:58:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:58:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:58:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:58:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:58:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:58:49 np0005538513.localdomain ceph-mon[292954]: pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:49.379 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:58:49 np0005538513.localdomain podman[306882]: 2025-11-28 09:58:49.847572034 +0000 UTC m=+0.084379148 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute)
Nov 28 09:58:49 np0005538513.localdomain podman[306882]: 2025-11-28 09:58:49.860423759 +0000 UTC m=+0.097230933 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 09:58:49 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:58:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:58:50.839 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:58:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:58:50.839 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:58:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:58:50.840 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:58:51 np0005538513.localdomain ceph-mon[292954]: pgmap v25: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:51.174 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:53 np0005538513.localdomain ceph-mon[292954]: pgmap v26: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:54.429 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:55 np0005538513.localdomain ceph-mon[292954]: pgmap v27: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:56.198 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:57 np0005538513.localdomain ceph-mon[292954]: pgmap v28: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:59 np0005538513.localdomain ceph-mon[292954]: pgmap v29: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:58:59.432 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:58:59 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:58:59 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:58:59 np0005538513.localdomain podman[306901]: 2025-11-28 09:58:59.852301428 +0000 UTC m=+0.087973880 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:58:59 np0005538513.localdomain podman[306901]: 2025-11-28 09:58:59.863347286 +0000 UTC m=+0.099019758 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:58:59 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:58:59 np0005538513.localdomain podman[306902]: 2025-11-28 09:58:59.957190574 +0000 UTC m=+0.189651137 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3)
Nov 28 09:58:59 np0005538513.localdomain podman[306902]: 2025-11-28 09:58:59.994417105 +0000 UTC m=+0.226877678 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:59:00 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:59:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:01.201 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:01 np0005538513.localdomain ceph-mon[292954]: pgmap v30: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:03 np0005538513.localdomain ceph-mon[292954]: pgmap v31: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:04.457 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:04 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:59:05 np0005538513.localdomain ceph-mon[292954]: pgmap v32: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:06 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:06.238 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:07 np0005538513.localdomain ceph-mon[292954]: pgmap v33: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:09 np0005538513.localdomain ceph-mon[292954]: pgmap v34: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:09 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:09.459 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:59:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:59:09 np0005538513.localdomain systemd[1]: tmp-crun.g51wvr.mount: Deactivated successfully.
Nov 28 09:59:09 np0005538513.localdomain podman[306942]: 2025-11-28 09:59:09.858247797 +0000 UTC m=+0.093339824 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, distribution-scope=public, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 09:59:09 np0005538513.localdomain podman[306942]: 2025-11-28 09:59:09.898514501 +0000 UTC m=+0.133606538 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.tags=minimal rhel9, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 09:59:09 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:59:10 np0005538513.localdomain podman[238687]: time="2025-11-28T09:59:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:59:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:59:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 28 09:59:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:59:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18737 "" "Go-http-client/1.1"
Nov 28 09:59:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:11.241 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:11 np0005538513.localdomain ceph-mon[292954]: pgmap v35: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:12 np0005538513.localdomain sudo[306962]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:59:12 np0005538513.localdomain sudo[306962]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:59:12 np0005538513.localdomain sudo[306962]: pam_unix(sudo:session): session closed for user root
Nov 28 09:59:12 np0005538513.localdomain sudo[306980]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:59:12 np0005538513.localdomain sudo[306980]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:59:13 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 09:59:13 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4226605833' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:59:13 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 09:59:13 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4226605833' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:59:13 np0005538513.localdomain ceph-mon[292954]: pgmap v36: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/4226605833' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:59:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/4226605833' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:59:13 np0005538513.localdomain sudo[306980]: pam_unix(sudo:session): session closed for user root
Nov 28 09:59:13 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 09:59:13 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:59:13 np0005538513.localdomain sudo[307031]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:59:13 np0005538513.localdomain sudo[307031]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:59:13 np0005538513.localdomain sudo[307031]: pam_unix(sudo:session): session closed for user root
Nov 28 09:59:14 np0005538513.localdomain ceph-mon[292954]: pgmap v37: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:59:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:59:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:59:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:59:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:14.462 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:59:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 09:59:15 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:59:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:59:15 np0005538513.localdomain podman[307049]: 2025-11-28 09:59:15.854268477 +0000 UTC m=+0.087861795 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:59:15 np0005538513.localdomain podman[307049]: 2025-11-28 09:59:15.891505589 +0000 UTC m=+0.125098847 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:59:15 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:59:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:16.271 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:16 np0005538513.localdomain ceph-mon[292954]: pgmap v38: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:59:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:59:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:59:17 np0005538513.localdomain podman[307072]: 2025-11-28 09:59:17.851126282 +0000 UTC m=+0.083170622 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 28 09:59:17 np0005538513.localdomain podman[307073]: 2025-11-28 09:59:17.935265602 +0000 UTC m=+0.162412191 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 28 09:59:17 np0005538513.localdomain podman[307073]: 2025-11-28 09:59:17.94365506 +0000 UTC m=+0.170801669 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:59:17 np0005538513.localdomain podman[307072]: 2025-11-28 09:59:17.951453518 +0000 UTC m=+0.183497848 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 09:59:17 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:59:17 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:59:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:59:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:59:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:59:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:59:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:59:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:59:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:59:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:59:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:59:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:59:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:59:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:59:18 np0005538513.localdomain ceph-mon[292954]: pgmap v39: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:19.488 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:59:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:59:20 np0005538513.localdomain ceph-mon[292954]: pgmap v40: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:20 np0005538513.localdomain systemd[1]: tmp-crun.A3VpAz.mount: Deactivated successfully.
Nov 28 09:59:20 np0005538513.localdomain podman[307114]: 2025-11-28 09:59:20.85545841 +0000 UTC m=+0.092514798 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 09:59:20 np0005538513.localdomain podman[307114]: 2025-11-28 09:59:20.865397705 +0000 UTC m=+0.102454063 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Nov 28 09:59:20 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:59:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:21.275 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: pgmap v41: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.799368) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323962799406, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 2315, "num_deletes": 253, "total_data_size": 3904180, "memory_usage": 4047024, "flush_reason": "Manual Compaction"}
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323962818538, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 3664999, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20568, "largest_seqno": 22882, "table_properties": {"data_size": 3655199, "index_size": 6049, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2693, "raw_key_size": 22841, "raw_average_key_size": 21, "raw_value_size": 3634616, "raw_average_value_size": 3451, "num_data_blocks": 261, "num_entries": 1053, "num_filter_entries": 1053, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323814, "oldest_key_time": 1764323814, "file_creation_time": 1764323962, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 19210 microseconds, and 8585 cpu microseconds.
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.818576) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 3664999 bytes OK
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.818598) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.820188) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.820205) EVENT_LOG_v1 {"time_micros": 1764323962820201, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.820222) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 3894106, prev total WAL file size 3894106, number of live WAL files 2.
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.820985) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131323935' seq:72057594037927935, type:22 .. '7061786F73003131353437' seq:0, type:0; will stop at (end)
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(3579KB)], [33(17MB)]
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323962821073, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 21675736, "oldest_snapshot_seqno": -1}
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 12154 keys, 18737772 bytes, temperature: kUnknown
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323962905501, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 18737772, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18666947, "index_size": 39364, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30405, "raw_key_size": 324272, "raw_average_key_size": 26, "raw_value_size": 18458636, "raw_average_value_size": 1518, "num_data_blocks": 1508, "num_entries": 12154, "num_filter_entries": 12154, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764323962, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.905770) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 18737772 bytes
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.907395) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 256.5 rd, 221.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 17.2 +0.0 blob) out(17.9 +0.0 blob), read-write-amplify(11.0) write-amplify(5.1) OK, records in: 12691, records dropped: 537 output_compression: NoCompression
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.907412) EVENT_LOG_v1 {"time_micros": 1764323962907404, "job": 18, "event": "compaction_finished", "compaction_time_micros": 84502, "compaction_time_cpu_micros": 40381, "output_level": 6, "num_output_files": 1, "total_output_size": 18737772, "num_input_records": 12691, "num_output_records": 12154, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323962907821, "job": 18, "event": "table_file_deletion", "file_number": 35}
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323962908901, "job": 18, "event": "table_file_deletion", "file_number": 33}
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.820881) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.908923) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.908926) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.908928) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.908930) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:59:22 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.908932) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:59:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:24.520 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:59:24 np0005538513.localdomain ceph-mon[292954]: pgmap v42: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:26.302 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:26 np0005538513.localdomain ceph-mon[292954]: pgmap v43: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:28 np0005538513.localdomain ceph-mon[292954]: pgmap v44: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:29.520 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:59:30 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 09:59:30 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 09:59:30 np0005538513.localdomain systemd[1]: tmp-crun.i9rMQh.mount: Deactivated successfully.
Nov 28 09:59:30 np0005538513.localdomain ceph-mon[292954]: pgmap v45: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:30 np0005538513.localdomain podman[307133]: 2025-11-28 09:59:30.901841136 +0000 UTC m=+0.137905939 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:59:30 np0005538513.localdomain podman[307134]: 2025-11-28 09:59:30.862411827 +0000 UTC m=+0.094708025 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 09:59:30 np0005538513.localdomain podman[307134]: 2025-11-28 09:59:30.946529177 +0000 UTC m=+0.178825345 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 09:59:30 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 09:59:30 np0005538513.localdomain podman[307133]: 2025-11-28 09:59:30.965689115 +0000 UTC m=+0.201753888 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:59:30 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 09:59:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:31.305 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:32 np0005538513.localdomain ceph-mon[292954]: pgmap v46: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:34.568 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:59:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:34.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:59:34 np0005538513.localdomain ceph-mon[292954]: pgmap v47: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:35.769 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:59:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:35.770 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:59:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:36.326 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:36.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:59:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:36.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:59:36 np0005538513.localdomain ceph-mon[292954]: pgmap v48: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:36 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e45: np0005538515.yfkzhl(active, since 91s), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 09:59:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:37.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:59:37 np0005538513.localdomain ceph-mon[292954]: mgrmap e45: np0005538515.yfkzhl(active, since 91s), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 09:59:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:38.766 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:59:39 np0005538513.localdomain ceph-mon[292954]: pgmap v49: 177 pgs: 177 active+clean; 105 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 28 09:59:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:39.570 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:59:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:39.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:59:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:39.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:59:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:39.792 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:59:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:39.793 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:59:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:39.793 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:59:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:39.794 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:59:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:39.794 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:59:40 np0005538513.localdomain podman[238687]: time="2025-11-28T09:59:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:59:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:59:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 28 09:59:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:09:59:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18744 "" "Go-http-client/1.1"
Nov 28 09:59:40 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:59:40 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1564004805' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:59:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:40.290 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:59:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:40.492 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:59:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:40.493 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 09:59:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:40.710 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:59:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:40.712 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11677MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:59:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:40.713 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:59:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:40.713 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:59:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 09:59:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:40.791 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 09:59:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:40.792 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:59:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:40.792 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:59:40 np0005538513.localdomain systemd[1]: tmp-crun.K88qlF.mount: Deactivated successfully.
Nov 28 09:59:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:40.846 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:59:40 np0005538513.localdomain podman[307198]: 2025-11-28 09:59:40.853383216 +0000 UTC m=+0.091518948 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vcs-type=git, managed_by=edpm_ansible, maintainer=Red Hat, Inc., version=9.6, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=)
Nov 28 09:59:40 np0005538513.localdomain podman[307198]: 2025-11-28 09:59:40.869913333 +0000 UTC m=+0.108049065 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, release=1755695350, vcs-type=git, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_id=edpm)
Nov 28 09:59:40 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 09:59:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:59:40.999 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 09:59:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:59:41.000 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 09:59:41 np0005538513.localdomain ceph-mon[292954]: pgmap v50: 177 pgs: 177 active+clean; 105 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 28 09:59:41 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1564004805' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:59:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:41.035 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:41.328 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:59:41 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/986677832' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:59:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:41.347 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:59:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:41.354 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:59:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:41.373 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:59:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:41.376 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:59:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:41.376 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.663s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:59:42 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/986677832' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:59:43 np0005538513.localdomain ceph-mon[292954]: pgmap v51: 177 pgs: 177 active+clean; 105 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 28 09:59:43 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/3275500434' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:59:43 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:59:43.218 261084 INFO oslo.privsep.daemon [None req-4211c54a-c871-4dfb-b53c-cc36590e4915 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp35q7603z/privsep.sock']
Nov 28 09:59:43 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:59:43.843 261084 INFO oslo.privsep.daemon [None req-4211c54a-c871-4dfb-b53c-cc36590e4915 - - - - - -] Spawned new privsep daemon via rootwrap
Nov 28 09:59:43 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:59:43.728 307245 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 28 09:59:43 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:59:43.735 307245 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 28 09:59:43 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:59:43.738 307245 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Nov 28 09:59:43 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:59:43.739 307245 INFO oslo.privsep.daemon [-] privsep daemon running as pid 307245
Nov 28 09:59:44 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/3064234323' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:59:44 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/164218660' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:59:44 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1537443536' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:59:44 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:59:44.387 261084 INFO oslo.privsep.daemon [None req-4211c54a-c871-4dfb-b53c-cc36590e4915 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpjkovtoe5/privsep.sock']
Nov 28 09:59:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:44.602 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:59:45 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:59:45.041 261084 INFO oslo.privsep.daemon [None req-4211c54a-c871-4dfb-b53c-cc36590e4915 - - - - - -] Spawned new privsep daemon via rootwrap
Nov 28 09:59:45 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:59:44.934 307254 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 28 09:59:45 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:59:44.939 307254 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 28 09:59:45 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:59:44.942 307254 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Nov 28 09:59:45 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:59:44.943 307254 INFO oslo.privsep.daemon [-] privsep daemon running as pid 307254
Nov 28 09:59:45 np0005538513.localdomain ceph-mon[292954]: pgmap v52: 177 pgs: 177 active+clean; 105 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 28 09:59:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 09:59:46 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:59:46.016 261084 INFO oslo.privsep.daemon [None req-4211c54a-c871-4dfb-b53c-cc36590e4915 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmptvywk7ck/privsep.sock']
Nov 28 09:59:46 np0005538513.localdomain podman[307262]: 2025-11-28 09:59:46.082951772 +0000 UTC m=+0.067331486 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:59:46 np0005538513.localdomain podman[307262]: 2025-11-28 09:59:46.089955147 +0000 UTC m=+0.074334851 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:59:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 do_prune osdmap full prune enabled
Nov 28 09:59:46 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 09:59:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e93 e93: 6 total, 6 up, 6 in
Nov 28 09:59:46 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e93: 6 total, 6 up, 6 in
Nov 28 09:59:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:46.364 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:46.377 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:59:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:46.378 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:59:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:46.378 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:59:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:46.459 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:59:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:46.460 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:59:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:46.460 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 09:59:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:46.461 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 09:59:46 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:59:46.624 261084 INFO oslo.privsep.daemon [None req-4211c54a-c871-4dfb-b53c-cc36590e4915 - - - - - -] Spawned new privsep daemon via rootwrap
Nov 28 09:59:46 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:59:46.524 307289 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 28 09:59:46 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:59:46.528 307289 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 28 09:59:46 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:59:46.532 307289 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 28 09:59:46 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:59:46.532 307289 INFO oslo.privsep.daemon [-] privsep daemon running as pid 307289
Nov 28 09:59:47 np0005538513.localdomain ceph-mon[292954]: pgmap v53: 177 pgs: 177 active+clean; 105 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 28 09:59:47 np0005538513.localdomain ceph-mon[292954]: osdmap e93: 6 total, 6 up, 6 in
Nov 28 09:59:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:47.608 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 09:59:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:47.622 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:59:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:47.623 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 09:59:48 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:59:48.017 261084 INFO neutron.agent.linux.ip_lib [None req-4211c54a-c871-4dfb-b53c-cc36590e4915 - - - - - -] Device tap1686a1d1-ca cannot be used as it has no MAC address
Nov 28 09:59:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:48.086 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:59:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:59:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:59:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:59:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:59:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:59:48 np0005538513.localdomain kernel: device tap1686a1d1-ca entered promiscuous mode
Nov 28 09:59:48 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:59:48Z|00071|binding|INFO|Claiming lport 1686a1d1-caea-4208-9d74-34f3140388c4 for this chassis.
Nov 28 09:59:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:48.097 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:48 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:59:48Z|00072|binding|INFO|1686a1d1-caea-4208-9d74-34f3140388c4: Claiming unknown
Nov 28 09:59:48 np0005538513.localdomain NetworkManager[5967]: <info>  [1764323988.1003] manager: (tap1686a1d1-ca): new Generic device (/org/freedesktop/NetworkManager/Devices/17)
Nov 28 09:59:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:59:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:59:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:59:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   09:59:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:59:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 09:59:48 np0005538513.localdomain systemd-udevd[307304]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 09:59:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 09:59:48 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:59:48Z|00073|binding|INFO|Setting lport 1686a1d1-caea-4208-9d74-34f3140388c4 ovn-installed in OVS
Nov 28 09:59:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:48.117 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:48.119 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:48 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e93 do_prune osdmap full prune enabled
Nov 28 09:59:48 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T09:59:48Z|00074|binding|INFO|Setting lport 1686a1d1-caea-4208-9d74-34f3140388c4 up in Southbound
Nov 28 09:59:48 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:59:48.121 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.199.3/24', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-5d25adc8-19ca-4816-87ea-2f93f610a253', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d25adc8-19ca-4816-87ea-2f93f610a253', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c326253de5044a60be18dcfa12e29a2c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=97b765ce-e608-49a3-80f7-70d475d900a9, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=1686a1d1-caea-4208-9d74-34f3140388c4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 09:59:48 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:59:48.123 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 1686a1d1-caea-4208-9d74-34f3140388c4 in datapath 5d25adc8-19ca-4816-87ea-2f93f610a253 bound to our chassis
Nov 28 09:59:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 09:59:48 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:59:48.128 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 6c531ff3-3ac3-49e1-bdb8-4b1db0bd1f7e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 09:59:48 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:59:48.129 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5d25adc8-19ca-4816-87ea-2f93f610a253, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 09:59:48 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:59:48.130 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4d195c6c-b3a6-4cae-aa2d-76c2ec29afe8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:59:48 np0005538513.localdomain virtnodedevd[227875]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 28 09:59:48 np0005538513.localdomain virtnodedevd[227875]: hostname: np0005538513.localdomain
Nov 28 09:59:48 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap1686a1d1-ca: No such device
Nov 28 09:59:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:48.148 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:48 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap1686a1d1-ca: No such device
Nov 28 09:59:48 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap1686a1d1-ca: No such device
Nov 28 09:59:48 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 e94: 6 total, 6 up, 6 in
Nov 28 09:59:48 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap1686a1d1-ca: No such device
Nov 28 09:59:48 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e94: 6 total, 6 up, 6 in
Nov 28 09:59:48 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap1686a1d1-ca: No such device
Nov 28 09:59:48 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap1686a1d1-ca: No such device
Nov 28 09:59:48 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap1686a1d1-ca: No such device
Nov 28 09:59:48 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap1686a1d1-ca: No such device
Nov 28 09:59:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:48.196 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:48.223 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:48 np0005538513.localdomain podman[307307]: 2025-11-28 09:59:48.229418495 +0000 UTC m=+0.104057492 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:59:48 np0005538513.localdomain podman[307306]: 2025-11-28 09:59:48.241288248 +0000 UTC m=+0.113628796 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 09:59:48 np0005538513.localdomain podman[307307]: 2025-11-28 09:59:48.267416179 +0000 UTC m=+0.142055106 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:59:48 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 09:59:48 np0005538513.localdomain podman[307306]: 2025-11-28 09:59:48.299248626 +0000 UTC m=+0.171589163 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:59:48 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 09:59:49 np0005538513.localdomain podman[307420]: 
Nov 28 09:59:49 np0005538513.localdomain podman[307420]: 2025-11-28 09:59:49.090106348 +0000 UTC m=+0.080117158 container create cdcdbea8cfce5c4adf286559acf5e95b26fc212b158f9ca5eedaa4d2e7ad0bdd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d25adc8-19ca-4816-87ea-2f93f610a253, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:59:49 np0005538513.localdomain systemd[1]: Started libpod-conmon-cdcdbea8cfce5c4adf286559acf5e95b26fc212b158f9ca5eedaa4d2e7ad0bdd.scope.
Nov 28 09:59:49 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 09:59:49 np0005538513.localdomain podman[307420]: 2025-11-28 09:59:49.044887981 +0000 UTC m=+0.034898751 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 09:59:49 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d2c22b3950f04e53676908bff2fdb47d1d2f45894d34c3bc4830014816f6320d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 09:59:49 np0005538513.localdomain podman[307420]: 2025-11-28 09:59:49.158056111 +0000 UTC m=+0.148066911 container init cdcdbea8cfce5c4adf286559acf5e95b26fc212b158f9ca5eedaa4d2e7ad0bdd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d25adc8-19ca-4816-87ea-2f93f610a253, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:59:49 np0005538513.localdomain podman[307420]: 2025-11-28 09:59:49.166183691 +0000 UTC m=+0.156194451 container start cdcdbea8cfce5c4adf286559acf5e95b26fc212b158f9ca5eedaa4d2e7ad0bdd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d25adc8-19ca-4816-87ea-2f93f610a253, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 09:59:49 np0005538513.localdomain ceph-mon[292954]: pgmap v55: 177 pgs: 177 active+clean; 125 MiB data, 619 MiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 2.0 MiB/s wr, 19 op/s
Nov 28 09:59:49 np0005538513.localdomain ceph-mon[292954]: osdmap e94: 6 total, 6 up, 6 in
Nov 28 09:59:49 np0005538513.localdomain dnsmasq[307439]: started, version 2.85 cachesize 150
Nov 28 09:59:49 np0005538513.localdomain dnsmasq[307439]: DNS service limited to local subnets
Nov 28 09:59:49 np0005538513.localdomain dnsmasq[307439]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 09:59:49 np0005538513.localdomain dnsmasq[307439]: warning: no upstream servers configured
Nov 28 09:59:49 np0005538513.localdomain dnsmasq-dhcp[307439]: DHCP, static leases only on 192.168.199.0, lease time 1d
Nov 28 09:59:49 np0005538513.localdomain dnsmasq[307439]: read /var/lib/neutron/dhcp/5d25adc8-19ca-4816-87ea-2f93f610a253/addn_hosts - 0 addresses
Nov 28 09:59:49 np0005538513.localdomain dnsmasq-dhcp[307439]: read /var/lib/neutron/dhcp/5d25adc8-19ca-4816-87ea-2f93f610a253/host
Nov 28 09:59:49 np0005538513.localdomain dnsmasq-dhcp[307439]: read /var/lib/neutron/dhcp/5d25adc8-19ca-4816-87ea-2f93f610a253/opts
Nov 28 09:59:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:49.605 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:59:49 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 09:59:49.820 261084 INFO neutron.agent.dhcp.agent [None req-0eda4073-35cd-43fe-8c0c-1ef507f749f8 - - - - - -] DHCP configuration for ports {'fb2fd669-dbf0-4ffc-99a7-fb5973f387f6'} is completed
Nov 28 09:59:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:59:50.840 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:59:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:59:50.841 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:59:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:59:50.842 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:59:51 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 09:59:51.002 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:59:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:51.012 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:59:51 np0005538513.localdomain ceph-mon[292954]: pgmap v57: 177 pgs: 177 active+clean; 125 MiB data, 619 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 2.6 MiB/s wr, 24 op/s
Nov 28 09:59:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:51.397 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 09:59:51 np0005538513.localdomain podman[307440]: 2025-11-28 09:59:51.851783836 +0000 UTC m=+0.081890343 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 09:59:51 np0005538513.localdomain podman[307440]: 2025-11-28 09:59:51.866353602 +0000 UTC m=+0.096460109 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:59:51 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 09:59:53 np0005538513.localdomain ceph-mon[292954]: pgmap v58: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 5.1 MiB/s wr, 47 op/s
Nov 28 09:59:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:54.641 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:59:55 np0005538513.localdomain ceph-mon[292954]: pgmap v59: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 5.1 MiB/s wr, 47 op/s
Nov 28 09:59:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:56.435 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:57 np0005538513.localdomain ceph-mon[292954]: pgmap v60: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 4.3 MiB/s wr, 40 op/s
Nov 28 09:59:59 np0005538513.localdomain ceph-mon[292954]: pgmap v61: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.0 MiB/s wr, 18 op/s
Nov 28 09:59:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 09:59:59.643 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 09:59:59 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:00 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : overall HEALTH_OK
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.675 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.675 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.679 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4fcf2ef0-da97-4d6b-b949-7f5957166d51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:00:00.675893', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '00cc0ee8-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.847788181, 'message_signature': '708b3d23a8558ce9fd8c5a101dac69d363494e10f163bda1ca95326935519461'}]}, 'timestamp': '2025-11-28 10:00:00.680238', '_unique_id': 'b6957b2a1aa64c6eaabff00c46c5ccda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.683 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.683 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af622ff2-193e-46c0-83c6-d8dadeab421e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:00:00.683264', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '00cc9bb0-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.847788181, 'message_signature': '05abe9605d585255f12cf26c7ae1f04a5437258b0cc329a6764c34a3602d9808'}]}, 'timestamp': '2025-11-28 10:00:00.683770', '_unique_id': 'd7ffcff4df484ee280990dab24729c94'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.685 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.686 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30af38aa-c6e7-48fe-a84a-5d4ed8e4322c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:00:00.686006', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '00cd07bc-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.847788181, 'message_signature': 'e5ed4158c320833ea36316b980798e71c1a855b5238c9ebe24be83298e918e5f'}]}, 'timestamp': '2025-11-28 10:00:00.686530', '_unique_id': '729e9e8119a6401d8885a6f0a681f1b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.688 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.715 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.715 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bbbbd770-f7d3-4f9b-8a22-813d5ac15300', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:00:00.688768', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '00d17fea-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': '1ad95c6468c92c467dc069736c58cebd413726da091f36ceb77aacdd4c3cb291'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:00:00.688768', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00d193fe-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': '53a2c4ed7ff22f2171934b94b1cab5e7bd2853be6e98d213dc41ff409a213e27'}]}, 'timestamp': '2025-11-28 10:00:00.716303', '_unique_id': '979b6a58f3f44e79b1ff41bfce8cb2ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.718 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.718 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.719 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0957538-bcc1-4205-8aa0-927eccb35529', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:00:00.718559', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '00d1fda8-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': 'cdab6bfdd1344717bf9b5804acdb98b49a5be1263f50890f74960426c7673929'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:00:00.718559', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00d21022-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': 'b45025be4ccea91b6d87418a4f31dcc4b671dddafcb212bc1a541d418e867448'}]}, 'timestamp': '2025-11-28 10:00:00.719477', '_unique_id': 'dcf87a874ec942e9995f8b605a684668'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.721 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.721 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.722 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ff7dc53-8fb7-40a9-9943-5dcad6a6f135', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:00:00.721724', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '00d278f0-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': '076ff3c2e16fff47d3c894003750fc2f0fb315d2c80eb0d69f127fffad5fa999'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:00:00.721724', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00d28b24-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': '92c62f570e9062ef06607485fe0f65d89240e254b8300373a6b922d89069422f'}]}, 'timestamp': '2025-11-28 10:00:00.722625', '_unique_id': '9482be48f20142df80249cef056e2a10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.724 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.724 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5edd321c-6f05-448e-b215-29cb1b0044b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:00:00.724849', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '00d2f4b0-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.847788181, 'message_signature': 'd98c1ece93ae85d17dbcfe61467586528b271c68d518d63a805d20680ea67ea5'}]}, 'timestamp': '2025-11-28 10:00:00.725362', '_unique_id': '11b832d685254a86b9c7ba724f752f42'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.727 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.727 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4459693-10d3-4a1d-836f-5d474d0a48a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:00:00.727513', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '00d35b4e-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.847788181, 'message_signature': '5de45355469180aea3de458d6f8da0801f19bb53a9daab0319bbc6216c1ecfce'}]}, 'timestamp': '2025-11-28 10:00:00.727984', '_unique_id': 'c0e3c8a9331c450bb3620cb6bd326fd0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.730 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.740 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.740 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a5484bf-ebb0-48aa-bce1-0a71b49a28be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:00:00.730197', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '00d54bca-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.902092836, 'message_signature': '28734dd9774b5457a642c8e87c2494ee3c734056c629281aa3a51ae30728f39c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:00:00.730197', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00d55692-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.902092836, 'message_signature': '54b73825b5903b05a9e6c3119aab07e7c24961fd57f344181d439439e0e24ee2'}]}, 'timestamp': '2025-11-28 10:00:00.740853', '_unique_id': '028ca4c2debc4a668895c8fd78dd7fa9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.742 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.742 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bcb4c919-8875-4de6-8669-f3fd30a3600b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:00:00.742256', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '00d597ec-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.847788181, 'message_signature': 'fc9a5bccfd5f782a1059cdcc14fcf2ac5dc014400ba7deccb35e689561a31148'}]}, 'timestamp': '2025-11-28 10:00:00.742545', '_unique_id': 'c5c98c11c9574253830436639f624659'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.760 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dfb4a4e0-c60f-4cc4-b9a2-5be14eab10c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:00:00.743826', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '00d85de2-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.93222141, 'message_signature': '0e270cb9e1b677cb9b23501ead7041043e334f0137ba5a7275dac5f6643f413b'}]}, 'timestamp': '2025-11-28 10:00:00.760713', '_unique_id': '090eb78e0a7246d697f45470b28473ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.762 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.762 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.762 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15273d9c-5284-49c8-a2c0-477cb228c576', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:00:00.762333', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '00d8a7f2-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': '8f19f6dc0e41c8cd3067757600006ae252a26c97652b05421b34685a0c4d783e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:00:00.762333', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00d8b1e8-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': 'b728341f1fbeb73599aabbf3b435f30b6fdd90e4a4150b7a2e4221245c31025c'}]}, 'timestamp': '2025-11-28 10:00:00.762849', '_unique_id': '7bdc8324860144f58489cda2495912dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.764 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.764 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4cddb6e6-f009-4c9a-985c-584a9a38598c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:00:00.764204', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '00d8f126-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.847788181, 'message_signature': '9df5794e59309d7d5e9ae54fff7ca228df719f84f4bf88662048f218b843d716'}]}, 'timestamp': '2025-11-28 10:00:00.764486', '_unique_id': 'd5d94baea96041cb935fbba9635dab24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.766 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.766 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.766 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.766 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b986547f-d25f-4f77-8b07-37d5abe9e0a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:00:00.766310', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '00d946a8-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.902092836, 'message_signature': 'a77d918f7587cd0df00a79938165e6e2f9a1a1d2bff12dcb287de095c6e2e188'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:00:00.766310', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00d95670-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.902092836, 'message_signature': 'b20cb7fce8b6dba1f1a85b53fd4fc721f1feb44162ef30be8a6fc8651ec66517'}]}, 'timestamp': '2025-11-28 10:00:00.767170', '_unique_id': 'f8d1380af4074493b9168f4437c97bc8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.769 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.769 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 15130000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ffcee6b2-04b8-4af2-9f1f-c4b4e4aa60fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15130000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:00:00.769336', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '00d9bcd2-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.93222141, 'message_signature': '370b3991f8fdcf573bb36b492ef3be1eec26609fc64cc38ba57270d122c332a3'}]}, 'timestamp': '2025-11-28 10:00:00.769777', '_unique_id': '58d4b92144fa411cb5d461ab909ad691'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.771 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.771 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45d1300a-a5db-4347-b64b-304e344d99e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:00:00.771806', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '00da1d76-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.847788181, 'message_signature': 'c6beceeb62c9d5c6d1d5909ceadad01bd075f14f3b7c949158ed1cddd741f77b'}]}, 'timestamp': '2025-11-28 10:00:00.772289', '_unique_id': 'd5b93b46000f441682835e0386df27d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.774 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.774 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf02dfc2-b545-42de-b9a6-66cbb4bc70ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:00:00.774346', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '00da804a-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.847788181, 'message_signature': '40bd15e643a7b8748428b3be916bb50c9f606f37b2d19a4f72c01752b7b47747'}]}, 'timestamp': '2025-11-28 10:00:00.774790', '_unique_id': '189d8bc186b14b1eb2c0746c6c7713ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.776 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.776 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b45daaf-e3a1-4330-9738-9b4e59b3c26e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:00:00.776892', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '00dae666-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.847788181, 'message_signature': '67409d38b69557410dba1f828d65f3c3549510fa5a3df2f39350b9a022de6546'}]}, 'timestamp': '2025-11-28 10:00:00.777405', '_unique_id': '17456ed79a3d4e98814153b4b6bfe5db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.779 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.779 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.780 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '341b4372-b4c1-47bb-aad7-1b5d14e16e5f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:00:00.779664', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '00db5128-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.902092836, 'message_signature': '39ff792a604c6202ad12f5b31e887655572dfe637dea049a386cb92f4302bb7a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:00:00.779664', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00db6348-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.902092836, 'message_signature': '2e878b2b527c5396637327ee22547c0f847ad23018c46a97af7f8afa69e5bc10'}]}, 'timestamp': '2025-11-28 10:00:00.780576', '_unique_id': '4f2eb4cb9585482b93b6341cd9e006e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:00:00 np0005538513.localdomain ceph-mon[292954]: pgmap v62: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 1.8 MiB/s wr, 16 op/s
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:00:00 np0005538513.localdomain ceph-mon[292954]: overall HEALTH_OK
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.782 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.782 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.783 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3275c8c5-58c8-4e64-b1ab-9e2f06161c2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:00:00.782719', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '00dbc7fc-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': '0c2fea60a4ca41b17b7e62e7f9a21fec10a664ac0b9a885370f6cdb25cdb611b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:00:00.782719', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00dbdaa8-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': '798d7fcf033419e489e17666414e7707f6962a8e749736ba873007f94a622b8e'}]}, 'timestamp': '2025-11-28 10:00:00.783631', '_unique_id': 'd3429e20496f40e68713897bddcc9ec0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.785 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.785 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.785 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.786 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.786 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c86b5669-f002-4b7c-a039-649199b4359a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:00:00.786068', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '00dc4b00-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': 'a12d5b7eb0a9d011d1c0ed158139a239e759326ce79eeacf6cc344765d3cba12'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:00:00.786068', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00dc5ad2-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': 'c4294288779ed8822dbf4e69364855c691b1e6b50f78c36be591c1fa6a0a204b'}]}, 'timestamp': '2025-11-28 10:00:00.786910', '_unique_id': 'ccdb52d9ce8443a5b78b87bc4f3b64e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:00:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:00:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:01.470 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:01 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:00:01 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:00:01 np0005538513.localdomain systemd[1]: tmp-crun.48VnJ6.mount: Deactivated successfully.
Nov 28 10:00:01 np0005538513.localdomain podman[307462]: 2025-11-28 10:00:01.875246819 +0000 UTC m=+0.104908729 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:00:01 np0005538513.localdomain podman[307462]: 2025-11-28 10:00:01.956404117 +0000 UTC m=+0.186066047 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:00:01 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:00:01 np0005538513.localdomain podman[307463]: 2025-11-28 10:00:01.92878322 +0000 UTC m=+0.155120368 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 28 10:00:02 np0005538513.localdomain podman[307463]: 2025-11-28 10:00:02.012545089 +0000 UTC m=+0.238882237 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.build-date=20251125)
Nov 28 10:00:02 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:00:02 np0005538513.localdomain ceph-mon[292954]: pgmap v63: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 1.7 MiB/s wr, 15 op/s
Nov 28 10:00:02 np0005538513.localdomain systemd[1]: tmp-crun.MC0ZQS.mount: Deactivated successfully.
Nov 28 10:00:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:04.670 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:04 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:04 np0005538513.localdomain ceph-mon[292954]: pgmap v64: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:06 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:06.501 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:06 np0005538513.localdomain ceph-mon[292954]: pgmap v65: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:08 np0005538513.localdomain ceph-mon[292954]: pgmap v66: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:09 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:09.674 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:10 np0005538513.localdomain podman[238687]: time="2025-11-28T10:00:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:00:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:00:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1"
Nov 28 10:00:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:00:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19225 "" "Go-http-client/1.1"
Nov 28 10:00:10 np0005538513.localdomain ceph-mon[292954]: pgmap v67: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:11.532 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:00:11 np0005538513.localdomain podman[307505]: 2025-11-28 10:00:11.834933374 +0000 UTC m=+0.069849723 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, managed_by=edpm_ansible, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Nov 28 10:00:11 np0005538513.localdomain podman[307505]: 2025-11-28 10:00:11.849335696 +0000 UTC m=+0.084252015 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, maintainer=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 10:00:11 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:00:12 np0005538513.localdomain ceph-mon[292954]: pgmap v68: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2462883409' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:00:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2462883409' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:00:14 np0005538513.localdomain sudo[307524]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:00:14 np0005538513.localdomain sudo[307524]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:00:14 np0005538513.localdomain sudo[307524]: pam_unix(sudo:session): session closed for user root
Nov 28 10:00:14 np0005538513.localdomain sudo[307542]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:00:14 np0005538513.localdomain sudo[307542]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:00:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:14.718 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.793650) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324014793680, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 783, "num_deletes": 256, "total_data_size": 828956, "memory_usage": 844576, "flush_reason": "Manual Compaction"}
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324014799442, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 816319, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22883, "largest_seqno": 23665, "table_properties": {"data_size": 812648, "index_size": 1462, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8362, "raw_average_key_size": 18, "raw_value_size": 805113, "raw_average_value_size": 1821, "num_data_blocks": 65, "num_entries": 442, "num_filter_entries": 442, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323963, "oldest_key_time": 1764323963, "file_creation_time": 1764324014, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 5841 microseconds, and 2280 cpu microseconds.
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.799490) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 816319 bytes OK
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.799507) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.802496) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.802512) EVENT_LOG_v1 {"time_micros": 1764324014802507, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.802530) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 825031, prev total WAL file size 825355, number of live WAL files 2.
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.803907) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033373730' seq:72057594037927935, type:22 .. '6C6F676D0034303232' seq:0, type:0; will stop at (end)
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(797KB)], [36(17MB)]
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324014803962, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 19554091, "oldest_snapshot_seqno": -1}
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 12064 keys, 19455161 bytes, temperature: kUnknown
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324014897737, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 19455161, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19383514, "index_size": 40378, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30213, "raw_key_size": 323368, "raw_average_key_size": 26, "raw_value_size": 19175323, "raw_average_value_size": 1589, "num_data_blocks": 1549, "num_entries": 12064, "num_filter_entries": 12064, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324014, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.898187) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 19455161 bytes
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.900220) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 208.2 rd, 207.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 17.9 +0.0 blob) out(18.6 +0.0 blob), read-write-amplify(47.8) write-amplify(23.8) OK, records in: 12596, records dropped: 532 output_compression: NoCompression
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.900253) EVENT_LOG_v1 {"time_micros": 1764324014900237, "job": 20, "event": "compaction_finished", "compaction_time_micros": 93920, "compaction_time_cpu_micros": 37695, "output_level": 6, "num_output_files": 1, "total_output_size": 19455161, "num_input_records": 12596, "num_output_records": 12064, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324014900593, "job": 20, "event": "table_file_deletion", "file_number": 38}
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324014903431, "job": 20, "event": "table_file_deletion", "file_number": 36}
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.803853) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.903691) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.903703) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.903707) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.903712) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.903717) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:00:14 np0005538513.localdomain sudo[307542]: pam_unix(sudo:session): session closed for user root
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: pgmap v69: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:00:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:00:15 np0005538513.localdomain sudo[307592]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:00:15 np0005538513.localdomain sudo[307592]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:00:15 np0005538513.localdomain sudo[307592]: pam_unix(sudo:session): session closed for user root
Nov 28 10:00:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:00:15 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:00:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:15.910 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:00:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:00:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:16.533 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:00:16 np0005538513.localdomain podman[307610]: 2025-11-28 10:00:16.852407088 +0000 UTC m=+0.084648406 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:00:16 np0005538513.localdomain podman[307610]: 2025-11-28 10:00:16.866450029 +0000 UTC m=+0.098691387 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:00:16 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:00:17 np0005538513.localdomain ceph-mon[292954]: pgmap v70: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:00:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:00:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:00:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:00:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:00:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:00:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:00:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:00:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:00:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:00:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:00:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:00:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:00:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:00:18 np0005538513.localdomain podman[307634]: 2025-11-28 10:00:18.856328039 +0000 UTC m=+0.087387650 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:00:18 np0005538513.localdomain podman[307634]: 2025-11-28 10:00:18.866724378 +0000 UTC m=+0.097783959 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 10:00:18 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:00:18 np0005538513.localdomain systemd[1]: tmp-crun.TCxsar.mount: Deactivated successfully.
Nov 28 10:00:18 np0005538513.localdomain podman[307633]: 2025-11-28 10:00:18.959292547 +0000 UTC m=+0.193576087 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 10:00:19 np0005538513.localdomain podman[307633]: 2025-11-28 10:00:19.021009979 +0000 UTC m=+0.255293489 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 10:00:19 np0005538513.localdomain ceph-mon[292954]: pgmap v71: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:19 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:00:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:19.471 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:19.722 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:21 np0005538513.localdomain ceph-mon[292954]: pgmap v72: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:21.563 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:00:22 np0005538513.localdomain systemd[299031]: Created slice User Background Tasks Slice.
Nov 28 10:00:22 np0005538513.localdomain podman[307676]: 2025-11-28 10:00:22.84871229 +0000 UTC m=+0.086729261 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 10:00:22 np0005538513.localdomain systemd[299031]: Starting Cleanup of User's Temporary Files and Directories...
Nov 28 10:00:22 np0005538513.localdomain systemd[299031]: Finished Cleanup of User's Temporary Files and Directories.
Nov 28 10:00:22 np0005538513.localdomain podman[307676]: 2025-11-28 10:00:22.915750024 +0000 UTC m=+0.153766945 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:00:22 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:00:22 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:00:22.950 261084 INFO neutron.agent.linux.ip_lib [None req-6cf535a6-0ca9-496a-b979-7b405dfe1f6c - - - - - -] Device tap2b1e8904-1c cannot be used as it has no MAC address
Nov 28 10:00:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:22.970 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:22 np0005538513.localdomain kernel: device tap2b1e8904-1c entered promiscuous mode
Nov 28 10:00:22 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:00:22Z|00075|binding|INFO|Claiming lport 2b1e8904-1c88-4828-a7bc-9f34a2930819 for this chassis.
Nov 28 10:00:22 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:00:22Z|00076|binding|INFO|2b1e8904-1c88-4828-a7bc-9f34a2930819: Claiming unknown
Nov 28 10:00:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:22.978 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:22 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324022.9823] manager: (tap2b1e8904-1c): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Nov 28 10:00:22 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:22.990 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-0303a35a-aae2-4e58-b0e5-9091112c9857', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0303a35a-aae2-4e58-b0e5-9091112c9857', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2159235bf1c5407eac7a3e3826561913', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=20ff3eab-119a-4740-918d-4005c52a4e27, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=2b1e8904-1c88-4828-a7bc-9f34a2930819) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:00:22 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:22.992 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 2b1e8904-1c88-4828-a7bc-9f34a2930819 in datapath 0303a35a-aae2-4e58-b0e5-9091112c9857 bound to our chassis
Nov 28 10:00:22 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:22.995 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 92b0a1b7-94a7-4946-a022-10c4e44505bf IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:00:22 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:22.995 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0303a35a-aae2-4e58-b0e5-9091112c9857, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:00:22 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:22.997 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4afd2102-e1c4-4737-980d-d89b8a978b80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:00:23 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:00:23Z|00077|binding|INFO|Setting lport 2b1e8904-1c88-4828-a7bc-9f34a2930819 ovn-installed in OVS
Nov 28 10:00:23 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:00:23Z|00078|binding|INFO|Setting lport 2b1e8904-1c88-4828-a7bc-9f34a2930819 up in Southbound
Nov 28 10:00:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:23.019 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:23.042 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:23.064 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:23 np0005538513.localdomain ceph-mon[292954]: pgmap v73: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:23.507 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:23 np0005538513.localdomain podman[307761]: 
Nov 28 10:00:23 np0005538513.localdomain podman[307761]: 2025-11-28 10:00:23.865182249 +0000 UTC m=+0.091772594 container create c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0303a35a-aae2-4e58-b0e5-9091112c9857, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:00:23 np0005538513.localdomain systemd[1]: Started libpod-conmon-c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b.scope.
Nov 28 10:00:23 np0005538513.localdomain systemd[1]: tmp-crun.06UuwW.mount: Deactivated successfully.
Nov 28 10:00:23 np0005538513.localdomain podman[307761]: 2025-11-28 10:00:23.818683544 +0000 UTC m=+0.045273939 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:00:23 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:00:23 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/776ee30c8f4b1d3b8a5504203661447b2ee50a3f6f2aafa660ced48066eed543/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:00:23 np0005538513.localdomain podman[307761]: 2025-11-28 10:00:23.95715074 +0000 UTC m=+0.183741085 container init c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0303a35a-aae2-4e58-b0e5-9091112c9857, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 28 10:00:23 np0005538513.localdomain podman[307761]: 2025-11-28 10:00:23.966447545 +0000 UTC m=+0.193037900 container start c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0303a35a-aae2-4e58-b0e5-9091112c9857, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 10:00:23 np0005538513.localdomain dnsmasq[307779]: started, version 2.85 cachesize 150
Nov 28 10:00:23 np0005538513.localdomain dnsmasq[307779]: DNS service limited to local subnets
Nov 28 10:00:23 np0005538513.localdomain dnsmasq[307779]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:00:23 np0005538513.localdomain dnsmasq[307779]: warning: no upstream servers configured
Nov 28 10:00:23 np0005538513.localdomain dnsmasq-dhcp[307779]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:00:23 np0005538513.localdomain dnsmasq[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/addn_hosts - 0 addresses
Nov 28 10:00:23 np0005538513.localdomain dnsmasq-dhcp[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/host
Nov 28 10:00:23 np0005538513.localdomain dnsmasq-dhcp[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/opts
Nov 28 10:00:24 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:00:24.167 261084 INFO neutron.agent.dhcp.agent [None req-ea06895c-1efe-4171-8a6c-fc71b43405d2 - - - - - -] DHCP configuration for ports {'37eaaacf-ed90-43ad-bf8b-bb907591b4ec'} is completed
Nov 28 10:00:24 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:00:24.343 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:00:23Z, description=, device_id=76efbe69-508f-4a6c-bc6c-575aca933da7, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd663c7f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd663c790>], id=8f481bfb-0f63-4459-a61b-a544bb537944, ip_allocation=immediate, mac_address=fa:16:3e:a4:82:17, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:00:20Z, description=, dns_domain=, id=0303a35a-aae2-4e58-b0e5-9091112c9857, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-667497220-network, port_security_enabled=True, project_id=2159235bf1c5407eac7a3e3826561913, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42725, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=176, status=ACTIVE, subnets=['ce2bee42-bca3-4580-8785-d86292bed448'], tags=[], tenant_id=2159235bf1c5407eac7a3e3826561913, updated_at=2025-11-28T10:00:21Z, vlan_transparent=None, network_id=0303a35a-aae2-4e58-b0e5-9091112c9857, port_security_enabled=False, project_id=2159235bf1c5407eac7a3e3826561913, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=210, status=DOWN, tags=[], tenant_id=2159235bf1c5407eac7a3e3826561913, updated_at=2025-11-28T10:00:24Z on network 0303a35a-aae2-4e58-b0e5-9091112c9857
Nov 28 10:00:24 np0005538513.localdomain dnsmasq[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/addn_hosts - 1 addresses
Nov 28 10:00:24 np0005538513.localdomain dnsmasq-dhcp[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/host
Nov 28 10:00:24 np0005538513.localdomain dnsmasq-dhcp[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/opts
Nov 28 10:00:24 np0005538513.localdomain podman[307797]: 2025-11-28 10:00:24.550814044 +0000 UTC m=+0.059017601 container kill c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0303a35a-aae2-4e58-b0e5-9091112c9857, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:00:24 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:00:24.740 261084 INFO neutron.agent.dhcp.agent [None req-d61f26ab-2d2b-40f2-a922-2384a36589e2 - - - - - -] DHCP configuration for ports {'8f481bfb-0f63-4459-a61b-a544bb537944'} is completed
Nov 28 10:00:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:24.745 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:25 np0005538513.localdomain ceph-mon[292954]: pgmap v74: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:25 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:00:25.293 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:00:23Z, description=, device_id=76efbe69-508f-4a6c-bc6c-575aca933da7, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd662f0a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd662f310>], id=8f481bfb-0f63-4459-a61b-a544bb537944, ip_allocation=immediate, mac_address=fa:16:3e:a4:82:17, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:00:20Z, description=, dns_domain=, id=0303a35a-aae2-4e58-b0e5-9091112c9857, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-667497220-network, port_security_enabled=True, project_id=2159235bf1c5407eac7a3e3826561913, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42725, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=176, status=ACTIVE, subnets=['ce2bee42-bca3-4580-8785-d86292bed448'], tags=[], tenant_id=2159235bf1c5407eac7a3e3826561913, updated_at=2025-11-28T10:00:21Z, vlan_transparent=None, network_id=0303a35a-aae2-4e58-b0e5-9091112c9857, port_security_enabled=False, project_id=2159235bf1c5407eac7a3e3826561913, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=210, status=DOWN, tags=[], tenant_id=2159235bf1c5407eac7a3e3826561913, updated_at=2025-11-28T10:00:24Z on network 0303a35a-aae2-4e58-b0e5-9091112c9857
Nov 28 10:00:25 np0005538513.localdomain dnsmasq[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/addn_hosts - 1 addresses
Nov 28 10:00:25 np0005538513.localdomain podman[307836]: 2025-11-28 10:00:25.552313826 +0000 UTC m=+0.061887149 container kill c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0303a35a-aae2-4e58-b0e5-9091112c9857, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:00:25 np0005538513.localdomain dnsmasq-dhcp[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/host
Nov 28 10:00:25 np0005538513.localdomain dnsmasq-dhcp[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/opts
Nov 28 10:00:25 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:00:25.701 261084 INFO neutron.agent.linux.ip_lib [None req-d599649c-2a5b-4443-a91a-077aaa3d06fe - - - - - -] Device tap7ac539be-36 cannot be used as it has no MAC address
Nov 28 10:00:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:25.729 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:25 np0005538513.localdomain kernel: device tap7ac539be-36 entered promiscuous mode
Nov 28 10:00:25 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324025.7352] manager: (tap7ac539be-36): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Nov 28 10:00:25 np0005538513.localdomain systemd-udevd[307711]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:00:25 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:00:25Z|00079|binding|INFO|Claiming lport 7ac539be-3605-4c5e-bb0c-6fbaaf95259b for this chassis.
Nov 28 10:00:25 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:00:25Z|00080|binding|INFO|7ac539be-3605-4c5e-bb0c-6fbaaf95259b: Claiming unknown
Nov 28 10:00:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:25.742 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:25.748 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-eaf25156-4f94-45e1-8ecf-348de157355a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eaf25156-4f94-45e1-8ecf-348de157355a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e759105895c542a0bccbe08b81ef5fde', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d8cbd739-26de-4c8a-898b-6e16313588b2, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=7ac539be-3605-4c5e-bb0c-6fbaaf95259b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:00:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:25.753 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 7ac539be-3605-4c5e-bb0c-6fbaaf95259b in datapath eaf25156-4f94-45e1-8ecf-348de157355a bound to our chassis
Nov 28 10:00:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:25.754 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network eaf25156-4f94-45e1-8ecf-348de157355a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:00:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:25.755 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[04757c39-4a8b-47f4-8067-24715abc0c96]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:00:25 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:00:25Z|00081|binding|INFO|Setting lport 7ac539be-3605-4c5e-bb0c-6fbaaf95259b ovn-installed in OVS
Nov 28 10:00:25 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:00:25.778 261084 INFO neutron.agent.dhcp.agent [None req-03019dd2-66de-4dbb-a795-65dc3cf488ff - - - - - -] DHCP configuration for ports {'8f481bfb-0f63-4459-a61b-a544bb537944'} is completed
Nov 28 10:00:25 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:00:25Z|00082|binding|INFO|Setting lport 7ac539be-3605-4c5e-bb0c-6fbaaf95259b up in Southbound
Nov 28 10:00:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:25.818 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:25.837 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:25.860 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:26.565 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:26 np0005538513.localdomain podman[307923]: 
Nov 28 10:00:26 np0005538513.localdomain podman[307923]: 2025-11-28 10:00:26.936228134 +0000 UTC m=+0.093126867 container create 660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eaf25156-4f94-45e1-8ecf-348de157355a, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 28 10:00:26 np0005538513.localdomain systemd[1]: Started libpod-conmon-660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481.scope.
Nov 28 10:00:26 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:00:26 np0005538513.localdomain podman[307923]: 2025-11-28 10:00:26.89043779 +0000 UTC m=+0.047336563 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:00:26 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2df9b00178a60559ff4adec6a755d20e525233a80e04a1bc351385bca38e5adf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:00:27 np0005538513.localdomain podman[307923]: 2025-11-28 10:00:27.002461215 +0000 UTC m=+0.159359908 container init 660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eaf25156-4f94-45e1-8ecf-348de157355a, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:00:27 np0005538513.localdomain podman[307923]: 2025-11-28 10:00:27.008692987 +0000 UTC m=+0.165591680 container start 660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eaf25156-4f94-45e1-8ecf-348de157355a, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 28 10:00:27 np0005538513.localdomain dnsmasq[307941]: started, version 2.85 cachesize 150
Nov 28 10:00:27 np0005538513.localdomain dnsmasq[307941]: DNS service limited to local subnets
Nov 28 10:00:27 np0005538513.localdomain dnsmasq[307941]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:00:27 np0005538513.localdomain dnsmasq[307941]: warning: no upstream servers configured
Nov 28 10:00:27 np0005538513.localdomain dnsmasq-dhcp[307941]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:00:27 np0005538513.localdomain dnsmasq[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/addn_hosts - 0 addresses
Nov 28 10:00:27 np0005538513.localdomain dnsmasq-dhcp[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/host
Nov 28 10:00:27 np0005538513.localdomain dnsmasq-dhcp[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/opts
Nov 28 10:00:27 np0005538513.localdomain ceph-mon[292954]: pgmap v75: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:27 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:00:27.129 261084 INFO neutron.agent.dhcp.agent [None req-b7f67ea2-1058-43a1-bf19-9b6b0df4439f - - - - - -] DHCP configuration for ports {'2dbf24aa-19c5-4e6e-b9d9-1bcffbe746db'} is completed
Nov 28 10:00:28 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:00:28.457 2 INFO neutron.agent.securitygroups_rpc [None req-cdabab7f-6f0b-439c-ae1d-dd4a3208cd11 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Security group member updated ['dc0a6e12-205a-4d7d-adb2-6545f08f7990']
Nov 28 10:00:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:28.896 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:29 np0005538513.localdomain ceph-mon[292954]: pgmap v76: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:29.791 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:29 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:00:29.876 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:00:29Z, description=, device_id=cd2a6086-9326-4cdd-a015-4768e2092068, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd662fb80>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd662f490>], id=02232087-1968-4362-9c1a-bb2acd27c4fd, ip_allocation=immediate, mac_address=fa:16:3e:1e:27:3b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:00:23Z, description=, dns_domain=, id=eaf25156-4f94-45e1-8ecf-348de157355a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ImagesNegativeTestJSON-1382899154-network, port_security_enabled=True, project_id=e759105895c542a0bccbe08b81ef5fde, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=30925, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=209, status=ACTIVE, subnets=['832963b0-c873-40fe-a644-0392c51c7abc'], tags=[], tenant_id=e759105895c542a0bccbe08b81ef5fde, updated_at=2025-11-28T10:00:24Z, vlan_transparent=None, network_id=eaf25156-4f94-45e1-8ecf-348de157355a, port_security_enabled=False, project_id=e759105895c542a0bccbe08b81ef5fde, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=266, status=DOWN, tags=[], tenant_id=e759105895c542a0bccbe08b81ef5fde, updated_at=2025-11-28T10:00:29Z on network eaf25156-4f94-45e1-8ecf-348de157355a
Nov 28 10:00:30 np0005538513.localdomain systemd[1]: tmp-crun.5zwxIR.mount: Deactivated successfully.
Nov 28 10:00:30 np0005538513.localdomain dnsmasq[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/addn_hosts - 1 addresses
Nov 28 10:00:30 np0005538513.localdomain podman[307959]: 2025-11-28 10:00:30.081305568 +0000 UTC m=+0.073540346 container kill 660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eaf25156-4f94-45e1-8ecf-348de157355a, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:00:30 np0005538513.localdomain dnsmasq-dhcp[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/host
Nov 28 10:00:30 np0005538513.localdomain dnsmasq-dhcp[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/opts
Nov 28 10:00:30 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:00:30.291 261084 INFO neutron.agent.dhcp.agent [None req-234ca4e1-bb35-4f86-be5b-19c875954a0b - - - - - -] DHCP configuration for ports {'02232087-1968-4362-9c1a-bb2acd27c4fd'} is completed
Nov 28 10:00:31 np0005538513.localdomain ceph-mon[292954]: pgmap v77: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:31 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:00:31.434 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:00:29Z, description=, device_id=cd2a6086-9326-4cdd-a015-4768e2092068, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd66655b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6665520>], id=02232087-1968-4362-9c1a-bb2acd27c4fd, ip_allocation=immediate, mac_address=fa:16:3e:1e:27:3b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:00:23Z, description=, dns_domain=, id=eaf25156-4f94-45e1-8ecf-348de157355a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ImagesNegativeTestJSON-1382899154-network, port_security_enabled=True, project_id=e759105895c542a0bccbe08b81ef5fde, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=30925, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=209, status=ACTIVE, subnets=['832963b0-c873-40fe-a644-0392c51c7abc'], tags=[], tenant_id=e759105895c542a0bccbe08b81ef5fde, updated_at=2025-11-28T10:00:24Z, vlan_transparent=None, network_id=eaf25156-4f94-45e1-8ecf-348de157355a, port_security_enabled=False, project_id=e759105895c542a0bccbe08b81ef5fde, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=266, status=DOWN, tags=[], tenant_id=e759105895c542a0bccbe08b81ef5fde, updated_at=2025-11-28T10:00:29Z on network eaf25156-4f94-45e1-8ecf-348de157355a
Nov 28 10:00:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:31.566 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:31 np0005538513.localdomain systemd[1]: tmp-crun.xcSOIe.mount: Deactivated successfully.
Nov 28 10:00:31 np0005538513.localdomain dnsmasq[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/addn_hosts - 1 addresses
Nov 28 10:00:31 np0005538513.localdomain dnsmasq-dhcp[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/host
Nov 28 10:00:31 np0005538513.localdomain dnsmasq-dhcp[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/opts
Nov 28 10:00:31 np0005538513.localdomain podman[307997]: 2025-11-28 10:00:31.676661541 +0000 UTC m=+0.067476650 container kill 660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eaf25156-4f94-45e1-8ecf-348de157355a, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:00:31 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:00:31.947 261084 INFO neutron.agent.dhcp.agent [None req-d9267bb8-5b10-4d33-a5f9-fc4f94d88822 - - - - - -] DHCP configuration for ports {'02232087-1968-4362-9c1a-bb2acd27c4fd'} is completed
Nov 28 10:00:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:00:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:00:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:32.768 279685 DEBUG oslo_concurrency.processutils [None req-52762099-22fc-4981-b6d6-4df819853c4e 8ea6e2aec9474e6594c08987b8c79204 ea61a5236ad2407485482a6f7462d550 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:00:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:32.790 279685 DEBUG oslo_concurrency.processutils [None req-52762099-22fc-4981-b6d6-4df819853c4e 8ea6e2aec9474e6594c08987b8c79204 ea61a5236ad2407485482a6f7462d550 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:00:32 np0005538513.localdomain systemd[1]: tmp-crun.7r4fWd.mount: Deactivated successfully.
Nov 28 10:00:32 np0005538513.localdomain podman[308018]: 2025-11-28 10:00:32.870958284 +0000 UTC m=+0.103367111 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:00:32 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:00:32.894 2 INFO neutron.agent.securitygroups_rpc [None req-d73a2eae-adce-4d72-91ae-34d59db28d8a c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Security group member updated ['dc0a6e12-205a-4d7d-adb2-6545f08f7990']
Nov 28 10:00:32 np0005538513.localdomain podman[308017]: 2025-11-28 10:00:32.907618738 +0000 UTC m=+0.141743007 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:00:32 np0005538513.localdomain podman[308017]: 2025-11-28 10:00:32.92138581 +0000 UTC m=+0.155510109 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:00:32 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:00:32 np0005538513.localdomain podman[308018]: 2025-11-28 10:00:32.937220306 +0000 UTC m=+0.169629193 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 10:00:32 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:00:33 np0005538513.localdomain ceph-mon[292954]: pgmap v78: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:33.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:33.770 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 10:00:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:33.789 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 10:00:34 np0005538513.localdomain podman[308076]: 2025-11-28 10:00:34.603358189 +0000 UTC m=+0.121603361 container kill 660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eaf25156-4f94-45e1-8ecf-348de157355a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 28 10:00:34 np0005538513.localdomain systemd[1]: tmp-crun.5kjoZO.mount: Deactivated successfully.
Nov 28 10:00:34 np0005538513.localdomain dnsmasq[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/addn_hosts - 0 addresses
Nov 28 10:00:34 np0005538513.localdomain dnsmasq-dhcp[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/host
Nov 28 10:00:34 np0005538513.localdomain dnsmasq-dhcp[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/opts
Nov 28 10:00:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:34.789 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:34 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:00:34.798 261084 INFO neutron.agent.linux.ip_lib [None req-0f46b05d-c1aa-41b7-8cf8-ed5d8b68a147 - - - - - -] Device tap74cdc895-82 cannot be used as it has no MAC address
Nov 28 10:00:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:34.822 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:34 np0005538513.localdomain kernel: device tap7ac539be-36 left promiscuous mode
Nov 28 10:00:34 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:00:34Z|00083|binding|INFO|Releasing lport 7ac539be-3605-4c5e-bb0c-6fbaaf95259b from this chassis (sb_readonly=0)
Nov 28 10:00:34 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:00:34Z|00084|binding|INFO|Setting lport 7ac539be-3605-4c5e-bb0c-6fbaaf95259b down in Southbound
Nov 28 10:00:34 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:34.833 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-eaf25156-4f94-45e1-8ecf-348de157355a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eaf25156-4f94-45e1-8ecf-348de157355a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e759105895c542a0bccbe08b81ef5fde', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d8cbd739-26de-4c8a-898b-6e16313588b2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=7ac539be-3605-4c5e-bb0c-6fbaaf95259b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:00:34 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:34.835 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 7ac539be-3605-4c5e-bb0c-6fbaaf95259b in datapath eaf25156-4f94-45e1-8ecf-348de157355a unbound from our chassis
Nov 28 10:00:34 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:34.838 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network eaf25156-4f94-45e1-8ecf-348de157355a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:00:34 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:34.839 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[f717e42d-5471-4b99-a0a0-7ac189dbda8f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:00:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:34.843 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:34.848 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:34 np0005538513.localdomain kernel: device tap74cdc895-82 entered promiscuous mode
Nov 28 10:00:34 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324034.8563] manager: (tap74cdc895-82): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Nov 28 10:00:34 np0005538513.localdomain systemd-udevd[308110]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:00:34 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:00:34Z|00085|binding|INFO|Claiming lport 74cdc895-82eb-45db-a408-03b43d3fc10f for this chassis.
Nov 28 10:00:34 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:00:34Z|00086|binding|INFO|74cdc895-82eb-45db-a408-03b43d3fc10f: Claiming unknown
Nov 28 10:00:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:34.863 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:34 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:34.878 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-991320b8-b994-4199-922f-5c3428b3e7ba', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-991320b8-b994-4199-922f-5c3428b3e7ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '40693e6dadaf448a8cb4caeb6899effc', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=16db044e-e796-4ecb-9ada-84075a99aa73, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=74cdc895-82eb-45db-a408-03b43d3fc10f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:00:34 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:34.880 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 74cdc895-82eb-45db-a408-03b43d3fc10f in datapath 991320b8-b994-4199-922f-5c3428b3e7ba bound to our chassis
Nov 28 10:00:34 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:34.883 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port e3d22942-ad97-4dc8-965d-3e878722ec78 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:00:34 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:34.883 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 991320b8-b994-4199-922f-5c3428b3e7ba, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:00:34 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:34.884 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4450e3dc-aa82-4c6b-923a-eba5d97ace38]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:00:34 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap74cdc895-82: No such device
Nov 28 10:00:34 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap74cdc895-82: No such device
Nov 28 10:00:34 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:00:34Z|00087|binding|INFO|Setting lport 74cdc895-82eb-45db-a408-03b43d3fc10f ovn-installed in OVS
Nov 28 10:00:34 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:00:34Z|00088|binding|INFO|Setting lport 74cdc895-82eb-45db-a408-03b43d3fc10f up in Southbound
Nov 28 10:00:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:34.906 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:34 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap74cdc895-82: No such device
Nov 28 10:00:34 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap74cdc895-82: No such device
Nov 28 10:00:34 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap74cdc895-82: No such device
Nov 28 10:00:34 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap74cdc895-82: No such device
Nov 28 10:00:34 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap74cdc895-82: No such device
Nov 28 10:00:34 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap74cdc895-82: No such device
Nov 28 10:00:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:34.944 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:34.975 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:35 np0005538513.localdomain ceph-mon[292954]: pgmap v79: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:35.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:35.770 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:00:35 np0005538513.localdomain podman[308181]: 
Nov 28 10:00:35 np0005538513.localdomain podman[308181]: 2025-11-28 10:00:35.984321847 +0000 UTC m=+0.090583210 container create 65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-991320b8-b994-4199-922f-5c3428b3e7ba, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 28 10:00:36 np0005538513.localdomain systemd[1]: Started libpod-conmon-65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3.scope.
Nov 28 10:00:36 np0005538513.localdomain podman[308181]: 2025-11-28 10:00:35.939721069 +0000 UTC m=+0.045982482 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:00:36 np0005538513.localdomain systemd[1]: tmp-crun.nxU2n8.mount: Deactivated successfully.
Nov 28 10:00:36 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:00:36 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e7a176f0a1d7dcae07345c04d77b729e8101196fbb4e6b2ed6c12ecbe6a4c6d1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:00:36 np0005538513.localdomain podman[308181]: 2025-11-28 10:00:36.065870767 +0000 UTC m=+0.172132130 container init 65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-991320b8-b994-4199-922f-5c3428b3e7ba, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:00:36 np0005538513.localdomain podman[308181]: 2025-11-28 10:00:36.075396259 +0000 UTC m=+0.181657622 container start 65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-991320b8-b994-4199-922f-5c3428b3e7ba, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 28 10:00:36 np0005538513.localdomain dnsmasq[308200]: started, version 2.85 cachesize 150
Nov 28 10:00:36 np0005538513.localdomain dnsmasq[308200]: DNS service limited to local subnets
Nov 28 10:00:36 np0005538513.localdomain dnsmasq[308200]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:00:36 np0005538513.localdomain dnsmasq[308200]: warning: no upstream servers configured
Nov 28 10:00:36 np0005538513.localdomain dnsmasq-dhcp[308200]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:00:36 np0005538513.localdomain dnsmasq[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/addn_hosts - 0 addresses
Nov 28 10:00:36 np0005538513.localdomain dnsmasq-dhcp[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/host
Nov 28 10:00:36 np0005538513.localdomain dnsmasq-dhcp[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/opts
Nov 28 10:00:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:36.195 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:36 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:00:36.235 261084 INFO neutron.agent.dhcp.agent [None req-2958e5aa-ff2b-4bc3-ba12-f371f4b2838e - - - - - -] DHCP configuration for ports {'a2cf2768-78e2-4b7f-a3d6-298ef981fb5a'} is completed
Nov 28 10:00:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:36.567 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:36 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:00:36.688 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:00:36Z, description=, device_id=5e2bdb5c-9386-4f23-88bb-b64884bb41d1, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd66ec310>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd66ec9d0>], id=198dd71f-c3e0-4377-9719-f2a42158bba7, ip_allocation=immediate, mac_address=fa:16:3e:0d:77:d5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:00:32Z, description=, dns_domain=, id=991320b8-b994-4199-922f-5c3428b3e7ba, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesTestJSON-474917441-network, port_security_enabled=True, project_id=40693e6dadaf448a8cb4caeb6899effc, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=43850, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=289, status=ACTIVE, subnets=['ea71b8b0-31a5-4591-a369-92ec92968c41'], tags=[], tenant_id=40693e6dadaf448a8cb4caeb6899effc, updated_at=2025-11-28T10:00:33Z, vlan_transparent=None, network_id=991320b8-b994-4199-922f-5c3428b3e7ba, port_security_enabled=False, project_id=40693e6dadaf448a8cb4caeb6899effc, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=329, status=DOWN, tags=[], tenant_id=40693e6dadaf448a8cb4caeb6899effc, updated_at=2025-11-28T10:00:36Z on network 991320b8-b994-4199-922f-5c3428b3e7ba
Nov 28 10:00:36 np0005538513.localdomain dnsmasq[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/addn_hosts - 1 addresses
Nov 28 10:00:36 np0005538513.localdomain podman[308218]: 2025-11-28 10:00:36.901949286 +0000 UTC m=+0.060676772 container kill 65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-991320b8-b994-4199-922f-5c3428b3e7ba, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 10:00:36 np0005538513.localdomain dnsmasq-dhcp[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/host
Nov 28 10:00:36 np0005538513.localdomain dnsmasq-dhcp[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/opts
Nov 28 10:00:37 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:00:37.097 261084 INFO neutron.agent.dhcp.agent [None req-700b54e0-0430-4124-8e02-376dd1f082ac - - - - - -] DHCP configuration for ports {'198dd71f-c3e0-4377-9719-f2a42158bba7'} is completed
Nov 28 10:00:37 np0005538513.localdomain ceph-mon[292954]: pgmap v80: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:37 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:00:37Z|00089|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:00:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:37.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:37.843 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:37 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:00:37.906 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:00:36Z, description=, device_id=5e2bdb5c-9386-4f23-88bb-b64884bb41d1, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6eb5df0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6eb5580>], id=198dd71f-c3e0-4377-9719-f2a42158bba7, ip_allocation=immediate, mac_address=fa:16:3e:0d:77:d5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:00:32Z, description=, dns_domain=, id=991320b8-b994-4199-922f-5c3428b3e7ba, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesTestJSON-474917441-network, port_security_enabled=True, project_id=40693e6dadaf448a8cb4caeb6899effc, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=43850, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=289, status=ACTIVE, subnets=['ea71b8b0-31a5-4591-a369-92ec92968c41'], tags=[], tenant_id=40693e6dadaf448a8cb4caeb6899effc, updated_at=2025-11-28T10:00:33Z, vlan_transparent=None, network_id=991320b8-b994-4199-922f-5c3428b3e7ba, port_security_enabled=False, project_id=40693e6dadaf448a8cb4caeb6899effc, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=329, status=DOWN, tags=[], tenant_id=40693e6dadaf448a8cb4caeb6899effc, updated_at=2025-11-28T10:00:36Z on network 991320b8-b994-4199-922f-5c3428b3e7ba
Nov 28 10:00:38 np0005538513.localdomain dnsmasq[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/addn_hosts - 1 addresses
Nov 28 10:00:38 np0005538513.localdomain dnsmasq-dhcp[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/host
Nov 28 10:00:38 np0005538513.localdomain dnsmasq-dhcp[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/opts
Nov 28 10:00:38 np0005538513.localdomain podman[308255]: 2025-11-28 10:00:38.128118878 +0000 UTC m=+0.069828113 container kill 65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-991320b8-b994-4199-922f-5c3428b3e7ba, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:00:38 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:00:38.373 261084 INFO neutron.agent.dhcp.agent [None req-02a85b7f-f781-4cc1-8f41-70ded975ff74 - - - - - -] DHCP configuration for ports {'198dd71f-c3e0-4377-9719-f2a42158bba7'} is completed
Nov 28 10:00:38 np0005538513.localdomain dnsmasq[307941]: exiting on receipt of SIGTERM
Nov 28 10:00:38 np0005538513.localdomain podman[308292]: 2025-11-28 10:00:38.478465621 +0000 UTC m=+0.058999130 container kill 660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eaf25156-4f94-45e1-8ecf-348de157355a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:00:38 np0005538513.localdomain systemd[1]: libpod-660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481.scope: Deactivated successfully.
Nov 28 10:00:38 np0005538513.localdomain podman[308306]: 2025-11-28 10:00:38.543916078 +0000 UTC m=+0.055209135 container died 660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eaf25156-4f94-45e1-8ecf-348de157355a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:00:38 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481-userdata-shm.mount: Deactivated successfully.
Nov 28 10:00:38 np0005538513.localdomain podman[308306]: 2025-11-28 10:00:38.592334822 +0000 UTC m=+0.103627839 container cleanup 660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eaf25156-4f94-45e1-8ecf-348de157355a, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 28 10:00:38 np0005538513.localdomain systemd[1]: libpod-conmon-660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481.scope: Deactivated successfully.
Nov 28 10:00:38 np0005538513.localdomain podman[308313]: 2025-11-28 10:00:38.618486255 +0000 UTC m=+0.115216764 container remove 660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eaf25156-4f94-45e1-8ecf-348de157355a, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 10:00:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:38.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:38.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:38 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:00:38.806 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:00:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-2df9b00178a60559ff4adec6a755d20e525233a80e04a1bc351385bca38e5adf-merged.mount: Deactivated successfully.
Nov 28 10:00:39 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2deaf25156\x2d4f94\x2d45e1\x2d8ecf\x2d348de157355a.mount: Deactivated successfully.
Nov 28 10:00:39 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:00:39.188 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:00:39 np0005538513.localdomain ceph-mon[292954]: pgmap v81: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:39.768 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:39.826 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:40 np0005538513.localdomain podman[238687]: time="2025-11-28T10:00:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:00:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:00:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159336 "" "Go-http-client/1.1"
Nov 28 10:00:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:00:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20195 "" "Go-http-client/1.1"
Nov 28 10:00:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:40.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:41 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:00:41.117 2 INFO neutron.agent.securitygroups_rpc [req-a368a1e6-562c-4526-ae3a-0e69b2a15bca req-d778405e-6e70-4ce8-a2a0-d781e7c8b4de 67aa4e1531db4854a2788f0bbeaba314 40693e6dadaf448a8cb4caeb6899effc - - default default] Security group rule updated ['f59fc346-b907-4d25-9b54-7ce550f4338f']
Nov 28 10:00:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:41.147 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:41.147 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:00:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:41.149 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:00:41 np0005538513.localdomain ceph-mon[292954]: pgmap v82: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:41 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/3575417762' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:00:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:41.597 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:41.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:41.796 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:00:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:41.796 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:00:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:41.797 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:00:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:41.797 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:00:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:41.797 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:00:42 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:00:42.140 2 INFO neutron.agent.securitygroups_rpc [req-91a233d2-d128-4e77-ba4a-bffdfb538e2c req-2dda1abe-8246-445d-82ca-7050072f5ab7 67aa4e1531db4854a2788f0bbeaba314 40693e6dadaf448a8cb4caeb6899effc - - default default] Security group rule updated ['793a871e-42bd-4871-9764-ed4c16f282ee']
Nov 28 10:00:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:00:42 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2613041327' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:00:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:42.250 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:00:42 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2613041327' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:00:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:42.319 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:00:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:42.320 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:00:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:42.553 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:00:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:42.555 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11361MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:00:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:42.556 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:00:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:42.556 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:00:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:00:42 np0005538513.localdomain podman[308359]: 2025-11-28 10:00:42.856065581 +0000 UTC m=+0.080948432 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350)
Nov 28 10:00:42 np0005538513.localdomain podman[308359]: 2025-11-28 10:00:42.869798603 +0000 UTC m=+0.094681444 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 10:00:42 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:00:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:43.023 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 10:00:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:43.023 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:00:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:43.024 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:00:43 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:00:43.145 2 INFO neutron.agent.securitygroups_rpc [req-9f8c831a-ecf0-4e4e-9595-719ef0ba964a req-7a744279-94cd-4894-9a01-f869bc00b409 67aa4e1531db4854a2788f0bbeaba314 40693e6dadaf448a8cb4caeb6899effc - - default default] Security group rule updated ['03578922-528e-499a-8e7e-7a5c262d5e64']
Nov 28 10:00:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:43.151 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:00:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:43.313 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing inventories for resource provider 35fead26-0bad-4950-b646-987079d58a17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 10:00:43 np0005538513.localdomain ceph-mon[292954]: pgmap v83: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:43 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1753842664' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:00:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:43.775 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating ProviderTree inventory for provider 35fead26-0bad-4950-b646-987079d58a17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 10:00:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:43.775 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 10:00:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:43.797 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing aggregate associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 10:00:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:43.834 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing trait associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_FMA3,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_ABM,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AMD_SVM,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_CLMUL,HW_CPU_X86_BMI2,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 10:00:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:43.875 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:00:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:00:44 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1539368950' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:00:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:44.335 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:00:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:44.341 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:00:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:44.359 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:00:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:44.362 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:00:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:44.362 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.806s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:00:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:44.363 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:44 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2762372881' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:00:44 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1651382477' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:00:44 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1539368950' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:00:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:44 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:00:44.822 2 INFO neutron.agent.securitygroups_rpc [req-cee9d54d-2061-4e18-9dbb-7978cc78c723 req-de7c358c-da89-4568-8167-d43d2e6c2b50 67aa4e1531db4854a2788f0bbeaba314 40693e6dadaf448a8cb4caeb6899effc - - default default] Security group rule updated ['0b748e56-a20d-4a74-8688-d245ea875072']
Nov 28 10:00:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:44.867 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:45 np0005538513.localdomain ceph-mon[292954]: pgmap v84: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:45 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2207683770' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:00:46 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:00:46.158 2 INFO neutron.agent.securitygroups_rpc [req-fb04029d-a55f-44ca-91cb-e1804a48ba9f req-358dd77e-4f38-41f7-8d87-d246c9bdd01f 67aa4e1531db4854a2788f0bbeaba314 40693e6dadaf448a8cb4caeb6899effc - - default default] Security group rule updated ['e4169f5e-6c1f-4c53-9aba-f1f5fa41bfad']
Nov 28 10:00:46 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2972538254' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:00:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:46.630 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:46 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:00:46.815 2 INFO neutron.agent.securitygroups_rpc [req-587359a2-3e51-4204-a146-11784139df1a req-865ecdad-325d-46ee-b040-9c64055b894f 67aa4e1531db4854a2788f0bbeaba314 40693e6dadaf448a8cb4caeb6899effc - - default default] Security group rule updated ['e4169f5e-6c1f-4c53-9aba-f1f5fa41bfad']
Nov 28 10:00:47 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:00:47.103 2 INFO neutron.agent.securitygroups_rpc [req-95a69005-5e60-4e46-b43c-b0ec65f622be req-ea491abe-a44c-4d6d-8b8a-511053fe8f93 67aa4e1531db4854a2788f0bbeaba314 40693e6dadaf448a8cb4caeb6899effc - - default default] Security group rule updated ['e4169f5e-6c1f-4c53-9aba-f1f5fa41bfad']
Nov 28 10:00:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:47.380 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:47.380 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:00:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:47.381 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:00:47 np0005538513.localdomain ceph-mon[292954]: pgmap v85: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:47 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1377881097' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:00:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:47.463 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:00:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:47.463 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:00:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:47.464 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 10:00:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:47.464 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:00:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:00:47 np0005538513.localdomain podman[308403]: 2025-11-28 10:00:47.85153773 +0000 UTC m=+0.083238574 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:00:47 np0005538513.localdomain podman[308403]: 2025-11-28 10:00:47.888542775 +0000 UTC m=+0.120243639 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:00:47 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:00:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:48.062 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:00:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:48.078 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:00:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:48.079 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 10:00:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:48.079 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:48.080 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 10:00:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:00:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:00:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:00:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:00:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:00:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:00:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:00:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:00:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:00:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:00:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:00:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:00:48 np0005538513.localdomain ceph-mon[292954]: pgmap v86: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 33 op/s
Nov 28 10:00:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:00:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:00:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:49 np0005538513.localdomain podman[308426]: 2025-11-28 10:00:49.857265366 +0000 UTC m=+0.084881585 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 28 10:00:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:49.908 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:49 np0005538513.localdomain podman[308427]: 2025-11-28 10:00:49.951232217 +0000 UTC m=+0.175395279 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:00:49 np0005538513.localdomain podman[308427]: 2025-11-28 10:00:49.955990803 +0000 UTC m=+0.180153875 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 28 10:00:49 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:00:50 np0005538513.localdomain podman[308426]: 2025-11-28 10:00:50.004873751 +0000 UTC m=+0.232489970 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:00:50 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:00:50 np0005538513.localdomain dnsmasq[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/addn_hosts - 0 addresses
Nov 28 10:00:50 np0005538513.localdomain dnsmasq-dhcp[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/host
Nov 28 10:00:50 np0005538513.localdomain dnsmasq-dhcp[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/opts
Nov 28 10:00:50 np0005538513.localdomain podman[308482]: 2025-11-28 10:00:50.294435962 +0000 UTC m=+0.065541462 container kill 65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-991320b8-b994-4199-922f-5c3428b3e7ba, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:00:50 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:00:50Z|00090|binding|INFO|Releasing lport 74cdc895-82eb-45db-a408-03b43d3fc10f from this chassis (sb_readonly=0)
Nov 28 10:00:50 np0005538513.localdomain kernel: device tap74cdc895-82 left promiscuous mode
Nov 28 10:00:50 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:00:50Z|00091|binding|INFO|Setting lport 74cdc895-82eb-45db-a408-03b43d3fc10f down in Southbound
Nov 28 10:00:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:50.482 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:50.503 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:50.504 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-991320b8-b994-4199-922f-5c3428b3e7ba', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-991320b8-b994-4199-922f-5c3428b3e7ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '40693e6dadaf448a8cb4caeb6899effc', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=16db044e-e796-4ecb-9ada-84075a99aa73, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=74cdc895-82eb-45db-a408-03b43d3fc10f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:00:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:50.507 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 74cdc895-82eb-45db-a408-03b43d3fc10f in datapath 991320b8-b994-4199-922f-5c3428b3e7ba unbound from our chassis
Nov 28 10:00:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:50.510 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 991320b8-b994-4199-922f-5c3428b3e7ba, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:00:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:50.512 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[093e3915-8951-4838-9881-5ab3c6a44ba6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:00:50 np0005538513.localdomain ceph-mon[292954]: pgmap v87: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 33 op/s
Nov 28 10:00:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:50.840 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:00:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:50.841 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:00:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:00:50.841 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:00:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:51.633 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:52.049 279685 DEBUG nova.virt.libvirt.driver [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Creating tmpfile /var/lib/nova/instances/tmpkz_uaqrw to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Nov 28 10:00:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:52.062 279685 DEBUG nova.compute.manager [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=13312,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkz_uaqrw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Nov 28 10:00:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:52.095 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:00:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:52.096 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:00:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:52.110 279685 INFO nova.compute.rpcapi [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Nov 28 10:00:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:52.111 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:00:52 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:00:52Z|00092|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:00:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:52.487 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:52 np0005538513.localdomain ceph-mon[292954]: pgmap v88: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 2.8 MiB/s rd, 1.8 MiB/s wr, 80 op/s
Nov 28 10:00:53 np0005538513.localdomain dnsmasq[308200]: exiting on receipt of SIGTERM
Nov 28 10:00:53 np0005538513.localdomain podman[308519]: 2025-11-28 10:00:53.013817092 +0000 UTC m=+0.060743923 container kill 65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-991320b8-b994-4199-922f-5c3428b3e7ba, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 10:00:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:00:53 np0005538513.localdomain systemd[1]: libpod-65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3.scope: Deactivated successfully.
Nov 28 10:00:53 np0005538513.localdomain podman[308533]: 2025-11-28 10:00:53.083911141 +0000 UTC m=+0.059724502 container died 65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-991320b8-b994-4199-922f-5c3428b3e7ba, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:00:53 np0005538513.localdomain podman[308540]: 2025-11-28 10:00:53.129607053 +0000 UTC m=+0.088405992 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:00:53 np0005538513.localdomain podman[308533]: 2025-11-28 10:00:53.172169788 +0000 UTC m=+0.147983099 container cleanup 65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-991320b8-b994-4199-922f-5c3428b3e7ba, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:00:53 np0005538513.localdomain systemd[1]: libpod-conmon-65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3.scope: Deactivated successfully.
Nov 28 10:00:53 np0005538513.localdomain podman[308535]: 2025-11-28 10:00:53.190685746 +0000 UTC m=+0.155783468 container remove 65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-991320b8-b994-4199-922f-5c3428b3e7ba, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:00:53 np0005538513.localdomain podman[308540]: 2025-11-28 10:00:53.192263994 +0000 UTC m=+0.151062933 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:00:53 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:00:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:00:53.408 261084 INFO neutron.agent.dhcp.agent [None req-6782fca0-efd2-40a9-82ea-c9eac2ed2582 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:00:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:00:53.409 261084 INFO neutron.agent.dhcp.agent [None req-6782fca0-efd2-40a9-82ea-c9eac2ed2582 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:00:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:00:53.661 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:00:54 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e7a176f0a1d7dcae07345c04d77b729e8101196fbb4e6b2ed6c12ecbe6a4c6d1-merged.mount: Deactivated successfully.
Nov 28 10:00:54 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3-userdata-shm.mount: Deactivated successfully.
Nov 28 10:00:54 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d991320b8\x2db994\x2d4199\x2d922f\x2d5c3428b3e7ba.mount: Deactivated successfully.
Nov 28 10:00:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:54.069 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:54.533 279685 DEBUG nova.compute.manager [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=13312,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkz_uaqrw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='d716674a-ba14-466a-956f-5bca9404174f',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Nov 28 10:00:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:54.576 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Acquiring lock "refresh_cache-d716674a-ba14-466a-956f-5bca9404174f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:00:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:54.577 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Acquired lock "refresh_cache-d716674a-ba14-466a-956f-5bca9404174f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:00:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:54.577 279685 DEBUG nova.network.neutron [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 10:00:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:54 np0005538513.localdomain ceph-mon[292954]: pgmap v89: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 2.8 MiB/s rd, 1.8 MiB/s wr, 80 op/s
Nov 28 10:00:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:54.911 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:55.170 279685 DEBUG nova.network.neutron [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Updating instance_info_cache with network_info: [{"id": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "address": "fa:16:3e:a9:a4:65", "network": {"id": "b1cd9c9c-949c-46cf-bb45-dc659f651fc3", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-91322750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "3e4b394501d24dc7954ec5d2f27b8081", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4b2e0ba-de", "ovs_interfaceid": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:00:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:55.217 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Releasing lock "refresh_cache-d716674a-ba14-466a-956f-5bca9404174f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:00:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:55.220 279685 DEBUG nova.virt.libvirt.driver [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=13312,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkz_uaqrw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='d716674a-ba14-466a-956f-5bca9404174f',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Nov 28 10:00:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:55.221 279685 DEBUG nova.virt.libvirt.driver [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Creating instance directory: /var/lib/nova/instances/d716674a-ba14-466a-956f-5bca9404174f pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Nov 28 10:00:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:55.222 279685 DEBUG nova.virt.libvirt.driver [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Ensure instance console log exists: /var/lib/nova/instances/d716674a-ba14-466a-956f-5bca9404174f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 28 10:00:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:55.222 279685 DEBUG nova.virt.libvirt.driver [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Nov 28 10:00:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:55.224 279685 DEBUG nova.virt.libvirt.vif [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T10:00:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1337177779',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005538514.localdomain',hostname='tempest-liveautoblockmigrationv225test-server-1337177779',id=6,image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T10:00:49Z,launched_on='np0005538514.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='np0005538514.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3e4b394501d24dc7954ec5d2f27b8081',ramdisk_id='',reservation_id='r-mfjalp0i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1153414438',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1153414438-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T10:00:49Z,user_data=None,user_id='c64867c2bac34a819c0995d0b72ee9a7',uuid=d716674a-ba14-466a-956f-5bca9404174f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "address": "fa:16:3e:a9:a4:65", "network": {"id": "b1cd9c9c-949c-46cf-bb45-dc659f651fc3", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-91322750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "3e4b394501d24dc7954ec5d2f27b8081", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd4b2e0ba-de", "ovs_interfaceid": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 10:00:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:55.225 279685 DEBUG nova.network.os_vif_util [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Converting VIF {"id": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "address": "fa:16:3e:a9:a4:65", "network": {"id": "b1cd9c9c-949c-46cf-bb45-dc659f651fc3", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-91322750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "3e4b394501d24dc7954ec5d2f27b8081", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd4b2e0ba-de", "ovs_interfaceid": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 10:00:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:55.226 279685 DEBUG nova.network.os_vif_util [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:a4:65,bridge_name='br-int',has_traffic_filtering=True,id=d4b2e0ba-de4a-4cfb-af66-1ed3abdde376,network=Network(b1cd9c9c-949c-46cf-bb45-dc659f651fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd4b2e0ba-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 10:00:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:55.226 279685 DEBUG os_vif [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:a4:65,bridge_name='br-int',has_traffic_filtering=True,id=d4b2e0ba-de4a-4cfb-af66-1ed3abdde376,network=Network(b1cd9c9c-949c-46cf-bb45-dc659f651fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd4b2e0ba-de') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 10:00:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:55.227 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:55.228 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:00:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:55.228 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 10:00:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:55.233 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:55.233 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4b2e0ba-de, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:00:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:55.234 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd4b2e0ba-de, col_values=(('external_ids', {'iface-id': 'd4b2e0ba-de4a-4cfb-af66-1ed3abdde376', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a9:a4:65', 'vm-uuid': 'd716674a-ba14-466a-956f-5bca9404174f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:00:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:55.262 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:55.264 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:00:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:55.269 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:55.271 279685 INFO os_vif [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:a4:65,bridge_name='br-int',has_traffic_filtering=True,id=d4b2e0ba-de4a-4cfb-af66-1ed3abdde376,network=Network(b1cd9c9c-949c-46cf-bb45-dc659f651fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd4b2e0ba-de')
Nov 28 10:00:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:55.272 279685 DEBUG nova.virt.libvirt.driver [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Nov 28 10:00:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:55.273 279685 DEBUG nova.compute.manager [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=13312,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkz_uaqrw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='d716674a-ba14-466a-956f-5bca9404174f',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Nov 28 10:00:55 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 do_prune osdmap full prune enabled
Nov 28 10:00:55 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e95 e95: 6 total, 6 up, 6 in
Nov 28 10:00:55 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e95: 6 total, 6 up, 6 in
Nov 28 10:00:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:56.813 279685 DEBUG nova.network.neutron [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Port d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 updated with migration profile {'migrating_to': 'np0005538513.localdomain'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Nov 28 10:00:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:56.816 279685 DEBUG nova.compute.manager [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=13312,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkz_uaqrw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='d716674a-ba14-466a-956f-5bca9404174f',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Nov 28 10:00:56 np0005538513.localdomain ceph-mon[292954]: pgmap v90: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 2.8 MiB/s rd, 1.8 MiB/s wr, 80 op/s
Nov 28 10:00:56 np0005538513.localdomain ceph-mon[292954]: osdmap e95: 6 total, 6 up, 6 in
Nov 28 10:00:57 np0005538513.localdomain sshd[308582]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:00:57 np0005538513.localdomain sshd[308582]: Accepted publickey for nova from 172.17.0.107 port 39414 ssh2: ECDSA SHA256:i2iq7ecxWJi+/Y2y/tJQSdSgYGpWyMNC9YvRCAzXl2w
Nov 28 10:00:57 np0005538513.localdomain systemd-logind[764]: New session 73 of user nova.
Nov 28 10:00:57 np0005538513.localdomain systemd[1]: Created slice User Slice of UID 42436.
Nov 28 10:00:57 np0005538513.localdomain systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 28 10:00:57 np0005538513.localdomain systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 28 10:00:57 np0005538513.localdomain systemd[1]: Starting User Manager for UID 42436...
Nov 28 10:00:57 np0005538513.localdomain systemd[308586]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by (uid=0)
Nov 28 10:00:57 np0005538513.localdomain systemd[308586]: Queued start job for default target Main User Target.
Nov 28 10:00:57 np0005538513.localdomain systemd[308586]: Created slice User Application Slice.
Nov 28 10:00:57 np0005538513.localdomain systemd[308586]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 10:00:57 np0005538513.localdomain systemd[308586]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 10:00:57 np0005538513.localdomain systemd[308586]: Reached target Paths.
Nov 28 10:00:57 np0005538513.localdomain systemd[308586]: Reached target Timers.
Nov 28 10:00:57 np0005538513.localdomain systemd[308586]: Starting D-Bus User Message Bus Socket...
Nov 28 10:00:57 np0005538513.localdomain systemd[308586]: Starting Create User's Volatile Files and Directories...
Nov 28 10:00:57 np0005538513.localdomain systemd[308586]: Listening on D-Bus User Message Bus Socket.
Nov 28 10:00:57 np0005538513.localdomain systemd[308586]: Reached target Sockets.
Nov 28 10:00:57 np0005538513.localdomain systemd[308586]: Finished Create User's Volatile Files and Directories.
Nov 28 10:00:57 np0005538513.localdomain systemd[308586]: Reached target Basic System.
Nov 28 10:00:57 np0005538513.localdomain systemd[308586]: Reached target Main User Target.
Nov 28 10:00:57 np0005538513.localdomain systemd[308586]: Startup finished in 169ms.
Nov 28 10:00:57 np0005538513.localdomain systemd[1]: Started User Manager for UID 42436.
Nov 28 10:00:57 np0005538513.localdomain systemd[1]: Started Session 73 of User nova.
Nov 28 10:00:57 np0005538513.localdomain sshd[308582]: pam_unix(sshd:session): session opened for user nova(uid=42436) by (uid=0)
Nov 28 10:00:57 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e95 do_prune osdmap full prune enabled
Nov 28 10:00:57 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e96 e96: 6 total, 6 up, 6 in
Nov 28 10:00:57 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e96: 6 total, 6 up, 6 in
Nov 28 10:00:58 np0005538513.localdomain systemd[1]: Started libvirt secret daemon.
Nov 28 10:00:58 np0005538513.localdomain kernel: device tapd4b2e0ba-de entered promiscuous mode
Nov 28 10:00:58 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324058.1471] manager: (tapd4b2e0ba-de): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Nov 28 10:00:58 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:00:58Z|00093|binding|INFO|Claiming lport d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 for this additional chassis.
Nov 28 10:00:58 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:00:58Z|00094|binding|INFO|d4b2e0ba-de4a-4cfb-af66-1ed3abdde376: Claiming fa:16:3e:a9:a4:65 10.100.0.10
Nov 28 10:00:58 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:00:58Z|00095|binding|INFO|Claiming lport a48cbb27-d55f-41c4-a09f-9bbe3a14fe95 for this additional chassis.
Nov 28 10:00:58 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:00:58Z|00096|binding|INFO|a48cbb27-d55f-41c4-a09f-9bbe3a14fe95: Claiming fa:16:3e:78:62:32 19.80.0.101
Nov 28 10:00:58 np0005538513.localdomain systemd-udevd[308635]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:00:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:58.153 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:58 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324058.1659] device (tapd4b2e0ba-de): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Nov 28 10:00:58 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324058.1669] device (tapd4b2e0ba-de): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Nov 28 10:00:58 np0005538513.localdomain systemd-machined[83422]: New machine qemu-3-instance-00000006.
Nov 28 10:00:58 np0005538513.localdomain systemd[1]: Started Virtual Machine qemu-3-instance-00000006.
Nov 28 10:00:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:58.194 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:58 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:00:58Z|00097|binding|INFO|Setting lport d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 ovn-installed in OVS
Nov 28 10:00:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:58.197 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:58.479 279685 DEBUG nova.virt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Emitting event <LifecycleEvent: 1764324058.4791348, d716674a-ba14-466a-956f-5bca9404174f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 10:00:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:58.482 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: d716674a-ba14-466a-956f-5bca9404174f] VM Started (Lifecycle Event)
Nov 28 10:00:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:58.655 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: d716674a-ba14-466a-956f-5bca9404174f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:00:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:58.732 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:00:58 np0005538513.localdomain ceph-mon[292954]: pgmap v92: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 2.3 MiB/s rd, 17 KiB/s wr, 103 op/s
Nov 28 10:00:58 np0005538513.localdomain ceph-mon[292954]: osdmap e96: 6 total, 6 up, 6 in
Nov 28 10:00:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:59.060 279685 DEBUG nova.virt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Emitting event <LifecycleEvent: 1764324059.0596793, d716674a-ba14-466a-956f-5bca9404174f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 10:00:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:59.060 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: d716674a-ba14-466a-956f-5bca9404174f] VM Resumed (Lifecycle Event)
Nov 28 10:00:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:59.079 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: d716674a-ba14-466a-956f-5bca9404174f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:00:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:59.083 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: d716674a-ba14-466a-956f-5bca9404174f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 10:00:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:59.101 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: d716674a-ba14-466a-956f-5bca9404174f] During the sync_power process the instance has moved from host np0005538514.localdomain to host np0005538513.localdomain
Nov 28 10:00:59 np0005538513.localdomain sshd[308602]: Received disconnect from 172.17.0.107 port 39414:11: disconnected by user
Nov 28 10:00:59 np0005538513.localdomain sshd[308602]: Disconnected from user nova 172.17.0.107 port 39414
Nov 28 10:00:59 np0005538513.localdomain sshd[308582]: pam_unix(sshd:session): session closed for user nova
Nov 28 10:00:59 np0005538513.localdomain systemd[1]: session-73.scope: Deactivated successfully.
Nov 28 10:00:59 np0005538513.localdomain systemd-logind[764]: Session 73 logged out. Waiting for processes to exit.
Nov 28 10:00:59 np0005538513.localdomain systemd-logind[764]: Removed session 73.
Nov 28 10:00:59 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:00:59.947 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:00 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:00Z|00098|binding|INFO|Claiming lport d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 for this chassis.
Nov 28 10:01:00 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:00Z|00099|binding|INFO|d4b2e0ba-de4a-4cfb-af66-1ed3abdde376: Claiming fa:16:3e:a9:a4:65 10.100.0.10
Nov 28 10:01:00 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:00Z|00100|binding|INFO|Claiming lport a48cbb27-d55f-41c4-a09f-9bbe3a14fe95 for this chassis.
Nov 28 10:01:00 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:00Z|00101|binding|INFO|a48cbb27-d55f-41c4-a09f-9bbe3a14fe95: Claiming fa:16:3e:78:62:32 19.80.0.101
Nov 28 10:01:00 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:00Z|00102|binding|INFO|Setting lport d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 up in Southbound
Nov 28 10:01:00 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:00Z|00103|binding|INFO|Setting lport a48cbb27-d55f-41c4-a09f-9bbe3a14fe95 up in Southbound
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.147 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:62:32 19.80.0.101'], port_security=['fa:16:3e:78:62:32 19.80.0.101'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['d4b2e0ba-de4a-4cfb-af66-1ed3abdde376'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-27110142', 'neutron:cidrs': '19.80.0.101/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c8d124f-f6e2-454a-9f65-e2e41a655306', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-27110142', 'neutron:project_id': '3e4b394501d24dc7954ec5d2f27b8081', 'neutron:revision_number': '3', 'neutron:security_group_ids': 'dc0a6e12-205a-4d7d-adb2-6545f08f7990', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=c5f09637-840e-43f3-af58-d197b914a787, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=a48cbb27-d55f-41c4-a09f-9bbe3a14fe95) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.150 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:a4:65 10.100.0.10'], port_security=['fa:16:3e:a9:a4:65 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1851630912', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd716674a-ba14-466a-956f-5bca9404174f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1cd9c9c-949c-46cf-bb45-dc659f651fc3', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1851630912', 'neutron:project_id': '3e4b394501d24dc7954ec5d2f27b8081', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'dc0a6e12-205a-4d7d-adb2-6545f08f7990', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d220f056-a923-484b-9df7-f648b3edde7c, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=d4b2e0ba-de4a-4cfb-af66-1ed3abdde376) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.152 158130 INFO neutron.agent.ovn.metadata.agent [-] Port a48cbb27-d55f-41c4-a09f-9bbe3a14fe95 in datapath 8c8d124f-f6e2-454a-9f65-e2e41a655306 bound to our chassis
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.155 158130 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8c8d124f-f6e2-454a-9f65-e2e41a655306
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.164 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[b7fbf540-f971-429b-a007-8434119b5ce6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.165 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8c8d124f-f1 in ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.167 158233 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8c8d124f-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.167 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[95e6f82d-11d7-4b04-b417-e7d6545066b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.169 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[25d37f20-b9e3-49c8-8ca3-9e6f00818a0b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.186 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[1a050001-1d4b-463e-ad5d-6421cfe2a851]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:00 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:01:00.187 2 WARNING neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 req-6fcf345d-e21d-4204-a5a8-b222eb18b3be 9e5033e84dec44f4956046cabe7e22af e2c76e4d27554fd5a4f85cce208b136f - - default default] This port is not SRIOV, skip binding for port d4b2e0ba-de4a-4cfb-af66-1ed3abdde376.
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.210 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[50146e27-96be-4cf7-a5a9-4553e242f39c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.236 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[a6b0ab48-8666-4124-9517-36ca593cd851]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:00 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324060.2450] manager: (tap8c8d124f-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Nov 28 10:01:00 np0005538513.localdomain systemd-udevd[308639]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.243 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[810a196f-3d57-4443-ab85-11aff5add4ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:00.265 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.293 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[c443fc6a-746a-40db-b2ba-20c2afd8a454]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.297 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[75b32d95-e12f-4c44-93a7-df967fe390bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:00.299 279685 INFO nova.compute.manager [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Post operation of migration started
Nov 28 10:01:00 np0005538513.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap8c8d124f-f1: link becomes ready
Nov 28 10:01:00 np0005538513.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap8c8d124f-f0: link becomes ready
Nov 28 10:01:00 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324060.3234] device (tap8c8d124f-f0): carrier: link connected
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.329 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[7118e377-1d7f-4893-97f9-9207068bec74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.346 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[6357979f-e600-4d4a-a3e0-336f6249ff9c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8c8d124f-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:ad:c1:c7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1189442, 'reachable_time': 41884, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 308714, 'error': None, 'target': 'ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.362 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[7927ae6f-8ebd-4623-aef9-3cf0dc0a529c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fead:c1c7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1189442, 'tstamp': 1189442}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 308715, 'error': None, 'target': 'ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.376 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[ac32bbb1-f6ad-4f8f-90d8-3824fc92a6c7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8c8d124f-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:ad:c1:c7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1189442, 'reachable_time': 41884, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 308716, 'error': None, 'target': 'ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.413 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[7ec6bc8e-a07e-4bc9-9288-3867fb00f59b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.459 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[a460732b-b19c-4686-ae1e-a7be73cfe362]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.460 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c8d124f-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.461 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.462 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8c8d124f-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:00.464 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:00 np0005538513.localdomain kernel: device tap8c8d124f-f0 entered promiscuous mode
Nov 28 10:01:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:00.466 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.476 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8c8d124f-f0, col_values=(('external_ids', {'iface-id': '7166677b-51c2-44e8-9170-e59542d1e9db'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:00.477 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:00 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:00Z|00104|binding|INFO|Releasing lport 7166677b-51c2-44e8-9170-e59542d1e9db from this chassis (sb_readonly=0)
Nov 28 10:01:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:00.479 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.484 158130 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8c8d124f-f6e2-454a-9f65-e2e41a655306.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8c8d124f-f6e2-454a-9f65-e2e41a655306.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 28 10:01:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:00.485 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Acquiring lock "refresh_cache-d716674a-ba14-466a-956f-5bca9404174f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:01:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:00.485 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Acquired lock "refresh_cache-d716674a-ba14-466a-956f-5bca9404174f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:01:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:00.486 279685 DEBUG nova.network.neutron [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.485 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[ed53a37c-29d4-43a1-9bc6-7c407dc1a7d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.487 158130 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: global
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]:     log         /dev/log local0 debug
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]:     log-tag     haproxy-metadata-proxy-8c8d124f-f6e2-454a-9f65-e2e41a655306
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]:     user        root
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]:     group       root
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]:     maxconn     1024
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]:     pidfile     /var/lib/neutron/external/pids/8c8d124f-f6e2-454a-9f65-e2e41a655306.pid.haproxy
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]:     daemon
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: defaults
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]:     log global
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]:     mode http
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]:     option httplog
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]:     option dontlognull
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]:     option http-server-close
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]:     option forwardfor
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]:     retries                 3
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout http-request    30s
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout connect         30s
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout client          32s
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout server          32s
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout http-keep-alive 30s
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: listen listener
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]:     bind 169.254.169.254:80
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]:     server metadata /var/lib/neutron/metadata_proxy
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]:     http-request add-header X-OVN-Network-ID 8c8d124f-f6e2-454a-9f65-e2e41a655306
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 28 10:01:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:00.488 158130 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306', 'env', 'PROCESS_TAG=haproxy-8c8d124f-f6e2-454a-9f65-e2e41a655306', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8c8d124f-f6e2-454a-9f65-e2e41a655306.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 28 10:01:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:00.495 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:00 np0005538513.localdomain podman[308748]: 
Nov 28 10:01:00 np0005538513.localdomain ceph-mon[292954]: pgmap v94: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 1.2 MiB/s rd, 2.2 KiB/s wr, 59 op/s
Nov 28 10:01:00 np0005538513.localdomain podman[308748]: 2025-11-28 10:01:00.973048137 +0000 UTC m=+0.112264654 container create 0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:01:01 np0005538513.localdomain systemd[1]: Started libpod-conmon-0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1.scope.
Nov 28 10:01:01 np0005538513.localdomain podman[308748]: 2025-11-28 10:01:00.925579861 +0000 UTC m=+0.064796418 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 10:01:01 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:01:01 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/696acd32f1fb59bc8c1e81a6c4eb9b6106d56a44647c38d6c3a5de7e8de7e41a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:01:01 np0005538513.localdomain podman[308748]: 2025-11-28 10:01:01.060152768 +0000 UTC m=+0.199369255 container init 0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 10:01:01 np0005538513.localdomain podman[308748]: 2025-11-28 10:01:01.070618578 +0000 UTC m=+0.209835045 container start 0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 10:01:01 np0005538513.localdomain neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306[308762]: [NOTICE]   (308766) : New worker (308768) forked
Nov 28 10:01:01 np0005538513.localdomain neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306[308762]: [NOTICE]   (308766) : Loading success.
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.113 158130 INFO neutron.agent.ovn.metadata.agent [-] Port d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 in datapath b1cd9c9c-949c-46cf-bb45-dc659f651fc3 unbound from our chassis
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.117 158130 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b1cd9c9c-949c-46cf-bb45-dc659f651fc3
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.126 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[31c61d94-2447-49ac-9bcb-1b8fbe584432]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.127 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb1cd9c9c-91 in ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.129 158233 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb1cd9c9c-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.129 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[5e55d1ad-0f7f-4d83-a3d5-076393f0f379]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.131 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[090d44b8-ff67-4f6c-8a65-29f329b328a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.141 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[84547a6a-6e54-4fc9-9c08-98165faa1134]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.151 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[bd10e679-473a-43ea-a93d-31640388e14d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:01.177 279685 DEBUG nova.network.neutron [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Updating instance_info_cache with network_info: [{"id": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "address": "fa:16:3e:a9:a4:65", "network": {"id": "b1cd9c9c-949c-46cf-bb45-dc659f651fc3", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-91322750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "3e4b394501d24dc7954ec5d2f27b8081", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4b2e0ba-de", "ovs_interfaceid": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:01:01 np0005538513.localdomain systemd[1]: tmp-crun.epAAtR.mount: Deactivated successfully.
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.189 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[7c1ed1f1-6218-4ef0-b835-31817f138dae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:01 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324061.2004] manager: (tapb1cd9c9c-90): new Veth device (/org/freedesktop/NetworkManager/Devices/23)
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.198 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[3c8500c4-ea4b-4a44-a6d7-d24d741867a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:01 np0005538513.localdomain systemd-udevd[308700]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:01:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:01.206 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Releasing lock "refresh_cache-d716674a-ba14-466a-956f-5bca9404174f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:01:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:01.233 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:01.234 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:01.235 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.237 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[3fce0ee6-0961-45cd-b0c0-3154f46d79f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:01.245 279685 INFO nova.virt.libvirt.driver [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Nov 28 10:01:01 np0005538513.localdomain virtqemud[201490]: Domain id=3 name='instance-00000006' uuid=d716674a-ba14-466a-956f-5bca9404174f is tainted: custom-monitor
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.246 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[1f84b275-18da-43ef-a612-b315a08bb44f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:01 np0005538513.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapb1cd9c9c-91: link becomes ready
Nov 28 10:01:01 np0005538513.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapb1cd9c9c-90: link becomes ready
Nov 28 10:01:01 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324061.2763] device (tapb1cd9c9c-90): carrier: link connected
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.282 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[cd157c62-d8b3-4683-9cfe-047db9413887]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.299 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[f8fc4178-7f78-4f44-a638-92974003268f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb1cd9c9c-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:10:2e:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1189537, 'reachable_time': 27294, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 308787, 'error': None, 'target': 'ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.328 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[2d08e6d1-1749-49c7-b909-e23864444300]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe10:2e50'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1189537, 'tstamp': 1189537}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 308788, 'error': None, 'target': 'ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.347 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[0ae6a962-e4ae-4c36-b12d-339fe09d79c5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb1cd9c9c-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:10:2e:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1189537, 'reachable_time': 27294, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 308789, 'error': None, 'target': 'ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.377 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[10013937-f8ef-4344-b842-5de96a45f25e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:01 np0005538513.localdomain CROND[308795]: (root) CMD (run-parts /etc/cron.hourly)
Nov 28 10:01:01 np0005538513.localdomain run-parts[308798]: (/etc/cron.hourly) starting 0anacron
Nov 28 10:01:01 np0005538513.localdomain run-parts[308806]: (/etc/cron.hourly) finished 0anacron
Nov 28 10:01:01 np0005538513.localdomain CROND[308794]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.443 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[a0e2fe69-92cb-4e5b-90da-240b931dce83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.445 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1cd9c9c-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.446 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.447 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb1cd9c9c-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:01 np0005538513.localdomain kernel: device tapb1cd9c9c-90 entered promiscuous mode
Nov 28 10:01:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:01.451 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.459 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb1cd9c9c-90, col_values=(('external_ids', {'iface-id': 'bbc6954a-495e-4b48-9eb4-0e6cdcdb602f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:01 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:01Z|00105|binding|INFO|Releasing lport bbc6954a-495e-4b48-9eb4-0e6cdcdb602f from this chassis (sb_readonly=0)
Nov 28 10:01:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:01.461 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:01.463 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:01.468 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.467 158130 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b1cd9c9c-949c-46cf-bb45-dc659f651fc3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b1cd9c9c-949c-46cf-bb45-dc659f651fc3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.470 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[c98d1de9-6e4e-4636-a48c-84442d25feae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.471 158130 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: global
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]:     log         /dev/log local0 debug
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]:     log-tag     haproxy-metadata-proxy-b1cd9c9c-949c-46cf-bb45-dc659f651fc3
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]:     user        root
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]:     group       root
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]:     maxconn     1024
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]:     pidfile     /var/lib/neutron/external/pids/b1cd9c9c-949c-46cf-bb45-dc659f651fc3.pid.haproxy
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]:     daemon
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: defaults
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]:     log global
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]:     mode http
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]:     option httplog
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]:     option dontlognull
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]:     option http-server-close
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]:     option forwardfor
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]:     retries                 3
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout http-request    30s
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout connect         30s
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout client          32s
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout server          32s
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout http-keep-alive 30s
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: listen listener
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]:     bind 169.254.169.254:80
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]:     server metadata /var/lib/neutron/metadata_proxy
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]:     http-request add-header X-OVN-Network-ID b1cd9c9c-949c-46cf-bb45-dc659f651fc3
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 28 10:01:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:01.474 158130 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3', 'env', 'PROCESS_TAG=haproxy-b1cd9c9c-949c-46cf-bb45-dc659f651fc3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b1cd9c9c-949c-46cf-bb45-dc659f651fc3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 28 10:01:01 np0005538513.localdomain podman[308832]: 
Nov 28 10:01:01 np0005538513.localdomain podman[308832]: 2025-11-28 10:01:01.911837945 +0000 UTC m=+0.082748219 container create 3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 10:01:01 np0005538513.localdomain podman[308832]: 2025-11-28 10:01:01.867718412 +0000 UTC m=+0.038628716 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 10:01:01 np0005538513.localdomain systemd[1]: Started libpod-conmon-3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45.scope.
Nov 28 10:01:02 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:01:02 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1af4d90b8638c3920ed542c915db741d94ff67735a1d7fb294f7e55b2f0fb4c9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:01:02 np0005538513.localdomain podman[308832]: 2025-11-28 10:01:02.02679528 +0000 UTC m=+0.197705534 container init 3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 10:01:02 np0005538513.localdomain podman[308832]: 2025-11-28 10:01:02.037088585 +0000 UTC m=+0.207998849 container start 3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:01:02 np0005538513.localdomain neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3[308846]: [NOTICE]   (308850) : New worker (308852) forked
Nov 28 10:01:02 np0005538513.localdomain neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3[308846]: [NOTICE]   (308850) : Loading success.
Nov 28 10:01:02 np0005538513.localdomain systemd[1]: tmp-crun.op1aZ0.mount: Deactivated successfully.
Nov 28 10:01:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:02.256 279685 INFO nova.virt.libvirt.driver [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Nov 28 10:01:02 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Nov 28 10:01:02 np0005538513.localdomain ceph-mon[292954]: pgmap v95: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 1.3 MiB/s rd, 4.0 KiB/s wr, 99 op/s
Nov 28 10:01:02 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/3260841447' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:03 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:03.261 279685 INFO nova.virt.libvirt.driver [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Nov 28 10:01:03 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:03.268 279685 DEBUG nova.compute.manager [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:01:03 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:03.294 279685 DEBUG nova.objects.instance [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 28 10:01:03 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:01:03 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:01:03 np0005538513.localdomain podman[308861]: 2025-11-28 10:01:03.866160914 +0000 UTC m=+0.102296707 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:01:03 np0005538513.localdomain podman[308862]: 2025-11-28 10:01:03.911571578 +0000 UTC m=+0.143653437 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd)
Nov 28 10:01:03 np0005538513.localdomain podman[308862]: 2025-11-28 10:01:03.921715808 +0000 UTC m=+0.153797647 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Nov 28 10:01:03 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:01:03 np0005538513.localdomain podman[308861]: 2025-11-28 10:01:03.975854358 +0000 UTC m=+0.211990151 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:01:03 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:01:04 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:04Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a9:a4:65 10.100.0.10
Nov 28 10:01:04 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:04Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a9:a4:65 10.100.0.10
Nov 28 10:01:04 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:01:04 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3322246960' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:04 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:04 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e96 do_prune osdmap full prune enabled
Nov 28 10:01:04 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e97 e97: 6 total, 6 up, 6 in
Nov 28 10:01:04 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e97: 6 total, 6 up, 6 in
Nov 28 10:01:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:04.884 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:04.905 279685 DEBUG oslo_concurrency.lockutils [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Acquiring lock "d716674a-ba14-466a-956f-5bca9404174f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:04.905 279685 DEBUG oslo_concurrency.lockutils [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Lock "d716674a-ba14-466a-956f-5bca9404174f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:04.906 279685 DEBUG oslo_concurrency.lockutils [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Acquiring lock "d716674a-ba14-466a-956f-5bca9404174f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:04.906 279685 DEBUG oslo_concurrency.lockutils [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Lock "d716674a-ba14-466a-956f-5bca9404174f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:04.906 279685 DEBUG oslo_concurrency.lockutils [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Lock "d716674a-ba14-466a-956f-5bca9404174f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:04.907 279685 INFO nova.compute.manager [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Terminating instance
Nov 28 10:01:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:04.908 279685 DEBUG nova.compute.manager [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 28 10:01:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:04.949 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:04 np0005538513.localdomain kernel: device tapd4b2e0ba-de left promiscuous mode
Nov 28 10:01:04 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324064.9731] device (tapd4b2e0ba-de): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed')
Nov 28 10:01:04 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:04Z|00106|binding|INFO|Releasing lport d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 from this chassis (sb_readonly=0)
Nov 28 10:01:04 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:04Z|00107|binding|INFO|Setting lport d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 down in Southbound
Nov 28 10:01:04 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:04Z|00108|binding|INFO|Releasing lport a48cbb27-d55f-41c4-a09f-9bbe3a14fe95 from this chassis (sb_readonly=0)
Nov 28 10:01:04 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:04Z|00109|binding|INFO|Setting lport a48cbb27-d55f-41c4-a09f-9bbe3a14fe95 down in Southbound
Nov 28 10:01:04 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:04Z|00110|binding|INFO|Removing iface tapd4b2e0ba-de ovn-installed in OVS
Nov 28 10:01:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:04.979 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:04 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:04Z|00111|binding|INFO|Releasing lport bbc6954a-495e-4b48-9eb4-0e6cdcdb602f from this chassis (sb_readonly=0)
Nov 28 10:01:04 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:04Z|00112|binding|INFO|Releasing lport 7166677b-51c2-44e8-9170-e59542d1e9db from this chassis (sb_readonly=0)
Nov 28 10:01:04 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:04Z|00113|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:04.997 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:62:32 19.80.0.101'], port_security=['fa:16:3e:78:62:32 19.80.0.101'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['d4b2e0ba-de4a-4cfb-af66-1ed3abdde376'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-27110142', 'neutron:cidrs': '19.80.0.101/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c8d124f-f6e2-454a-9f65-e2e41a655306', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-27110142', 'neutron:project_id': '3e4b394501d24dc7954ec5d2f27b8081', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'dc0a6e12-205a-4d7d-adb2-6545f08f7990', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=c5f09637-840e-43f3-af58-d197b914a787, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=a48cbb27-d55f-41c4-a09f-9bbe3a14fe95) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:04.998 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:a4:65 10.100.0.10'], port_security=['fa:16:3e:a9:a4:65 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1851630912', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd716674a-ba14-466a-956f-5bca9404174f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1cd9c9c-949c-46cf-bb45-dc659f651fc3', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1851630912', 'neutron:project_id': '3e4b394501d24dc7954ec5d2f27b8081', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'dc0a6e12-205a-4d7d-adb2-6545f08f7990', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d220f056-a923-484b-9df7-f648b3edde7c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=d4b2e0ba-de4a-4cfb-af66-1ed3abdde376) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:04.999 158130 INFO neutron.agent.ovn.metadata.agent [-] Port a48cbb27-d55f-41c4-a09f-9bbe3a14fe95 in datapath 8c8d124f-f6e2-454a-9f65-e2e41a655306 unbound from our chassis
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.001 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8c8d124f-f6e2-454a-9f65-e2e41a655306, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.001 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[55741819-ec56-4af7-9cf0-ab9db6d0390a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.002 158130 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306 namespace which is not needed anymore
Nov 28 10:01:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:05.004 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:05.013 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:05 np0005538513.localdomain systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Deactivated successfully.
Nov 28 10:01:05 np0005538513.localdomain systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Consumed 4.409s CPU time.
Nov 28 10:01:05 np0005538513.localdomain systemd-machined[83422]: Machine qemu-3-instance-00000006 terminated.
Nov 28 10:01:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:05.147 279685 INFO nova.virt.libvirt.driver [-] [instance: d716674a-ba14-466a-956f-5bca9404174f] Instance destroyed successfully.
Nov 28 10:01:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:05.149 279685 DEBUG nova.objects.instance [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Lazy-loading 'resources' on Instance uuid d716674a-ba14-466a-956f-5bca9404174f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:01:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:05.165 279685 DEBUG nova.virt.libvirt.vif [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-28T10:00:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1337177779',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005538513.localdomain',hostname='tempest-liveautoblockmigrationv225test-server-1337177779',id=6,image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T10:00:49Z,launched_on='np0005538514.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='np0005538513.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3e4b394501d24dc7954ec5d2f27b8081',ramdisk_id='',reservation_id='r-mfjalp0i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1153414438',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1153414438-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T10:01:03Z,user_data=None,user_id='c64867c2bac34a819c0995d0b72ee9a7',uuid=d716674a-ba14-466a-956f-5bca9404174f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "address": "fa:16:3e:a9:a4:65", "network": {"id": "b1cd9c9c-949c-46cf-bb45-dc659f651fc3", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-91322750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "3e4b394501d24dc7954ec5d2f27b8081", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4b2e0ba-de", "ovs_interfaceid": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 10:01:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:05.165 279685 DEBUG nova.network.os_vif_util [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Converting VIF {"id": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "address": "fa:16:3e:a9:a4:65", "network": {"id": "b1cd9c9c-949c-46cf-bb45-dc659f651fc3", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-91322750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "3e4b394501d24dc7954ec5d2f27b8081", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4b2e0ba-de", "ovs_interfaceid": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 10:01:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:05.166 279685 DEBUG nova.network.os_vif_util [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a9:a4:65,bridge_name='br-int',has_traffic_filtering=True,id=d4b2e0ba-de4a-4cfb-af66-1ed3abdde376,network=Network(b1cd9c9c-949c-46cf-bb45-dc659f651fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd4b2e0ba-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 10:01:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:05.167 279685 DEBUG os_vif [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:a4:65,bridge_name='br-int',has_traffic_filtering=True,id=d4b2e0ba-de4a-4cfb-af66-1ed3abdde376,network=Network(b1cd9c9c-949c-46cf-bb45-dc659f651fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd4b2e0ba-de') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 10:01:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:05.170 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:05.170 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4b2e0ba-de, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:05.172 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:05.175 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:01:05 np0005538513.localdomain neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306[308762]: [NOTICE]   (308766) : haproxy version is 2.8.14-c23fe91
Nov 28 10:01:05 np0005538513.localdomain neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306[308762]: [NOTICE]   (308766) : path to executable is /usr/sbin/haproxy
Nov 28 10:01:05 np0005538513.localdomain neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306[308762]: [WARNING]  (308766) : Exiting Master process...
Nov 28 10:01:05 np0005538513.localdomain neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306[308762]: [WARNING]  (308766) : Exiting Master process...
Nov 28 10:01:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:05.182 279685 INFO os_vif [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:a4:65,bridge_name='br-int',has_traffic_filtering=True,id=d4b2e0ba-de4a-4cfb-af66-1ed3abdde376,network=Network(b1cd9c9c-949c-46cf-bb45-dc659f651fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd4b2e0ba-de')
Nov 28 10:01:05 np0005538513.localdomain neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306[308762]: [ALERT]    (308766) : Current worker (308768) exited with code 143 (Terminated)
Nov 28 10:01:05 np0005538513.localdomain neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306[308762]: [WARNING]  (308766) : All workers exited. Exiting... (0)
Nov 28 10:01:05 np0005538513.localdomain systemd[1]: libpod-0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1.scope: Deactivated successfully.
Nov 28 10:01:05 np0005538513.localdomain podman[308930]: 2025-11-28 10:01:05.194914862 +0000 UTC m=+0.093758727 container died 0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 10:01:05 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1-userdata-shm.mount: Deactivated successfully.
Nov 28 10:01:05 np0005538513.localdomain podman[308930]: 2025-11-28 10:01:05.230323597 +0000 UTC m=+0.129167432 container cleanup 0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:01:05 np0005538513.localdomain podman[308962]: 2025-11-28 10:01:05.261480432 +0000 UTC m=+0.060851246 container cleanup 0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 10:01:05 np0005538513.localdomain systemd[1]: libpod-conmon-0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1.scope: Deactivated successfully.
Nov 28 10:01:05 np0005538513.localdomain ceph-mon[292954]: pgmap v96: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 1.3 MiB/s rd, 4.0 KiB/s wr, 99 op/s
Nov 28 10:01:05 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/3322246960' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:05 np0005538513.localdomain ceph-mon[292954]: osdmap e97: 6 total, 6 up, 6 in
Nov 28 10:01:05 np0005538513.localdomain podman[308983]: 2025-11-28 10:01:05.304968526 +0000 UTC m=+0.056488193 container remove 0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.310 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[8c5b2f0a-e3f0-4a97-8223-9320447b312f]: (4, ('Fri Nov 28 10:01:05 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306 (0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1)\n0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1\nFri Nov 28 10:01:05 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306 (0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1)\n0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.312 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[9fd55339-c470-485e-a964-13718b0b468a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.313 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c8d124f-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:05.316 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:05 np0005538513.localdomain kernel: device tap8c8d124f-f0 left promiscuous mode
Nov 28 10:01:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:05.324 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.327 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[cc8db5d5-27d1-47f9-982e-43704ec69e7a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.346 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4117bfaa-2041-4563-a70a-a743a9af7e93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.347 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[da793837-3d6e-4547-a2ff-6897c4e37db8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.356 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[431035ce-eb41-4561-971a-c58779c85699]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1189433, 'reachable_time': 29335, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309006, 'error': None, 'target': 'ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.358 158264 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.359 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[c5102584-41b5-4160-9fc5-4364d2d35b57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.359 158130 INFO neutron.agent.ovn.metadata.agent [-] Port d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 in datapath b1cd9c9c-949c-46cf-bb45-dc659f651fc3 unbound from our chassis
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.361 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b1cd9c9c-949c-46cf-bb45-dc659f651fc3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.362 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[bfbf1ddd-6397-4e13-8954-934cea4eb50a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.363 158130 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3 namespace which is not needed anymore
Nov 28 10:01:05 np0005538513.localdomain neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3[308846]: [NOTICE]   (308850) : haproxy version is 2.8.14-c23fe91
Nov 28 10:01:05 np0005538513.localdomain neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3[308846]: [NOTICE]   (308850) : path to executable is /usr/sbin/haproxy
Nov 28 10:01:05 np0005538513.localdomain neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3[308846]: [WARNING]  (308850) : Exiting Master process...
Nov 28 10:01:05 np0005538513.localdomain neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3[308846]: [WARNING]  (308850) : Exiting Master process...
Nov 28 10:01:05 np0005538513.localdomain neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3[308846]: [ALERT]    (308850) : Current worker (308852) exited with code 143 (Terminated)
Nov 28 10:01:05 np0005538513.localdomain neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3[308846]: [WARNING]  (308850) : All workers exited. Exiting... (0)
Nov 28 10:01:05 np0005538513.localdomain systemd[1]: libpod-3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45.scope: Deactivated successfully.
Nov 28 10:01:05 np0005538513.localdomain podman[309024]: 2025-11-28 10:01:05.532855284 +0000 UTC m=+0.079622732 container died 3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 28 10:01:05 np0005538513.localdomain podman[309024]: 2025-11-28 10:01:05.571558211 +0000 UTC m=+0.118325579 container cleanup 3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:01:05 np0005538513.localdomain podman[309036]: 2025-11-28 10:01:05.606789922 +0000 UTC m=+0.070644648 container cleanup 3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:01:05 np0005538513.localdomain systemd[1]: libpod-conmon-3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45.scope: Deactivated successfully.
Nov 28 10:01:05 np0005538513.localdomain podman[309051]: 2025-11-28 10:01:05.650227774 +0000 UTC m=+0.063803198 container remove 3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.654 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[6900e654-282a-434a-8331-e31e9ce465cd]: (4, ('Fri Nov 28 10:01:05 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3 (3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45)\n3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45\nFri Nov 28 10:01:05 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3 (3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45)\n3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.656 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[45df0646-e119-4187-81a9-af756ff16ba6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.657 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1cd9c9c-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:05.659 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:05 np0005538513.localdomain kernel: device tapb1cd9c9c-90 left promiscuous mode
Nov 28 10:01:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:05.665 279685 DEBUG nova.compute.manager [req-ce984fd2-c349-4d0f-8dde-a2cd19204e91 req-1ac28375-7efd-44c4-8e8d-feb411b52a18 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Received event network-vif-unplugged-d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 10:01:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:05.665 279685 DEBUG oslo_concurrency.lockutils [req-ce984fd2-c349-4d0f-8dde-a2cd19204e91 req-1ac28375-7efd-44c4-8e8d-feb411b52a18 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "d716674a-ba14-466a-956f-5bca9404174f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:05.666 279685 DEBUG oslo_concurrency.lockutils [req-ce984fd2-c349-4d0f-8dde-a2cd19204e91 req-1ac28375-7efd-44c4-8e8d-feb411b52a18 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "d716674a-ba14-466a-956f-5bca9404174f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:05.666 279685 DEBUG oslo_concurrency.lockutils [req-ce984fd2-c349-4d0f-8dde-a2cd19204e91 req-1ac28375-7efd-44c4-8e8d-feb411b52a18 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "d716674a-ba14-466a-956f-5bca9404174f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:05.667 279685 DEBUG nova.compute.manager [req-ce984fd2-c349-4d0f-8dde-a2cd19204e91 req-1ac28375-7efd-44c4-8e8d-feb411b52a18 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] No waiting events found dispatching network-vif-unplugged-d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 10:01:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:05.668 279685 DEBUG nova.compute.manager [req-ce984fd2-c349-4d0f-8dde-a2cd19204e91 req-1ac28375-7efd-44c4-8e8d-feb411b52a18 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Received event network-vif-unplugged-d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 10:01:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:05.669 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.670 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[2fd45e0d-2ba3-4747-a921-5bac20cb4ebc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.684 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[6040aab6-17d1-438e-96d2-a3290b821780]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.685 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[b7c99ea8-6d17-4d33-b181-1e5c5d66db6a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.699 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[b24c2e00-46e3-4405-93f3-1cddacaff20f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1189527, 'reachable_time': 31604, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309072, 'error': None, 'target': 'ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.701 158264 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 28 10:01:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:05.701 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[b8745d41-3d10-49f7-8034-1bd9728047a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:06 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-1af4d90b8638c3920ed542c915db741d94ff67735a1d7fb294f7e55b2f0fb4c9-merged.mount: Deactivated successfully.
Nov 28 10:01:06 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45-userdata-shm.mount: Deactivated successfully.
Nov 28 10:01:06 np0005538513.localdomain systemd[1]: run-netns-ovnmeta\x2db1cd9c9c\x2d949c\x2d46cf\x2dbb45\x2ddc659f651fc3.mount: Deactivated successfully.
Nov 28 10:01:06 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-696acd32f1fb59bc8c1e81a6c4eb9b6106d56a44647c38d6c3a5de7e8de7e41a-merged.mount: Deactivated successfully.
Nov 28 10:01:06 np0005538513.localdomain systemd[1]: run-netns-ovnmeta\x2d8c8d124f\x2df6e2\x2d454a\x2d9f65\x2de2e41a655306.mount: Deactivated successfully.
Nov 28 10:01:06 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/593006997' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:06 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:06.586 279685 INFO nova.virt.libvirt.driver [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Deleting instance files /var/lib/nova/instances/d716674a-ba14-466a-956f-5bca9404174f_del
Nov 28 10:01:06 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:06.587 279685 INFO nova.virt.libvirt.driver [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Deletion of /var/lib/nova/instances/d716674a-ba14-466a-956f-5bca9404174f_del complete
Nov 28 10:01:06 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:06.676 279685 INFO nova.compute.manager [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Took 1.77 seconds to destroy the instance on the hypervisor.
Nov 28 10:01:06 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:06.677 279685 DEBUG oslo.service.loopingcall [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 28 10:01:06 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:06.678 279685 DEBUG nova.compute.manager [-] [instance: d716674a-ba14-466a-956f-5bca9404174f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 28 10:01:06 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:06.678 279685 DEBUG nova.network.neutron [-] [instance: d716674a-ba14-466a-956f-5bca9404174f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 28 10:01:07 np0005538513.localdomain ceph-mon[292954]: pgmap v98: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 1.7 KiB/s wr, 39 op/s
Nov 28 10:01:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:07.779 279685 DEBUG nova.compute.manager [req-f9925f73-8793-4608-8bc9-371e5e8b8132 req-4469b561-e280-4039-945d-4a164381d41b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Received event network-vif-plugged-d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 10:01:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:07.779 279685 DEBUG oslo_concurrency.lockutils [req-f9925f73-8793-4608-8bc9-371e5e8b8132 req-4469b561-e280-4039-945d-4a164381d41b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "d716674a-ba14-466a-956f-5bca9404174f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:07.780 279685 DEBUG oslo_concurrency.lockutils [req-f9925f73-8793-4608-8bc9-371e5e8b8132 req-4469b561-e280-4039-945d-4a164381d41b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "d716674a-ba14-466a-956f-5bca9404174f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:07.780 279685 DEBUG oslo_concurrency.lockutils [req-f9925f73-8793-4608-8bc9-371e5e8b8132 req-4469b561-e280-4039-945d-4a164381d41b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "d716674a-ba14-466a-956f-5bca9404174f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:07.781 279685 DEBUG nova.compute.manager [req-f9925f73-8793-4608-8bc9-371e5e8b8132 req-4469b561-e280-4039-945d-4a164381d41b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] No waiting events found dispatching network-vif-plugged-d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 10:01:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:07.781 279685 WARNING nova.compute.manager [req-f9925f73-8793-4608-8bc9-371e5e8b8132 req-4469b561-e280-4039-945d-4a164381d41b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Received unexpected event network-vif-plugged-d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 for instance with vm_state active and task_state deleting.
Nov 28 10:01:08 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2931902013' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:01:08 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/3475554474' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:01:09 np0005538513.localdomain ceph-mon[292954]: pgmap v99: 177 pgs: 177 active+clean; 175 MiB data, 782 MiB used, 41 GiB / 42 GiB avail; 2.5 MiB/s rd, 3.8 MiB/s wr, 166 op/s
Nov 28 10:01:09 np0005538513.localdomain systemd[1]: Stopping User Manager for UID 42436...
Nov 28 10:01:09 np0005538513.localdomain systemd[308586]: Activating special unit Exit the Session...
Nov 28 10:01:09 np0005538513.localdomain systemd[308586]: Stopped target Main User Target.
Nov 28 10:01:09 np0005538513.localdomain systemd[308586]: Stopped target Basic System.
Nov 28 10:01:09 np0005538513.localdomain systemd[308586]: Stopped target Paths.
Nov 28 10:01:09 np0005538513.localdomain systemd[308586]: Stopped target Sockets.
Nov 28 10:01:09 np0005538513.localdomain systemd[308586]: Stopped target Timers.
Nov 28 10:01:09 np0005538513.localdomain systemd[308586]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 28 10:01:09 np0005538513.localdomain systemd[308586]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 10:01:09 np0005538513.localdomain systemd[308586]: Closed D-Bus User Message Bus Socket.
Nov 28 10:01:09 np0005538513.localdomain systemd[308586]: Stopped Create User's Volatile Files and Directories.
Nov 28 10:01:09 np0005538513.localdomain systemd[308586]: Removed slice User Application Slice.
Nov 28 10:01:09 np0005538513.localdomain systemd[308586]: Reached target Shutdown.
Nov 28 10:01:09 np0005538513.localdomain systemd[308586]: Finished Exit the Session.
Nov 28 10:01:09 np0005538513.localdomain systemd[308586]: Reached target Exit the Session.
Nov 28 10:01:09 np0005538513.localdomain systemd[1]: user@42436.service: Deactivated successfully.
Nov 28 10:01:09 np0005538513.localdomain systemd[1]: Stopped User Manager for UID 42436.
Nov 28 10:01:09 np0005538513.localdomain systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 28 10:01:09 np0005538513.localdomain systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 28 10:01:09 np0005538513.localdomain systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 28 10:01:09 np0005538513.localdomain systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 28 10:01:09 np0005538513.localdomain systemd[1]: Removed slice User Slice of UID 42436.
Nov 28 10:01:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:09 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:09.981 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:10 np0005538513.localdomain podman[238687]: time="2025-11-28T10:01:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:01:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:01:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157512 "" "Go-http-client/1.1"
Nov 28 10:01:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:01:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19730 "" "Go-http-client/1.1"
Nov 28 10:01:10 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:10.173 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:10 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:10.194 279685 DEBUG nova.network.neutron [-] [instance: d716674a-ba14-466a-956f-5bca9404174f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:01:10 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:10.236 279685 INFO nova.compute.manager [-] [instance: d716674a-ba14-466a-956f-5bca9404174f] Took 3.56 seconds to deallocate network for instance.
Nov 28 10:01:10 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:10.297 279685 DEBUG oslo_concurrency.lockutils [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:10 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:10.297 279685 DEBUG oslo_concurrency.lockutils [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:10 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:10.300 279685 DEBUG oslo_concurrency.lockutils [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:10 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:10.528 279685 INFO nova.scheduler.client.report [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Deleted allocations for instance d716674a-ba14-466a-956f-5bca9404174f
Nov 28 10:01:10 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:10.590 279685 DEBUG oslo_concurrency.lockutils [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Lock "d716674a-ba14-466a-956f-5bca9404174f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.684s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:11.298 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:11 np0005538513.localdomain ceph-mon[292954]: pgmap v100: 177 pgs: 177 active+clean; 175 MiB data, 782 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 3.7 MiB/s wr, 162 op/s
Nov 28 10:01:11 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:01:11.677 2 INFO neutron.agent.securitygroups_rpc [None req-64ca5811-bbe2-4768-b546-3f3c65a295fa c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Security group member updated ['dc0a6e12-205a-4d7d-adb2-6545f08f7990']
Nov 28 10:01:13 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:01:13.328 2 INFO neutron.agent.securitygroups_rpc [None req-0bbc429e-2462-4e1e-9b65-ff3c254999c8 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Security group member updated ['8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e']
Nov 28 10:01:13 np0005538513.localdomain ceph-mon[292954]: pgmap v101: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 4.7 MiB/s wr, 174 op/s
Nov 28 10:01:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1924445682' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:01:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1924445682' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:01:13 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:01:13.667 2 INFO neutron.agent.securitygroups_rpc [None req-8218c41f-1821-4284-8a71-4b98eaf9d107 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Security group member updated ['dc0a6e12-205a-4d7d-adb2-6545f08f7990']
Nov 28 10:01:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:01:13 np0005538513.localdomain podman[309075]: 2025-11-28 10:01:13.841064259 +0000 UTC m=+0.078561000 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., release=1755695350, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal)
Nov 28 10:01:13 np0005538513.localdomain podman[309075]: 2025-11-28 10:01:13.853698067 +0000 UTC m=+0.091194858 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, release=1755695350, io.openshift.expose-services=, version=9.6, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc.)
Nov 28 10:01:13 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:01:14 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:01:14.405 261084 INFO neutron.agent.linux.ip_lib [None req-2cee969e-79d6-443f-b01f-72d840f98265 - - - - - -] Device tap26794c2c-f6 cannot be used as it has no MAC address
Nov 28 10:01:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:14.468 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:14 np0005538513.localdomain kernel: device tap26794c2c-f6 entered promiscuous mode
Nov 28 10:01:14 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324074.4775] manager: (tap26794c2c-f6): new Generic device (/org/freedesktop/NetworkManager/Devices/24)
Nov 28 10:01:14 np0005538513.localdomain systemd-udevd[309108]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:01:14 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:14Z|00114|binding|INFO|Claiming lport 26794c2c-f636-431d-8d25-cfbabb72ae33 for this chassis.
Nov 28 10:01:14 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:14Z|00115|binding|INFO|26794c2c-f636-431d-8d25-cfbabb72ae33: Claiming unknown
Nov 28 10:01:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:14.481 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:14 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:14.495 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-b42f9a09-f299-4469-8a2a-b6b8c70a7aed', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b42f9a09-f299-4469-8a2a-b6b8c70a7aed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6dce7e95fa0443beb41563da37907095', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e8b0f643-a0b0-4f6d-b3fc-848bb5b1dd8d, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=26794c2c-f636-431d-8d25-cfbabb72ae33) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:01:14 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:14.500 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 26794c2c-f636-431d-8d25-cfbabb72ae33 in datapath b42f9a09-f299-4469-8a2a-b6b8c70a7aed bound to our chassis
Nov 28 10:01:14 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:14.502 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b42f9a09-f299-4469-8a2a-b6b8c70a7aed or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:01:14 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:14.503 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[1a689ffc-cbce-4612-ae8a-a1a395f6b8b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:14 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap26794c2c-f6: No such device
Nov 28 10:01:14 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap26794c2c-f6: No such device
Nov 28 10:01:14 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:14Z|00116|binding|INFO|Setting lport 26794c2c-f636-431d-8d25-cfbabb72ae33 ovn-installed in OVS
Nov 28 10:01:14 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:14Z|00117|binding|INFO|Setting lport 26794c2c-f636-431d-8d25-cfbabb72ae33 up in Southbound
Nov 28 10:01:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:14.514 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:14 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap26794c2c-f6: No such device
Nov 28 10:01:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:14.517 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:14 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap26794c2c-f6: No such device
Nov 28 10:01:14 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap26794c2c-f6: No such device
Nov 28 10:01:14 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap26794c2c-f6: No such device
Nov 28 10:01:14 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap26794c2c-f6: No such device
Nov 28 10:01:14 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap26794c2c-f6: No such device
Nov 28 10:01:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:14.557 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:14.582 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:14.983 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:15.175 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:15 np0005538513.localdomain sudo[309167]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:01:15 np0005538513.localdomain sudo[309167]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:01:15 np0005538513.localdomain sudo[309167]: pam_unix(sudo:session): session closed for user root
Nov 28 10:01:15 np0005538513.localdomain sudo[309192]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:01:15 np0005538513.localdomain sudo[309192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:01:15 np0005538513.localdomain podman[309205]: 
Nov 28 10:01:15 np0005538513.localdomain podman[309205]: 2025-11-28 10:01:15.426002191 +0000 UTC m=+0.085283276 container create 2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b42f9a09-f299-4469-8a2a-b6b8c70a7aed, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 10:01:15 np0005538513.localdomain ceph-mon[292954]: pgmap v102: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 4.7 MiB/s wr, 174 op/s
Nov 28 10:01:15 np0005538513.localdomain systemd[1]: Started libpod-conmon-2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242.scope.
Nov 28 10:01:15 np0005538513.localdomain podman[309205]: 2025-11-28 10:01:15.38749644 +0000 UTC m=+0.046777575 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:01:15 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:01:15 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1d5f4174ee8c8f2aa1c8d3fbcaed9ef7f054f6f66b168db2dbfe4a5c6fb91850/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:01:15 np0005538513.localdomain podman[309205]: 2025-11-28 10:01:15.505263712 +0000 UTC m=+0.164544797 container init 2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b42f9a09-f299-4469-8a2a-b6b8c70a7aed, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:01:15 np0005538513.localdomain podman[309205]: 2025-11-28 10:01:15.515549767 +0000 UTC m=+0.174830852 container start 2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b42f9a09-f299-4469-8a2a-b6b8c70a7aed, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 10:01:15 np0005538513.localdomain dnsmasq[309233]: started, version 2.85 cachesize 150
Nov 28 10:01:15 np0005538513.localdomain dnsmasq[309233]: DNS service limited to local subnets
Nov 28 10:01:15 np0005538513.localdomain dnsmasq[309233]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:01:15 np0005538513.localdomain dnsmasq[309233]: warning: no upstream servers configured
Nov 28 10:01:15 np0005538513.localdomain dnsmasq-dhcp[309233]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:01:15 np0005538513.localdomain dnsmasq[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/addn_hosts - 0 addresses
Nov 28 10:01:15 np0005538513.localdomain dnsmasq-dhcp[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/host
Nov 28 10:01:15 np0005538513.localdomain dnsmasq-dhcp[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/opts
Nov 28 10:01:15 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:01:15.702 261084 INFO neutron.agent.dhcp.agent [None req-9108ab0a-0bfa-4ba6-bdf0-e16d29ad2d9f - - - - - -] DHCP configuration for ports {'94f53e87-a0f5-4c4e-ba83-ece1220ca48d'} is completed
Nov 28 10:01:16 np0005538513.localdomain sudo[309192]: pam_unix(sudo:session): session closed for user root
Nov 28 10:01:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:01:16 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:01:16 np0005538513.localdomain sudo[309265]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:01:16 np0005538513.localdomain sudo[309265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:01:16 np0005538513.localdomain sudo[309265]: pam_unix(sudo:session): session closed for user root
Nov 28 10:01:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:01:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:01:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:01:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:01:17 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:01:17.036 2 INFO neutron.agent.securitygroups_rpc [None req-2679fbf8-01c0-46c1-b86d-7a154868a163 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Security group member updated ['8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e']
Nov 28 10:01:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:17.176 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:17 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:17Z|00118|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:01:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:17.358 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:17 np0005538513.localdomain ceph-mon[292954]: pgmap v103: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 2.7 MiB/s rd, 4.4 MiB/s wr, 161 op/s
Nov 28 10:01:18 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:01:18.087 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:01:17Z, description=, device_id=350c5687-2c97-42e6-96bf-0b6c681cec37, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd66a2940>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd66a2d00>], id=9b55c433-acbb-4821-80ef-e5fdd0471217, ip_allocation=immediate, mac_address=fa:16:3e:a0:00:c1, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:01:12Z, description=, dns_domain=, id=b42f9a09-f299-4469-8a2a-b6b8c70a7aed, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPDetailsNegativeTestJSON-1727831009-network, port_security_enabled=True, project_id=6dce7e95fa0443beb41563da37907095, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=14204, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=566, status=ACTIVE, subnets=['0396b46f-a28c-409c-94f1-33c424a08c25'], tags=[], tenant_id=6dce7e95fa0443beb41563da37907095, updated_at=2025-11-28T10:01:13Z, vlan_transparent=None, network_id=b42f9a09-f299-4469-8a2a-b6b8c70a7aed, port_security_enabled=False, project_id=6dce7e95fa0443beb41563da37907095, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=590, status=DOWN, tags=[], tenant_id=6dce7e95fa0443beb41563da37907095, updated_at=2025-11-28T10:01:17Z on network b42f9a09-f299-4469-8a2a-b6b8c70a7aed
Nov 28 10:01:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:01:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:01:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:01:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:01:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:01:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:01:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:01:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:01:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:01:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:01:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:01:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:01:18 np0005538513.localdomain podman[309298]: 2025-11-28 10:01:18.337823492 +0000 UTC m=+0.055615417 container kill 2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b42f9a09-f299-4469-8a2a-b6b8c70a7aed, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 10:01:18 np0005538513.localdomain dnsmasq[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/addn_hosts - 1 addresses
Nov 28 10:01:18 np0005538513.localdomain dnsmasq-dhcp[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/host
Nov 28 10:01:18 np0005538513.localdomain dnsmasq-dhcp[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/opts
Nov 28 10:01:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:01:18 np0005538513.localdomain podman[309312]: 2025-11-28 10:01:18.452859339 +0000 UTC m=+0.082546731 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:01:18 np0005538513.localdomain podman[309312]: 2025-11-28 10:01:18.460977508 +0000 UTC m=+0.090664890 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 10:01:18 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:01:18 np0005538513.localdomain ceph-mon[292954]: pgmap v104: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 3.9 MiB/s rd, 3.9 MiB/s wr, 194 op/s
Nov 28 10:01:18 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:01:18.559 261084 INFO neutron.agent.dhcp.agent [None req-b5036dfa-d387-4221-b087-919466830ae4 - - - - - -] DHCP configuration for ports {'9b55c433-acbb-4821-80ef-e5fdd0471217'} is completed
Nov 28 10:01:18 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:01:18.984 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:01:17Z, description=, device_id=350c5687-2c97-42e6-96bf-0b6c681cec37, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6eed790>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6eed9d0>], id=9b55c433-acbb-4821-80ef-e5fdd0471217, ip_allocation=immediate, mac_address=fa:16:3e:a0:00:c1, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:01:12Z, description=, dns_domain=, id=b42f9a09-f299-4469-8a2a-b6b8c70a7aed, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPDetailsNegativeTestJSON-1727831009-network, port_security_enabled=True, project_id=6dce7e95fa0443beb41563da37907095, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=14204, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=566, status=ACTIVE, subnets=['0396b46f-a28c-409c-94f1-33c424a08c25'], tags=[], tenant_id=6dce7e95fa0443beb41563da37907095, updated_at=2025-11-28T10:01:13Z, vlan_transparent=None, network_id=b42f9a09-f299-4469-8a2a-b6b8c70a7aed, port_security_enabled=False, project_id=6dce7e95fa0443beb41563da37907095, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=590, status=DOWN, tags=[], tenant_id=6dce7e95fa0443beb41563da37907095, updated_at=2025-11-28T10:01:17Z on network b42f9a09-f299-4469-8a2a-b6b8c70a7aed
Nov 28 10:01:19 np0005538513.localdomain dnsmasq[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/addn_hosts - 1 addresses
Nov 28 10:01:19 np0005538513.localdomain dnsmasq-dhcp[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/host
Nov 28 10:01:19 np0005538513.localdomain dnsmasq-dhcp[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/opts
Nov 28 10:01:19 np0005538513.localdomain podman[309359]: 2025-11-28 10:01:19.217188618 +0000 UTC m=+0.062561739 container kill 2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b42f9a09-f299-4469-8a2a-b6b8c70a7aed, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:01:19 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:01:19.437 261084 INFO neutron.agent.dhcp.agent [None req-4fdaed7f-fb3f-496e-89dc-71411dca00f2 - - - - - -] DHCP configuration for ports {'9b55c433-acbb-4821-80ef-e5fdd0471217'} is completed
Nov 28 10:01:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:19.987 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.037 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.038 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.059 279685 DEBUG nova.compute.manager [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.130 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.131 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.137 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.137 279685 INFO nova.compute.claims [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Claim successful on node np0005538513.localdomain
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.143 279685 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764324065.1428251, d716674a-ba14-466a-956f-5bca9404174f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.143 279685 INFO nova.compute.manager [-] [instance: d716674a-ba14-466a-956f-5bca9404174f] VM Stopped (Lifecycle Event)
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.177 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.183 279685 DEBUG nova.compute.manager [None req-605a5be0-9c41-4684-8485-0a4e64da372b - - - - - -] [instance: d716674a-ba14-466a-956f-5bca9404174f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.293 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:20 np0005538513.localdomain dnsmasq[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/addn_hosts - 0 addresses
Nov 28 10:01:20 np0005538513.localdomain dnsmasq-dhcp[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/host
Nov 28 10:01:20 np0005538513.localdomain dnsmasq-dhcp[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/opts
Nov 28 10:01:20 np0005538513.localdomain podman[309416]: 2025-11-28 10:01:20.546101289 +0000 UTC m=+0.058135613 container kill c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0303a35a-aae2-4e58-b0e5-9091112c9857, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:01:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:01:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:01:20 np0005538513.localdomain podman[309429]: 2025-11-28 10:01:20.658516597 +0000 UTC m=+0.091553049 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 28 10:01:20 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:01:20 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:01:20 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/88113730' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:20 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.759 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:20 np0005538513.localdomain systemd[1]: tmp-crun.fqo8mg.mount: Deactivated successfully.
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.778 279685 DEBUG nova.compute.provider_tree [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:01:20 np0005538513.localdomain podman[309429]: 2025-11-28 10:01:20.782308183 +0000 UTC m=+0.215344665 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:01:20 np0005538513.localdomain kernel: device tap2b1e8904-1c left promiscuous mode
Nov 28 10:01:20 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:20Z|00119|binding|INFO|Releasing lport 2b1e8904-1c88-4828-a7bc-9f34a2930819 from this chassis (sb_readonly=0)
Nov 28 10:01:20 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:20Z|00120|binding|INFO|Setting lport 2b1e8904-1c88-4828-a7bc-9f34a2930819 down in Southbound
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.783 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:20 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:20.790 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-0303a35a-aae2-4e58-b0e5-9091112c9857', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0303a35a-aae2-4e58-b0e5-9091112c9857', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2159235bf1c5407eac7a3e3826561913', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=20ff3eab-119a-4740-918d-4005c52a4e27, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=2b1e8904-1c88-4828-a7bc-9f34a2930819) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.791 279685 DEBUG nova.scheduler.client.report [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:01:20 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:20.791 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 2b1e8904-1c88-4828-a7bc-9f34a2930819 in datapath 0303a35a-aae2-4e58-b0e5-9091112c9857 unbound from our chassis
Nov 28 10:01:20 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:20.793 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0303a35a-aae2-4e58-b0e5-9091112c9857, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:01:20 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:20.794 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[da90546a-3f4f-4c65-93b4-3602a757f5cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:20 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.803 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.814 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.684s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.815 279685 DEBUG nova.compute.manager [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 28 10:01:20 np0005538513.localdomain podman[309430]: 2025-11-28 10:01:20.783688905 +0000 UTC m=+0.209709872 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 28 10:01:20 np0005538513.localdomain ceph-mon[292954]: pgmap v105: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 873 KiB/s wr, 86 op/s
Nov 28 10:01:20 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/88113730' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:01:20 np0005538513.localdomain podman[309430]: 2025-11-28 10:01:20.86766256 +0000 UTC m=+0.293683547 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.871 279685 DEBUG nova.compute.manager [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.872 279685 DEBUG nova.network.neutron [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 28 10:01:20 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.893 279685 INFO nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.910 279685 DEBUG nova.compute.manager [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.990 279685 DEBUG nova.compute.manager [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.991 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 28 10:01:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:20.992 279685 INFO nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Creating image(s)
Nov 28 10:01:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:21.030 279685 DEBUG nova.storage.rbd_utils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] rbd image c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:01:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:21.069 279685 DEBUG nova.storage.rbd_utils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] rbd image c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:01:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:21.110 279685 DEBUG nova.storage.rbd_utils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] rbd image c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:01:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:21.116 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Acquiring lock "1d475a5fe6866c2fa864abfa6db335a58fd8123d" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:21.117 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "1d475a5fe6866c2fa864abfa6db335a58fd8123d" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:21.187 279685 DEBUG nova.virt.libvirt.imagebackend [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Image locations are: [{'url': 'rbd://2c5417c9-00eb-57d5-a565-ddecbc7995c1/images/85968a96-5a0e-43a4-9c04-3954f640a7ed/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2c5417c9-00eb-57d5-a565-ddecbc7995c1/images/85968a96-5a0e-43a4-9c04-3954f640a7ed/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085
Nov 28 10:01:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:21.258 279685 WARNING oslo_policy.policy [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Nov 28 10:01:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:21.258 279685 WARNING oslo_policy.policy [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Nov 28 10:01:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:21.262 279685 DEBUG nova.policy [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '318114281cb649bc9eeed12ecdc7273f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '310745a04bd441169ff77f55ccf6bd7b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 28 10:01:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:22.074 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:22.162 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.part --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:22.163 279685 DEBUG nova.virt.images [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] 85968a96-5a0e-43a4-9c04-3954f640a7ed was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Nov 28 10:01:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:22.165 279685 DEBUG nova.privsep.utils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 28 10:01:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:22.165 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.part /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:22.470 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.part /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.converted" returned: 0 in 0.305s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:22.473 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:22.516 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.converted --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:22.517 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "1d475a5fe6866c2fa864abfa6db335a58fd8123d" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.400s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:22 np0005538513.localdomain dnsmasq[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/addn_hosts - 0 addresses
Nov 28 10:01:22 np0005538513.localdomain podman[309561]: 2025-11-28 10:01:22.520627829 +0000 UTC m=+0.044272178 container kill 2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b42f9a09-f299-4469-8a2a-b6b8c70a7aed, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:01:22 np0005538513.localdomain dnsmasq-dhcp[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/host
Nov 28 10:01:22 np0005538513.localdomain dnsmasq-dhcp[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/opts
Nov 28 10:01:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:22.544 279685 DEBUG nova.storage.rbd_utils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] rbd image c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:01:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:22.549 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:22 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:22Z|00121|binding|INFO|Releasing lport 26794c2c-f636-431d-8d25-cfbabb72ae33 from this chassis (sb_readonly=0)
Nov 28 10:01:22 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:22Z|00122|binding|INFO|Setting lport 26794c2c-f636-431d-8d25-cfbabb72ae33 down in Southbound
Nov 28 10:01:22 np0005538513.localdomain kernel: device tap26794c2c-f6 left promiscuous mode
Nov 28 10:01:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:22.662 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:22.688 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:22.690 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:22 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:22.744 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-b42f9a09-f299-4469-8a2a-b6b8c70a7aed', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b42f9a09-f299-4469-8a2a-b6b8c70a7aed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6dce7e95fa0443beb41563da37907095', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e8b0f643-a0b0-4f6d-b3fc-848bb5b1dd8d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=26794c2c-f636-431d-8d25-cfbabb72ae33) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:01:22 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:22.747 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 26794c2c-f636-431d-8d25-cfbabb72ae33 in datapath b42f9a09-f299-4469-8a2a-b6b8c70a7aed unbound from our chassis
Nov 28 10:01:22 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:22.751 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b42f9a09-f299-4469-8a2a-b6b8c70a7aed, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:01:22 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:22.752 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[b2ac9ff6-2a4f-4c7c-bd52-cdc8a4a99b30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:22 np0005538513.localdomain ceph-mon[292954]: pgmap v106: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 873 KiB/s wr, 93 op/s
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.017 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.110 279685 DEBUG nova.storage.rbd_utils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] resizing rbd image c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.182 279685 DEBUG nova.network.neutron [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Successfully updated port: 62b8533f-b250-4475-80c2-28c4543536b5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.214 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Acquiring lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.214 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Acquired lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.215 279685 DEBUG nova.network.neutron [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.225 279685 DEBUG nova.compute.manager [req-b4ffaa82-b05e-4eea-bdf0-df5714d78b7b req-7dd329a8-2e44-4d2e-8004-6784b1815336 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-changed-62b8533f-b250-4475-80c2-28c4543536b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.226 279685 DEBUG nova.compute.manager [req-b4ffaa82-b05e-4eea-bdf0-df5714d78b7b req-7dd329a8-2e44-4d2e-8004-6784b1815336 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Refreshing instance network info cache due to event network-changed-62b8533f-b250-4475-80c2-28c4543536b5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.226 279685 DEBUG oslo_concurrency.lockutils [req-b4ffaa82-b05e-4eea-bdf0-df5714d78b7b req-7dd329a8-2e44-4d2e-8004-6784b1815336 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.320 279685 DEBUG nova.network.neutron [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.332 279685 DEBUG nova.objects.instance [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lazy-loading 'migration_context' on Instance uuid c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.361 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.361 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Ensure instance console log exists: /var/lib/nova/instances/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.362 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.363 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.363 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.680 279685 DEBUG nova.network.neutron [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Updating instance_info_cache with network_info: [{"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.712 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Releasing lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.712 279685 DEBUG nova.compute.manager [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Instance network_info: |[{"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.713 279685 DEBUG oslo_concurrency.lockutils [req-b4ffaa82-b05e-4eea-bdf0-df5714d78b7b req-7dd329a8-2e44-4d2e-8004-6784b1815336 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquired lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.714 279685 DEBUG nova.network.neutron [req-b4ffaa82-b05e-4eea-bdf0-df5714d78b7b req-7dd329a8-2e44-4d2e-8004-6784b1815336 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Refreshing network info cache for port 62b8533f-b250-4475-80c2-28c4543536b5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.719 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Start _get_guest_xml network_info=[{"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T09:59:44Z,direct_url=<?>,disk_format='qcow2',id=85968a96-5a0e-43a4-9c04-3954f640a7ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dda653c53224db086060962b0702694',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T09:59:46Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'device_type': 'disk', 'encrypted': False, 'image_id': '85968a96-5a0e-43a4-9c04-3954f640a7ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.725 279685 WARNING nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.728 279685 DEBUG nova.virt.libvirt.host [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Searching host: 'np0005538513.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.729 279685 DEBUG nova.virt.libvirt.host [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.737 279685 DEBUG nova.virt.libvirt.host [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Searching host: 'np0005538513.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.738 279685 DEBUG nova.virt.libvirt.host [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.739 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.739 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T09:59:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98f289d4-5c06-4ab5-9089-7b580870d676',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T09:59:44Z,direct_url=<?>,disk_format='qcow2',id=85968a96-5a0e-43a4-9c04-3954f640a7ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dda653c53224db086060962b0702694',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T09:59:46Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.740 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.740 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.741 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.741 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.742 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.742 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.743 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.743 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.744 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.744 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 28 10:01:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:23.749 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:01:23 np0005538513.localdomain podman[309697]: 2025-11-28 10:01:23.868399099 +0000 UTC m=+0.099451690 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:01:23 np0005538513.localdomain podman[309697]: 2025-11-28 10:01:23.913798262 +0000 UTC m=+0.144850823 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 10:01:23 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:01:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 28 10:01:24 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/85736152' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.223 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.255 279685 DEBUG nova.storage.rbd_utils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] rbd image c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.260 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:24 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:24Z|00123|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.463 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.476 279685 DEBUG nova.network.neutron [req-b4ffaa82-b05e-4eea-bdf0-df5714d78b7b req-7dd329a8-2e44-4d2e-8004-6784b1815336 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Updated VIF entry in instance network info cache for port 62b8533f-b250-4475-80c2-28c4543536b5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.477 279685 DEBUG nova.network.neutron [req-b4ffaa82-b05e-4eea-bdf0-df5714d78b7b req-7dd329a8-2e44-4d2e-8004-6784b1815336 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Updating instance_info_cache with network_info: [{"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.507 279685 DEBUG oslo_concurrency.lockutils [req-b4ffaa82-b05e-4eea-bdf0-df5714d78b7b req-7dd329a8-2e44-4d2e-8004-6784b1815336 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Releasing lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:01:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 28 10:01:24 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2373481162' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.704 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.706 279685 DEBUG nova.virt.libvirt.vif [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T10:01:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-915340611',display_name='tempest-LiveMigrationTest-server-915340611',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005538513.localdomain',hostname='tempest-livemigrationtest-server-915340611',id=8,image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005538513.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005538513.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='310745a04bd441169ff77f55ccf6bd7b',ramdisk_id='',reservation_id='r-1g332w05',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-480152442',owner_user_name='tempest-LiveMigrationTest-480152442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T10:01:20Z,user_data=None,user_id='318114281cb649bc9eeed12ecdc7273f',uuid=c06e2ffc-a8af-41b6-ab88-680ef1f6fe50,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.707 279685 DEBUG nova.network.os_vif_util [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Converting VIF {"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.708 279685 DEBUG nova.network.os_vif_util [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=62b8533f-b250-4475-80c2-28c4543536b5,network=Network(ad2d8cf7-987d-4804-acbd-9b3e248dc8cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap62b8533f-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.710 279685 DEBUG nova.objects.instance [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lazy-loading 'pci_devices' on Instance uuid c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.728 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] End _get_guest_xml xml=<domain type="kvm">
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:   <uuid>c06e2ffc-a8af-41b6-ab88-680ef1f6fe50</uuid>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:   <name>instance-00000008</name>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:   <memory>131072</memory>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:   <vcpu>1</vcpu>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:   <metadata>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <nova:name>tempest-LiveMigrationTest-server-915340611</nova:name>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <nova:creationTime>2025-11-28 10:01:23</nova:creationTime>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <nova:flavor name="m1.nano">
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:         <nova:memory>128</nova:memory>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:         <nova:disk>1</nova:disk>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:         <nova:swap>0</nova:swap>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:         <nova:ephemeral>0</nova:ephemeral>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:         <nova:vcpus>1</nova:vcpus>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       </nova:flavor>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <nova:owner>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:         <nova:user uuid="318114281cb649bc9eeed12ecdc7273f">tempest-LiveMigrationTest-480152442-project-member</nova:user>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:         <nova:project uuid="310745a04bd441169ff77f55ccf6bd7b">tempest-LiveMigrationTest-480152442</nova:project>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       </nova:owner>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <nova:root type="image" uuid="85968a96-5a0e-43a4-9c04-3954f640a7ed"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <nova:ports>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:         <nova:port uuid="62b8533f-b250-4475-80c2-28c4543536b5">
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:         </nova:port>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       </nova:ports>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     </nova:instance>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:   </metadata>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:   <sysinfo type="smbios">
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <system>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <entry name="manufacturer">RDO</entry>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <entry name="product">OpenStack Compute</entry>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <entry name="serial">c06e2ffc-a8af-41b6-ab88-680ef1f6fe50</entry>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <entry name="uuid">c06e2ffc-a8af-41b6-ab88-680ef1f6fe50</entry>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <entry name="family">Virtual Machine</entry>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     </system>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:   </sysinfo>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:   <os>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <boot dev="hd"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <smbios mode="sysinfo"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:   </os>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:   <features>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <acpi/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <apic/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <vmcoreinfo/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:   </features>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:   <clock offset="utc">
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <timer name="pit" tickpolicy="delay"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <timer name="hpet" present="no"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:   </clock>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:   <cpu mode="host-model" match="exact">
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <topology sockets="1" cores="1" threads="1"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:   </cpu>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:   <devices>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <disk type="network" device="disk">
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <driver type="raw" cache="none"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <source protocol="rbd" name="vms/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk">
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:         <host name="172.18.0.103" port="6789"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:         <host name="172.18.0.104" port="6789"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:         <host name="172.18.0.105" port="6789"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       </source>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <auth username="openstack">
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:         <secret type="ceph" uuid="2c5417c9-00eb-57d5-a565-ddecbc7995c1"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       </auth>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <target dev="vda" bus="virtio"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     </disk>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <disk type="network" device="cdrom">
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <driver type="raw" cache="none"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <source protocol="rbd" name="vms/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk.config">
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:         <host name="172.18.0.103" port="6789"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:         <host name="172.18.0.104" port="6789"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:         <host name="172.18.0.105" port="6789"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       </source>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <auth username="openstack">
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:         <secret type="ceph" uuid="2c5417c9-00eb-57d5-a565-ddecbc7995c1"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       </auth>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <target dev="sda" bus="sata"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     </disk>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <interface type="ethernet">
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <mac address="fa:16:3e:58:68:3c"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <model type="virtio"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <driver name="vhost" rx_queue_size="512"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <mtu size="1442"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <target dev="tap62b8533f-b2"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     </interface>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <serial type="pty">
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <log file="/var/lib/nova/instances/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50/console.log" append="off"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     </serial>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <video>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <model type="virtio"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     </video>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <input type="tablet" bus="usb"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <rng model="virtio">
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <backend model="random">/dev/urandom</backend>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     </rng>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <controller type="usb" index="0"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     <memballoon model="virtio">
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:       <stats period="10"/>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:     </memballoon>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:   </devices>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: </domain>
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.729 279685 DEBUG nova.compute.manager [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Preparing to wait for external event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.729 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.729 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.730 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.731 279685 DEBUG nova.virt.libvirt.vif [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T10:01:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-915340611',display_name='tempest-LiveMigrationTest-server-915340611',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005538513.localdomain',hostname='tempest-livemigrationtest-server-915340611',id=8,image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005538513.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005538513.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='310745a04bd441169ff77f55ccf6bd7b',ramdisk_id='',reservation_id='r-1g332w05',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-480152442',owner_user_name='tempest-LiveMigrationTest-480152442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T10:01:20Z,user_data=None,user_id='318114281cb649bc9eeed12ecdc7273f',uuid=c06e2ffc-a8af-41b6-ab88-680ef1f6fe50,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.731 279685 DEBUG nova.network.os_vif_util [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Converting VIF {"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.732 279685 DEBUG nova.network.os_vif_util [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=62b8533f-b250-4475-80c2-28c4543536b5,network=Network(ad2d8cf7-987d-4804-acbd-9b3e248dc8cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap62b8533f-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.732 279685 DEBUG os_vif [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=62b8533f-b250-4475-80c2-28c4543536b5,network=Network(ad2d8cf7-987d-4804-acbd-9b3e248dc8cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap62b8533f-b2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.733 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.733 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.734 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.738 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.738 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62b8533f-b2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.739 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap62b8533f-b2, col_values=(('external_ids', {'iface-id': '62b8533f-b250-4475-80c2-28c4543536b5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:58:68:3c', 'vm-uuid': 'c06e2ffc-a8af-41b6-ab88-680ef1f6fe50'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.741 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.742 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.750 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.751 279685 INFO os_vif [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=62b8533f-b250-4475-80c2-28c4543536b5,network=Network(ad2d8cf7-987d-4804-acbd-9b3e248dc8cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap62b8533f-b2')
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.809 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.809 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.810 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] No VIF found with MAC fa:16:3e:58:68:3c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.811 279685 INFO nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Using config drive
Nov 28 10:01:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.842 279685 DEBUG nova.storage.rbd_utils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] rbd image c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:01:24 np0005538513.localdomain ceph-mon[292954]: pgmap v107: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 2.2 MiB/s rd, 56 op/s
Nov 28 10:01:24 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/85736152' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:01:24 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2373481162' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:01:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:24.988 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:25.143 279685 INFO nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Creating config drive at /var/lib/nova/instances/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50/disk.config
Nov 28 10:01:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:25.150 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps02jy7nw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:25.167 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:25.278 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps02jy7nw" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:25.318 279685 DEBUG nova.storage.rbd_utils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] rbd image c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:01:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:25.323 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50/disk.config c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:25 np0005538513.localdomain dnsmasq[307779]: exiting on receipt of SIGTERM
Nov 28 10:01:25 np0005538513.localdomain podman[309833]: 2025-11-28 10:01:25.387078201 +0000 UTC m=+0.067862513 container kill c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0303a35a-aae2-4e58-b0e5-9091112c9857, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 10:01:25 np0005538513.localdomain systemd[1]: libpod-c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b.scope: Deactivated successfully.
Nov 28 10:01:25 np0005538513.localdomain podman[309867]: 2025-11-28 10:01:25.46762315 +0000 UTC m=+0.054472291 container died c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0303a35a-aae2-4e58-b0e5-9091112c9857, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 10:01:25 np0005538513.localdomain systemd[1]: tmp-crun.EAJ36a.mount: Deactivated successfully.
Nov 28 10:01:25 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b-userdata-shm.mount: Deactivated successfully.
Nov 28 10:01:25 np0005538513.localdomain podman[309867]: 2025-11-28 10:01:25.539149864 +0000 UTC m=+0.125998965 container remove c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0303a35a-aae2-4e58-b0e5-9091112c9857, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:01:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:25.569 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50/disk.config c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.246s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:25.570 279685 INFO nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Deleting local config drive /var/lib/nova/instances/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50/disk.config because it was imported into RBD.
Nov 28 10:01:25 np0005538513.localdomain systemd[1]: libpod-conmon-c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b.scope: Deactivated successfully.
Nov 28 10:01:25 np0005538513.localdomain kernel: device tap62b8533f-b2 entered promiscuous mode
Nov 28 10:01:25 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324085.6169] manager: (tap62b8533f-b2): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Nov 28 10:01:25 np0005538513.localdomain systemd-udevd[309902]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:01:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:25.651 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:25.656 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:25 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:25Z|00124|binding|INFO|Claiming lport 62b8533f-b250-4475-80c2-28c4543536b5 for this chassis.
Nov 28 10:01:25 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:25Z|00125|binding|INFO|62b8533f-b250-4475-80c2-28c4543536b5: Claiming fa:16:3e:58:68:3c 10.100.0.12
Nov 28 10:01:25 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:25Z|00126|binding|INFO|Claiming lport fc82099a-3702-4952-add7-ba3d39b895a0 for this chassis.
Nov 28 10:01:25 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:25Z|00127|binding|INFO|fc82099a-3702-4952-add7-ba3d39b895a0: Claiming fa:16:3e:41:3c:a8 19.80.0.139
Nov 28 10:01:25 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324085.6661] device (tap62b8533f-b2): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Nov 28 10:01:25 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324085.6667] device (tap62b8533f-b2): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.667 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:3c:a8 19.80.0.139'], port_security=['fa:16:3e:41:3c:a8 19.80.0.139'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['62b8533f-b250-4475-80c2-28c4543536b5'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-957922340', 'neutron:cidrs': '19.80.0.139/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-957922340', 'neutron:project_id': '310745a04bd441169ff77f55ccf6bd7b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=96ffb618-d617-4e8c-a498-acb365ae5313, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=fc82099a-3702-4952-add7-ba3d39b895a0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:01:25 np0005538513.localdomain systemd-machined[83422]: New machine qemu-4-instance-00000008.
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.671 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:68:3c 10.100.0.12'], port_security=['fa:16:3e:58:68:3c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-191355626', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c06e2ffc-a8af-41b6-ab88-680ef1f6fe50', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-191355626', 'neutron:project_id': '310745a04bd441169ff77f55ccf6bd7b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b393f93f-1891-43a2-aa26-a4cab2642f74, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=62b8533f-b250-4475-80c2-28c4543536b5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.673 158130 INFO neutron.agent.ovn.metadata.agent [-] Port fc82099a-3702-4952-add7-ba3d39b895a0 in datapath 492ef1de-4a68-49e4-b736-13cdb2eb7b59 bound to our chassis
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.677 158130 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 492ef1de-4a68-49e4-b736-13cdb2eb7b59
Nov 28 10:01:25 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:25Z|00128|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.689 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[a55569bd-8ec3-44d1-9c08-5ddf44691ecc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.690 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap492ef1de-41 in ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 28 10:01:25 np0005538513.localdomain systemd[1]: Started Virtual Machine qemu-4-instance-00000008.
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.693 158233 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap492ef1de-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.693 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[6e6fcf51-879f-4a4f-bfbd-e5b2505def49]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.695 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[34acae76-551f-4402-ab63-28680f008912]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.706 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[58d6bd2e-05c8-49da-bd79-db1ada65b8e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:25.717 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.730 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[5abf38d3-fe9e-4350-bf47-a8dd78cf660e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:25 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:25Z|00129|binding|INFO|Setting lport 62b8533f-b250-4475-80c2-28c4543536b5 ovn-installed in OVS
Nov 28 10:01:25 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:25Z|00130|binding|INFO|Setting lport 62b8533f-b250-4475-80c2-28c4543536b5 up in Southbound
Nov 28 10:01:25 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:25Z|00131|binding|INFO|Setting lport fc82099a-3702-4952-add7-ba3d39b895a0 up in Southbound
Nov 28 10:01:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:25.733 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.759 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[d53b3cc8-2927-4eff-89f4-9075c669e3e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.770 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[c63d471d-6034-4550-8f41-a05f0137f5b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:25 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324085.7717] manager: (tap492ef1de-40): new Veth device (/org/freedesktop/NetworkManager/Devices/26)
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.796 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[7212105e-b42d-4553-ba8d-27fd77df2bf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.804 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[3006f1a9-2778-48dc-8635-dc042172dfe0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:25 np0005538513.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap492ef1de-41: link becomes ready
Nov 28 10:01:25 np0005538513.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap492ef1de-40: link becomes ready
Nov 28 10:01:25 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324085.8307] device (tap492ef1de-40): carrier: link connected
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.842 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[dd4388d8-6cea-43d9-bc16-e80f1d2623dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.860 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[7b7de348-9fca-4c9d-ac16-d718d97cd64c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap492ef1de-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:e3:7c:76'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1191993, 'reachable_time': 15378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309959, 'error': None, 'target': 'ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.874 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[ba1360fb-98a0-465d-84c7-782e4f66e91c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee3:7c76'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1191993, 'tstamp': 1191993}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 309967, 'error': None, 'target': 'ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:25 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:01:25.890 261084 INFO neutron.agent.dhcp.agent [None req-63ded401-cc9b-44ea-988e-ce6bf04017c3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:01:25 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:01:25.891 261084 INFO neutron.agent.dhcp.agent [None req-63ded401-cc9b-44ea-988e-ce6bf04017c3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.889 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[62bb0251-596a-41d5-ac74-7a84ad98a81d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap492ef1de-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:e3:7c:76'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1191993, 'reachable_time': 15378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 309976, 'error': None, 'target': 'ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.918 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[fc766d5c-f077-4c92-810f-5a7f0f9b2105]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.974 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[a4529775-7859-48d2-b846-7a4f73caa5a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.976 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap492ef1de-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.976 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.977 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap492ef1de-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:25.979 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:25 np0005538513.localdomain kernel: device tap492ef1de-40 entered promiscuous mode
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.982 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap492ef1de-40, col_values=(('external_ids', {'iface-id': '6838a8cb-20d7-44c7-aad3-e7f442484bd5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:25.983 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:25 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:25Z|00132|binding|INFO|Releasing lport 6838a8cb-20d7-44c7-aad3-e7f442484bd5 from this chassis (sb_readonly=0)
Nov 28 10:01:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:25.993 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.995 158130 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/492ef1de-4a68-49e4-b736-13cdb2eb7b59.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/492ef1de-4a68-49e4-b736-13cdb2eb7b59.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.997 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[f68c6aa4-ecbf-4434-b2cd-7f533711339b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.998 158130 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: global
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]:     log         /dev/log local0 debug
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]:     log-tag     haproxy-metadata-proxy-492ef1de-4a68-49e4-b736-13cdb2eb7b59
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]:     user        root
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]:     group       root
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]:     maxconn     1024
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]:     pidfile     /var/lib/neutron/external/pids/492ef1de-4a68-49e4-b736-13cdb2eb7b59.pid.haproxy
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]:     daemon
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: defaults
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]:     log global
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]:     mode http
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]:     option httplog
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]:     option dontlognull
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]:     option http-server-close
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]:     option forwardfor
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]:     retries                 3
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout http-request    30s
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout connect         30s
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout client          32s
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout server          32s
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout http-keep-alive 30s
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: listen listener
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]:     bind 169.254.169.254:80
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]:     server metadata /var/lib/neutron/metadata_proxy
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]:     http-request add-header X-OVN-Network-ID 492ef1de-4a68-49e4-b736-13cdb2eb7b59
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 28 10:01:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:25.999 158130 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'env', 'PROCESS_TAG=haproxy-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/492ef1de-4a68-49e4-b736-13cdb2eb7b59.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 28 10:01:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:26.052 279685 DEBUG nova.virt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Emitting event <LifecycleEvent: 1764324086.0514994, c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 10:01:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:26.052 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] VM Started (Lifecycle Event)
Nov 28 10:01:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:26.079 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:01:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:26.087 279685 DEBUG nova.virt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Emitting event <LifecycleEvent: 1764324086.0516672, c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 10:01:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:26.087 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] VM Paused (Lifecycle Event)
Nov 28 10:01:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:26.109 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:01:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:26.113 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 10:01:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:26.143 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 10:01:26 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-776ee30c8f4b1d3b8a5504203661447b2ee50a3f6f2aafa660ced48066eed543-merged.mount: Deactivated successfully.
Nov 28 10:01:26 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d0303a35a\x2daae2\x2d4e58\x2db0e5\x2d9091112c9857.mount: Deactivated successfully.
Nov 28 10:01:26 np0005538513.localdomain podman[310018]: 
Nov 28 10:01:26 np0005538513.localdomain podman[310018]: 2025-11-28 10:01:26.434968655 +0000 UTC m=+0.095482339 container create ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:01:26 np0005538513.localdomain systemd[1]: Started libpod-conmon-ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84.scope.
Nov 28 10:01:26 np0005538513.localdomain podman[310018]: 2025-11-28 10:01:26.386059385 +0000 UTC m=+0.046573119 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 10:01:26 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:01:26.489 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:01:26 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:01:26 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c783556db9b2f3943c08dc4e97d79185a998a122f3fd4b4982860e000e73c01f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:01:26 np0005538513.localdomain podman[310018]: 2025-11-28 10:01:26.51245729 +0000 UTC m=+0.172970984 container init ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:01:26 np0005538513.localdomain podman[310018]: 2025-11-28 10:01:26.521639963 +0000 UTC m=+0.182153647 container start ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:01:26 np0005538513.localdomain neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59[310032]: [NOTICE]   (310036) : New worker (310038) forked
Nov 28 10:01:26 np0005538513.localdomain neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59[310032]: [NOTICE]   (310036) : Loading success.
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.583 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 62b8533f-b250-4475-80c2-28c4543536b5 in datapath ad2d8cf7-987d-4804-acbd-9b3e248dc8cd unbound from our chassis
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.587 158130 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ad2d8cf7-987d-4804-acbd-9b3e248dc8cd
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.598 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[5314b3d9-4408-4641-84ea-c6114d5a98ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.599 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapad2d8cf7-91 in ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.601 158233 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapad2d8cf7-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.601 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[fbb9f30f-0ba7-47da-9d3a-ebf9dc870d6c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.603 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[f0dd62a3-5ace-4738-bc52-01b683d436e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.613 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[b7301a2a-b922-48b8-9bde-7d3fa86df545]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.626 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[d0126d58-74ef-4a76-8b5e-db6d73769f70]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.654 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[8c0e96ed-55c7-43af-98ba-65ddf49055d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:26 np0005538513.localdomain systemd-udevd[309929]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.661 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[1a5274ff-f0c8-4d02-bf88-327c42682adb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:26 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324086.6629] manager: (tapad2d8cf7-90): new Veth device (/org/freedesktop/NetworkManager/Devices/27)
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.700 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[c4eb4c5f-24c8-4e60-9961-053f92378d47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.704 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[cd6b197a-a339-410a-9db8-40ddde0cdf59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:26 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324086.7296] device (tapad2d8cf7-90): carrier: link connected
Nov 28 10:01:26 np0005538513.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapad2d8cf7-90: link becomes ready
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.736 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[bbc712f1-e71e-4cc1-9b23-6fffc67a1ff7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.753 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[5122f276-b575-4e2d-9aca-47d455547d90]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapad2d8cf7-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:14:78:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1192083, 'reachable_time': 38285, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 310082, 'error': None, 'target': 'ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.772 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[17a3e11f-1b62-40d0-8554-b959b5f2a6be]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe14:785b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1192083, 'tstamp': 1192083}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 310083, 'error': None, 'target': 'ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:26 np0005538513.localdomain dnsmasq[309233]: exiting on receipt of SIGTERM
Nov 28 10:01:26 np0005538513.localdomain podman[310069]: 2025-11-28 10:01:26.772780383 +0000 UTC m=+0.064143258 container kill 2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b42f9a09-f299-4469-8a2a-b6b8c70a7aed, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:01:26 np0005538513.localdomain systemd[1]: libpod-2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242.scope: Deactivated successfully.
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.788 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[3179db9c-f424-442f-b747-2576f9bb2e3a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapad2d8cf7-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:14:78:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1192083, 'reachable_time': 38285, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 310085, 'error': None, 'target': 'ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.817 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[f41d555c-d38e-4683-8b71-d1beb6b8c1f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:26 np0005538513.localdomain podman[310086]: 2025-11-28 10:01:26.847862216 +0000 UTC m=+0.059493625 container died 2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b42f9a09-f299-4469-8a2a-b6b8c70a7aed, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.874 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[a458e751-30ce-4f1d-b41b-05ef7b2f8a8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.876 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad2d8cf7-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.876 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.877 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapad2d8cf7-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:26 np0005538513.localdomain kernel: device tapad2d8cf7-90 entered promiscuous mode
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.884 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapad2d8cf7-90, col_values=(('external_ids', {'iface-id': 'acd4bbc3-c7c4-47d8-b58b-29abee48b714'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:26.884 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:26 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:26Z|00133|binding|INFO|Releasing lport acd4bbc3-c7c4-47d8-b58b-29abee48b714 from this chassis (sb_readonly=0)
Nov 28 10:01:26 np0005538513.localdomain podman[310086]: 2025-11-28 10:01:26.891620438 +0000 UTC m=+0.103251807 container cleanup 2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b42f9a09-f299-4469-8a2a-b6b8c70a7aed, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 10:01:26 np0005538513.localdomain systemd[1]: libpod-conmon-2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242.scope: Deactivated successfully.
Nov 28 10:01:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:26.897 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.900 158130 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ad2d8cf7-987d-4804-acbd-9b3e248dc8cd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ad2d8cf7-987d-4804-acbd-9b3e248dc8cd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.901 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[e7295a9a-cc9c-405e-bf3b-44dc69139025]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.904 158130 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: global
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]:     log         /dev/log local0 debug
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]:     log-tag     haproxy-metadata-proxy-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]:     user        root
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]:     group       root
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]:     maxconn     1024
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]:     pidfile     /var/lib/neutron/external/pids/ad2d8cf7-987d-4804-acbd-9b3e248dc8cd.pid.haproxy
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]:     daemon
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: defaults
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]:     log global
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]:     mode http
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]:     option httplog
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]:     option dontlognull
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]:     option http-server-close
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]:     option forwardfor
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]:     retries                 3
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout http-request    30s
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout connect         30s
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout client          32s
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout server          32s
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]:     timeout http-keep-alive 30s
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: listen listener
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]:     bind 169.254.169.254:80
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]:     server metadata /var/lib/neutron/metadata_proxy
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]:     http-request add-header X-OVN-Network-ID ad2d8cf7-987d-4804-acbd-9b3e248dc8cd
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 28 10:01:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:26.905 158130 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'env', 'PROCESS_TAG=haproxy-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ad2d8cf7-987d-4804-acbd-9b3e248dc8cd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 28 10:01:26 np0005538513.localdomain ceph-mon[292954]: pgmap v108: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 2.2 MiB/s rd, 56 op/s
Nov 28 10:01:26 np0005538513.localdomain podman[310088]: 2025-11-28 10:01:26.93666853 +0000 UTC m=+0.139666414 container remove 2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b42f9a09-f299-4469-8a2a-b6b8c70a7aed, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:01:27 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:01:27.157 261084 INFO neutron.agent.dhcp.agent [None req-01e144de-e5a2-4ccb-86d0-225b19311ea5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:01:27 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:01:27.159 261084 INFO neutron.agent.dhcp.agent [None req-01e144de-e5a2-4ccb-86d0-225b19311ea5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:01:27 np0005538513.localdomain podman[310146]: 
Nov 28 10:01:27 np0005538513.localdomain podman[310146]: 2025-11-28 10:01:27.352445939 +0000 UTC m=+0.094525190 container create 96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 10:01:27 np0005538513.localdomain systemd[1]: Started libpod-conmon-96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a.scope.
Nov 28 10:01:27 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-1d5f4174ee8c8f2aa1c8d3fbcaed9ef7f054f6f66b168db2dbfe4a5c6fb91850-merged.mount: Deactivated successfully.
Nov 28 10:01:27 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242-userdata-shm.mount: Deactivated successfully.
Nov 28 10:01:27 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2db42f9a09\x2df299\x2d4469\x2d8a2a\x2db6b8c70a7aed.mount: Deactivated successfully.
Nov 28 10:01:27 np0005538513.localdomain podman[310146]: 2025-11-28 10:01:27.308352457 +0000 UTC m=+0.050431748 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 10:01:27 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:01:27 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fa67beacec21ebd5b294a9ebc5d587ffcfb9867f9ba8eaa2c201ba3d1c08b89b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:01:27 np0005538513.localdomain podman[310146]: 2025-11-28 10:01:27.425171399 +0000 UTC m=+0.167250610 container init 96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 10:01:27 np0005538513.localdomain podman[310146]: 2025-11-28 10:01:27.43434656 +0000 UTC m=+0.176425811 container start 96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 10:01:27 np0005538513.localdomain neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[310161]: [NOTICE]   (310165) : New worker (310167) forked
Nov 28 10:01:27 np0005538513.localdomain neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[310161]: [NOTICE]   (310165) : Loading success.
Nov 28 10:01:28 np0005538513.localdomain ceph-mon[292954]: pgmap v109: 177 pgs: 177 active+clean; 271 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 3.9 MiB/s wr, 154 op/s
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.054 279685 DEBUG nova.compute.manager [req-f0f34203-8778-40b9-ab00-13ff7807b3f1 req-0c037a56-9e5f-4e39-8cf1-9de1bb81aef0 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.055 279685 DEBUG oslo_concurrency.lockutils [req-f0f34203-8778-40b9-ab00-13ff7807b3f1 req-0c037a56-9e5f-4e39-8cf1-9de1bb81aef0 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.056 279685 DEBUG oslo_concurrency.lockutils [req-f0f34203-8778-40b9-ab00-13ff7807b3f1 req-0c037a56-9e5f-4e39-8cf1-9de1bb81aef0 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.056 279685 DEBUG oslo_concurrency.lockutils [req-f0f34203-8778-40b9-ab00-13ff7807b3f1 req-0c037a56-9e5f-4e39-8cf1-9de1bb81aef0 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.057 279685 DEBUG nova.compute.manager [req-f0f34203-8778-40b9-ab00-13ff7807b3f1 req-0c037a56-9e5f-4e39-8cf1-9de1bb81aef0 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Processing event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.059 279685 DEBUG nova.compute.manager [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.063 279685 DEBUG nova.virt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Emitting event <LifecycleEvent: 1764324089.063349, c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.064 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] VM Resumed (Lifecycle Event)
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.068 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.073 279685 INFO nova.virt.libvirt.driver [-] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Instance spawned successfully.
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.073 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.390 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.395 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.466 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.467 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.468 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.468 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.469 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.470 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.529 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.584 279685 INFO nova.compute.manager [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Took 8.59 seconds to spawn the instance on the hypervisor.
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.585 279685 DEBUG nova.compute.manager [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.718 279685 INFO nova.compute.manager [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Took 9.61 seconds to build instance.
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.788 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.809 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.771s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.859230) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324089859320, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 1068, "num_deletes": 251, "total_data_size": 891118, "memory_usage": 909088, "flush_reason": "Manual Compaction"}
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324089867528, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 599688, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23666, "largest_seqno": 24733, "table_properties": {"data_size": 595874, "index_size": 1477, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10741, "raw_average_key_size": 21, "raw_value_size": 587279, "raw_average_value_size": 1149, "num_data_blocks": 66, "num_entries": 511, "num_filter_entries": 511, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324014, "oldest_key_time": 1764324014, "file_creation_time": 1764324089, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 8379 microseconds, and 3319 cpu microseconds.
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.867609) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 599688 bytes OK
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.867647) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.869743) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.869766) EVENT_LOG_v1 {"time_micros": 1764324089869759, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.869790) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 886127, prev total WAL file size 886451, number of live WAL files 2.
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.870709) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033373630' seq:72057594037927935, type:22 .. '6D6772737461740034303132' seq:0, type:0; will stop at (end)
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(585KB)], [39(18MB)]
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324089870762, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 20054849, "oldest_snapshot_seqno": -1}
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 12083 keys, 18080755 bytes, temperature: kUnknown
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324089971265, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 18080755, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18012837, "index_size": 36649, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30213, "raw_key_size": 324104, "raw_average_key_size": 26, "raw_value_size": 17808190, "raw_average_value_size": 1473, "num_data_blocks": 1394, "num_entries": 12083, "num_filter_entries": 12083, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324089, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.973429) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 18080755 bytes
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.975165) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 199.3 rd, 179.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 18.6 +0.0 blob) out(17.2 +0.0 blob), read-write-amplify(63.6) write-amplify(30.2) OK, records in: 12575, records dropped: 492 output_compression: NoCompression
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.975186) EVENT_LOG_v1 {"time_micros": 1764324089975178, "job": 22, "event": "compaction_finished", "compaction_time_micros": 100604, "compaction_time_cpu_micros": 32451, "output_level": 6, "num_output_files": 1, "total_output_size": 18080755, "num_input_records": 12575, "num_output_records": 12083, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324089975338, "job": 22, "event": "table_file_deletion", "file_number": 41}
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324089976866, "job": 22, "event": "table_file_deletion", "file_number": 39}
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.870608) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.976935) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.976941) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.976948) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.976957) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:01:29 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.976959) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:01:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:29.992 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:31 np0005538513.localdomain ceph-mon[292954]: pgmap v110: 177 pgs: 177 active+clean; 271 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 2.0 MiB/s rd, 3.9 MiB/s wr, 105 op/s
Nov 28 10:01:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:31.316 279685 DEBUG nova.compute.manager [req-409bddd0-cabe-4f03-b1f7-6643140a842b req-1dc67035-69e9-4136-a58a-9f3d55a280e5 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 10:01:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:31.317 279685 DEBUG oslo_concurrency.lockutils [req-409bddd0-cabe-4f03-b1f7-6643140a842b req-1dc67035-69e9-4136-a58a-9f3d55a280e5 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:31.317 279685 DEBUG oslo_concurrency.lockutils [req-409bddd0-cabe-4f03-b1f7-6643140a842b req-1dc67035-69e9-4136-a58a-9f3d55a280e5 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:31.318 279685 DEBUG oslo_concurrency.lockutils [req-409bddd0-cabe-4f03-b1f7-6643140a842b req-1dc67035-69e9-4136-a58a-9f3d55a280e5 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:31.318 279685 DEBUG nova.compute.manager [req-409bddd0-cabe-4f03-b1f7-6643140a842b req-1dc67035-69e9-4136-a58a-9f3d55a280e5 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] No waiting events found dispatching network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 10:01:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:31.319 279685 WARNING nova.compute.manager [req-409bddd0-cabe-4f03-b1f7-6643140a842b req-1dc67035-69e9-4136-a58a-9f3d55a280e5 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received unexpected event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 for instance with vm_state active and task_state None.
Nov 28 10:01:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:31.997 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:33 np0005538513.localdomain ceph-mon[292954]: pgmap v111: 177 pgs: 177 active+clean; 271 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 3.9 MiB/s wr, 158 op/s
Nov 28 10:01:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:33.825 279685 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Check if temp file /var/lib/nova/instances/tmpx5ac6ig2 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Nov 28 10:01:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:33.826 279685 DEBUG nova.compute.manager [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=13312,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx5ac6ig2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='c06e2ffc-a8af-41b6-ab88-680ef1f6fe50',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Nov 28 10:01:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:34.822 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:01:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:01:34 np0005538513.localdomain podman[310177]: 2025-11-28 10:01:34.935299881 +0000 UTC m=+0.067633825 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 28 10:01:34 np0005538513.localdomain podman[310177]: 2025-11-28 10:01:34.944977097 +0000 UTC m=+0.077311041 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3)
Nov 28 10:01:34 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:01:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:34.994 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:35 np0005538513.localdomain podman[310176]: 2025-11-28 10:01:35.027184009 +0000 UTC m=+0.154721235 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:01:35 np0005538513.localdomain podman[310176]: 2025-11-28 10:01:35.063319687 +0000 UTC m=+0.190856933 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:01:35 np0005538513.localdomain ceph-mon[292954]: pgmap v112: 177 pgs: 177 active+clean; 271 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 3.9 MiB/s wr, 151 op/s
Nov 28 10:01:35 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:01:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:35.784 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:01:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:35.785 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:01:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:35.785 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:01:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:36.959 279685 DEBUG nova.compute.manager [req-ef84a24d-3da5-4aee-ac61-428d0f1384a9 req-9b75021e-a1ef-4d2c-90cf-416bc75dd7f4 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-vif-unplugged-62b8533f-b250-4475-80c2-28c4543536b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 10:01:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:36.959 279685 DEBUG oslo_concurrency.lockutils [req-ef84a24d-3da5-4aee-ac61-428d0f1384a9 req-9b75021e-a1ef-4d2c-90cf-416bc75dd7f4 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:36.959 279685 DEBUG oslo_concurrency.lockutils [req-ef84a24d-3da5-4aee-ac61-428d0f1384a9 req-9b75021e-a1ef-4d2c-90cf-416bc75dd7f4 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:36.960 279685 DEBUG oslo_concurrency.lockutils [req-ef84a24d-3da5-4aee-ac61-428d0f1384a9 req-9b75021e-a1ef-4d2c-90cf-416bc75dd7f4 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:36.960 279685 DEBUG nova.compute.manager [req-ef84a24d-3da5-4aee-ac61-428d0f1384a9 req-9b75021e-a1ef-4d2c-90cf-416bc75dd7f4 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] No waiting events found dispatching network-vif-unplugged-62b8533f-b250-4475-80c2-28c4543536b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 10:01:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:36.960 279685 DEBUG nova.compute.manager [req-ef84a24d-3da5-4aee-ac61-428d0f1384a9 req-9b75021e-a1ef-4d2c-90cf-416bc75dd7f4 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-vif-unplugged-62b8533f-b250-4475-80c2-28c4543536b5 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 10:01:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e97 do_prune osdmap full prune enabled
Nov 28 10:01:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e98 e98: 6 total, 6 up, 6 in
Nov 28 10:01:37 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e98: 6 total, 6 up, 6 in
Nov 28 10:01:37 np0005538513.localdomain ceph-mon[292954]: pgmap v113: 177 pgs: 177 active+clean; 271 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 3.9 MiB/s wr, 151 op/s
Nov 28 10:01:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:37.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:01:38 np0005538513.localdomain ceph-mon[292954]: osdmap e98: 6 total, 6 up, 6 in
Nov 28 10:01:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:38.602 279685 INFO nova.compute.manager [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Took 4.21 seconds for pre_live_migration on destination host np0005538515.localdomain.
Nov 28 10:01:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:38.603 279685 DEBUG nova.compute.manager [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 10:01:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:38.634 279685 DEBUG nova.compute.manager [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=13312,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx5ac6ig2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='c06e2ffc-a8af-41b6-ab88-680ef1f6fe50',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(62fb7f70-bf44-4fcf-8c08-e096ee66cd99),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Nov 28 10:01:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:38.638 279685 DEBUG nova.objects.instance [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Lazy-loading 'migration_context' on Instance uuid c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:01:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:38.640 279685 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Nov 28 10:01:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:38.642 279685 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Nov 28 10:01:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:38.642 279685 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Nov 28 10:01:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:38.655 279685 DEBUG nova.virt.libvirt.vif [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T10:01:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-915340611',display_name='tempest-LiveMigrationTest-server-915340611',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005538513.localdomain',hostname='tempest-livemigrationtest-server-915340611',id=8,image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T10:01:29Z,launched_on='np0005538513.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005538513.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='310745a04bd441169ff77f55ccf6bd7b',ramdisk_id='',reservation_id='r-1g332w05',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-480152442',owner_user_name='tempest-LiveMigrationTest-480152442-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T10:01:29Z,user_data=None,user_id='318114281cb649bc9eeed12ecdc7273f',uuid=c06e2ffc-a8af-41b6-ab88-680ef1f6fe50,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 10:01:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:38.656 279685 DEBUG nova.network.os_vif_util [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Converting VIF {"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 10:01:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:38.657 279685 DEBUG nova.network.os_vif_util [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=62b8533f-b250-4475-80c2-28c4543536b5,network=Network(ad2d8cf7-987d-4804-acbd-9b3e248dc8cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap62b8533f-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 10:01:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:38.658 279685 DEBUG nova.virt.libvirt.migration [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Updating guest XML with vif config: <interface type="ethernet">
Nov 28 10:01:38 np0005538513.localdomain nova_compute[279673]:   <mac address="fa:16:3e:58:68:3c"/>
Nov 28 10:01:38 np0005538513.localdomain nova_compute[279673]:   <model type="virtio"/>
Nov 28 10:01:38 np0005538513.localdomain nova_compute[279673]:   <driver name="vhost" rx_queue_size="512"/>
Nov 28 10:01:38 np0005538513.localdomain nova_compute[279673]:   <mtu size="1442"/>
Nov 28 10:01:38 np0005538513.localdomain nova_compute[279673]:   <target dev="tap62b8533f-b2"/>
Nov 28 10:01:38 np0005538513.localdomain nova_compute[279673]: </interface>
Nov 28 10:01:38 np0005538513.localdomain nova_compute[279673]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Nov 28 10:01:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:38.660 279685 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Nov 28 10:01:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:39.146 279685 DEBUG nova.virt.libvirt.migration [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 10:01:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:39.148 279685 INFO nova.virt.libvirt.migration [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Increasing downtime to 50 ms after 0 sec elapsed time
Nov 28 10:01:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e98 do_prune osdmap full prune enabled
Nov 28 10:01:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e99 e99: 6 total, 6 up, 6 in
Nov 28 10:01:39 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e99: 6 total, 6 up, 6 in
Nov 28 10:01:39 np0005538513.localdomain ceph-mon[292954]: pgmap v115: 177 pgs: 177 active+clean; 271 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 2.3 MiB/s rd, 32 KiB/s wr, 87 op/s
Nov 28 10:01:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:39.766 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:01:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:39.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:01:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:39.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:01:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e99 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:39.859 279685 DEBUG nova.compute.manager [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 10:01:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:39.859 279685 DEBUG oslo_concurrency.lockutils [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:39.859 279685 DEBUG oslo_concurrency.lockutils [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:39.860 279685 DEBUG oslo_concurrency.lockutils [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:39.860 279685 DEBUG nova.compute.manager [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] No waiting events found dispatching network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 10:01:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:39.860 279685 WARNING nova.compute.manager [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received unexpected event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 for instance with vm_state active and task_state migrating.
Nov 28 10:01:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:39.861 279685 DEBUG nova.compute.manager [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-changed-62b8533f-b250-4475-80c2-28c4543536b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 10:01:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:39.861 279685 DEBUG nova.compute.manager [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Refreshing instance network info cache due to event network-changed-62b8533f-b250-4475-80c2-28c4543536b5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 10:01:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:39.861 279685 DEBUG oslo_concurrency.lockutils [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:01:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:39.861 279685 DEBUG oslo_concurrency.lockutils [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquired lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:01:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:39.862 279685 DEBUG nova.network.neutron [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Refreshing network info cache for port 62b8533f-b250-4475-80c2-28c4543536b5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 10:01:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:39.863 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:39.998 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:40 np0005538513.localdomain podman[238687]: time="2025-11-28T10:01:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:01:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:01:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158062 "" "Go-http-client/1.1"
Nov 28 10:01:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:01:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20212 "" "Go-http-client/1.1"
Nov 28 10:01:40 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e99 do_prune osdmap full prune enabled
Nov 28 10:01:40 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e100 e100: 6 total, 6 up, 6 in
Nov 28 10:01:40 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e100: 6 total, 6 up, 6 in
Nov 28 10:01:40 np0005538513.localdomain ceph-mon[292954]: osdmap e99: 6 total, 6 up, 6 in
Nov 28 10:01:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:40.280 279685 DEBUG nova.virt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Emitting event <LifecycleEvent: 1764324100.2802534, c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 10:01:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:40.281 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] VM Paused (Lifecycle Event)
Nov 28 10:01:40 np0005538513.localdomain kernel: device tap62b8533f-b2 left promiscuous mode
Nov 28 10:01:40 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324100.4345] device (tap62b8533f-b2): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed')
Nov 28 10:01:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:40.451 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:40 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:40Z|00134|binding|INFO|Releasing lport 62b8533f-b250-4475-80c2-28c4543536b5 from this chassis (sb_readonly=0)
Nov 28 10:01:40 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:40Z|00135|binding|INFO|Setting lport 62b8533f-b250-4475-80c2-28c4543536b5 down in Southbound
Nov 28 10:01:40 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:40Z|00136|binding|INFO|Releasing lport fc82099a-3702-4952-add7-ba3d39b895a0 from this chassis (sb_readonly=0)
Nov 28 10:01:40 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:40Z|00137|binding|INFO|Setting lport fc82099a-3702-4952-add7-ba3d39b895a0 down in Southbound
Nov 28 10:01:40 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:40Z|00138|binding|INFO|Removing iface tap62b8533f-b2 ovn-installed in OVS
Nov 28 10:01:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:40.456 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:40.469 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:40 np0005538513.localdomain systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000008.scope: Deactivated successfully.
Nov 28 10:01:40 np0005538513.localdomain systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000008.scope: Consumed 11.866s CPU time.
Nov 28 10:01:40 np0005538513.localdomain systemd-machined[83422]: Machine qemu-4-instance-00000008 terminated.
Nov 28 10:01:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:40.540 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:01:40 np0005538513.localdomain virtqemud[201490]: cannot parse process status data
Nov 28 10:01:40 np0005538513.localdomain virtqemud[201490]: Unable to get XATTR trusted.libvirt.security.ref_selinux on vms/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk: No such file or directory
Nov 28 10:01:40 np0005538513.localdomain virtqemud[201490]: Unable to get XATTR trusted.libvirt.security.ref_dac on vms/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk: No such file or directory
Nov 28 10:01:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:40.630 279685 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Nov 28 10:01:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:40.631 279685 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Nov 28 10:01:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:40.631 279685 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Nov 28 10:01:41 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:41Z|00139|binding|INFO|Releasing lport 6838a8cb-20d7-44c7-aad3-e7f442484bd5 from this chassis (sb_readonly=0)
Nov 28 10:01:41 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:41Z|00140|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:01:41 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:41Z|00141|binding|INFO|Releasing lport acd4bbc3-c7c4-47d8-b58b-29abee48b714 from this chassis (sb_readonly=0)
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.104 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:3c:a8 19.80.0.139'], port_security=['fa:16:3e:41:3c:a8 19.80.0.139'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['62b8533f-b250-4475-80c2-28c4543536b5'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-957922340', 'neutron:cidrs': '19.80.0.139/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-957922340', 'neutron:project_id': '310745a04bd441169ff77f55ccf6bd7b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=96ffb618-d617-4e8c-a498-acb365ae5313, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=fc82099a-3702-4952-add7-ba3d39b895a0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.107 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:68:3c 10.100.0.12'], port_security=['fa:16:3e:58:68:3c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain,np0005538515.localdomain', 'activation-strategy': 'rarp', 'additional-chassis-activated': '62c03cad-89c1-4fd7-973b-8f2a608c71f1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-191355626', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c06e2ffc-a8af-41b6-ab88-680ef1f6fe50', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-191355626', 'neutron:project_id': '310745a04bd441169ff77f55ccf6bd7b', 'neutron:revision_number': '8', 'neutron:security_group_ids': '8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b393f93f-1891-43a2-aa26-a4cab2642f74, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=62b8533f-b250-4475-80c2-28c4543536b5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.109 158130 INFO neutron.agent.ovn.metadata.agent [-] Port fc82099a-3702-4952-add7-ba3d39b895a0 in datapath 492ef1de-4a68-49e4-b736-13cdb2eb7b59 unbound from our chassis
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.112 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 492ef1de-4a68-49e4-b736-13cdb2eb7b59, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.113 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[cd0a85ab-b0bc-4f89-a58d-bf314de847c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.114 158130 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59 namespace which is not needed anymore
Nov 28 10:01:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:41.151 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:41.170 279685 INFO nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Nov 28 10:01:41 np0005538513.localdomain ceph-mon[292954]: pgmap v117: 177 pgs: 177 active+clean; 271 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 518 KiB/s rd, 23 KiB/s wr, 28 op/s
Nov 28 10:01:41 np0005538513.localdomain ceph-mon[292954]: osdmap e100: 6 total, 6 up, 6 in
Nov 28 10:01:41 np0005538513.localdomain neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59[310032]: [NOTICE]   (310036) : haproxy version is 2.8.14-c23fe91
Nov 28 10:01:41 np0005538513.localdomain neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59[310032]: [NOTICE]   (310036) : path to executable is /usr/sbin/haproxy
Nov 28 10:01:41 np0005538513.localdomain neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59[310032]: [WARNING]  (310036) : Exiting Master process...
Nov 28 10:01:41 np0005538513.localdomain neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59[310032]: [ALERT]    (310036) : Current worker (310038) exited with code 143 (Terminated)
Nov 28 10:01:41 np0005538513.localdomain neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59[310032]: [WARNING]  (310036) : All workers exited. Exiting... (0)
Nov 28 10:01:41 np0005538513.localdomain systemd[1]: libpod-ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84.scope: Deactivated successfully.
Nov 28 10:01:41 np0005538513.localdomain podman[310256]: 2025-11-28 10:01:41.328864742 +0000 UTC m=+0.080643514 container died ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 10:01:41 np0005538513.localdomain podman[310256]: 2025-11-28 10:01:41.37447547 +0000 UTC m=+0.126254242 container cleanup ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:01:41 np0005538513.localdomain podman[310270]: 2025-11-28 10:01:41.406516833 +0000 UTC m=+0.066904423 container cleanup ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:01:41 np0005538513.localdomain systemd[1]: libpod-conmon-ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84.scope: Deactivated successfully.
Nov 28 10:01:41 np0005538513.localdomain podman[310284]: 2025-11-28 10:01:41.476717086 +0000 UTC m=+0.079337784 container remove ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.481 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[a905a408-3a39-42c9-a5e1-144d9c989d4c]: (4, ('Fri Nov 28 10:01:41 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59 (ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84)\nea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84\nFri Nov 28 10:01:41 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59 (ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84)\nea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.484 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[6a54f5bd-7940-4219-93e2-18f9d8429bcb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.486 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap492ef1de-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:41.489 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:41 np0005538513.localdomain kernel: device tap492ef1de-40 left promiscuous mode
Nov 28 10:01:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:41.505 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.509 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[6e8a54b3-b31c-4d50-8454-03e6644c6bd9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.521 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[07f4f61b-6ebf-4cae-ba75-1ddbc8ba7733]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.523 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[e48753c8-cf17-4a1c-8849-ccac5c016f00]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.536 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[3b20980e-dfd4-4a66-9f6c-d5d90903c704]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1191985, 'reachable_time': 34462, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 310307, 'error': None, 'target': 'ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.538 158264 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.538 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[e143b6e3-229f-4c90-b936-91f519f1f2ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.540 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 62b8533f-b250-4475-80c2-28c4543536b5 in datapath ad2d8cf7-987d-4804-acbd-9b3e248dc8cd unbound from our chassis
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.542 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.543 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[86ae7403-81ed-438d-bfea-edd54df14eac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.544 158130 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd namespace which is not needed anymore
Nov 28 10:01:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:41.674 279685 DEBUG nova.virt.libvirt.guest [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'c06e2ffc-a8af-41b6-ab88-680ef1f6fe50' (instance-00000008) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Nov 28 10:01:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:41.675 279685 INFO nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Migration operation has completed
Nov 28 10:01:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:41.677 279685 INFO nova.compute.manager [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] _post_live_migration() is started..
Nov 28 10:01:41 np0005538513.localdomain neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[310161]: [NOTICE]   (310165) : haproxy version is 2.8.14-c23fe91
Nov 28 10:01:41 np0005538513.localdomain neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[310161]: [NOTICE]   (310165) : path to executable is /usr/sbin/haproxy
Nov 28 10:01:41 np0005538513.localdomain neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[310161]: [WARNING]  (310165) : Exiting Master process...
Nov 28 10:01:41 np0005538513.localdomain neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[310161]: [WARNING]  (310165) : Exiting Master process...
Nov 28 10:01:41 np0005538513.localdomain neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[310161]: [ALERT]    (310165) : Current worker (310167) exited with code 143 (Terminated)
Nov 28 10:01:41 np0005538513.localdomain neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[310161]: [WARNING]  (310165) : All workers exited. Exiting... (0)
Nov 28 10:01:41 np0005538513.localdomain systemd[1]: libpod-96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a.scope: Deactivated successfully.
Nov 28 10:01:41 np0005538513.localdomain podman[310325]: 2025-11-28 10:01:41.735760359 +0000 UTC m=+0.078354084 container died 96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:01:41 np0005538513.localdomain podman[310325]: 2025-11-28 10:01:41.777737326 +0000 UTC m=+0.120331021 container cleanup 96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 10:01:41 np0005538513.localdomain podman[310339]: 2025-11-28 10:01:41.80817986 +0000 UTC m=+0.069880564 container cleanup 96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:01:41 np0005538513.localdomain systemd[1]: libpod-conmon-96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a.scope: Deactivated successfully.
Nov 28 10:01:41 np0005538513.localdomain podman[310354]: 2025-11-28 10:01:41.871552594 +0000 UTC m=+0.073673550 container remove 96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.876 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[23c33e84-7605-4370-a13d-acbce6066084]: (4, ('Fri Nov 28 10:01:41 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd (96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a)\n96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a\nFri Nov 28 10:01:41 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd (96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a)\n96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.878 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[3d6fd427-ac2d-4505-93c3-b0b66b4fa54b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.879 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad2d8cf7-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:41.882 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:41 np0005538513.localdomain kernel: device tapad2d8cf7-90 left promiscuous mode
Nov 28 10:01:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:41.896 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.902 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[98e8c24c-16e6-494c-a02e-27763f191546]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.915 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[61c3b1fc-8513-4496-994e-f53512de474a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.916 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[b33ac3f5-e491-4973-aa11-eaf6cbf8dc85]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.932 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[262ba370-bf46-4d53-b903-ec6f96cbb1a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1192075, 'reachable_time': 43246, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 310377, 'error': None, 'target': 'ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.935 158264 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 28 10:01:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:41.935 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[55f54df8-7e95-432b-babd-ce20cd6749ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:42 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-fa67beacec21ebd5b294a9ebc5d587ffcfb9867f9ba8eaa2c201ba3d1c08b89b-merged.mount: Deactivated successfully.
Nov 28 10:01:42 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a-userdata-shm.mount: Deactivated successfully.
Nov 28 10:01:42 np0005538513.localdomain systemd[1]: run-netns-ovnmeta\x2dad2d8cf7\x2d987d\x2d4804\x2dacbd\x2d9b3e248dc8cd.mount: Deactivated successfully.
Nov 28 10:01:42 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c783556db9b2f3943c08dc4e97d79185a998a122f3fd4b4982860e000e73c01f-merged.mount: Deactivated successfully.
Nov 28 10:01:42 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84-userdata-shm.mount: Deactivated successfully.
Nov 28 10:01:42 np0005538513.localdomain systemd[1]: run-netns-ovnmeta\x2d492ef1de\x2d4a68\x2d49e4\x2db736\x2d13cdb2eb7b59.mount: Deactivated successfully.
Nov 28 10:01:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:42.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:01:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:42.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:01:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:42.881 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:42.882 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:42.882 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:42.883 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:01:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:42.883 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:43.072 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:01:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:43.072 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:43.074 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:01:43 np0005538513.localdomain ceph-mon[292954]: pgmap v119: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 8.5 MiB/s rd, 7.8 MiB/s wr, 210 op/s
Nov 28 10:01:43 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:01:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4067164618' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:43.348 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:43.431 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:01:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:43.431 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:01:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:43.667 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:01:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:43.670 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11236MB free_disk=41.63758850097656GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:01:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:43.670 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:43.671 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:43.760 279685 INFO nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Updating resource usage from migration 62fb7f70-bf44-4fcf-8c08-e096ee66cd99
Nov 28 10:01:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:43.799 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 10:01:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:43.800 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Migration 62fb7f70-bf44-4fcf-8c08-e096ee66cd99 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 28 10:01:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:43.800 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:01:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:43.800 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1152MB phys_disk=41GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:01:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:43.862 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.129 279685 DEBUG nova.compute.manager [req-384758b2-e7d4-4609-94ca-471d68e41979 req-2ce278cb-d756-499c-a7b6-939243af4700 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-vif-unplugged-62b8533f-b250-4475-80c2-28c4543536b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.130 279685 DEBUG oslo_concurrency.lockutils [req-384758b2-e7d4-4609-94ca-471d68e41979 req-2ce278cb-d756-499c-a7b6-939243af4700 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.131 279685 DEBUG oslo_concurrency.lockutils [req-384758b2-e7d4-4609-94ca-471d68e41979 req-2ce278cb-d756-499c-a7b6-939243af4700 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.132 279685 DEBUG oslo_concurrency.lockutils [req-384758b2-e7d4-4609-94ca-471d68e41979 req-2ce278cb-d756-499c-a7b6-939243af4700 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.132 279685 DEBUG nova.compute.manager [req-384758b2-e7d4-4609-94ca-471d68e41979 req-2ce278cb-d756-499c-a7b6-939243af4700 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] No waiting events found dispatching network-vif-unplugged-62b8533f-b250-4475-80c2-28c4543536b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.133 279685 DEBUG nova.compute.manager [req-384758b2-e7d4-4609-94ca-471d68e41979 req-2ce278cb-d756-499c-a7b6-939243af4700 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-vif-unplugged-62b8533f-b250-4475-80c2-28c4543536b5 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.248 279685 DEBUG nova.network.neutron [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Updated VIF entry in instance network info cache for port 62b8533f-b250-4475-80c2-28c4543536b5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.249 279685 DEBUG nova.network.neutron [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Updating instance_info_cache with network_info: [{"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "np0005538515.localdomain"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:01:44 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/4067164618' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:44 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2900998411' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.274 279685 DEBUG oslo_concurrency.lockutils [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Releasing lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:01:44 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:44Z|00142|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:01:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:01:44 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1891951526' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.342 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.380 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.382 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.419 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.507 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.508 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.837s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:44 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.822 279685 DEBUG nova.network.neutron [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Activated binding for port 62b8533f-b250-4475-80c2-28c4543536b5 and host np0005538515.localdomain migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.823 279685 DEBUG nova.compute.manager [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.824 279685 DEBUG nova.virt.libvirt.vif [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T10:01:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-915340611',display_name='tempest-LiveMigrationTest-server-915340611',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005538513.localdomain',hostname='tempest-livemigrationtest-server-915340611',id=8,image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T10:01:29Z,launched_on='np0005538513.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005538513.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='310745a04bd441169ff77f55ccf6bd7b',ramdisk_id='',reservation_id='r-1g332w05',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-480152442',owner_user_name='tempest-LiveMigrationTest-480152442-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T10:01:33Z,user_data=None,user_id='318114281cb649bc9eeed12ecdc7273f',uuid=c06e2ffc-a8af-41b6-ab88-680ef1f6fe50,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.824 279685 DEBUG nova.network.os_vif_util [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Converting VIF {"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.830 279685 DEBUG nova.network.os_vif_util [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=62b8533f-b250-4475-80c2-28c4543536b5,network=Network(ad2d8cf7-987d-4804-acbd-9b3e248dc8cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap62b8533f-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.830 279685 DEBUG os_vif [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=62b8533f-b250-4475-80c2-28c4543536b5,network=Network(ad2d8cf7-987d-4804-acbd-9b3e248dc8cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap62b8533f-b2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 10:01:44 np0005538513.localdomain podman[310423]: 2025-11-28 10:01:44.831707718 +0000 UTC m=+0.071083621 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, io.openshift.expose-services=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, name=ubi9-minimal, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.833 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.834 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62b8533f-b2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.836 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.840 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.843 279685 INFO os_vif [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=62b8533f-b250-4475-80c2-28c4543536b5,network=Network(ad2d8cf7-987d-4804-acbd-9b3e248dc8cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap62b8533f-b2')
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.843 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.843 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.843 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.844 279685 DEBUG nova.compute.manager [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.844 279685 INFO nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Deleting instance files /var/lib/nova/instances/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_del
Nov 28 10:01:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:44.844 279685 INFO nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Deletion of /var/lib/nova/instances/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_del complete
Nov 28 10:01:44 np0005538513.localdomain podman[310423]: 2025-11-28 10:01:44.849566496 +0000 UTC m=+0.088942399 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, architecture=x86_64, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9)
Nov 28 10:01:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e100 do_prune osdmap full prune enabled
Nov 28 10:01:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e101 e101: 6 total, 6 up, 6 in
Nov 28 10:01:44 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:01:44 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e101: 6 total, 6 up, 6 in
Nov 28 10:01:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:45.004 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:45.024 279685 DEBUG nova.compute.manager [req-ba2daeac-bca5-4b68-a4b7-b11188075754 req-0c29f3af-8b17-4500-9f55-b6c7a52059b7 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 10:01:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:45.025 279685 DEBUG oslo_concurrency.lockutils [req-ba2daeac-bca5-4b68-a4b7-b11188075754 req-0c29f3af-8b17-4500-9f55-b6c7a52059b7 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:45.026 279685 DEBUG oslo_concurrency.lockutils [req-ba2daeac-bca5-4b68-a4b7-b11188075754 req-0c29f3af-8b17-4500-9f55-b6c7a52059b7 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:45.026 279685 DEBUG oslo_concurrency.lockutils [req-ba2daeac-bca5-4b68-a4b7-b11188075754 req-0c29f3af-8b17-4500-9f55-b6c7a52059b7 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:45.027 279685 DEBUG nova.compute.manager [req-ba2daeac-bca5-4b68-a4b7-b11188075754 req-0c29f3af-8b17-4500-9f55-b6c7a52059b7 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] No waiting events found dispatching network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 10:01:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:45.027 279685 WARNING nova.compute.manager [req-ba2daeac-bca5-4b68-a4b7-b11188075754 req-0c29f3af-8b17-4500-9f55-b6c7a52059b7 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received unexpected event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 for instance with vm_state active and task_state migrating.
Nov 28 10:01:45 np0005538513.localdomain ceph-mon[292954]: pgmap v120: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 7.2 MiB/s rd, 7.1 MiB/s wr, 157 op/s
Nov 28 10:01:45 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1891951526' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:45 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/701571942' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:45 np0005538513.localdomain ceph-mon[292954]: osdmap e101: 6 total, 6 up, 6 in
Nov 28 10:01:45 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1054391208' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:46 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/947106529' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:46.509 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:01:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:46.510 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:01:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:46.510 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:01:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:46.593 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:01:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:46.594 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:01:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:46.595 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 10:01:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:46.596 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:01:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:47.075 279685 DEBUG nova.compute.manager [req-fe34f289-5840-4fcb-aae1-c6fea9e23f9a req-9cecaca3-b598-4b80-bb4c-f0e6e3f3a95a 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 10:01:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:47.076 279685 DEBUG oslo_concurrency.lockutils [req-fe34f289-5840-4fcb-aae1-c6fea9e23f9a req-9cecaca3-b598-4b80-bb4c-f0e6e3f3a95a 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:47.076 279685 DEBUG oslo_concurrency.lockutils [req-fe34f289-5840-4fcb-aae1-c6fea9e23f9a req-9cecaca3-b598-4b80-bb4c-f0e6e3f3a95a 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:47.077 279685 DEBUG oslo_concurrency.lockutils [req-fe34f289-5840-4fcb-aae1-c6fea9e23f9a req-9cecaca3-b598-4b80-bb4c-f0e6e3f3a95a 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:47.077 279685 DEBUG nova.compute.manager [req-fe34f289-5840-4fcb-aae1-c6fea9e23f9a req-9cecaca3-b598-4b80-bb4c-f0e6e3f3a95a 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] No waiting events found dispatching network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 10:01:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:47.078 279685 WARNING nova.compute.manager [req-fe34f289-5840-4fcb-aae1-c6fea9e23f9a req-9cecaca3-b598-4b80-bb4c-f0e6e3f3a95a 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received unexpected event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 for instance with vm_state active and task_state migrating.
Nov 28 10:01:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:47.200 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:01:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:47.224 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:01:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:47.224 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 10:01:47 np0005538513.localdomain ceph-mon[292954]: pgmap v122: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 7.3 MiB/s rd, 7.2 MiB/s wr, 159 op/s
Nov 28 10:01:47 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2242950946' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:47 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/3836043779' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:01:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:01:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:01:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:01:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:01:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:01:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:01:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:01:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:01:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:01:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:01:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:01:48 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1206230784' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:48.400 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:01:48 np0005538513.localdomain podman[310444]: 2025-11-28 10:01:48.845612646 +0000 UTC m=+0.084692068 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:01:48 np0005538513.localdomain podman[310444]: 2025-11-28 10:01:48.855056706 +0000 UTC m=+0.094136128 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:01:48 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:01:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:49.211 279685 DEBUG nova.compute.manager [req-79b26847-8288-4a75-84b9-cfb0ab3a3c84 req-909ec57b-1e3a-4128-b351-55bf34dc3c34 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 10:01:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:49.212 279685 DEBUG oslo_concurrency.lockutils [req-79b26847-8288-4a75-84b9-cfb0ab3a3c84 req-909ec57b-1e3a-4128-b351-55bf34dc3c34 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:49.212 279685 DEBUG oslo_concurrency.lockutils [req-79b26847-8288-4a75-84b9-cfb0ab3a3c84 req-909ec57b-1e3a-4128-b351-55bf34dc3c34 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:49.213 279685 DEBUG oslo_concurrency.lockutils [req-79b26847-8288-4a75-84b9-cfb0ab3a3c84 req-909ec57b-1e3a-4128-b351-55bf34dc3c34 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:49.213 279685 DEBUG nova.compute.manager [req-79b26847-8288-4a75-84b9-cfb0ab3a3c84 req-909ec57b-1e3a-4128-b351-55bf34dc3c34 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] No waiting events found dispatching network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 10:01:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:49.213 279685 WARNING nova.compute.manager [req-79b26847-8288-4a75-84b9-cfb0ab3a3c84 req-909ec57b-1e3a-4128-b351-55bf34dc3c34 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received unexpected event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 for instance with vm_state active and task_state migrating.
Nov 28 10:01:49 np0005538513.localdomain ceph-mon[292954]: pgmap v123: 177 pgs: 177 active+clean; 304 MiB data, 1012 MiB used, 41 GiB / 42 GiB avail; 6.4 MiB/s rd, 9.0 MiB/s wr, 264 op/s
Nov 28 10:01:49 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/3121132784' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:49.836 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:50.005 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:50.482 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:01:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:50.520 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:50.521 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:50.521 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:50.540 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:50.541 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:50.542 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:50.542 279685 DEBUG nova.compute.resource_tracker [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:01:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:50.543 279685 DEBUG oslo_concurrency.processutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:50.841 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:50.842 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:50.843 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:50 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:01:50 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3559263610' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:50.997 279685 DEBUG oslo_concurrency.processutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:51.052 279685 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:01:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:51.053 279685 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:01:51 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:51.076 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:51.217 279685 WARNING nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:01:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:51.218 279685 DEBUG nova.compute.resource_tracker [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11214MB free_disk=41.700828552246094GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:01:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:51.219 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:51.219 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:51.264 279685 DEBUG nova.compute.resource_tracker [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Migration for instance c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 28 10:01:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:51.284 279685 DEBUG nova.compute.resource_tracker [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Nov 28 10:01:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:51.330 279685 DEBUG nova.compute.resource_tracker [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 10:01:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:51.331 279685 DEBUG nova.compute.resource_tracker [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Migration 62fb7f70-bf44-4fcf-8c08-e096ee66cd99 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 28 10:01:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:51.331 279685 DEBUG nova.compute.resource_tracker [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:01:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:51.332 279685 DEBUG nova.compute.resource_tracker [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:01:51 np0005538513.localdomain ceph-mon[292954]: pgmap v124: 177 pgs: 177 active+clean; 304 MiB data, 1012 MiB used, 41 GiB / 42 GiB avail; 5.4 MiB/s rd, 7.6 MiB/s wr, 224 op/s
Nov 28 10:01:51 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3559263610' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:51 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1415036771' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:01:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:51.400 279685 DEBUG oslo_concurrency.processutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:01:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:01:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:01:51 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1128492412' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:51 np0005538513.localdomain podman[310508]: 2025-11-28 10:01:51.857012142 +0000 UTC m=+0.091646630 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Nov 28 10:01:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:51.858 279685 DEBUG oslo_concurrency.processutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:51.868 279685 DEBUG nova.compute.provider_tree [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:01:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:51.900 279685 DEBUG nova.scheduler.client.report [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:01:51 np0005538513.localdomain podman[310509]: 2025-11-28 10:01:51.916724804 +0000 UTC m=+0.148556976 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:01:51 np0005538513.localdomain podman[310508]: 2025-11-28 10:01:51.925352699 +0000 UTC m=+0.159987127 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Nov 28 10:01:51 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:01:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:51.946 279685 DEBUG nova.compute.resource_tracker [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:01:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:51.946 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.727s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:51 np0005538513.localdomain podman[310509]: 2025-11-28 10:01:51.949505379 +0000 UTC m=+0.181337541 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:01:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:51.955 279685 INFO nova.compute.manager [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Migrating instance to np0005538515.localdomain finished successfully.
Nov 28 10:01:51 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:01:51.958 261084 INFO neutron.agent.linux.ip_lib [None req-0fb38773-599a-4dc4-8224-09918295a826 - - - - - -] Device tap516917c4-99 cannot be used as it has no MAC address
Nov 28 10:01:51 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:01:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:51.982 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:51 np0005538513.localdomain kernel: device tap516917c4-99 entered promiscuous mode
Nov 28 10:01:51 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324111.9889] manager: (tap516917c4-99): new Generic device (/org/freedesktop/NetworkManager/Devices/28)
Nov 28 10:01:51 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:51Z|00143|binding|INFO|Claiming lport 516917c4-995e-4297-af25-c4f8499fcc7d for this chassis.
Nov 28 10:01:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:51.989 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:51 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:51Z|00144|binding|INFO|516917c4-995e-4297-af25-c4f8499fcc7d: Claiming unknown
Nov 28 10:01:51 np0005538513.localdomain systemd-udevd[310558]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:01:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:52.006 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f9b84b894e641c4bee3ebcd1409ad9f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4106ac0-e782-4268-8bb4-37fc3096f0bc, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=516917c4-995e-4297-af25-c4f8499fcc7d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:01:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:52.012 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 516917c4-995e-4297-af25-c4f8499fcc7d in datapath b1696f4c-80ce-491f-ad1c-cc7f5b6700ba bound to our chassis
Nov 28 10:01:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:52.013 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b1696f4c-80ce-491f-ad1c-cc7f5b6700ba or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:01:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:52.013 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[f81a9356-1f1a-49f6-a305-e3c46a738cc2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:52 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap516917c4-99: No such device
Nov 28 10:01:52 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:52Z|00145|binding|INFO|Setting lport 516917c4-995e-4297-af25-c4f8499fcc7d ovn-installed in OVS
Nov 28 10:01:52 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:52Z|00146|binding|INFO|Setting lport 516917c4-995e-4297-af25-c4f8499fcc7d up in Southbound
Nov 28 10:01:52 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap516917c4-99: No such device
Nov 28 10:01:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:52.032 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:52 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap516917c4-99: No such device
Nov 28 10:01:52 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap516917c4-99: No such device
Nov 28 10:01:52 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap516917c4-99: No such device
Nov 28 10:01:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:52.052 279685 INFO nova.scheduler.client.report [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Deleted allocation for migration 62fb7f70-bf44-4fcf-8c08-e096ee66cd99
Nov 28 10:01:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:52.052 279685 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Nov 28 10:01:52 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap516917c4-99: No such device
Nov 28 10:01:52 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap516917c4-99: No such device
Nov 28 10:01:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:52.062 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:52 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap516917c4-99: No such device
Nov 28 10:01:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:52.119 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:52 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2304873447' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:01:52 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1128492412' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:52 np0005538513.localdomain podman[310629]: 
Nov 28 10:01:52 np0005538513.localdomain podman[310629]: 2025-11-28 10:01:52.988080458 +0000 UTC m=+0.104603049 container create 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:01:53 np0005538513.localdomain podman[310629]: 2025-11-28 10:01:52.936929099 +0000 UTC m=+0.053451720 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:01:53 np0005538513.localdomain systemd[1]: Started libpod-conmon-804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd.scope.
Nov 28 10:01:53 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:01:53 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e03c2285c42fffa1cd27962b49feefea5575696a1d40702567a3737442c3ea1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:01:53 np0005538513.localdomain podman[310629]: 2025-11-28 10:01:53.074470847 +0000 UTC m=+0.190993428 container init 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 10:01:53 np0005538513.localdomain podman[310629]: 2025-11-28 10:01:53.081240614 +0000 UTC m=+0.197763195 container start 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 10:01:53 np0005538513.localdomain dnsmasq[310647]: started, version 2.85 cachesize 150
Nov 28 10:01:53 np0005538513.localdomain dnsmasq[310647]: DNS service limited to local subnets
Nov 28 10:01:53 np0005538513.localdomain dnsmasq[310647]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:01:53 np0005538513.localdomain dnsmasq[310647]: warning: no upstream servers configured
Nov 28 10:01:53 np0005538513.localdomain dnsmasq-dhcp[310647]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:01:53 np0005538513.localdomain dnsmasq[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/addn_hosts - 0 addresses
Nov 28 10:01:53 np0005538513.localdomain dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/host
Nov 28 10:01:53 np0005538513.localdomain dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/opts
Nov 28 10:01:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:01:53.260 261084 INFO neutron.agent.dhcp.agent [None req-a0457f08-cd61-4e70-9710-d84cc62f8162 - - - - - -] DHCP configuration for ports {'d866b7da-b4ec-4c1e-9c58-e58b19fd6a55'} is completed
Nov 28 10:01:53 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e101 do_prune osdmap full prune enabled
Nov 28 10:01:53 np0005538513.localdomain ceph-mon[292954]: pgmap v125: 177 pgs: 177 active+clean; 383 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 5.1 MiB/s rd, 7.2 MiB/s wr, 200 op/s
Nov 28 10:01:53 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e102 e102: 6 total, 6 up, 6 in
Nov 28 10:01:53 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e102: 6 total, 6 up, 6 in
Nov 28 10:01:54 np0005538513.localdomain ceph-mon[292954]: osdmap e102: 6 total, 6 up, 6 in
Nov 28 10:01:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:54.736 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:01:54 np0005538513.localdomain podman[310648]: 2025-11-28 10:01:54.82111265 +0000 UTC m=+0.057260827 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:01:54 np0005538513.localdomain podman[310648]: 2025-11-28 10:01:54.833428348 +0000 UTC m=+0.069576575 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:01:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:54.837 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:54 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:01:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:55.008 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:55.366 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:55 np0005538513.localdomain ceph-mon[292954]: pgmap v127: 177 pgs: 177 active+clean; 383 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 5.8 MiB/s rd, 8.2 MiB/s wr, 228 op/s
Nov 28 10:01:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:55.618 279685 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764324100.6160715, c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 10:01:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:55.619 279685 INFO nova.compute.manager [-] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] VM Stopped (Lifecycle Event)
Nov 28 10:01:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:55.643 279685 DEBUG nova.compute.manager [None req-5639471a-f41a-4d05-8dbe-5f39c4734ce6 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:01:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:01:56.079 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:01:55Z, description=, device_id=ff13b2c3-ffbb-486b-ba3a-fa0f2960342d, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd66a2e20>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd66a2af0>], id=38f26b93-3884-4247-b638-2104f92bdcaf, ip_allocation=immediate, mac_address=fa:16:3e:58:21:6e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:01:49Z, description=, dns_domain=, id=b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-101229426-network, port_security_enabled=True, project_id=1f9b84b894e641c4bee3ebcd1409ad9f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62044, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=730, status=ACTIVE, subnets=['bac26e98-4c61-47a1-b281-0a4613971f3f'], tags=[], tenant_id=1f9b84b894e641c4bee3ebcd1409ad9f, updated_at=2025-11-28T10:01:50Z, vlan_transparent=None, network_id=b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, port_security_enabled=False, project_id=1f9b84b894e641c4bee3ebcd1409ad9f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=767, status=DOWN, tags=[], tenant_id=1f9b84b894e641c4bee3ebcd1409ad9f, updated_at=2025-11-28T10:01:55Z on network b1696f4c-80ce-491f-ad1c-cc7f5b6700ba
Nov 28 10:01:56 np0005538513.localdomain dnsmasq[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/addn_hosts - 1 addresses
Nov 28 10:01:56 np0005538513.localdomain dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/host
Nov 28 10:01:56 np0005538513.localdomain dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/opts
Nov 28 10:01:56 np0005538513.localdomain podman[310685]: 2025-11-28 10:01:56.33814094 +0000 UTC m=+0.071692350 container kill 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 10:01:56 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:01:56.412 2 INFO neutron.agent.securitygroups_rpc [req-49cc58ff-e4e8-45be-b0c0-595b2c881c34 req-2439bbec-210c-4eb9-989c-4cbc137e5d8d 76caaf04f9e5427ca10e0bb020dbffa2 6fec370fed684ed6ba04de00336f61ee - - default default] Security group rule updated ['4f7b9341-d4bb-4bbc-a8bf-917ce0b68881']
Nov 28 10:01:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:01:56.531 261084 INFO neutron.agent.dhcp.agent [None req-abacf628-896f-4f36-ae8c-95ee3fcf07c0 - - - - - -] DHCP configuration for ports {'38f26b93-3884-4247-b638-2104f92bdcaf'} is completed
Nov 28 10:01:56 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:01:56.954 2 INFO neutron.agent.securitygroups_rpc [req-d6a9207a-32bb-417d-a1a8-33a725f0d00f req-76509a4e-6eff-4420-ad31-f2903ff65806 76caaf04f9e5427ca10e0bb020dbffa2 6fec370fed684ed6ba04de00336f61ee - - default default] Security group rule updated ['4f7b9341-d4bb-4bbc-a8bf-917ce0b68881']
Nov 28 10:01:57 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:01:57.143 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:01:55Z, description=, device_id=ff13b2c3-ffbb-486b-ba3a-fa0f2960342d, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6ee0b50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd66531c0>], id=38f26b93-3884-4247-b638-2104f92bdcaf, ip_allocation=immediate, mac_address=fa:16:3e:58:21:6e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:01:49Z, description=, dns_domain=, id=b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-101229426-network, port_security_enabled=True, project_id=1f9b84b894e641c4bee3ebcd1409ad9f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62044, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=730, status=ACTIVE, subnets=['bac26e98-4c61-47a1-b281-0a4613971f3f'], tags=[], tenant_id=1f9b84b894e641c4bee3ebcd1409ad9f, updated_at=2025-11-28T10:01:50Z, vlan_transparent=None, network_id=b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, port_security_enabled=False, project_id=1f9b84b894e641c4bee3ebcd1409ad9f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=767, status=DOWN, tags=[], tenant_id=1f9b84b894e641c4bee3ebcd1409ad9f, updated_at=2025-11-28T10:01:55Z on network b1696f4c-80ce-491f-ad1c-cc7f5b6700ba
Nov 28 10:01:57 np0005538513.localdomain dnsmasq[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/addn_hosts - 1 addresses
Nov 28 10:01:57 np0005538513.localdomain dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/host
Nov 28 10:01:57 np0005538513.localdomain podman[310721]: 2025-11-28 10:01:57.360302714 +0000 UTC m=+0.070829192 container kill 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:01:57 np0005538513.localdomain dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/opts
Nov 28 10:01:57 np0005538513.localdomain ceph-mon[292954]: pgmap v128: 177 pgs: 177 active+clean; 383 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 5.1 MiB/s rd, 7.2 MiB/s wr, 200 op/s
Nov 28 10:01:57 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:01:57.593 261084 INFO neutron.agent.dhcp.agent [None req-511587f2-b9ab-455a-89d5-73247e9b2efc - - - - - -] DHCP configuration for ports {'38f26b93-3884-4247-b638-2104f92bdcaf'} is completed
Nov 28 10:01:57 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:01:57.738 2 INFO neutron.agent.securitygroups_rpc [None req-e0232781-0774-46d7-9ff8-6308f0f3831b 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Security group member updated ['8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e']
Nov 28 10:01:58 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:01:58.125 261084 INFO neutron.agent.linux.ip_lib [None req-11028aa0-241c-4cba-b68c-25c21bc3bb21 - - - - - -] Device tap54867331-d2 cannot be used as it has no MAC address
Nov 28 10:01:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:58.181 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:58 np0005538513.localdomain kernel: device tap54867331-d2 entered promiscuous mode
Nov 28 10:01:58 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324118.1897] manager: (tap54867331-d2): new Generic device (/org/freedesktop/NetworkManager/Devices/29)
Nov 28 10:01:58 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:58Z|00147|binding|INFO|Claiming lport 54867331-d2d2-4007-8751-6825f0370005 for this chassis.
Nov 28 10:01:58 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:58Z|00148|binding|INFO|54867331-d2d2-4007-8751-6825f0370005: Claiming unknown
Nov 28 10:01:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:58.190 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:58 np0005538513.localdomain systemd-udevd[310751]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:01:58 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:58.203 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-291dc1ac-5414-4421-8e5e-126d810812c9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-291dc1ac-5414-4421-8e5e-126d810812c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b9d5d0d5dc28445f854288051977b3d1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46ecb4cc-6f9f-41cb-ba67-522f7eda61f5, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=54867331-d2d2-4007-8751-6825f0370005) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:01:58 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:58.205 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 54867331-d2d2-4007-8751-6825f0370005 in datapath 291dc1ac-5414-4421-8e5e-126d810812c9 bound to our chassis
Nov 28 10:01:58 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:58.207 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 291dc1ac-5414-4421-8e5e-126d810812c9 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:01:58 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:01:58.208 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[68a719c5-0bf9-42f1-9756-0a6bf202d3b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:58 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap54867331-d2: No such device
Nov 28 10:01:58 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap54867331-d2: No such device
Nov 28 10:01:58 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:58Z|00149|binding|INFO|Setting lport 54867331-d2d2-4007-8751-6825f0370005 ovn-installed in OVS
Nov 28 10:01:58 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:01:58Z|00150|binding|INFO|Setting lport 54867331-d2d2-4007-8751-6825f0370005 up in Southbound
Nov 28 10:01:58 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap54867331-d2: No such device
Nov 28 10:01:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:58.230 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:58 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap54867331-d2: No such device
Nov 28 10:01:58 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap54867331-d2: No such device
Nov 28 10:01:58 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap54867331-d2: No such device
Nov 28 10:01:58 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap54867331-d2: No such device
Nov 28 10:01:58 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap54867331-d2: No such device
Nov 28 10:01:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:58.263 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:58.289 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:58 np0005538513.localdomain ceph-mon[292954]: pgmap v129: 177 pgs: 177 active+clean; 225 MiB data, 916 MiB used, 41 GiB / 42 GiB avail; 7.0 MiB/s rd, 4.7 MiB/s wr, 241 op/s
Nov 28 10:01:58 np0005538513.localdomain snmpd[66832]: empty variable list in _query
Nov 28 10:01:58 np0005538513.localdomain snmpd[66832]: empty variable list in _query
Nov 28 10:01:58 np0005538513.localdomain snmpd[66832]: empty variable list in _query
Nov 28 10:01:58 np0005538513.localdomain snmpd[66832]: empty variable list in _query
Nov 28 10:01:58 np0005538513.localdomain snmpd[66832]: empty variable list in _query
Nov 28 10:01:58 np0005538513.localdomain snmpd[66832]: empty variable list in _query
Nov 28 10:01:59 np0005538513.localdomain podman[310822]: 
Nov 28 10:01:59 np0005538513.localdomain podman[310822]: 2025-11-28 10:01:59.139832024 +0000 UTC m=+0.089997971 container create ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-291dc1ac-5414-4421-8e5e-126d810812c9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 10:01:59 np0005538513.localdomain systemd[1]: Started libpod-conmon-ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381.scope.
Nov 28 10:01:59 np0005538513.localdomain podman[310822]: 2025-11-28 10:01:59.09795526 +0000 UTC m=+0.048121257 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:01:59 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:01:59 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e086de59da47ef628d8f8ee81d1bcf4e96528abb7fd65b1cc4ec3d44ec3ea5b1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:01:59 np0005538513.localdomain podman[310822]: 2025-11-28 10:01:59.218395753 +0000 UTC m=+0.168561690 container init ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-291dc1ac-5414-4421-8e5e-126d810812c9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:01:59 np0005538513.localdomain podman[310822]: 2025-11-28 10:01:59.228105592 +0000 UTC m=+0.178271529 container start ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-291dc1ac-5414-4421-8e5e-126d810812c9, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 10:01:59 np0005538513.localdomain dnsmasq[310840]: started, version 2.85 cachesize 150
Nov 28 10:01:59 np0005538513.localdomain dnsmasq[310840]: DNS service limited to local subnets
Nov 28 10:01:59 np0005538513.localdomain dnsmasq[310840]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:01:59 np0005538513.localdomain dnsmasq[310840]: warning: no upstream servers configured
Nov 28 10:01:59 np0005538513.localdomain dnsmasq-dhcp[310840]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:01:59 np0005538513.localdomain dnsmasq[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/addn_hosts - 0 addresses
Nov 28 10:01:59 np0005538513.localdomain dnsmasq-dhcp[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/host
Nov 28 10:01:59 np0005538513.localdomain dnsmasq-dhcp[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/opts
Nov 28 10:01:59 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:01:59.352 261084 INFO neutron.agent.dhcp.agent [None req-a3adf305-0a5b-4653-82d7-cee7b7a2063d - - - - - -] DHCP configuration for ports {'b5a6badf-b758-4c4f-b162-a463f94ddb2e'} is completed
Nov 28 10:01:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:01:59.840 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:59 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:01:59.857 2 INFO neutron.agent.securitygroups_rpc [None req-34f90ada-ae7e-4d6e-90c9-94029146836e 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Security group member updated ['8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e']
Nov 28 10:01:59 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:59 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e102 do_prune osdmap full prune enabled
Nov 28 10:01:59 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e103 e103: 6 total, 6 up, 6 in
Nov 28 10:01:59 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e103: 6 total, 6 up, 6 in
Nov 28 10:02:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:00.010 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.675 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.677 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.682 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a6fefa1-5605-4469-acf2-a10536ee15c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:02:00.677589', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '48531d60-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.849510782, 'message_signature': 'ed0b8275d3b9de0f4cfb71eefc8d699f4fe5b7a46334681c67dc041ed07fd16d'}]}, 'timestamp': '2025-11-28 10:02:00.683552', '_unique_id': 'ce27b9d1b2d5488fa93e03e94e29bbcb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.686 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.699 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.700 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '850f14f7-0321-4e9e-a395-5d73e66157c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:02:00.686713', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4855a9fe-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.858632422, 'message_signature': 'a31a3b2a28d6925eeb19d5b8a60498456ee900c24abc19ce444a50eb9fec1bf3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:02:00.686713', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4855c2a4-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.858632422, 'message_signature': '023f22cd9346db0800a8dce97952ff063acabcfadf3607daa4c494463d1af78b'}]}, 'timestamp': '2025-11-28 10:02:00.700890', '_unique_id': 'bdd444813d794476b533f59ae6e816ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.703 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.703 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.703 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.733 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.734 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce5b4000-2618-44b5-978f-8ccf11a9ea5d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:02:00.703795', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '485ae798-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': '4ed2c1710d03b58f78fd3af1022f94114abe929b8a9d021addd2de1e606471ae'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:02:00.703795', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '485b0106-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': '8a441c4feeef2f1ec7f9856af711b967e24cd7ac4d001ecba32f3bd8b99e26c1'}]}, 'timestamp': '2025-11-28 10:02:00.735270', '_unique_id': '8aeca1093fcf4f91a1f1590c2c53b78c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.737 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.753 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69264395-95e8-4f92-aae2-e58e6eb5b6e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:02:00.737980', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '485dee34-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.925610406, 'message_signature': '996191eb67ea52d85432ac08196c0ac4ce64b5ac87559586a51bd15b78a90957'}]}, 'timestamp': '2025-11-28 10:02:00.754410', '_unique_id': '65c5679ac41a45d685406ef6cca41952'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.756 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.756 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.757 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eae21a89-472b-4903-af50-95bb8514d6df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:02:00.756780', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '485e5e64-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.858632422, 'message_signature': '64ca1c18b4b967d3098ac7eff89af00088f873caf6290e4a2e9c9dbcc5626acc'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:02:00.756780', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '485e700c-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.858632422, 'message_signature': '9c19714837de4b89c8264f64f95c5cc543b59526c7d9d76a70ca47db42b92cdd'}]}, 'timestamp': '2025-11-28 10:02:00.757655', '_unique_id': '578f3b15746648a7b956e20cc796a112'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.759 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.759 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.760 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc2e27a0-dc5b-4955-bbca-206a737f3bb0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:02:00.759924', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '485edaa6-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': '1bb2fd7c6b2b989c22488dec2e5c0f02f3092b6c3aad4b6bb25bad81b13909dd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:02:00.759924', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '485eeb54-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': '00544548394dd27c67e0f1a399ba94306e5b3dc4f3b2d6cc0c7f6dce53ffacfa'}]}, 'timestamp': '2025-11-28 10:02:00.760812', '_unique_id': '14428ccd932b4e7cabe44ea582127024'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.762 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.762 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc4f37cb-ea4a-4ca2-b760-20eeb28de1dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:02:00.762922', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '485f5026-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.849510782, 'message_signature': 'ea1bb469332eac19f16040247826fa82e0cdacfeb531d5191d5e014c67de4c84'}]}, 'timestamp': '2025-11-28 10:02:00.763435', '_unique_id': '06147871029d4d30ad928ba040add18e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.765 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.765 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '923e3cfa-18d9-4c18-a645-82e9204c6cfe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:02:00.765518', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '485fb3b8-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.849510782, 'message_signature': '378dc273f4e7634ccfa973409214e2ba23e43ea10baf69367cbebda6b28a1655'}]}, 'timestamp': '2025-11-28 10:02:00.765970', '_unique_id': '4a104847a46d4372be37d8cb2ad0ade6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.767 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.768 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a833ccf2-fe7b-4074-beb0-dd2a0ba07a97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:02:00.768068', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '48601786-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.849510782, 'message_signature': 'edee6ffb91e1497cdaf6d5884c746c505b7cbf0f75c9ce706c2fa4eab4c883b7'}]}, 'timestamp': '2025-11-28 10:02:00.768551', '_unique_id': '4565083ce1904707883a2451fb31abdb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.770 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.770 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.771 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8fee6ff-ffc8-4044-9c11-b67dde5a18d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:02:00.770609', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '48607a46-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.858632422, 'message_signature': 'be9987b1301a77906dfc1d4d4b6b652d0fa6d216c6428031b2f35ec6b31fbafe'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:02:00.770609', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '48608b8a-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.858632422, 'message_signature': '29a1a7c1346ee0553297d55a5f7aef2d9958df23da1eae3a0c2a6f90273fb4d5'}]}, 'timestamp': '2025-11-28 10:02:00.771468', '_unique_id': 'fcf7db19477440d08c9e95e5277cf484'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.773 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.773 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12500514-64de-4a70-a57d-4ab72b5b3e0b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:02:00.773607', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '4860ef94-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.849510782, 'message_signature': 'e66ce4b549e9e754799c15eeb9b9b9c820b4781116630e6e69ea490c9d4c5670'}]}, 'timestamp': '2025-11-28 10:02:00.774088', '_unique_id': 'c10008e8862f4b92a02ce57d0f97333c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.775 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.776 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.776 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af9f3725-9ab0-4d76-ad42-1636ec547845', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:02:00.776143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '48615344-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': 'efd1534974420a710e84d24d57b753963e1172e6b44d59dafd4208295ef262c4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:02:00.776143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '48616302-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': 'e5c1117bfad0b121efe60bf62201d2bb8b7996d978a3ab35a0ec520b36389124'}]}, 'timestamp': '2025-11-28 10:02:00.776981', '_unique_id': '685ec28cb69543fcb9546fdfcf01dcbf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.778 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.779 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64be9e07-33f1-4b2d-ac9c-10b0eeb6973d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:02:00.779107', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '4861c6b2-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.849510782, 'message_signature': '23794143cb9be5e477e59eaea27e328b628b7b197a552456f75239baf0c08abc'}]}, 'timestamp': '2025-11-28 10:02:00.779561', '_unique_id': '2b8fea4bf6114392b33b59642c22c2ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.781 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.781 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5cf1600c-325e-4e21-85cf-7678831d8ad0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:02:00.781637', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '4862292c-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.849510782, 'message_signature': '91e0b2c1fa6d0e40b37fb083b7dfbdda8ee69fcf4971e2e4bdd79c8f0aa20671'}]}, 'timestamp': '2025-11-28 10:02:00.782118', '_unique_id': '88a292825e1d4d9ea9827ea367b8c66a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.784 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.784 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.784 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6a2d9f2-d22b-4bef-bf80-857f730d07b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:02:00.784172', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '48628d22-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': '433d20ee13b167b567f0c178338c2611c14ab7fe085f2d015666aafe45d1e25d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:02:00.784172', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '48629ce0-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': '5478a5d9b8e2c383318551a5423b4a393663f06400229807ef6b15486712c39e'}]}, 'timestamp': '2025-11-28 10:02:00.785045', '_unique_id': '51216bc176f84401b2b62a95c61ca4fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.786 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.787 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.787 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1d97f35-11f5-4bfa-a3e7-9f56facbfb53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:02:00.787284', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '48630612-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.849510782, 'message_signature': '7d68f65c7bb12e79559855b1d9186e008adcf0f180a52407cead0784ec3f8c0d'}]}, 'timestamp': '2025-11-28 10:02:00.787739', '_unique_id': '4c87b5bebceb4ba59a8e7fda64727f30'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.789 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.789 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '100fd371-2f7c-4f24-a970-b01e98a5955e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:02:00.789823', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '486369fe-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.849510782, 'message_signature': '77ad94ddd83dd072ac94f45a7ef0ab820df81c5b513b6cc04f4f49542c0a2130'}]}, 'timestamp': '2025-11-28 10:02:00.790300', '_unique_id': 'a003b889ff124fb99fdcf1843f830599'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.792 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.792 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 15730000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e244453-4044-4467-ae2c-fc059b9c4263', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15730000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:02:00.792376', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '4863cc82-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.925610406, 'message_signature': '1254a75d90f1eb619a4ae58ccd7bd56dde4454d3f9f8e7c765f70b6dd922a8a3'}]}, 'timestamp': '2025-11-28 10:02:00.792805', '_unique_id': '74a420ead7ba489dabfe5007f99200a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.794 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.794 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.794 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86caf319-d10e-4b98-8352-506082c4a869', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:02:00.794352', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4864169c-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': '07827867b9c37b8da78c0c771ec0990274c277f3538a90acd5467e1daf590bc9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:02:00.794352', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '48642074-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': '555fdfa734c6ebcd3ee8f0465af0998b6a78ecc9f467f471b1c5fde27d2853ff'}]}, 'timestamp': '2025-11-28 10:02:00.794863', '_unique_id': '65418ba3c5394c57a0d65daa92af1c07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.796 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.796 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.796 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15c3754d-069d-41f3-880b-961922c30441', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:02:00.796167', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '48645e2c-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': '7fc2155ca9654eafab178b28d9776fdffd69932b730d6facb16577b5043b61c0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:02:00.796167', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '486467fa-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': 'b74500b808f06bb6ad38581905715726861ec08f40d7ed44529319328dfaa679'}]}, 'timestamp': '2025-11-28 10:02:00.796695', '_unique_id': '2e8445f210784649aad56aa3f06dadfa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40622aa4-440c-4ade-a4c1-2326bc5d4a3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:02:00.797977', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '4864a5c6-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.849510782, 'message_signature': 'd5d528b35fbb3bbf05b5803a80441de58a62c4532a21da7ce4335b582c4f08ea'}]}, 'timestamp': '2025-11-28 10:02:00.798298', '_unique_id': '58fc1e9104cb4af49925933a570afd28'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:00 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:02:00.865 2 INFO neutron.agent.securitygroups_rpc [None req-42bd8e77-bdc1-4bfe-abe6-7d585fdf99bb 75ac6a26227c40ba81e61e610018d23f 1f9b84b894e641c4bee3ebcd1409ad9f - - default default] Security group rule updated ['6deb8732-9203-448a-b0a5-cf6a0375d009']
Nov 28 10:02:00 np0005538513.localdomain ceph-mon[292954]: pgmap v130: 177 pgs: 177 active+clean; 225 MiB data, 916 MiB used, 41 GiB / 42 GiB avail; 7.0 MiB/s rd, 4.7 MiB/s wr, 241 op/s
Nov 28 10:02:00 np0005538513.localdomain ceph-mon[292954]: osdmap e103: 6 total, 6 up, 6 in
Nov 28 10:02:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:01.024 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:01 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:02:01.497 2 INFO neutron.agent.securitygroups_rpc [None req-3458faa2-903e-46ff-96c1-5776090af93b 75ac6a26227c40ba81e61e610018d23f 1f9b84b894e641c4bee3ebcd1409ad9f - - default default] Security group rule updated ['6deb8732-9203-448a-b0a5-cf6a0375d009']
Nov 28 10:02:02 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:02.707 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:02Z, description=, device_id=8d6dcd20-92ab-47ad-ac9d-52244fd1b9b4, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6ee20a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6ee2c70>], id=f270f680-72b3-4958-a0e3-4e2fbae9a975, ip_allocation=immediate, mac_address=fa:16:3e:eb:a0:b1, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:01:56Z, description=, dns_domain=, id=291dc1ac-5414-4421-8e5e-126d810812c9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsNegativeTestJSON-1342624790-network, port_security_enabled=True, project_id=b9d5d0d5dc28445f854288051977b3d1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=39438, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=777, status=ACTIVE, subnets=['72948917-b3da-47be-87d8-60087f12ee07'], tags=[], tenant_id=b9d5d0d5dc28445f854288051977b3d1, updated_at=2025-11-28T10:01:57Z, vlan_transparent=None, network_id=291dc1ac-5414-4421-8e5e-126d810812c9, port_security_enabled=False, project_id=b9d5d0d5dc28445f854288051977b3d1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=832, status=DOWN, tags=[], tenant_id=b9d5d0d5dc28445f854288051977b3d1, updated_at=2025-11-28T10:02:02Z on network 291dc1ac-5414-4421-8e5e-126d810812c9
Nov 28 10:02:02 np0005538513.localdomain ceph-mon[292954]: pgmap v132: 177 pgs: 177 active+clean; 225 MiB data, 869 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 22 KiB/s wr, 181 op/s
Nov 28 10:02:02 np0005538513.localdomain systemd[1]: tmp-crun.ux8WEm.mount: Deactivated successfully.
Nov 28 10:02:02 np0005538513.localdomain dnsmasq[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/addn_hosts - 1 addresses
Nov 28 10:02:02 np0005538513.localdomain dnsmasq-dhcp[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/host
Nov 28 10:02:02 np0005538513.localdomain podman[310859]: 2025-11-28 10:02:02.936057007 +0000 UTC m=+0.068516911 container kill ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-291dc1ac-5414-4421-8e5e-126d810812c9, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:02:02 np0005538513.localdomain dnsmasq-dhcp[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/opts
Nov 28 10:02:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:03.104 261084 INFO neutron.agent.dhcp.agent [None req-377252fe-49a6-4699-ab57-168d7f3b0adb - - - - - -] DHCP configuration for ports {'f270f680-72b3-4958-a0e3-4e2fbae9a975'} is completed
Nov 28 10:02:04 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:02:04Z|00151|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:02:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:04.808 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:04.842 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:04 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:02:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:05.012 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:05 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:05.082 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:02Z, description=, device_id=8d6dcd20-92ab-47ad-ac9d-52244fd1b9b4, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6715340>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd66498b0>], id=f270f680-72b3-4958-a0e3-4e2fbae9a975, ip_allocation=immediate, mac_address=fa:16:3e:eb:a0:b1, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:01:56Z, description=, dns_domain=, id=291dc1ac-5414-4421-8e5e-126d810812c9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsNegativeTestJSON-1342624790-network, port_security_enabled=True, project_id=b9d5d0d5dc28445f854288051977b3d1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=39438, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=777, status=ACTIVE, subnets=['72948917-b3da-47be-87d8-60087f12ee07'], tags=[], tenant_id=b9d5d0d5dc28445f854288051977b3d1, updated_at=2025-11-28T10:01:57Z, vlan_transparent=None, network_id=291dc1ac-5414-4421-8e5e-126d810812c9, port_security_enabled=False, project_id=b9d5d0d5dc28445f854288051977b3d1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=832, status=DOWN, tags=[], tenant_id=b9d5d0d5dc28445f854288051977b3d1, updated_at=2025-11-28T10:02:02Z on network 291dc1ac-5414-4421-8e5e-126d810812c9
Nov 28 10:02:05 np0005538513.localdomain dnsmasq[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/addn_hosts - 1 addresses
Nov 28 10:02:05 np0005538513.localdomain dnsmasq-dhcp[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/host
Nov 28 10:02:05 np0005538513.localdomain dnsmasq-dhcp[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/opts
Nov 28 10:02:05 np0005538513.localdomain podman[310896]: 2025-11-28 10:02:05.334159217 +0000 UTC m=+0.067952994 container kill ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-291dc1ac-5414-4421-8e5e-126d810812c9, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:02:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:02:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:02:05 np0005538513.localdomain podman[310908]: 2025-11-28 10:02:05.47609731 +0000 UTC m=+0.111895112 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:02:05 np0005538513.localdomain podman[310909]: 2025-11-28 10:02:05.491976338 +0000 UTC m=+0.120388673 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Nov 28 10:02:05 np0005538513.localdomain podman[310909]: 2025-11-28 10:02:05.50968099 +0000 UTC m=+0.138093325 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 10:02:05 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:02:05 np0005538513.localdomain podman[310908]: 2025-11-28 10:02:05.560108866 +0000 UTC m=+0.195906658 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:02:05 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:05.565 261084 INFO neutron.agent.dhcp.agent [None req-dc780ffc-b30d-4974-8415-da373f47dbb8 - - - - - -] DHCP configuration for ports {'f270f680-72b3-4958-a0e3-4e2fbae9a975'} is completed
Nov 28 10:02:05 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:02:05 np0005538513.localdomain ceph-mon[292954]: pgmap v133: 177 pgs: 177 active+clean; 225 MiB data, 869 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 18 KiB/s wr, 149 op/s
Nov 28 10:02:06 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:02:06Z|00152|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:02:06 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:06.276 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:06 np0005538513.localdomain systemd[1]: tmp-crun.GC5j3p.mount: Deactivated successfully.
Nov 28 10:02:06 np0005538513.localdomain ceph-mon[292954]: pgmap v134: 177 pgs: 177 active+clean; 225 MiB data, 869 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 18 KiB/s wr, 149 op/s
Nov 28 10:02:07 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/743986143' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:08 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:02:08.256 2 INFO neutron.agent.securitygroups_rpc [req-6bffedb9-405b-4a40-9982-68d686e88a5f req-5df2fd06-5333-4972-81c1-a0ccb5870973 75ac6a26227c40ba81e61e610018d23f 1f9b84b894e641c4bee3ebcd1409ad9f - - default default] Security group member updated ['6deb8732-9203-448a-b0a5-cf6a0375d009']
Nov 28 10:02:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:08.320 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:07Z, description=, device_id=bbf7ad79-0406-4158-8a09-075ba873c1fd, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6eed970>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd67c32b0>], id=72569922-3c02-4d13-b171-27f6f957e54c, ip_allocation=immediate, mac_address=fa:16:3e:23:23:a2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:01:49Z, description=, dns_domain=, id=b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-101229426-network, port_security_enabled=True, project_id=1f9b84b894e641c4bee3ebcd1409ad9f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62044, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=730, status=ACTIVE, subnets=['bac26e98-4c61-47a1-b281-0a4613971f3f'], tags=[], tenant_id=1f9b84b894e641c4bee3ebcd1409ad9f, updated_at=2025-11-28T10:01:50Z, vlan_transparent=None, network_id=b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, port_security_enabled=True, project_id=1f9b84b894e641c4bee3ebcd1409ad9f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['6deb8732-9203-448a-b0a5-cf6a0375d009'], standard_attr_id=864, status=DOWN, tags=[], tenant_id=1f9b84b894e641c4bee3ebcd1409ad9f, updated_at=2025-11-28T10:02:07Z on network b1696f4c-80ce-491f-ad1c-cc7f5b6700ba
Nov 28 10:02:08 np0005538513.localdomain dnsmasq[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/addn_hosts - 2 addresses
Nov 28 10:02:08 np0005538513.localdomain dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/host
Nov 28 10:02:08 np0005538513.localdomain podman[310972]: 2025-11-28 10:02:08.543169313 +0000 UTC m=+0.067901933 container kill 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:02:08 np0005538513.localdomain dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/opts
Nov 28 10:02:08 np0005538513.localdomain ceph-mon[292954]: pgmap v135: 177 pgs: 177 active+clean; 226 MiB data, 869 MiB used, 41 GiB / 42 GiB avail; 630 KiB/s rd, 35 KiB/s wr, 53 op/s
Nov 28 10:02:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:08.883 261084 INFO neutron.agent.dhcp.agent [None req-c3d929fd-8152-4f5c-8f84-cd745f2000df - - - - - -] DHCP configuration for ports {'72569922-3c02-4d13-b171-27f6f957e54c'} is completed
Nov 28 10:02:09 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:09.271 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005538514.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:07Z, description=, device_id=bbf7ad79-0406-4158-8a09-075ba873c1fd, device_owner=compute:nova, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6697a60>], dns_domain=, dns_name=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd66974f0>], id=72569922-3c02-4d13-b171-27f6f957e54c, ip_allocation=immediate, mac_address=fa:16:3e:23:23:a2, name=, network_id=b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, port_security_enabled=True, project_id=1f9b84b894e641c4bee3ebcd1409ad9f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['6deb8732-9203-448a-b0a5-cf6a0375d009'], standard_attr_id=864, status=DOWN, tags=[], tenant_id=1f9b84b894e641c4bee3ebcd1409ad9f, updated_at=2025-11-28T10:02:08Z on network b1696f4c-80ce-491f-ad1c-cc7f5b6700ba
Nov 28 10:02:09 np0005538513.localdomain dnsmasq[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/addn_hosts - 2 addresses
Nov 28 10:02:09 np0005538513.localdomain dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/host
Nov 28 10:02:09 np0005538513.localdomain podman[311009]: 2025-11-28 10:02:09.500754368 +0000 UTC m=+0.061086304 container kill 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:02:09 np0005538513.localdomain dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/opts
Nov 28 10:02:09 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:09.715 261084 INFO neutron.agent.dhcp.agent [None req-a706aba6-e5b7-4b45-acd4-353416be1706 - - - - - -] DHCP configuration for ports {'72569922-3c02-4d13-b171-27f6f957e54c'} is completed
Nov 28 10:02:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e103 do_prune osdmap full prune enabled
Nov 28 10:02:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e104 e104: 6 total, 6 up, 6 in
Nov 28 10:02:09 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e104: 6 total, 6 up, 6 in
Nov 28 10:02:09 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:09.869 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e104 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:02:10 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:10.016 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:10 np0005538513.localdomain podman[238687]: time="2025-11-28T10:02:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:02:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:02:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159336 "" "Go-http-client/1.1"
Nov 28 10:02:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:02:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20212 "" "Go-http-client/1.1"
Nov 28 10:02:10 np0005538513.localdomain ceph-mon[292954]: pgmap v136: 177 pgs: 177 active+clean; 226 MiB data, 869 MiB used, 41 GiB / 42 GiB avail; 630 KiB/s rd, 35 KiB/s wr, 53 op/s
Nov 28 10:02:10 np0005538513.localdomain ceph-mon[292954]: osdmap e104: 6 total, 6 up, 6 in
Nov 28 10:02:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e104 do_prune osdmap full prune enabled
Nov 28 10:02:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e105 e105: 6 total, 6 up, 6 in
Nov 28 10:02:10 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e105: 6 total, 6 up, 6 in
Nov 28 10:02:10 np0005538513.localdomain dnsmasq[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/addn_hosts - 0 addresses
Nov 28 10:02:10 np0005538513.localdomain dnsmasq-dhcp[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/host
Nov 28 10:02:10 np0005538513.localdomain dnsmasq-dhcp[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/opts
Nov 28 10:02:10 np0005538513.localdomain podman[311049]: 2025-11-28 10:02:10.979232451 +0000 UTC m=+0.063592392 container kill ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-291dc1ac-5414-4421-8e5e-126d810812c9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:02:11 np0005538513.localdomain kernel: device tap54867331-d2 left promiscuous mode
Nov 28 10:02:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:11.187 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:11 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:02:11Z|00153|binding|INFO|Releasing lport 54867331-d2d2-4007-8751-6825f0370005 from this chassis (sb_readonly=0)
Nov 28 10:02:11 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:02:11Z|00154|binding|INFO|Setting lport 54867331-d2d2-4007-8751-6825f0370005 down in Southbound
Nov 28 10:02:11 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:02:11.204 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-291dc1ac-5414-4421-8e5e-126d810812c9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-291dc1ac-5414-4421-8e5e-126d810812c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b9d5d0d5dc28445f854288051977b3d1', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46ecb4cc-6f9f-41cb-ba67-522f7eda61f5, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=54867331-d2d2-4007-8751-6825f0370005) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:02:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:11.206 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:11 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:02:11.210 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 54867331-d2d2-4007-8751-6825f0370005 in datapath 291dc1ac-5414-4421-8e5e-126d810812c9 unbound from our chassis
Nov 28 10:02:11 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:02:11.215 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 291dc1ac-5414-4421-8e5e-126d810812c9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:02:11 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:02:11.216 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[5db0e8f2-4ce7-4faf-9a0d-3ad76f444a59]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:02:11 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:02:11Z|00155|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:02:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:11.891 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e105 do_prune osdmap full prune enabled
Nov 28 10:02:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e106 e106: 6 total, 6 up, 6 in
Nov 28 10:02:11 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e106: 6 total, 6 up, 6 in
Nov 28 10:02:11 np0005538513.localdomain ceph-mon[292954]: osdmap e105: 6 total, 6 up, 6 in
Nov 28 10:02:11 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/3484021559' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:02:11 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1882949366' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:02:12 np0005538513.localdomain ceph-mon[292954]: pgmap v139: 177 pgs: 177 active+clean; 354 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 6.7 MiB/s rd, 8.5 MiB/s wr, 217 op/s
Nov 28 10:02:12 np0005538513.localdomain ceph-mon[292954]: osdmap e106: 6 total, 6 up, 6 in
Nov 28 10:02:13 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:02:13Z|00156|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:02:13 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:13.531 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3925667673' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:02:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3925667673' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:02:14 np0005538513.localdomain dnsmasq[310840]: exiting on receipt of SIGTERM
Nov 28 10:02:14 np0005538513.localdomain systemd[1]: tmp-crun.LLf7Wk.mount: Deactivated successfully.
Nov 28 10:02:14 np0005538513.localdomain podman[311090]: 2025-11-28 10:02:14.088291998 +0000 UTC m=+0.074838866 container kill ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-291dc1ac-5414-4421-8e5e-126d810812c9, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 10:02:14 np0005538513.localdomain systemd[1]: libpod-ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381.scope: Deactivated successfully.
Nov 28 10:02:14 np0005538513.localdomain podman[311102]: 2025-11-28 10:02:14.158096168 +0000 UTC m=+0.056306804 container died ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-291dc1ac-5414-4421-8e5e-126d810812c9, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 10:02:14 np0005538513.localdomain podman[311102]: 2025-11-28 10:02:14.195296394 +0000 UTC m=+0.093506960 container cleanup ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-291dc1ac-5414-4421-8e5e-126d810812c9, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:02:14 np0005538513.localdomain systemd[1]: libpod-conmon-ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381.scope: Deactivated successfully.
Nov 28 10:02:14 np0005538513.localdomain podman[311108]: 2025-11-28 10:02:14.221472305 +0000 UTC m=+0.106948076 container remove ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-291dc1ac-5414-4421-8e5e-126d810812c9, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:02:14 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:14.473 261084 INFO neutron.agent.dhcp.agent [None req-350b1190-06e0-4de2-b6f3-8032ed6b7c95 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:02:14 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:14.515 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:02:14 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:14.835 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:02:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:02:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e106 do_prune osdmap full prune enabled
Nov 28 10:02:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:14.904 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e107 e107: 6 total, 6 up, 6 in
Nov 28 10:02:14 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e107: 6 total, 6 up, 6 in
Nov 28 10:02:14 np0005538513.localdomain ceph-mon[292954]: pgmap v141: 177 pgs: 177 active+clean; 354 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 7.9 MiB/s rd, 11 MiB/s wr, 200 op/s
Nov 28 10:02:14 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/915675261' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:14 np0005538513.localdomain ceph-mon[292954]: osdmap e107: 6 total, 6 up, 6 in
Nov 28 10:02:14 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:02:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:15.024 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:15 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e086de59da47ef628d8f8ee81d1bcf4e96528abb7fd65b1cc4ec3d44ec3ea5b1-merged.mount: Deactivated successfully.
Nov 28 10:02:15 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381-userdata-shm.mount: Deactivated successfully.
Nov 28 10:02:15 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d291dc1ac\x2d5414\x2d4421\x2d8e5e\x2d126d810812c9.mount: Deactivated successfully.
Nov 28 10:02:15 np0005538513.localdomain podman[311132]: 2025-11-28 10:02:15.115703748 +0000 UTC m=+0.103037254 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 10:02:15 np0005538513.localdomain podman[311132]: 2025-11-28 10:02:15.129241256 +0000 UTC m=+0.116574782 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, architecture=x86_64, managed_by=edpm_ansible, container_name=openstack_network_exporter, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, config_id=edpm)
Nov 28 10:02:15 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:02:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:15.186 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Acquiring lock "7292509e-f294-4159-96e5-22d4712df2a0" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:02:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:15.187 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "7292509e-f294-4159-96e5-22d4712df2a0" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:02:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:15.188 279685 INFO nova.compute.manager [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Unshelving
Nov 28 10:02:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:15.296 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:02:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:15.297 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:02:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:15.302 279685 DEBUG nova.objects.instance [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'pci_requests' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:02:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:15.316 279685 DEBUG nova.objects.instance [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'numa_topology' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:02:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:15.327 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 28 10:02:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:15.328 279685 INFO nova.compute.claims [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Claim successful on node np0005538513.localdomain
Nov 28 10:02:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:15.446 279685 DEBUG oslo_concurrency.processutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:02:15 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:15.733 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:02:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:02:15 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2145567881' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:15.973 279685 DEBUG oslo_concurrency.processutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:02:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:15.983 279685 DEBUG nova.compute.provider_tree [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:02:15 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2145567881' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:16.005 279685 DEBUG nova.scheduler.client.report [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.023925) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324136024088, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 887, "num_deletes": 254, "total_data_size": 689676, "memory_usage": 705848, "flush_reason": "Manual Compaction"}
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Nov 28 10:02:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:16.027 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.730s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324136032934, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 674308, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24734, "largest_seqno": 25620, "table_properties": {"data_size": 670184, "index_size": 1787, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10053, "raw_average_key_size": 20, "raw_value_size": 661649, "raw_average_value_size": 1358, "num_data_blocks": 78, "num_entries": 487, "num_filter_entries": 487, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324089, "oldest_key_time": 1764324089, "file_creation_time": 1764324136, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 9114 microseconds, and 4803 cpu microseconds.
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.033004) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 674308 bytes OK
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.033087) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.035342) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.035366) EVENT_LOG_v1 {"time_micros": 1764324136035358, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.035399) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 685289, prev total WAL file size 685289, number of live WAL files 2.
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.036100) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131353436' seq:72057594037927935, type:22 .. '7061786F73003131373938' seq:0, type:0; will stop at (end)
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(658KB)], [42(17MB)]
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324136036168, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 18755063, "oldest_snapshot_seqno": -1}
Nov 28 10:02:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:16.066 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Acquiring lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:02:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:16.067 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Acquired lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:02:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:16.067 279685 DEBUG nova.network.neutron [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 10:02:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:16.115 279685 DEBUG nova.network.neutron [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 12045 keys, 16120567 bytes, temperature: kUnknown
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324136128247, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 16120567, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16054212, "index_size": 35150, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30149, "raw_key_size": 323922, "raw_average_key_size": 26, "raw_value_size": 15851387, "raw_average_value_size": 1316, "num_data_blocks": 1327, "num_entries": 12045, "num_filter_entries": 12045, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324136, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.128783) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 16120567 bytes
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.130879) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 203.5 rd, 174.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 17.2 +0.0 blob) out(15.4 +0.0 blob), read-write-amplify(51.7) write-amplify(23.9) OK, records in: 12570, records dropped: 525 output_compression: NoCompression
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.130911) EVENT_LOG_v1 {"time_micros": 1764324136130898, "job": 24, "event": "compaction_finished", "compaction_time_micros": 92182, "compaction_time_cpu_micros": 41068, "output_level": 6, "num_output_files": 1, "total_output_size": 16120567, "num_input_records": 12570, "num_output_records": 12045, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324136131146, "job": 24, "event": "table_file_deletion", "file_number": 44}
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324136132931, "job": 24, "event": "table_file_deletion", "file_number": 42}
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.035957) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.133046) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.133055) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.133059) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.133062) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:02:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.133066) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:02:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:16.283 279685 DEBUG nova.network.neutron [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:02:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:16.298 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Releasing lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:02:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:16.300 279685 DEBUG nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 28 10:02:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:16.301 279685 INFO nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Creating image(s)
Nov 28 10:02:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:16.342 279685 DEBUG nova.storage.rbd_utils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:02:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:16.348 279685 DEBUG nova.objects.instance [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:02:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:16.398 279685 DEBUG nova.storage.rbd_utils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:02:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:16.441 279685 DEBUG nova.storage.rbd_utils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:02:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:16.447 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Acquiring lock "e38b87c2132e46140fd3315a4a63ceb8d23e3d74" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:02:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:16.448 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "e38b87c2132e46140fd3315a4a63ceb8d23e3d74" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:02:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:16.505 279685 DEBUG nova.virt.libvirt.imagebackend [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Image locations are: [{'url': 'rbd://2c5417c9-00eb-57d5-a565-ddecbc7995c1/images/a2def208-be38-4da4-a3f2-d5c5045455ca/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2c5417c9-00eb-57d5-a565-ddecbc7995c1/images/a2def208-be38-4da4-a3f2-d5c5045455ca/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085
Nov 28 10:02:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:16.596 279685 DEBUG nova.virt.libvirt.imagebackend [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Selected location: {'url': 'rbd://2c5417c9-00eb-57d5-a565-ddecbc7995c1/images/a2def208-be38-4da4-a3f2-d5c5045455ca/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094
Nov 28 10:02:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:16.597 279685 DEBUG nova.storage.rbd_utils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] cloning images/a2def208-be38-4da4-a3f2-d5c5045455ca@snap to None/7292509e-f294-4159-96e5-22d4712df2a0_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261
Nov 28 10:02:16 np0005538513.localdomain sudo[311249]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:02:16 np0005538513.localdomain sudo[311249]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:02:16 np0005538513.localdomain sudo[311249]: pam_unix(sudo:session): session closed for user root
Nov 28 10:02:16 np0005538513.localdomain sudo[311309]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:02:16 np0005538513.localdomain sudo[311309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:02:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:16.824 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "e38b87c2132e46140fd3315a4a63ceb8d23e3d74" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.375s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:02:17 np0005538513.localdomain ceph-mon[292954]: pgmap v143: 177 pgs: 177 active+clean; 354 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 8.0 MiB/s rd, 12 MiB/s wr, 204 op/s
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.101 279685 DEBUG nova.objects.instance [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'migration_context' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.203 279685 DEBUG nova.storage.rbd_utils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] flattening vms/7292509e-f294-4159-96e5-22d4712df2a0_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314
Nov 28 10:02:17 np0005538513.localdomain sudo[311309]: pam_unix(sudo:session): session closed for user root
Nov 28 10:02:17 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:02:17 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:02:17 np0005538513.localdomain sudo[311455]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:02:17 np0005538513.localdomain sudo[311455]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:02:17 np0005538513.localdomain sudo[311455]: pam_unix(sudo:session): session closed for user root
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.948 279685 DEBUG nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Image rbd:vms/7292509e-f294-4159-96e5-22d4712df2a0_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.948 279685 DEBUG nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.949 279685 DEBUG nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Ensure instance console log exists: /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.949 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.950 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.950 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.953 279685 DEBUG nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2025-11-28T10:01:56Z,direct_url=<?>,disk_format='raw',id=a2def208-be38-4da4-a3f2-d5c5045455ca,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-650509197-shelved',owner='a30386ba68ee46f4a1bac43cf415f3a4',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2025-11-28T10:02:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'device_type': 'disk', 'encrypted': False, 'image_id': '85968a96-5a0e-43a4-9c04-3954f640a7ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.960 279685 WARNING nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.963 279685 DEBUG nova.virt.libvirt.host [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Searching host: 'np0005538513.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.964 279685 DEBUG nova.virt.libvirt.host [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.966 279685 DEBUG nova.virt.libvirt.host [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Searching host: 'np0005538513.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.967 279685 DEBUG nova.virt.libvirt.host [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.967 279685 DEBUG nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.968 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T09:59:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98f289d4-5c06-4ab5-9089-7b580870d676',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2025-11-28T10:01:56Z,direct_url=<?>,disk_format='raw',id=a2def208-be38-4da4-a3f2-d5c5045455ca,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-650509197-shelved',owner='a30386ba68ee46f4a1bac43cf415f3a4',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2025-11-28T10:02:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.968 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.969 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.969 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.970 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.970 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.970 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.971 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.971 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.972 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.972 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.973 279685 DEBUG nova.objects.instance [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:02:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:17.998 279685 DEBUG oslo_concurrency.processutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:02:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:02:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:02:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:02:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:02:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:02:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:02:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:02:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:02:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:02:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:02:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:02:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:02:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:02:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:02:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:02:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:02:18 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 28 10:02:18 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3129734523' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:02:18 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:18.521 279685 DEBUG oslo_concurrency.processutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:02:18 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:18.566 279685 DEBUG nova.storage.rbd_utils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:02:18 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:18.573 279685 DEBUG oslo_concurrency.processutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:02:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 28 10:02:19 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3253647983' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:19.036 279685 DEBUG oslo_concurrency.processutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:19.039 279685 DEBUG nova.objects.instance [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:02:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e107 do_prune osdmap full prune enabled
Nov 28 10:02:19 np0005538513.localdomain ceph-mon[292954]: pgmap v144: 177 pgs: 177 active+clean; 273 MiB data, 941 MiB used, 41 GiB / 42 GiB avail; 10 MiB/s rd, 10 MiB/s wr, 387 op/s
Nov 28 10:02:19 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3129734523' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:02:19 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3253647983' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:19.063 279685 DEBUG nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] End _get_guest_xml xml=<domain type="kvm">
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:   <uuid>7292509e-f294-4159-96e5-22d4712df2a0</uuid>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:   <name>instance-00000007</name>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:   <memory>131072</memory>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:   <vcpu>1</vcpu>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:   <metadata>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <nova:name>tempest-UnshelveToHostMultiNodesTest-server-650509197</nova:name>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <nova:creationTime>2025-11-28 10:02:17</nova:creationTime>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <nova:flavor name="m1.nano">
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:         <nova:memory>128</nova:memory>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:         <nova:disk>1</nova:disk>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:         <nova:swap>0</nova:swap>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:         <nova:ephemeral>0</nova:ephemeral>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:         <nova:vcpus>1</nova:vcpus>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       </nova:flavor>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <nova:owner>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:         <nova:user uuid="28578129c91d407a92af609ba8bac430">tempest-UnshelveToHostMultiNodesTest-426973173-project-member</nova:user>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:         <nova:project uuid="a30386ba68ee46f4a1bac43cf415f3a4">tempest-UnshelveToHostMultiNodesTest-426973173</nova:project>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       </nova:owner>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <nova:root type="image" uuid="a2def208-be38-4da4-a3f2-d5c5045455ca"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <nova:ports/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     </nova:instance>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:   </metadata>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:   <sysinfo type="smbios">
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <system>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <entry name="manufacturer">RDO</entry>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <entry name="product">OpenStack Compute</entry>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <entry name="serial">7292509e-f294-4159-96e5-22d4712df2a0</entry>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <entry name="uuid">7292509e-f294-4159-96e5-22d4712df2a0</entry>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <entry name="family">Virtual Machine</entry>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     </system>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:   </sysinfo>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:   <os>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <boot dev="hd"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <smbios mode="sysinfo"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:   </os>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:   <features>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <acpi/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <apic/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <vmcoreinfo/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:   </features>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:   <clock offset="utc">
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <timer name="pit" tickpolicy="delay"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <timer name="hpet" present="no"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:   </clock>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:   <cpu mode="host-model" match="exact">
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <topology sockets="1" cores="1" threads="1"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:   </cpu>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:   <devices>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <disk type="network" device="disk">
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <driver type="raw" cache="none"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <source protocol="rbd" name="vms/7292509e-f294-4159-96e5-22d4712df2a0_disk">
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:         <host name="172.18.0.103" port="6789"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:         <host name="172.18.0.104" port="6789"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:         <host name="172.18.0.105" port="6789"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       </source>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <auth username="openstack">
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:         <secret type="ceph" uuid="2c5417c9-00eb-57d5-a565-ddecbc7995c1"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       </auth>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <target dev="vda" bus="virtio"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     </disk>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <disk type="network" device="cdrom">
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <driver type="raw" cache="none"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <source protocol="rbd" name="vms/7292509e-f294-4159-96e5-22d4712df2a0_disk.config">
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:         <host name="172.18.0.103" port="6789"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:         <host name="172.18.0.104" port="6789"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:         <host name="172.18.0.105" port="6789"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       </source>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <auth username="openstack">
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:         <secret type="ceph" uuid="2c5417c9-00eb-57d5-a565-ddecbc7995c1"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       </auth>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <target dev="sda" bus="sata"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     </disk>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <serial type="pty">
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <log file="/var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/console.log" append="off"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     </serial>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <video>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <model type="virtio"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     </video>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <input type="tablet" bus="usb"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <input type="keyboard" bus="usb"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <rng model="virtio">
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <backend model="random">/dev/urandom</backend>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     </rng>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <controller type="usb" index="0"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     <memballoon model="virtio">
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:       <stats period="10"/>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:     </memballoon>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:   </devices>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]: </domain>
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 28 10:02:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e108 e108: 6 total, 6 up, 6 in
Nov 28 10:02:19 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e108: 6 total, 6 up, 6 in
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:19.134 279685 DEBUG nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:19.135 279685 DEBUG nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:19.136 279685 INFO nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Using config drive
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:19.177 279685 DEBUG nova.storage.rbd_utils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:19.210 279685 DEBUG nova.objects.instance [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:19.258 279685 DEBUG nova.objects.instance [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'keypairs' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:19.324 279685 INFO nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Creating config drive at /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:19.333 279685 DEBUG oslo_concurrency.processutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwx8dh7be execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:19.467 279685 DEBUG oslo_concurrency.processutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwx8dh7be" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:19.514 279685 DEBUG nova.storage.rbd_utils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:19.519 279685 DEBUG oslo_concurrency.processutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config 7292509e-f294-4159-96e5-22d4712df2a0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:19.741 279685 DEBUG oslo_concurrency.processutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config 7292509e-f294-4159-96e5-22d4712df2a0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.222s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:19.742 279685 INFO nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Deleting local config drive /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config because it was imported into RBD.
Nov 28 10:02:19 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:02:19 np0005538513.localdomain systemd-machined[83422]: New machine qemu-5-instance-00000007.
Nov 28 10:02:19 np0005538513.localdomain systemd[1]: Started Virtual Machine qemu-5-instance-00000007.
Nov 28 10:02:19 np0005538513.localdomain podman[311593]: 2025-11-28 10:02:19.864095777 +0000 UTC m=+0.097645859 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:02:19 np0005538513.localdomain podman[311593]: 2025-11-28 10:02:19.879498488 +0000 UTC m=+0.113048600 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:02:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:02:19 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:02:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:19.941 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:20.027 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:20 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e108 do_prune osdmap full prune enabled
Nov 28 10:02:20 np0005538513.localdomain ceph-mon[292954]: osdmap e108: 6 total, 6 up, 6 in
Nov 28 10:02:20 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e109 e109: 6 total, 6 up, 6 in
Nov 28 10:02:20 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e109: 6 total, 6 up, 6 in
Nov 28 10:02:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:20.213 279685 DEBUG nova.virt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Emitting event <LifecycleEvent: 1764324140.2126472, 7292509e-f294-4159-96e5-22d4712df2a0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 10:02:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:20.213 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] VM Resumed (Lifecycle Event)
Nov 28 10:02:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:20.217 279685 DEBUG nova.compute.manager [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 10:02:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:20.218 279685 DEBUG nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 28 10:02:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:20.224 279685 INFO nova.virt.libvirt.driver [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance spawned successfully.
Nov 28 10:02:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:20.238 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:02:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:20.242 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 10:02:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:20.270 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 10:02:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:20.271 279685 DEBUG nova.virt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Emitting event <LifecycleEvent: 1764324140.2198365, 7292509e-f294-4159-96e5-22d4712df2a0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 10:02:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:20.271 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] VM Started (Lifecycle Event)
Nov 28 10:02:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:20.292 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:02:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:20.297 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 10:02:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:20.318 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 10:02:20 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:20.442 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:02:20 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:02:20 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:02:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e109 do_prune osdmap full prune enabled
Nov 28 10:02:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e110 e110: 6 total, 6 up, 6 in
Nov 28 10:02:21 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e110: 6 total, 6 up, 6 in
Nov 28 10:02:21 np0005538513.localdomain ceph-mon[292954]: pgmap v146: 177 pgs: 177 active+clean; 273 MiB data, 941 MiB used, 41 GiB / 42 GiB avail; 3.0 MiB/s rd, 25 KiB/s wr, 182 op/s
Nov 28 10:02:21 np0005538513.localdomain ceph-mon[292954]: osdmap e109: 6 total, 6 up, 6 in
Nov 28 10:02:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:02:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:21.731 279685 DEBUG nova.compute.manager [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:02:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:21.812 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "7292509e-f294-4159-96e5-22d4712df2a0" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 6.624s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:02:21 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:21.908 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:02:22 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e110 do_prune osdmap full prune enabled
Nov 28 10:02:22 np0005538513.localdomain ceph-mon[292954]: osdmap e110: 6 total, 6 up, 6 in
Nov 28 10:02:22 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e111 e111: 6 total, 6 up, 6 in
Nov 28 10:02:22 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e111: 6 total, 6 up, 6 in
Nov 28 10:02:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:02:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:02:22 np0005538513.localdomain podman[311673]: 2025-11-28 10:02:22.847169824 +0000 UTC m=+0.072381715 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Nov 28 10:02:22 np0005538513.localdomain systemd[1]: tmp-crun.wVsfFi.mount: Deactivated successfully.
Nov 28 10:02:22 np0005538513.localdomain podman[311674]: 2025-11-28 10:02:22.956394454 +0000 UTC m=+0.182396177 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 10:02:22 np0005538513.localdomain podman[311673]: 2025-11-28 10:02:22.990662116 +0000 UTC m=+0.215874057 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 10:02:23 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:02:23 np0005538513.localdomain podman[311674]: 2025-11-28 10:02:23.043450188 +0000 UTC m=+0.269451961 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:02:23 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:02:23 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e111 do_prune osdmap full prune enabled
Nov 28 10:02:23 np0005538513.localdomain ceph-mon[292954]: pgmap v149: 177 pgs: 177 active+clean; 273 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 13 MiB/s rd, 7.8 MiB/s wr, 572 op/s
Nov 28 10:02:23 np0005538513.localdomain ceph-mon[292954]: osdmap e111: 6 total, 6 up, 6 in
Nov 28 10:02:23 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e112 e112: 6 total, 6 up, 6 in
Nov 28 10:02:23 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e112: 6 total, 6 up, 6 in
Nov 28 10:02:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:23.203 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Acquiring lock "7292509e-f294-4159-96e5-22d4712df2a0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:02:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:23.204 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "7292509e-f294-4159-96e5-22d4712df2a0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:02:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:23.204 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Acquiring lock "7292509e-f294-4159-96e5-22d4712df2a0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:02:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:23.205 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "7292509e-f294-4159-96e5-22d4712df2a0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:02:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:23.205 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "7292509e-f294-4159-96e5-22d4712df2a0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:02:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:23.208 279685 INFO nova.compute.manager [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Terminating instance
Nov 28 10:02:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:23.209 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Acquiring lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:02:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:23.210 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Acquired lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:02:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:23.211 279685 DEBUG nova.network.neutron [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 10:02:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:23.433 279685 DEBUG nova.network.neutron [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 28 10:02:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:23.607 279685 DEBUG nova.network.neutron [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:02:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:23.641 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Releasing lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:02:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:23.642 279685 DEBUG nova.compute.manager [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 28 10:02:23 np0005538513.localdomain systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000007.scope: Deactivated successfully.
Nov 28 10:02:23 np0005538513.localdomain systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000007.scope: Consumed 4.092s CPU time.
Nov 28 10:02:23 np0005538513.localdomain systemd-machined[83422]: Machine qemu-5-instance-00000007 terminated.
Nov 28 10:02:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:23.866 279685 INFO nova.virt.libvirt.driver [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance destroyed successfully.
Nov 28 10:02:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:23.867 279685 DEBUG nova.objects.instance [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lazy-loading 'resources' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:02:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e112 do_prune osdmap full prune enabled
Nov 28 10:02:24 np0005538513.localdomain ceph-mon[292954]: osdmap e112: 6 total, 6 up, 6 in
Nov 28 10:02:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e113 e113: 6 total, 6 up, 6 in
Nov 28 10:02:24 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e113: 6 total, 6 up, 6 in
Nov 28 10:02:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:24.878 279685 INFO nova.virt.libvirt.driver [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Deleting instance files /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0_del
Nov 28 10:02:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:24.879 279685 INFO nova.virt.libvirt.driver [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Deletion of /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0_del complete
Nov 28 10:02:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e113 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:02:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e113 do_prune osdmap full prune enabled
Nov 28 10:02:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e114 e114: 6 total, 6 up, 6 in
Nov 28 10:02:24 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e114: 6 total, 6 up, 6 in
Nov 28 10:02:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:24.932 279685 INFO nova.compute.manager [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Took 1.29 seconds to destroy the instance on the hypervisor.
Nov 28 10:02:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:24.933 279685 DEBUG oslo.service.loopingcall [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 28 10:02:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:24.933 279685 DEBUG nova.compute.manager [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 28 10:02:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:24.933 279685 DEBUG nova.network.neutron [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 28 10:02:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:24.969 279685 DEBUG nova.network.neutron [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 28 10:02:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:24.978 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:24.985 279685 DEBUG nova.network.neutron [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:02:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:25.002 279685 INFO nova.compute.manager [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Took 0.07 seconds to deallocate network for instance.
Nov 28 10:02:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:25.028 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:25.052 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:02:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:25.053 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:02:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:25.174 279685 DEBUG oslo_concurrency.processutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:02:25 np0005538513.localdomain ceph-mon[292954]: pgmap v152: 177 pgs: 177 active+clean; 273 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 13 MiB/s rd, 12 MiB/s wr, 504 op/s
Nov 28 10:02:25 np0005538513.localdomain ceph-mon[292954]: osdmap e113: 6 total, 6 up, 6 in
Nov 28 10:02:25 np0005538513.localdomain ceph-mon[292954]: osdmap e114: 6 total, 6 up, 6 in
Nov 28 10:02:25 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:02:25 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2043476067' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:25.628 279685 DEBUG oslo_concurrency.processutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:02:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:25.636 279685 DEBUG nova.compute.provider_tree [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:02:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:25.652 279685 DEBUG nova.scheduler.client.report [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:02:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:25.674 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.621s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:02:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:25.710 279685 INFO nova.scheduler.client.report [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Deleted allocations for instance 7292509e-f294-4159-96e5-22d4712df2a0
Nov 28 10:02:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:02:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:25.787 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "7292509e-f294-4159-96e5-22d4712df2a0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.583s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:02:25 np0005538513.localdomain podman[311759]: 2025-11-28 10:02:25.853878828 +0000 UTC m=+0.092943745 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 10:02:25 np0005538513.localdomain podman[311759]: 2025-11-28 10:02:25.889808877 +0000 UTC m=+0.128873794 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 10:02:25 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:02:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e114 do_prune osdmap full prune enabled
Nov 28 10:02:26 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2043476067' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e115 e115: 6 total, 6 up, 6 in
Nov 28 10:02:26 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e115: 6 total, 6 up, 6 in
Nov 28 10:02:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:27.023 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e115 do_prune osdmap full prune enabled
Nov 28 10:02:27 np0005538513.localdomain ceph-mon[292954]: pgmap v155: 177 pgs: 177 active+clean; 273 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:02:27 np0005538513.localdomain ceph-mon[292954]: osdmap e115: 6 total, 6 up, 6 in
Nov 28 10:02:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e116 e116: 6 total, 6 up, 6 in
Nov 28 10:02:27 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e116: 6 total, 6 up, 6 in
Nov 28 10:02:28 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e116 do_prune osdmap full prune enabled
Nov 28 10:02:28 np0005538513.localdomain ceph-mon[292954]: osdmap e116: 6 total, 6 up, 6 in
Nov 28 10:02:28 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e117 e117: 6 total, 6 up, 6 in
Nov 28 10:02:28 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e117: 6 total, 6 up, 6 in
Nov 28 10:02:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e117 do_prune osdmap full prune enabled
Nov 28 10:02:29 np0005538513.localdomain ceph-mon[292954]: pgmap v158: 177 pgs: 177 active+clean; 225 MiB data, 886 MiB used, 41 GiB / 42 GiB avail; 5.2 MiB/s rd, 6.4 MiB/s wr, 555 op/s
Nov 28 10:02:29 np0005538513.localdomain ceph-mon[292954]: osdmap e117: 6 total, 6 up, 6 in
Nov 28 10:02:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e118 e118: 6 total, 6 up, 6 in
Nov 28 10:02:29 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e118: 6 total, 6 up, 6 in
Nov 28 10:02:29 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:02:29.863 2 INFO neutron.agent.securitygroups_rpc [None req-163713b6-af4d-4d16-9097-b3cd54a25f68 078cec78b66d44acb2dcf304e572f2cf dba68040958c4e4c89f84cd27a771cd2 - - default default] Security group member updated ['a0bf5ab5-c355-48ac-a40e-9473d4858766']
Nov 28 10:02:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e118 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:02:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e118 do_prune osdmap full prune enabled
Nov 28 10:02:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e119 e119: 6 total, 6 up, 6 in
Nov 28 10:02:29 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e119: 6 total, 6 up, 6 in
Nov 28 10:02:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:30.024 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:30.032 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:30 np0005538513.localdomain ceph-mon[292954]: osdmap e118: 6 total, 6 up, 6 in
Nov 28 10:02:30 np0005538513.localdomain ceph-mon[292954]: osdmap e119: 6 total, 6 up, 6 in
Nov 28 10:02:30 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:02:30.382 2 INFO neutron.agent.securitygroups_rpc [None req-59eaff10-1680-4aeb-97dc-49cab4063acc 078cec78b66d44acb2dcf304e572f2cf dba68040958c4e4c89f84cd27a771cd2 - - default default] Security group member updated ['a0bf5ab5-c355-48ac-a40e-9473d4858766']
Nov 28 10:02:30 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:30.411 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:02:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:30.415 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:30 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e119 do_prune osdmap full prune enabled
Nov 28 10:02:30 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e120 e120: 6 total, 6 up, 6 in
Nov 28 10:02:30 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e120: 6 total, 6 up, 6 in
Nov 28 10:02:31 np0005538513.localdomain ceph-mon[292954]: pgmap v161: 177 pgs: 177 active+clean; 225 MiB data, 886 MiB used, 41 GiB / 42 GiB avail; 5.2 MiB/s rd, 6.4 MiB/s wr, 555 op/s
Nov 28 10:02:31 np0005538513.localdomain ceph-mon[292954]: osdmap e120: 6 total, 6 up, 6 in
Nov 28 10:02:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e120 do_prune osdmap full prune enabled
Nov 28 10:02:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e121 e121: 6 total, 6 up, 6 in
Nov 28 10:02:32 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e121: 6 total, 6 up, 6 in
Nov 28 10:02:33 np0005538513.localdomain ceph-mon[292954]: pgmap v164: 177 pgs: 177 active+clean; 225 MiB data, 891 MiB used, 41 GiB / 42 GiB avail; 203 KiB/s rd, 51 KiB/s wr, 281 op/s
Nov 28 10:02:33 np0005538513.localdomain ceph-mon[292954]: osdmap e121: 6 total, 6 up, 6 in
Nov 28 10:02:33 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:33.965 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:02:34 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:34.690 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:02:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:02:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e121 do_prune osdmap full prune enabled
Nov 28 10:02:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e122 e122: 6 total, 6 up, 6 in
Nov 28 10:02:34 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e122: 6 total, 6 up, 6 in
Nov 28 10:02:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:35.063 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:35.067 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:35 np0005538513.localdomain ceph-mon[292954]: pgmap v166: 177 pgs: 177 active+clean; 225 MiB data, 891 MiB used, 41 GiB / 42 GiB avail; 188 KiB/s rd, 48 KiB/s wr, 260 op/s
Nov 28 10:02:35 np0005538513.localdomain ceph-mon[292954]: osdmap e122: 6 total, 6 up, 6 in
Nov 28 10:02:35 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:02:35Z|00157|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:02:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:35.538 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:02:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:02:35 np0005538513.localdomain podman[311778]: 2025-11-28 10:02:35.858519463 +0000 UTC m=+0.092535552 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:02:35 np0005538513.localdomain podman[311778]: 2025-11-28 10:02:35.893835376 +0000 UTC m=+0.127851485 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:02:35 np0005538513.localdomain podman[311779]: 2025-11-28 10:02:35.907326393 +0000 UTC m=+0.137271115 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 28 10:02:35 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:02:35 np0005538513.localdomain podman[311779]: 2025-11-28 10:02:35.922537608 +0000 UTC m=+0.152482370 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd)
Nov 28 10:02:35 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:02:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:36.357 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:37 np0005538513.localdomain ceph-mon[292954]: pgmap v168: 177 pgs: 177 active+clean; 225 MiB data, 891 MiB used, 41 GiB / 42 GiB avail; 141 KiB/s rd, 36 KiB/s wr, 195 op/s
Nov 28 10:02:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:37.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:02:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:37.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:02:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:37.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:02:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:37.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:02:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:38.864 279685 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764324143.8623488, 7292509e-f294-4159-96e5-22d4712df2a0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 10:02:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:38.864 279685 INFO nova.compute.manager [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] VM Stopped (Lifecycle Event)
Nov 28 10:02:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:38.888 279685 DEBUG nova.compute.manager [None req-92031a13-6d60-4444-b1e9-b0c375630dd1 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:02:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 10:02:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 600.0 total, 600.0 interval
                                                           Cumulative writes: 3084 writes, 26K keys, 3084 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.08 MB/s
                                                           Cumulative WAL: 3084 writes, 3084 syncs, 1.00 writes per sync, written: 0.05 GB, 0.08 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 3084 writes, 26K keys, 3084 commit groups, 1.0 writes per commit group, ingest: 46.97 MB, 0.08 MB/s
                                                           Interval WAL: 3084 writes, 3084 syncs, 1.00 writes per sync, written: 0.05 GB, 0.08 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    136.2      0.25              0.09        12    0.021       0      0       0.0       0.0
                                                             L6      1/0   15.37 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   5.3    168.3    152.5      1.19              0.47        11    0.108    128K   5632       0.0       0.0
                                                            Sum      1/0   15.37 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   6.3    139.0    149.7      1.44              0.56        23    0.062    128K   5632       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   6.3    139.4    150.1      1.43              0.56        22    0.065    128K   5632       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            Low      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   0.0    168.3    152.5      1.19              0.47        11    0.108    128K   5632       0.0       0.0
                                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    138.6      0.25              0.09        11    0.022       0      0       0.0       0.0
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 600.0 total, 600.0 interval
                                                           Flush(GB): cumulative 0.033, interval 0.033
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.21 GB write, 0.36 MB/s write, 0.19 GB read, 0.33 MB/s read, 1.4 seconds
                                                           Interval compaction: 0.21 GB write, 0.36 MB/s write, 0.19 GB read, 0.33 MB/s read, 1.4 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x55b5bb679350#2 capacity: 308.00 MB usage: 46.03 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.000342 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(3114,45.19 MB,14.6732%) FilterBlock(23,374.42 KB,0.118716%) IndexBlock(23,485.39 KB,0.153901%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Nov 28 10:02:39 np0005538513.localdomain ceph-mon[292954]: pgmap v169: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 138 KiB/s rd, 32 KiB/s wr, 191 op/s
Nov 28 10:02:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:39.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:02:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e122 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:02:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e122 do_prune osdmap full prune enabled
Nov 28 10:02:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e123 e123: 6 total, 6 up, 6 in
Nov 28 10:02:39 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e123: 6 total, 6 up, 6 in
Nov 28 10:02:39 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:02:39.942 2 INFO neutron.agent.securitygroups_rpc [None req-c410e527-579f-4d7d-bb14-04bb4c79dd9f b97430f38d544448bcb1f84d60affd50 f23b7feb8db740db9eea6302444ed3a8 - - default default] Security group member updated ['84bc6ad8-56a1-4678-950f-738b55ff6708']
Nov 28 10:02:40 np0005538513.localdomain podman[238687]: time="2025-11-28T10:02:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:02:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:40.089 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:02:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157512 "" "Go-http-client/1.1"
Nov 28 10:02:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:02:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19749 "" "Go-http-client/1.1"
Nov 28 10:02:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:40.747 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:40.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:02:40 np0005538513.localdomain ceph-mon[292954]: pgmap v170: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 15 KiB/s rd, 1023 B/s wr, 20 op/s
Nov 28 10:02:40 np0005538513.localdomain ceph-mon[292954]: osdmap e123: 6 total, 6 up, 6 in
Nov 28 10:02:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:41.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:02:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:42.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:02:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e123 do_prune osdmap full prune enabled
Nov 28 10:02:42 np0005538513.localdomain ceph-mon[292954]: pgmap v172: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 15 KiB/s rd, 6.5 KiB/s wr, 20 op/s
Nov 28 10:02:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e124 e124: 6 total, 6 up, 6 in
Nov 28 10:02:42 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e124: 6 total, 6 up, 6 in
Nov 28 10:02:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:43.256 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:43.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:02:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:43.794 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:02:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:43.794 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:02:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:43.795 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:02:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:43.795 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:02:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:43.796 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:02:43 np0005538513.localdomain ceph-mon[292954]: osdmap e124: 6 total, 6 up, 6 in
Nov 28 10:02:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:02:44 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3290241926' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:44.257 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:02:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:44.335 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:02:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:44.335 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:02:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:44.546 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:02:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:44.548 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11201MB free_disk=41.700096130371094GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:02:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:44.549 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:02:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:44.549 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:02:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:44.598 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:44 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:02:44.599 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:02:44 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:02:44.600 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:02:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:44.701 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 10:02:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:44.702 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:02:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:44.703 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:02:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:44.759 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:02:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e124 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:02:44 np0005538513.localdomain ceph-mon[292954]: pgmap v174: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 15 KiB/s rd, 6.5 KiB/s wr, 20 op/s
Nov 28 10:02:44 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3290241926' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:44 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2317602557' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:45.092 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:02:45 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1391681796' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:45.180 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:02:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:45.186 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:02:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:45.212 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:02:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:45.247 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:02:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:45.248 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.698s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:02:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:02:45 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:02:45.846 2 INFO neutron.agent.securitygroups_rpc [None req-7370f7f5-c105-405f-816d-670eb41986b4 b97430f38d544448bcb1f84d60affd50 f23b7feb8db740db9eea6302444ed3a8 - - default default] Security group member updated ['84bc6ad8-56a1-4678-950f-738b55ff6708']
Nov 28 10:02:45 np0005538513.localdomain systemd[1]: tmp-crun.jbaE0o.mount: Deactivated successfully.
Nov 28 10:02:45 np0005538513.localdomain podman[311866]: 2025-11-28 10:02:45.874282158 +0000 UTC m=+0.101472490 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 10:02:45 np0005538513.localdomain podman[311866]: 2025-11-28 10:02:45.916644501 +0000 UTC m=+0.143834863 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, container_name=openstack_network_exporter, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Nov 28 10:02:45 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:02:46 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1391681796' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:46 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/3089492664' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:46 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/3168695736' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:46 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:02:46Z|00158|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:02:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:46.578 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:46.979 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:47 np0005538513.localdomain ceph-mon[292954]: pgmap v175: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 5.5 KiB/s wr, 0 op/s
Nov 28 10:02:47 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/438558575' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:48 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e124 do_prune osdmap full prune enabled
Nov 28 10:02:48 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e125 e125: 6 total, 6 up, 6 in
Nov 28 10:02:48 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e125: 6 total, 6 up, 6 in
Nov 28 10:02:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:02:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:02:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:02:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:02:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:02:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:02:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:02:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:02:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:02:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:02:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:02:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:02:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:48.249 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:02:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:48.250 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:02:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:48.302 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 10:02:49 np0005538513.localdomain ceph-mon[292954]: pgmap v176: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 9.0 KiB/s wr, 19 op/s
Nov 28 10:02:49 np0005538513.localdomain ceph-mon[292954]: osdmap e125: 6 total, 6 up, 6 in
Nov 28 10:02:49 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:02:49Z|00159|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:02:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:49.498 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e125 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:02:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:50.096 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:50 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:02:50Z|00160|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:02:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:50.391 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:02:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:02:50.841 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:02:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:02:50.842 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:02:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:02:50.842 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:02:50 np0005538513.localdomain systemd[1]: tmp-crun.7UaIji.mount: Deactivated successfully.
Nov 28 10:02:50 np0005538513.localdomain podman[311885]: 2025-11-28 10:02:50.849153167 +0000 UTC m=+0.085515032 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:02:50 np0005538513.localdomain podman[311885]: 2025-11-28 10:02:50.88453118 +0000 UTC m=+0.120893035 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:02:50 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:02:51 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:02:51.013 2 INFO neutron.agent.securitygroups_rpc [req-bb7f0ac8-504e-4783-80de-f00563b1098a req-aad0b688-0986-452a-b92d-7d53ff4d1361 75ac6a26227c40ba81e61e610018d23f 1f9b84b894e641c4bee3ebcd1409ad9f - - default default] Security group member updated ['6deb8732-9203-448a-b0a5-cf6a0375d009']
Nov 28 10:02:51 np0005538513.localdomain ceph-mon[292954]: pgmap v178: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 3.5 KiB/s wr, 18 op/s
Nov 28 10:02:51 np0005538513.localdomain dnsmasq[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/addn_hosts - 1 addresses
Nov 28 10:02:51 np0005538513.localdomain dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/host
Nov 28 10:02:51 np0005538513.localdomain dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/opts
Nov 28 10:02:51 np0005538513.localdomain podman[311925]: 2025-11-28 10:02:51.29074383 +0000 UTC m=+0.048290515 container kill 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:02:52 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2372670749' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:52 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:52.574 261084 INFO neutron.agent.linux.ip_lib [None req-03bc2737-23b5-4d2f-80f4-f2b9089fd4e7 - - - - - -] Device tap8c96f24c-a8 cannot be used as it has no MAC address
Nov 28 10:02:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:52.632 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:52 np0005538513.localdomain kernel: device tap8c96f24c-a8 entered promiscuous mode
Nov 28 10:02:52 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324172.6392] manager: (tap8c96f24c-a8): new Generic device (/org/freedesktop/NetworkManager/Devices/30)
Nov 28 10:02:52 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:02:52Z|00161|binding|INFO|Claiming lport 8c96f24c-a809-498c-a368-b04d504c0694 for this chassis.
Nov 28 10:02:52 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:02:52Z|00162|binding|INFO|8c96f24c-a809-498c-a368-b04d504c0694: Claiming unknown
Nov 28 10:02:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:52.639 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:52 np0005538513.localdomain systemd-udevd[311956]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:02:52 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap8c96f24c-a8: No such device
Nov 28 10:02:52 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:02:52Z|00163|binding|INFO|Setting lport 8c96f24c-a809-498c-a368-b04d504c0694 ovn-installed in OVS
Nov 28 10:02:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:52.678 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:52 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap8c96f24c-a8: No such device
Nov 28 10:02:52 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap8c96f24c-a8: No such device
Nov 28 10:02:52 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap8c96f24c-a8: No such device
Nov 28 10:02:52 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap8c96f24c-a8: No such device
Nov 28 10:02:52 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:02:52Z|00164|binding|INFO|Setting lport 8c96f24c-a809-498c-a368-b04d504c0694 up in Southbound
Nov 28 10:02:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:02:52.701 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-2df2b9d7-92bb-4c3f-a2c7-b313541a7942', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2df2b9d7-92bb-4c3f-a2c7-b313541a7942', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2cfa67078b8440d0bf985b2a5e0e5558', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c3a1ad44-5623-4936-b8c1-f0d1c2dea95d, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=8c96f24c-a809-498c-a368-b04d504c0694) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:02:52 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap8c96f24c-a8: No such device
Nov 28 10:02:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:02:52.704 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 8c96f24c-a809-498c-a368-b04d504c0694 in datapath 2df2b9d7-92bb-4c3f-a2c7-b313541a7942 bound to our chassis
Nov 28 10:02:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:02:52.706 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2df2b9d7-92bb-4c3f-a2c7-b313541a7942 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:02:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:02:52.707 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[40788f2e-889a-407c-9606-c131620f6e42]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:02:52 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap8c96f24c-a8: No such device
Nov 28 10:02:52 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap8c96f24c-a8: No such device
Nov 28 10:02:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:52.716 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:52.733 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:53 np0005538513.localdomain ceph-mon[292954]: pgmap v179: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 60 KiB/s rd, 6.4 KiB/s wr, 84 op/s
Nov 28 10:02:53 np0005538513.localdomain podman[312027]: 
Nov 28 10:02:53 np0005538513.localdomain podman[312027]: 2025-11-28 10:02:53.598393163 +0000 UTC m=+0.093867012 container create 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:02:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:02:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:02:53 np0005538513.localdomain systemd[1]: Started libpod-conmon-61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31.scope.
Nov 28 10:02:53 np0005538513.localdomain systemd[1]: tmp-crun.inI0BC.mount: Deactivated successfully.
Nov 28 10:02:53 np0005538513.localdomain podman[312027]: 2025-11-28 10:02:53.551744106 +0000 UTC m=+0.047217995 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:02:53 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:02:53 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b372ebb8673178fa80e5f637587aa00b946a13d3ac306690ce5e3c373a5ccdef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:02:53 np0005538513.localdomain podman[312041]: 2025-11-28 10:02:53.725429732 +0000 UTC m=+0.090675649 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller)
Nov 28 10:02:53 np0005538513.localdomain podman[312027]: 2025-11-28 10:02:53.733066241 +0000 UTC m=+0.228540090 container init 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:02:53 np0005538513.localdomain podman[312027]: 2025-11-28 10:02:53.746884707 +0000 UTC m=+0.242358546 container start 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 10:02:53 np0005538513.localdomain dnsmasq[312077]: started, version 2.85 cachesize 150
Nov 28 10:02:53 np0005538513.localdomain dnsmasq[312077]: DNS service limited to local subnets
Nov 28 10:02:53 np0005538513.localdomain dnsmasq[312077]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:02:53 np0005538513.localdomain dnsmasq[312077]: warning: no upstream servers configured
Nov 28 10:02:53 np0005538513.localdomain dnsmasq-dhcp[312077]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:02:53 np0005538513.localdomain dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 0 addresses
Nov 28 10:02:53 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host
Nov 28 10:02:53 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts
Nov 28 10:02:53 np0005538513.localdomain podman[312042]: 2025-11-28 10:02:53.806897577 +0000 UTC m=+0.171715371 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 28 10:02:53 np0005538513.localdomain podman[312041]: 2025-11-28 10:02:53.840418037 +0000 UTC m=+0.205663944 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Nov 28 10:02:53 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:02:53 np0005538513.localdomain podman[312042]: 2025-11-28 10:02:53.891057718 +0000 UTC m=+0.255875462 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:02:53 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:02:53.890 2 INFO neutron.agent.securitygroups_rpc [None req-ca5b8c5c-4a7b-4773-b7e8-8e9eb8c79737 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:02:53 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:02:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:53.903 261084 INFO neutron.agent.dhcp.agent [None req-093128e4-5820-4fa4-94fb-6d9383a381b9 - - - - - -] DHCP configuration for ports {'a9422cdc-c436-4a76-bbaf-9159623fa972'} is completed
Nov 28 10:02:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:53.917 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:53Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6ec4ca0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6ec4550>], id=d6c2b851-701e-4d45-b5a9-391ca9c93d44, ip_allocation=immediate, mac_address=fa:16:3e:4b:a6:76, name=tempest-AllowedAddressPairTestJSON-485609447, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:02:50Z, description=, dns_domain=, id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-984695291, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63025, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1126, status=ACTIVE, subnets=['b57966e7-777d-4ac7-b284-15bb4a0b37f9'], tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:02:51Z, vlan_transparent=None, network_id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d'], standard_attr_id=1154, status=DOWN, tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:02:53Z on network 2df2b9d7-92bb-4c3f-a2c7-b313541a7942
Nov 28 10:02:54 np0005538513.localdomain dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 1 addresses
Nov 28 10:02:54 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host
Nov 28 10:02:54 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts
Nov 28 10:02:54 np0005538513.localdomain podman[312106]: 2025-11-28 10:02:54.125476256 +0000 UTC m=+0.070756679 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:02:54 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:54.337 261084 INFO neutron.agent.dhcp.agent [None req-13c0a229-aff3-4670-960a-93731a131398 - - - - - -] DHCP configuration for ports {'d6c2b851-701e-4d45-b5a9-391ca9c93d44'} is completed
Nov 28 10:02:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:02:54.604 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:02:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e125 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:02:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e125 do_prune osdmap full prune enabled
Nov 28 10:02:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e126 e126: 6 total, 6 up, 6 in
Nov 28 10:02:54 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e126: 6 total, 6 up, 6 in
Nov 28 10:02:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:55.128 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:55 np0005538513.localdomain ceph-mon[292954]: pgmap v180: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 5.6 KiB/s wr, 73 op/s
Nov 28 10:02:55 np0005538513.localdomain ceph-mon[292954]: osdmap e126: 6 total, 6 up, 6 in
Nov 28 10:02:55 np0005538513.localdomain dnsmasq[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/addn_hosts - 0 addresses
Nov 28 10:02:55 np0005538513.localdomain podman[312143]: 2025-11-28 10:02:55.41617684 +0000 UTC m=+0.065287122 container kill 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 10:02:55 np0005538513.localdomain dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/host
Nov 28 10:02:55 np0005538513.localdomain dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/opts
Nov 28 10:02:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:55.723 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:55 np0005538513.localdomain kernel: device tap516917c4-99 left promiscuous mode
Nov 28 10:02:55 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:02:55Z|00165|binding|INFO|Releasing lport 516917c4-995e-4297-af25-c4f8499fcc7d from this chassis (sb_readonly=0)
Nov 28 10:02:55 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:02:55Z|00166|binding|INFO|Setting lport 516917c4-995e-4297-af25-c4f8499fcc7d down in Southbound
Nov 28 10:02:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:02:55.738 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f9b84b894e641c4bee3ebcd1409ad9f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4106ac0-e782-4268-8bb4-37fc3096f0bc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=516917c4-995e-4297-af25-c4f8499fcc7d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:02:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:02:55.740 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 516917c4-995e-4297-af25-c4f8499fcc7d in datapath b1696f4c-80ce-491f-ad1c-cc7f5b6700ba unbound from our chassis
Nov 28 10:02:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:02:55.742 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:02:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:02:55.743 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:02:55.745 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[80c990ad-9bc1-4866-b35d-79c4592f8261]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:02:55 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:02:55.789 2 INFO neutron.agent.securitygroups_rpc [None req-0a1122e3-48a9-4fdd-9791-f33fb613b799 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:02:55 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:55.837 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd66497c0>], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:55Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6649cd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6649d60>], id=6391e909-e95a-4e76-b60a-9bc64a9f9f1b, ip_allocation=immediate, mac_address=fa:16:3e:b5:5b:ef, name=tempest-AllowedAddressPairTestJSON-1597087428, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:02:50Z, description=, dns_domain=, id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-984695291, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63025, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1126, status=ACTIVE, subnets=['b57966e7-777d-4ac7-b284-15bb4a0b37f9'], tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:02:51Z, vlan_transparent=None, network_id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d'], standard_attr_id=1175, status=DOWN, tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:02:55Z on network 2df2b9d7-92bb-4c3f-a2c7-b313541a7942
Nov 28 10:02:56 np0005538513.localdomain podman[312184]: 2025-11-28 10:02:56.075528852 +0000 UTC m=+0.062667836 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:02:56 np0005538513.localdomain dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 2 addresses
Nov 28 10:02:56 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host
Nov 28 10:02:56 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts
Nov 28 10:02:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:02:56 np0005538513.localdomain podman[312197]: 2025-11-28 10:02:56.190355843 +0000 UTC m=+0.086587733 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 28 10:02:56 np0005538513.localdomain podman[312197]: 2025-11-28 10:02:56.206571358 +0000 UTC m=+0.102803248 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:02:56 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:02:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:56.625 261084 INFO neutron.agent.dhcp.agent [None req-77bcac77-fb24-4edb-b698-1e97c40ced18 - - - - - -] DHCP configuration for ports {'6391e909-e95a-4e76-b60a-9bc64a9f9f1b'} is completed
Nov 28 10:02:57 np0005538513.localdomain ceph-mon[292954]: pgmap v182: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 3.5 KiB/s wr, 73 op/s
Nov 28 10:02:57 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:02:57.780 2 INFO neutron.agent.securitygroups_rpc [None req-2d11ad2b-bc0b-4803-8bd7-bbf5b227318c 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:02:58 np0005538513.localdomain dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 1 addresses
Nov 28 10:02:58 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host
Nov 28 10:02:58 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts
Nov 28 10:02:58 np0005538513.localdomain podman[312239]: 2025-11-28 10:02:58.114718513 +0000 UTC m=+0.061068921 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 28 10:02:58 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:02:58.910 2 INFO neutron.agent.securitygroups_rpc [None req-15306174-a853-47d1-9333-4213f5fad357 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:02:59 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:59.035 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:58Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6649f70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6649850>], id=d566639b-ef8e-4d02-a570-43c5e19e05b4, ip_allocation=immediate, mac_address=fa:16:3e:84:93:df, name=tempest-AllowedAddressPairTestJSON-2119547774, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:02:50Z, description=, dns_domain=, id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-984695291, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63025, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1126, status=ACTIVE, subnets=['b57966e7-777d-4ac7-b284-15bb4a0b37f9'], tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:02:51Z, vlan_transparent=None, network_id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d'], standard_attr_id=1179, status=DOWN, tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:02:58Z on network 2df2b9d7-92bb-4c3f-a2c7-b313541a7942
Nov 28 10:02:59 np0005538513.localdomain ceph-mon[292954]: pgmap v183: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 2.9 KiB/s wr, 60 op/s
Nov 28 10:02:59 np0005538513.localdomain dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 2 addresses
Nov 28 10:02:59 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host
Nov 28 10:02:59 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts
Nov 28 10:02:59 np0005538513.localdomain podman[312277]: 2025-11-28 10:02:59.246537094 +0000 UTC m=+0.059107115 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 10:02:59 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:02:59.499 261084 INFO neutron.agent.dhcp.agent [None req-6d8c023e-265a-4d6d-bb6b-9ffba3d27396 - - - - - -] DHCP configuration for ports {'d566639b-ef8e-4d02-a570-43c5e19e05b4'} is completed
Nov 28 10:02:59 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:00.161 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:00 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:00.206 2 INFO neutron.agent.securitygroups_rpc [None req-f1e38bd4-3201-4ca6-aca5-e6cf8d3e47ff 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:03:00 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:00Z|00167|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:03:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:00.498 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:00 np0005538513.localdomain dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 1 addresses
Nov 28 10:03:00 np0005538513.localdomain podman[312314]: 2025-11-28 10:03:00.550928359 +0000 UTC m=+0.068696470 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:03:00 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host
Nov 28 10:03:00 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts
Nov 28 10:03:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:00.975 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:01 np0005538513.localdomain ceph-mon[292954]: pgmap v184: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 2.8 KiB/s wr, 58 op/s
Nov 28 10:03:01 np0005538513.localdomain dnsmasq[310647]: exiting on receipt of SIGTERM
Nov 28 10:03:01 np0005538513.localdomain podman[312352]: 2025-11-28 10:03:01.443953958 +0000 UTC m=+0.062399390 container kill 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:03:01 np0005538513.localdomain systemd[1]: libpod-804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd.scope: Deactivated successfully.
Nov 28 10:03:01 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:01.468 2 INFO neutron.agent.securitygroups_rpc [None req-06213f27-8bbf-4f60-8df9-0ce6274952ed 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:03:01 np0005538513.localdomain podman[312365]: 2025-11-28 10:03:01.507118967 +0000 UTC m=+0.047297716 container died 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:03:01 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:01.557 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:00Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6ecc160>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd66fce20>], id=234a27f0-462c-462d-8b5e-a906ee88990b, ip_allocation=immediate, mac_address=fa:16:3e:4a:e2:fc, name=tempest-AllowedAddressPairTestJSON-632019140, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:02:50Z, description=, dns_domain=, id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-984695291, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63025, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1126, status=ACTIVE, subnets=['b57966e7-777d-4ac7-b284-15bb4a0b37f9'], tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:02:51Z, vlan_transparent=None, network_id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d'], standard_attr_id=1184, status=DOWN, tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:03:01Z on network 2df2b9d7-92bb-4c3f-a2c7-b313541a7942
Nov 28 10:03:01 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd-userdata-shm.mount: Deactivated successfully.
Nov 28 10:03:01 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-7e03c2285c42fffa1cd27962b49feefea5575696a1d40702567a3737442c3ea1-merged.mount: Deactivated successfully.
Nov 28 10:03:01 np0005538513.localdomain podman[312365]: 2025-11-28 10:03:01.593479572 +0000 UTC m=+0.133658271 container cleanup 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:03:01 np0005538513.localdomain systemd[1]: libpod-conmon-804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd.scope: Deactivated successfully.
Nov 28 10:03:01 np0005538513.localdomain podman[312366]: 2025-11-28 10:03:01.624145261 +0000 UTC m=+0.157534295 container remove 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:03:01 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:01.652 261084 INFO neutron.agent.dhcp.agent [None req-59d44550-eaf9-4c77-8e92-857e51298a03 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:01 np0005538513.localdomain dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 2 addresses
Nov 28 10:03:01 np0005538513.localdomain podman[312409]: 2025-11-28 10:03:01.779126961 +0000 UTC m=+0.059203617 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:03:01 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host
Nov 28 10:03:01 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts
Nov 28 10:03:02 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2db1696f4c\x2d80ce\x2d491f\x2dad1c\x2dcc7f5b6700ba.mount: Deactivated successfully.
Nov 28 10:03:02 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:02.754 261084 INFO neutron.agent.dhcp.agent [None req-4568dc87-1e38-4410-879d-5eb6d4a332bc - - - - - -] DHCP configuration for ports {'234a27f0-462c-462d-8b5e-a906ee88990b'} is completed
Nov 28 10:03:02 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:02.778 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:03 np0005538513.localdomain ceph-mon[292954]: pgmap v185: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:03 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:03.310 2 INFO neutron.agent.securitygroups_rpc [None req-58637b77-ae6c-405f-99c5-e20fa41f4923 f6a5516f43fb48ebaa16e4040dd82b84 cd7c5d213d924d3d9d4428db9d286082 - - default default] Security group member updated ['9f37640b-8d78-40f1-9b7c-3ec3fef04776']
Nov 28 10:03:04 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:04.504 2 INFO neutron.agent.securitygroups_rpc [None req-9701a6f5-02eb-46da-bd51-76f4153e4e2b db6b2950549b47a2a2693ecffb5083c4 cd9b97e6d04840f3a546b260a8ee9b24 - - default default] Security group member updated ['7c5e1d73-494f-47ff-9f16-a2cff6e79638']
Nov 28 10:03:04 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:05.197 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:05 np0005538513.localdomain ceph-mon[292954]: pgmap v186: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:06 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:06.147 2 INFO neutron.agent.securitygroups_rpc [None req-42b0499f-37f4-4061-a4df-d49e7a70a2c4 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:03:06 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:06.634 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:06 np0005538513.localdomain dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 1 addresses
Nov 28 10:03:06 np0005538513.localdomain podman[312444]: 2025-11-28 10:03:06.657109586 +0000 UTC m=+0.047601795 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 28 10:03:06 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host
Nov 28 10:03:06 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts
Nov 28 10:03:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:03:06 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:03:06 np0005538513.localdomain systemd[1]: tmp-crun.ex1TCr.mount: Deactivated successfully.
Nov 28 10:03:06 np0005538513.localdomain podman[312461]: 2025-11-28 10:03:06.782352345 +0000 UTC m=+0.095091436 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd)
Nov 28 10:03:06 np0005538513.localdomain podman[312461]: 2025-11-28 10:03:06.822197446 +0000 UTC m=+0.134936537 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 28 10:03:06 np0005538513.localdomain podman[312459]: 2025-11-28 10:03:06.832649186 +0000 UTC m=+0.146027825 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:03:06 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:03:06 np0005538513.localdomain podman[312459]: 2025-11-28 10:03:06.846535184 +0000 UTC m=+0.159913823 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:03:06 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:03:07 np0005538513.localdomain ceph-mon[292954]: pgmap v187: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:07 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:07.276 2 INFO neutron.agent.securitygroups_rpc [None req-374ec1da-a6ee-43ec-aeb4-2a3037224eb2 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:03:07 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:07.352 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:06Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd66a2f70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd66a2be0>], id=031c62e9-b76b-49bd-ad97-981393fcbd5a, ip_allocation=immediate, mac_address=fa:16:3e:91:ad:d2, name=tempest-AllowedAddressPairTestJSON-431195025, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:02:50Z, description=, dns_domain=, id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-984695291, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63025, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1126, status=ACTIVE, subnets=['b57966e7-777d-4ac7-b284-15bb4a0b37f9'], tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:02:51Z, vlan_transparent=None, network_id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d'], standard_attr_id=1212, status=DOWN, tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:03:07Z on network 2df2b9d7-92bb-4c3f-a2c7-b313541a7942
Nov 28 10:03:07 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:07.567 2 INFO neutron.agent.securitygroups_rpc [None req-5f1d0dc7-c78c-4e13-8de3-56bbcc932539 db6b2950549b47a2a2693ecffb5083c4 cd9b97e6d04840f3a546b260a8ee9b24 - - default default] Security group member updated ['7c5e1d73-494f-47ff-9f16-a2cff6e79638']
Nov 28 10:03:07 np0005538513.localdomain dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 2 addresses
Nov 28 10:03:07 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host
Nov 28 10:03:07 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts
Nov 28 10:03:07 np0005538513.localdomain podman[312520]: 2025-11-28 10:03:07.581224846 +0000 UTC m=+0.063876521 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 10:03:07 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:07.840 261084 INFO neutron.agent.dhcp.agent [None req-af679853-d0d1-4a54-8ee3-c61a41514b87 - - - - - -] DHCP configuration for ports {'031c62e9-b76b-49bd-ad97-981393fcbd5a'} is completed
Nov 28 10:03:08 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:08.751 2 INFO neutron.agent.securitygroups_rpc [None req-be0492e2-ff74-4faa-8249-9d4640988efe f6a5516f43fb48ebaa16e4040dd82b84 cd7c5d213d924d3d9d4428db9d286082 - - default default] Security group member updated ['9f37640b-8d78-40f1-9b7c-3ec3fef04776']
Nov 28 10:03:08 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:08.963 2 INFO neutron.agent.securitygroups_rpc [None req-f639edd5-343d-4ae3-8fa2-2054bebb498d 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:03:09 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:09.020 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:08Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6653be0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd66a2c40>], id=028280bb-2b70-45cf-b0a6-9521d60732bb, ip_allocation=immediate, mac_address=fa:16:3e:91:43:ca, name=tempest-AllowedAddressPairTestJSON-1463569347, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:02:50Z, description=, dns_domain=, id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-984695291, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63025, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1126, status=ACTIVE, subnets=['b57966e7-777d-4ac7-b284-15bb4a0b37f9'], tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:02:51Z, vlan_transparent=None, network_id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d'], standard_attr_id=1229, status=DOWN, tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:03:08Z on network 2df2b9d7-92bb-4c3f-a2c7-b313541a7942
Nov 28 10:03:09 np0005538513.localdomain dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 3 addresses
Nov 28 10:03:09 np0005538513.localdomain podman[312555]: 2025-11-28 10:03:09.250980971 +0000 UTC m=+0.061111331 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:03:09 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host
Nov 28 10:03:09 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts
Nov 28 10:03:09 np0005538513.localdomain ceph-mon[292954]: pgmap v188: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:09 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:09.487 261084 INFO neutron.agent.dhcp.agent [None req-26475625-a8b9-4c3f-9d34-d8584b8cb346 - - - - - -] DHCP configuration for ports {'028280bb-2b70-45cf-b0a6-9521d60732bb'} is completed
Nov 28 10:03:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:10 np0005538513.localdomain podman[238687]: time="2025-11-28T10:03:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:03:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:03:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157512 "" "Go-http-client/1.1"
Nov 28 10:03:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:03:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19739 "" "Go-http-client/1.1"
Nov 28 10:03:10 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:10.229 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:11 np0005538513.localdomain ceph-mon[292954]: pgmap v189: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:11 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:11.431 2 INFO neutron.agent.securitygroups_rpc [None req-cc15aeb8-86ce-4ade-b16e-7c5f404511cd 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:03:11 np0005538513.localdomain dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 2 addresses
Nov 28 10:03:11 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host
Nov 28 10:03:11 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts
Nov 28 10:03:11 np0005538513.localdomain podman[312594]: 2025-11-28 10:03:11.684306366 +0000 UTC m=+0.059999630 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 10:03:12 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:12.069 2 INFO neutron.agent.securitygroups_rpc [None req-93eb68a4-7d7e-4f26-af38-fff447267025 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:03:12 np0005538513.localdomain podman[312631]: 2025-11-28 10:03:12.330941704 +0000 UTC m=+0.060001750 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:03:12 np0005538513.localdomain dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 1 addresses
Nov 28 10:03:12 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host
Nov 28 10:03:12 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts
Nov 28 10:03:12 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:12.503 2 INFO neutron.agent.securitygroups_rpc [None req-0547c360-35fd-496e-9dbb-6212e2de25bb 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:03:12 np0005538513.localdomain systemd[1]: tmp-crun.q2zIoU.mount: Deactivated successfully.
Nov 28 10:03:12 np0005538513.localdomain dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 0 addresses
Nov 28 10:03:12 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host
Nov 28 10:03:12 np0005538513.localdomain podman[312668]: 2025-11-28 10:03:12.776759269 +0000 UTC m=+0.072292743 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 10:03:12 np0005538513.localdomain dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts
Nov 28 10:03:13 np0005538513.localdomain ceph-mon[292954]: pgmap v190: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:13 np0005538513.localdomain dnsmasq[312077]: exiting on receipt of SIGTERM
Nov 28 10:03:13 np0005538513.localdomain podman[312705]: 2025-11-28 10:03:13.744198749 +0000 UTC m=+0.056517360 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:03:13 np0005538513.localdomain systemd[1]: libpod-61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31.scope: Deactivated successfully.
Nov 28 10:03:13 np0005538513.localdomain podman[312718]: 2025-11-28 10:03:13.817925233 +0000 UTC m=+0.056576623 container died 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 10:03:13 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31-userdata-shm.mount: Deactivated successfully.
Nov 28 10:03:13 np0005538513.localdomain podman[312718]: 2025-11-28 10:03:13.852814372 +0000 UTC m=+0.091465692 container cleanup 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 28 10:03:13 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-b372ebb8673178fa80e5f637587aa00b946a13d3ac306690ce5e3c373a5ccdef-merged.mount: Deactivated successfully.
Nov 28 10:03:13 np0005538513.localdomain systemd[1]: libpod-conmon-61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31.scope: Deactivated successfully.
Nov 28 10:03:13 np0005538513.localdomain podman[312719]: 2025-11-28 10:03:13.882312428 +0000 UTC m=+0.117352824 container remove 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:03:13 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:13Z|00168|binding|INFO|Releasing lport 8c96f24c-a809-498c-a368-b04d504c0694 from this chassis (sb_readonly=0)
Nov 28 10:03:13 np0005538513.localdomain kernel: device tap8c96f24c-a8 left promiscuous mode
Nov 28 10:03:13 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:13Z|00169|binding|INFO|Setting lport 8c96f24c-a809-498c-a368-b04d504c0694 down in Southbound
Nov 28 10:03:13 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:13.928 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:13 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:13.936 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-2df2b9d7-92bb-4c3f-a2c7-b313541a7942', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2df2b9d7-92bb-4c3f-a2c7-b313541a7942', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2cfa67078b8440d0bf985b2a5e0e5558', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c3a1ad44-5623-4936-b8c1-f0d1c2dea95d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=8c96f24c-a809-498c-a368-b04d504c0694) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:13 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:13.938 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 8c96f24c-a809-498c-a368-b04d504c0694 in datapath 2df2b9d7-92bb-4c3f-a2c7-b313541a7942 unbound from our chassis
Nov 28 10:03:13 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:13.940 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2df2b9d7-92bb-4c3f-a2c7-b313541a7942, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:03:13 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:13.941 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4b00684d-e2bc-45b8-bee6-a6fda47a3011]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:13 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:13.951 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:14 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:14.324 261084 INFO neutron.agent.dhcp.agent [None req-f79aedbe-2d65-46e2-a66f-a5f1d6e784b4 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:14 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/4000469581' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:03:14 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/4000469581' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:03:14 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:14.430 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:14 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d2df2b9d7\x2d92bb\x2d4c3f\x2da2c7\x2db313541a7942.mount: Deactivated successfully.
Nov 28 10:03:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:15.260 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:15 np0005538513.localdomain ceph-mon[292954]: pgmap v191: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:15 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:15.541 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:16 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:16Z|00170|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:03:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:16.143 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:03:16 np0005538513.localdomain systemd[1]: tmp-crun.EOGvtA.mount: Deactivated successfully.
Nov 28 10:03:16 np0005538513.localdomain podman[312748]: 2025-11-28 10:03:16.854502374 +0000 UTC m=+0.091830513 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9-minimal, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, version=9.6, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, maintainer=Red Hat, Inc.)
Nov 28 10:03:16 np0005538513.localdomain podman[312748]: 2025-11-28 10:03:16.866380474 +0000 UTC m=+0.103708583 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter)
Nov 28 10:03:16 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:03:17 np0005538513.localdomain ceph-mon[292954]: pgmap v192: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:17.716 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:18 np0005538513.localdomain sudo[312770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:03:18 np0005538513.localdomain sudo[312770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:03:18 np0005538513.localdomain sudo[312770]: pam_unix(sudo:session): session closed for user root
Nov 28 10:03:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:03:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:03:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:03:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:03:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:03:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:03:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:03:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:03:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:03:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:03:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:03:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:03:18 np0005538513.localdomain sudo[312788]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 10:03:18 np0005538513.localdomain sudo[312788]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:03:18 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 10:03:18 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:18 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 10:03:18 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 10:03:18 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:18 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 10:03:18 np0005538513.localdomain sudo[312788]: pam_unix(sudo:session): session closed for user root
Nov 28 10:03:18 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:18 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:18 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 10:03:18 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:18 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 10:03:18 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:18 np0005538513.localdomain sudo[312826]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:03:18 np0005538513.localdomain sudo[312826]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:03:18 np0005538513.localdomain sudo[312826]: pam_unix(sudo:session): session closed for user root
Nov 28 10:03:18 np0005538513.localdomain sudo[312844]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:03:18 np0005538513.localdomain sudo[312844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:03:19 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:19.005 261084 INFO neutron.agent.linux.ip_lib [None req-91921294-1053-4956-b053-0520fb8e8ee7 - - - - - -] Device tap341ca857-e3 cannot be used as it has no MAC address
Nov 28 10:03:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:19.032 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:19 np0005538513.localdomain kernel: device tap341ca857-e3 entered promiscuous mode
Nov 28 10:03:19 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324199.0412] manager: (tap341ca857-e3): new Generic device (/org/freedesktop/NetworkManager/Devices/31)
Nov 28 10:03:19 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:19Z|00171|binding|INFO|Claiming lport 341ca857-e376-4066-bea3-5b6fff39b2b6 for this chassis.
Nov 28 10:03:19 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:19Z|00172|binding|INFO|341ca857-e376-4066-bea3-5b6fff39b2b6: Claiming unknown
Nov 28 10:03:19 np0005538513.localdomain systemd-udevd[312872]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:03:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:19.048 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:19 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:19Z|00173|binding|INFO|Setting lport 341ca857-e376-4066-bea3-5b6fff39b2b6 ovn-installed in OVS
Nov 28 10:03:19 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:19Z|00174|binding|INFO|Setting lport 341ca857-e376-4066-bea3-5b6fff39b2b6 up in Southbound
Nov 28 10:03:19 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:19.057 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=341ca857-e376-4066-bea3-5b6fff39b2b6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:19 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:19.059 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 341ca857-e376-4066-bea3-5b6fff39b2b6 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis
Nov 28 10:03:19 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:19.061 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:19.060 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:19 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:19.062 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[94afafac-8fde-413b-9794-7db6872abb6b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:19 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap341ca857-e3: No such device
Nov 28 10:03:19 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap341ca857-e3: No such device
Nov 28 10:03:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:19.093 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:19 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap341ca857-e3: No such device
Nov 28 10:03:19 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap341ca857-e3: No such device
Nov 28 10:03:19 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap341ca857-e3: No such device
Nov 28 10:03:19 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap341ca857-e3: No such device
Nov 28 10:03:19 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap341ca857-e3: No such device
Nov 28 10:03:19 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap341ca857-e3: No such device
Nov 28 10:03:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:19.136 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:19.167 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:19 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:19.300 2 INFO neutron.agent.securitygroups_rpc [None req-0bd438b8-b072-41d3-bddf-9588300a9670 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:19 np0005538513.localdomain ceph-mon[292954]: pgmap v193: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:19 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:19 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:19 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:19 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:19 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:19 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:19 np0005538513.localdomain sudo[312844]: pam_unix(sudo:session): session closed for user root
Nov 28 10:03:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:03:19 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:19 np0005538513.localdomain sudo[312955]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:03:19 np0005538513.localdomain sudo[312955]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:03:19 np0005538513.localdomain sudo[312955]: pam_unix(sudo:session): session closed for user root
Nov 28 10:03:19 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:19Z|00175|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:03:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:19.989 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:20 np0005538513.localdomain podman[312993]: 
Nov 28 10:03:20 np0005538513.localdomain podman[312993]: 2025-11-28 10:03:20.099609479 +0000 UTC m=+0.092314637 container create dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 10:03:20 np0005538513.localdomain podman[312993]: 2025-11-28 10:03:20.056981397 +0000 UTC m=+0.049686585 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:03:20 np0005538513.localdomain systemd[1]: Started libpod-conmon-dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3.scope.
Nov 28 10:03:20 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:03:20 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0ba7633cf2477f6d6fa8cda99e361b32589ee760afefe1d9142670b2275abf0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:03:20 np0005538513.localdomain podman[312993]: 2025-11-28 10:03:20.190335948 +0000 UTC m=+0.183041106 container init dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:03:20 np0005538513.localdomain podman[312993]: 2025-11-28 10:03:20.202889178 +0000 UTC m=+0.195594336 container start dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:03:20 np0005538513.localdomain dnsmasq[313012]: started, version 2.85 cachesize 150
Nov 28 10:03:20 np0005538513.localdomain dnsmasq[313012]: DNS service limited to local subnets
Nov 28 10:03:20 np0005538513.localdomain dnsmasq[313012]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:03:20 np0005538513.localdomain dnsmasq[313012]: warning: no upstream servers configured
Nov 28 10:03:20 np0005538513.localdomain dnsmasq-dhcp[313012]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:03:20 np0005538513.localdomain dnsmasq[313012]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:03:20 np0005538513.localdomain dnsmasq-dhcp[313012]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:03:20 np0005538513.localdomain dnsmasq-dhcp[313012]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:03:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:20.263 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:20 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:20.263 261084 INFO neutron.agent.dhcp.agent [None req-91921294-1053-4956-b053-0520fb8e8ee7 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:18Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd669b5e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd669ba60>], id=335213db-753a-4eae-b67f-8acb9db5d4f0, ip_allocation=immediate, mac_address=fa:16:3e:2c:2b:04, name=tempest-NetworksTestDHCPv6-67632547, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['b9394be9-c4f7-4b2c-bf87-38897cbd99e1'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:16Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1289, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:18Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1
Nov 28 10:03:20 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:20.445 261084 INFO neutron.agent.dhcp.agent [None req-2884d28d-33a4-43c7-8d2a-630c3fc49618 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed
Nov 28 10:03:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:03:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:03:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:03:20 np0005538513.localdomain dnsmasq[313012]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses
Nov 28 10:03:20 np0005538513.localdomain dnsmasq-dhcp[313012]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:03:20 np0005538513.localdomain dnsmasq-dhcp[313012]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:03:20 np0005538513.localdomain podman[313031]: 2025-11-28 10:03:20.487989598 +0000 UTC m=+0.067005431 container kill dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:03:20 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:03:20 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:20 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:20.885 261084 INFO neutron.agent.dhcp.agent [None req-a452ed1e-c53a-443b-8aa1-76af381123c8 - - - - - -] DHCP configuration for ports {'335213db-753a-4eae-b67f-8acb9db5d4f0'} is completed
Nov 28 10:03:20 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:20.918 2 INFO neutron.agent.securitygroups_rpc [None req-cc447c81-1a1f-4f5d-aa14-abdbefdf4620 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:03:21 np0005538513.localdomain systemd[1]: tmp-crun.sIlLsp.mount: Deactivated successfully.
Nov 28 10:03:21 np0005538513.localdomain podman[313066]: 2025-11-28 10:03:21.108791615 +0000 UTC m=+0.095235830 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:03:21 np0005538513.localdomain dnsmasq[313012]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:03:21 np0005538513.localdomain dnsmasq-dhcp[313012]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:03:21 np0005538513.localdomain dnsmasq-dhcp[313012]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:03:21 np0005538513.localdomain podman[313081]: 2025-11-28 10:03:21.1292177 +0000 UTC m=+0.072042344 container kill dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:03:21 np0005538513.localdomain podman[313066]: 2025-11-28 10:03:21.193554784 +0000 UTC m=+0.179999029 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 10:03:21 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:03:21 np0005538513.localdomain ceph-mon[292954]: pgmap v194: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:22 np0005538513.localdomain dnsmasq[313012]: exiting on receipt of SIGTERM
Nov 28 10:03:22 np0005538513.localdomain podman[313130]: 2025-11-28 10:03:22.096925199 +0000 UTC m=+0.063316185 container kill dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 28 10:03:22 np0005538513.localdomain systemd[1]: libpod-dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3.scope: Deactivated successfully.
Nov 28 10:03:22 np0005538513.localdomain podman[313143]: 2025-11-28 10:03:22.173349539 +0000 UTC m=+0.061496293 container died dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 10:03:22 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3-userdata-shm.mount: Deactivated successfully.
Nov 28 10:03:22 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-e0ba7633cf2477f6d6fa8cda99e361b32589ee760afefe1d9142670b2275abf0-merged.mount: Deactivated successfully.
Nov 28 10:03:22 np0005538513.localdomain podman[313143]: 2025-11-28 10:03:22.212558082 +0000 UTC m=+0.100704806 container cleanup dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:03:22 np0005538513.localdomain systemd[1]: libpod-conmon-dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3.scope: Deactivated successfully.
Nov 28 10:03:22 np0005538513.localdomain podman[313145]: 2025-11-28 10:03:22.253042793 +0000 UTC m=+0.133088815 container remove dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:03:22 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:22Z|00176|binding|INFO|Releasing lport 341ca857-e376-4066-bea3-5b6fff39b2b6 from this chassis (sb_readonly=0)
Nov 28 10:03:22 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:22Z|00177|binding|INFO|Setting lport 341ca857-e376-4066-bea3-5b6fff39b2b6 down in Southbound
Nov 28 10:03:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:22.267 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:22 np0005538513.localdomain kernel: device tap341ca857-e3 left promiscuous mode
Nov 28 10:03:22 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:22.275 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=341ca857-e376-4066-bea3-5b6fff39b2b6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:22 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:22.276 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 341ca857-e376-4066-bea3-5b6fff39b2b6 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis
Nov 28 10:03:22 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:22.278 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:22 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:22.279 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[cb36e133-3a03-4847-aa2b-87137c0cc891]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:22.285 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:22 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:22.564 261084 INFO neutron.agent.dhcp.agent [None req-aca12473-8b6c-4395-81c5-de30dda27e96 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:23 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully.
Nov 28 10:03:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:23.456 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:23 np0005538513.localdomain ceph-mon[292954]: pgmap v195: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:23 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:23.897 261084 INFO neutron.agent.linux.ip_lib [None req-c53502c5-ba38-467c-ac18-97b685a13bc8 - - - - - -] Device tapbfd23fad-b1 cannot be used as it has no MAC address
Nov 28 10:03:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:23.926 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:23 np0005538513.localdomain kernel: device tapbfd23fad-b1 entered promiscuous mode
Nov 28 10:03:23 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324203.9360] manager: (tapbfd23fad-b1): new Generic device (/org/freedesktop/NetworkManager/Devices/32)
Nov 28 10:03:23 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:23Z|00178|binding|INFO|Claiming lport bfd23fad-b10d-4e29-b498-7b05b354a75f for this chassis.
Nov 28 10:03:23 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:23Z|00179|binding|INFO|bfd23fad-b10d-4e29-b498-7b05b354a75f: Claiming unknown
Nov 28 10:03:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:23.936 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:23 np0005538513.localdomain systemd-udevd[313182]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:03:23 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:23.946 2 INFO neutron.agent.securitygroups_rpc [None req-8c468440-8245-4890-91bf-66327309dae3 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:23 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:23.947 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=bfd23fad-b10d-4e29-b498-7b05b354a75f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:23 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:23Z|00180|binding|INFO|Setting lport bfd23fad-b10d-4e29-b498-7b05b354a75f ovn-installed in OVS
Nov 28 10:03:23 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:23Z|00181|binding|INFO|Setting lport bfd23fad-b10d-4e29-b498-7b05b354a75f up in Southbound
Nov 28 10:03:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:23.949 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:23 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:23.950 158130 INFO neutron.agent.ovn.metadata.agent [-] Port bfd23fad-b10d-4e29-b498-7b05b354a75f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis
Nov 28 10:03:23 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:23.952 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:23 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:23.953 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[93ddd13a-2063-46ca-8119-fa9c6fa230ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:23.955 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:23 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapbfd23fad-b1: No such device
Nov 28 10:03:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:23.974 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:23 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapbfd23fad-b1: No such device
Nov 28 10:03:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:03:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:03:23 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapbfd23fad-b1: No such device
Nov 28 10:03:23 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapbfd23fad-b1: No such device
Nov 28 10:03:23 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapbfd23fad-b1: No such device
Nov 28 10:03:24 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapbfd23fad-b1: No such device
Nov 28 10:03:24 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapbfd23fad-b1: No such device
Nov 28 10:03:24 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapbfd23fad-b1: No such device
Nov 28 10:03:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:24.030 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:24.068 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:24 np0005538513.localdomain systemd[1]: tmp-crun.Kjgbt3.mount: Deactivated successfully.
Nov 28 10:03:24 np0005538513.localdomain podman[313193]: 2025-11-28 10:03:24.158760069 +0000 UTC m=+0.157876854 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Nov 28 10:03:24 np0005538513.localdomain podman[313190]: 2025-11-28 10:03:24.11377379 +0000 UTC m=+0.116274623 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 10:03:24 np0005538513.localdomain podman[313193]: 2025-11-28 10:03:24.192518417 +0000 UTC m=+0.191635162 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:03:24 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:03:24 np0005538513.localdomain podman[313190]: 2025-11-28 10:03:24.245596538 +0000 UTC m=+0.248097401 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:03:24 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:03:24 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:24.620 2 INFO neutron.agent.securitygroups_rpc [None req-6563d2b7-ae08-45e8-8b76-40044d8bfa2e 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:24 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:24.920 261084 INFO neutron.agent.linux.ip_lib [None req-67323245-aba1-4149-8266-c1f3b685ea24 - - - - - -] Device tap79491b70-fe cannot be used as it has no MAC address
Nov 28 10:03:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:24.989 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:25 np0005538513.localdomain kernel: device tap79491b70-fe entered promiscuous mode
Nov 28 10:03:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:24.996 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:24 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324204.9994] manager: (tap79491b70-fe): new Generic device (/org/freedesktop/NetworkManager/Devices/33)
Nov 28 10:03:25 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:25Z|00182|binding|INFO|Claiming lport 79491b70-fe82-4673-a612-1252578cdd84 for this chassis.
Nov 28 10:03:25 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:25Z|00183|binding|INFO|79491b70-fe82-4673-a612-1252578cdd84: Claiming unknown
Nov 28 10:03:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:25.013 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-54d19915-3dc0-4577-b573-72119a0c141d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54d19915-3dc0-4577-b573-72119a0c141d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c66e098e4fb4a349dc2bb4293454135', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db21ade0-fc80-4871-bcd6-f4301708978d, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=79491b70-fe82-4673-a612-1252578cdd84) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:25.016 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 79491b70-fe82-4673-a612-1252578cdd84 in datapath 54d19915-3dc0-4577-b573-72119a0c141d bound to our chassis
Nov 28 10:03:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:25.018 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 54d19915-3dc0-4577-b573-72119a0c141d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:25.021 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[abb44dec-798d-4342-8f08-fa81ea2e1949]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:25 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:25Z|00184|binding|INFO|Setting lport 79491b70-fe82-4673-a612-1252578cdd84 ovn-installed in OVS
Nov 28 10:03:25 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:25Z|00185|binding|INFO|Setting lport 79491b70-fe82-4673-a612-1252578cdd84 up in Southbound
Nov 28 10:03:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:25.026 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:25.051 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:25 np0005538513.localdomain podman[313300]: 
Nov 28 10:03:25 np0005538513.localdomain podman[313300]: 2025-11-28 10:03:25.084117054 +0000 UTC m=+0.142873295 container create f971bb1c82067a732a118ee8686514659117bc1baaa55465d213913a9a561507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 28 10:03:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:25.119 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:25 np0005538513.localdomain systemd[1]: Started libpod-conmon-f971bb1c82067a732a118ee8686514659117bc1baaa55465d213913a9a561507.scope.
Nov 28 10:03:25 np0005538513.localdomain podman[313300]: 2025-11-28 10:03:25.050800929 +0000 UTC m=+0.109557240 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:03:25 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:03:25 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be1b76aa3a048e62f5201b64518592855d4798bf23964acf53818c2f01e55cdb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:03:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:25.171 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:25 np0005538513.localdomain podman[313300]: 2025-11-28 10:03:25.176267655 +0000 UTC m=+0.235023926 container init f971bb1c82067a732a118ee8686514659117bc1baaa55465d213913a9a561507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:03:25 np0005538513.localdomain systemd[1]: tmp-crun.95Cw9m.mount: Deactivated successfully.
Nov 28 10:03:25 np0005538513.localdomain podman[313300]: 2025-11-28 10:03:25.188375001 +0000 UTC m=+0.247131272 container start f971bb1c82067a732a118ee8686514659117bc1baaa55465d213913a9a561507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:03:25 np0005538513.localdomain dnsmasq[313328]: started, version 2.85 cachesize 150
Nov 28 10:03:25 np0005538513.localdomain dnsmasq[313328]: DNS service limited to local subnets
Nov 28 10:03:25 np0005538513.localdomain dnsmasq[313328]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:03:25 np0005538513.localdomain dnsmasq[313328]: warning: no upstream servers configured
Nov 28 10:03:25 np0005538513.localdomain dnsmasq[313328]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:03:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:25.267 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:25 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:25.396 261084 INFO neutron.agent.dhcp.agent [None req-d85c7343-5333-4b96-8f67-8f95e25fdc86 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed
Nov 28 10:03:25 np0005538513.localdomain ceph-mon[292954]: pgmap v196: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:25 np0005538513.localdomain dnsmasq[313328]: exiting on receipt of SIGTERM
Nov 28 10:03:25 np0005538513.localdomain podman[313359]: 2025-11-28 10:03:25.572969312 +0000 UTC m=+0.065373115 container kill f971bb1c82067a732a118ee8686514659117bc1baaa55465d213913a9a561507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 28 10:03:25 np0005538513.localdomain systemd[1]: libpod-f971bb1c82067a732a118ee8686514659117bc1baaa55465d213913a9a561507.scope: Deactivated successfully.
Nov 28 10:03:25 np0005538513.localdomain podman[313376]: 2025-11-28 10:03:25.651640826 +0000 UTC m=+0.066200818 container died f971bb1c82067a732a118ee8686514659117bc1baaa55465d213913a9a561507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 10:03:25 np0005538513.localdomain podman[313376]: 2025-11-28 10:03:25.687001659 +0000 UTC m=+0.101561611 container cleanup f971bb1c82067a732a118ee8686514659117bc1baaa55465d213913a9a561507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:03:25 np0005538513.localdomain systemd[1]: libpod-conmon-f971bb1c82067a732a118ee8686514659117bc1baaa55465d213913a9a561507.scope: Deactivated successfully.
Nov 28 10:03:25 np0005538513.localdomain podman[313381]: 2025-11-28 10:03:25.711859981 +0000 UTC m=+0.113268966 container remove f971bb1c82067a732a118ee8686514659117bc1baaa55465d213913a9a561507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:03:25 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:25Z|00186|binding|INFO|Releasing lport bfd23fad-b10d-4e29-b498-7b05b354a75f from this chassis (sb_readonly=0)
Nov 28 10:03:25 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:25Z|00187|binding|INFO|Setting lport bfd23fad-b10d-4e29-b498-7b05b354a75f down in Southbound
Nov 28 10:03:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:25.727 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:25 np0005538513.localdomain kernel: device tapbfd23fad-b1 left promiscuous mode
Nov 28 10:03:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:25.737 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=bfd23fad-b10d-4e29-b498-7b05b354a75f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:25.739 158130 INFO neutron.agent.ovn.metadata.agent [-] Port bfd23fad-b10d-4e29-b498-7b05b354a75f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis
Nov 28 10:03:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:25.741 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:25.747 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[173e3637-49b9-43f2-affe-e7412bd540cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:25.750 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:25 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e126 do_prune osdmap full prune enabled
Nov 28 10:03:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e127 e127: 6 total, 6 up, 6 in
Nov 28 10:03:26 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e127: 6 total, 6 up, 6 in
Nov 28 10:03:26 np0005538513.localdomain podman[313432]: 
Nov 28 10:03:26 np0005538513.localdomain podman[313432]: 2025-11-28 10:03:26.162960667 +0000 UTC m=+0.098394090 container create 7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54d19915-3dc0-4577-b573-72119a0c141d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 10:03:26 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-be1b76aa3a048e62f5201b64518592855d4798bf23964acf53818c2f01e55cdb-merged.mount: Deactivated successfully.
Nov 28 10:03:26 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f971bb1c82067a732a118ee8686514659117bc1baaa55465d213913a9a561507-userdata-shm.mount: Deactivated successfully.
Nov 28 10:03:26 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully.
Nov 28 10:03:26 np0005538513.localdomain systemd[1]: Started libpod-conmon-7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5.scope.
Nov 28 10:03:26 np0005538513.localdomain podman[313432]: 2025-11-28 10:03:26.119634546 +0000 UTC m=+0.055067999 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:03:26 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:03:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:03:26 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c2133ceac09aa4671885348478e6b1807140f8a826fd1cc3877c75256248e37/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:03:26 np0005538513.localdomain podman[313432]: 2025-11-28 10:03:26.257761344 +0000 UTC m=+0.193194777 container init 7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54d19915-3dc0-4577-b573-72119a0c141d, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 10:03:26 np0005538513.localdomain podman[313432]: 2025-11-28 10:03:26.26772324 +0000 UTC m=+0.203156673 container start 7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54d19915-3dc0-4577-b573-72119a0c141d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 10:03:26 np0005538513.localdomain dnsmasq[313457]: started, version 2.85 cachesize 150
Nov 28 10:03:26 np0005538513.localdomain dnsmasq[313457]: DNS service limited to local subnets
Nov 28 10:03:26 np0005538513.localdomain dnsmasq[313457]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:03:26 np0005538513.localdomain dnsmasq[313457]: warning: no upstream servers configured
Nov 28 10:03:26 np0005538513.localdomain dnsmasq-dhcp[313457]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:03:26 np0005538513.localdomain dnsmasq[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/addn_hosts - 0 addresses
Nov 28 10:03:26 np0005538513.localdomain dnsmasq-dhcp[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/host
Nov 28 10:03:26 np0005538513.localdomain dnsmasq-dhcp[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/opts
Nov 28 10:03:26 np0005538513.localdomain podman[313450]: 2025-11-28 10:03:26.3497456 +0000 UTC m=+0.092380839 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:03:26 np0005538513.localdomain podman[313450]: 2025-11-28 10:03:26.368507687 +0000 UTC m=+0.111142896 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 10:03:26 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:03:26 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:26.450 261084 INFO neutron.agent.dhcp.agent [None req-9d58455f-b9e4-4772-8ea1-75e0fc7bbaf4 - - - - - -] DHCP configuration for ports {'4bca3778-f7fe-4f52-a319-4ddc2deb73f1'} is completed
Nov 28 10:03:26 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:26.719 261084 INFO neutron.agent.linux.ip_lib [None req-7a7440a7-77a7-4562-ac0b-ec783591dc65 - - - - - -] Device tap80c11714-73 cannot be used as it has no MAC address
Nov 28 10:03:26 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:26.725 2 INFO neutron.agent.securitygroups_rpc [None req-a6c40294-bcff-4fbb-89ad-bea0e8a1937c 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:26.751 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:26 np0005538513.localdomain kernel: device tap80c11714-73 entered promiscuous mode
Nov 28 10:03:26 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324206.7603] manager: (tap80c11714-73): new Generic device (/org/freedesktop/NetworkManager/Devices/34)
Nov 28 10:03:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:26.760 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:26 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:26Z|00188|binding|INFO|Claiming lport 80c11714-7320-4ace-8aff-6149fd7ecd71 for this chassis.
Nov 28 10:03:26 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:26Z|00189|binding|INFO|80c11714-7320-4ace-8aff-6149fd7ecd71: Claiming unknown
Nov 28 10:03:26 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:26Z|00190|binding|INFO|Setting lport 80c11714-7320-4ace-8aff-6149fd7ecd71 ovn-installed in OVS
Nov 28 10:03:26 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:26Z|00191|binding|INFO|Setting lport 80c11714-7320-4ace-8aff-6149fd7ecd71 up in Southbound
Nov 28 10:03:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:26.774 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:26.776 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=80c11714-7320-4ace-8aff-6149fd7ecd71) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:26.778 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 80c11714-7320-4ace-8aff-6149fd7ecd71 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis
Nov 28 10:03:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:26.780 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:26.782 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:26.781 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[6e34120a-c3ce-4acb-9672-84a308b3f888]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:26.805 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:26.852 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:26.892 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:27 np0005538513.localdomain ceph-mon[292954]: pgmap v197: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:27 np0005538513.localdomain ceph-mon[292954]: osdmap e127: 6 total, 6 up, 6 in
Nov 28 10:03:27 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:27.492 2 INFO neutron.agent.securitygroups_rpc [None req-9bac993d-08a2-4a7a-9741-5f6e8a523396 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:27 np0005538513.localdomain podman[313537]: 
Nov 28 10:03:27 np0005538513.localdomain podman[313537]: 2025-11-28 10:03:27.758487536 +0000 UTC m=+0.086678735 container create c81095c088dbbd0ccb72cdf8259fd0da9bd35dd09ba7b44742077a1ff324e069 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:03:27 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:27.774 261084 INFO neutron.agent.linux.ip_lib [None req-d55d30d6-980a-4c20-8ecb-8a8df164c5ef - - - - - -] Device tap3e54b4c6-84 cannot be used as it has no MAC address
Nov 28 10:03:27 np0005538513.localdomain podman[313537]: 2025-11-28 10:03:27.720978381 +0000 UTC m=+0.049169580 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:03:27 np0005538513.localdomain systemd[1]: Started libpod-conmon-c81095c088dbbd0ccb72cdf8259fd0da9bd35dd09ba7b44742077a1ff324e069.scope.
Nov 28 10:03:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:27.847 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:27 np0005538513.localdomain kernel: device tap3e54b4c6-84 entered promiscuous mode
Nov 28 10:03:27 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324207.8577] manager: (tap3e54b4c6-84): new Generic device (/org/freedesktop/NetworkManager/Devices/35)
Nov 28 10:03:27 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:27Z|00192|binding|INFO|Claiming lport 3e54b4c6-8462-4da0-9951-b922d57575cf for this chassis.
Nov 28 10:03:27 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:27Z|00193|binding|INFO|3e54b4c6-8462-4da0-9951-b922d57575cf: Claiming unknown
Nov 28 10:03:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:27.858 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:27.862 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:27 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:27Z|00194|binding|INFO|Setting lport 3e54b4c6-8462-4da0-9951-b922d57575cf ovn-installed in OVS
Nov 28 10:03:27 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:27Z|00195|binding|INFO|Setting lport 3e54b4c6-8462-4da0-9951-b922d57575cf up in Southbound
Nov 28 10:03:27 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:27.880 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-3f9a6f97-9109-45cc-b3d8-12edbd83a346', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f9a6f97-9109-45cc-b3d8-12edbd83a346', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c66e098e4fb4a349dc2bb4293454135', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=34d43fdc-65ac-42a6-8e26-177d541f3791, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=3e54b4c6-8462-4da0-9951-b922d57575cf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:27 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:27.882 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 3e54b4c6-8462-4da0-9951-b922d57575cf in datapath 3f9a6f97-9109-45cc-b3d8-12edbd83a346 bound to our chassis
Nov 28 10:03:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:27.881 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:27 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:27.884 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3f9a6f97-9109-45cc-b3d8-12edbd83a346 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:27 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:27.885 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[04909683-1f8b-4a44-b351-9b1b7bd3eb52]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:27 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:03:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:27.892 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:27 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/615202a384b5901bf010b31341d8cceb3dc86a8f244a7a54a09f31e43589443f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:03:27 np0005538513.localdomain podman[313537]: 2025-11-28 10:03:27.905267471 +0000 UTC m=+0.233458670 container init c81095c088dbbd0ccb72cdf8259fd0da9bd35dd09ba7b44742077a1ff324e069 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:03:27 np0005538513.localdomain podman[313537]: 2025-11-28 10:03:27.919810499 +0000 UTC m=+0.248001698 container start c81095c088dbbd0ccb72cdf8259fd0da9bd35dd09ba7b44742077a1ff324e069 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:03:27 np0005538513.localdomain dnsmasq[313569]: started, version 2.85 cachesize 150
Nov 28 10:03:27 np0005538513.localdomain dnsmasq[313569]: DNS service limited to local subnets
Nov 28 10:03:27 np0005538513.localdomain dnsmasq[313569]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:03:27 np0005538513.localdomain dnsmasq[313569]: warning: no upstream servers configured
Nov 28 10:03:27 np0005538513.localdomain dnsmasq-dhcp[313569]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:03:27 np0005538513.localdomain dnsmasq[313569]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:03:27 np0005538513.localdomain dnsmasq-dhcp[313569]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:03:27 np0005538513.localdomain dnsmasq-dhcp[313569]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:03:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:27.933 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:27.964 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:28 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e127 do_prune osdmap full prune enabled
Nov 28 10:03:28 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e128 e128: 6 total, 6 up, 6 in
Nov 28 10:03:28 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e128: 6 total, 6 up, 6 in
Nov 28 10:03:28 np0005538513.localdomain systemd[1]: tmp-crun.6CQfIu.mount: Deactivated successfully.
Nov 28 10:03:28 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:28.381 261084 INFO neutron.agent.dhcp.agent [None req-dcc1db14-efa7-425a-8704-492431afa33a - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed
Nov 28 10:03:28 np0005538513.localdomain dnsmasq[313569]: exiting on receipt of SIGTERM
Nov 28 10:03:28 np0005538513.localdomain podman[313609]: 2025-11-28 10:03:28.567466817 +0000 UTC m=+0.061244487 container kill c81095c088dbbd0ccb72cdf8259fd0da9bd35dd09ba7b44742077a1ff324e069 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 28 10:03:28 np0005538513.localdomain systemd[1]: libpod-c81095c088dbbd0ccb72cdf8259fd0da9bd35dd09ba7b44742077a1ff324e069.scope: Deactivated successfully.
Nov 28 10:03:28 np0005538513.localdomain podman[313626]: 2025-11-28 10:03:28.641997461 +0000 UTC m=+0.049257641 container died c81095c088dbbd0ccb72cdf8259fd0da9bd35dd09ba7b44742077a1ff324e069 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:03:28 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c81095c088dbbd0ccb72cdf8259fd0da9bd35dd09ba7b44742077a1ff324e069-userdata-shm.mount: Deactivated successfully.
Nov 28 10:03:28 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-615202a384b5901bf010b31341d8cceb3dc86a8f244a7a54a09f31e43589443f-merged.mount: Deactivated successfully.
Nov 28 10:03:28 np0005538513.localdomain podman[313626]: 2025-11-28 10:03:28.693027874 +0000 UTC m=+0.100287944 container remove c81095c088dbbd0ccb72cdf8259fd0da9bd35dd09ba7b44742077a1ff324e069 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:03:28 np0005538513.localdomain systemd[1]: libpod-conmon-c81095c088dbbd0ccb72cdf8259fd0da9bd35dd09ba7b44742077a1ff324e069.scope: Deactivated successfully.
Nov 28 10:03:28 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:28Z|00196|binding|INFO|Releasing lport 80c11714-7320-4ace-8aff-6149fd7ecd71 from this chassis (sb_readonly=0)
Nov 28 10:03:28 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:28Z|00197|binding|INFO|Setting lport 80c11714-7320-4ace-8aff-6149fd7ecd71 down in Southbound
Nov 28 10:03:28 np0005538513.localdomain kernel: device tap80c11714-73 left promiscuous mode
Nov 28 10:03:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:28.714 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:28 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:28.720 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=80c11714-7320-4ace-8aff-6149fd7ecd71) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:28 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:28.722 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 80c11714-7320-4ace-8aff-6149fd7ecd71 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis
Nov 28 10:03:28 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:28.723 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:28 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:28.724 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[c8923e72-e116-493f-bede-b461113aa643]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:28.735 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:29 np0005538513.localdomain podman[313673]: 
Nov 28 10:03:29 np0005538513.localdomain podman[313673]: 2025-11-28 10:03:29.013728383 +0000 UTC m=+0.087311472 container create 2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f9a6f97-9109-45cc-b3d8-12edbd83a346, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:03:29 np0005538513.localdomain ceph-mon[292954]: pgmap v199: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 1.6 KiB/s wr, 18 op/s
Nov 28 10:03:29 np0005538513.localdomain ceph-mon[292954]: osdmap e128: 6 total, 6 up, 6 in
Nov 28 10:03:29 np0005538513.localdomain systemd[1]: Started libpod-conmon-2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a.scope.
Nov 28 10:03:29 np0005538513.localdomain podman[313673]: 2025-11-28 10:03:28.974401517 +0000 UTC m=+0.047984606 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:03:29 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:03:29 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6f48a9173b9ce2ea6c707fe91dbf14c53a35e5091245aaa244be183a6c7890a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:03:29 np0005538513.localdomain podman[313673]: 2025-11-28 10:03:29.098668257 +0000 UTC m=+0.172251376 container init 2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f9a6f97-9109-45cc-b3d8-12edbd83a346, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 10:03:29 np0005538513.localdomain podman[313673]: 2025-11-28 10:03:29.108053896 +0000 UTC m=+0.181636995 container start 2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f9a6f97-9109-45cc-b3d8-12edbd83a346, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:03:29 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:29.109 261084 INFO neutron.agent.dhcp.agent [None req-9416bd7d-ed78-4bf5-87f1-14425dcd69fc - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:29 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:29.111 261084 INFO neutron.agent.dhcp.agent [None req-9416bd7d-ed78-4bf5-87f1-14425dcd69fc - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:29 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:29.112 261084 INFO neutron.agent.dhcp.agent [None req-9416bd7d-ed78-4bf5-87f1-14425dcd69fc - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:29 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:29.112 261084 INFO neutron.agent.dhcp.agent [None req-9416bd7d-ed78-4bf5-87f1-14425dcd69fc - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:29 np0005538513.localdomain dnsmasq[313691]: started, version 2.85 cachesize 150
Nov 28 10:03:29 np0005538513.localdomain dnsmasq[313691]: DNS service limited to local subnets
Nov 28 10:03:29 np0005538513.localdomain dnsmasq[313691]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:03:29 np0005538513.localdomain dnsmasq[313691]: warning: no upstream servers configured
Nov 28 10:03:29 np0005538513.localdomain dnsmasq-dhcp[313691]: DHCP, static leases only on 10.101.0.0, lease time 1d
Nov 28 10:03:29 np0005538513.localdomain dnsmasq[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/addn_hosts - 0 addresses
Nov 28 10:03:29 np0005538513.localdomain dnsmasq-dhcp[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/host
Nov 28 10:03:29 np0005538513.localdomain dnsmasq-dhcp[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/opts
Nov 28 10:03:29 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully.
Nov 28 10:03:29 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:29.257 261084 INFO neutron.agent.dhcp.agent [None req-d7a37b2c-9460-4ffb-85ac-c14e00cc5088 - - - - - -] DHCP configuration for ports {'5d63853b-9b6d-4df6-a058-ea8bdb95fd89'} is completed
Nov 28 10:03:29 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:29.861 261084 INFO neutron.agent.linux.ip_lib [None req-ed4c1b1d-7cca-4932-a6ac-61674616df16 - - - - - -] Device tapb9229365-e6 cannot be used as it has no MAC address
Nov 28 10:03:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:29.885 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:29 np0005538513.localdomain kernel: device tapb9229365-e6 entered promiscuous mode
Nov 28 10:03:29 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324209.8924] manager: (tapb9229365-e6): new Generic device (/org/freedesktop/NetworkManager/Devices/36)
Nov 28 10:03:29 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:29Z|00198|binding|INFO|Claiming lport b9229365-e63f-47af-83a5-e34c4eab9b13 for this chassis.
Nov 28 10:03:29 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:29Z|00199|binding|INFO|b9229365-e63f-47af-83a5-e34c4eab9b13: Claiming unknown
Nov 28 10:03:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:29.894 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:29 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:29Z|00200|binding|INFO|Setting lport b9229365-e63f-47af-83a5-e34c4eab9b13 up in Southbound
Nov 28 10:03:29 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:29.905 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=b9229365-e63f-47af-83a5-e34c4eab9b13) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:29.905 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:29 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:29.907 158130 INFO neutron.agent.ovn.metadata.agent [-] Port b9229365-e63f-47af-83a5-e34c4eab9b13 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis
Nov 28 10:03:29 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:29.909 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:29 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:29Z|00201|binding|INFO|Setting lport b9229365-e63f-47af-83a5-e34c4eab9b13 ovn-installed in OVS
Nov 28 10:03:29 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:29.910 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[2c9195d5-62a9-4e74-b1d9-07cf8f4b8fb6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:29.913 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:29.942 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:29 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:29.981 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:30.007 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:30 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e128 do_prune osdmap full prune enabled
Nov 28 10:03:30 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e129 e129: 6 total, 6 up, 6 in
Nov 28 10:03:30 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e129: 6 total, 6 up, 6 in
Nov 28 10:03:30 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:30.073 2 INFO neutron.agent.securitygroups_rpc [None req-a538bd0d-c0aa-4d14-8c4b-26de5d170843 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:30.296 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:30 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:30.468 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:29Z, description=, device_id=95dd87c4-3547-4d63-b1b3-f7fac634736c, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65c1610>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65c1940>], id=6f30b16c-3d15-4d00-b3e9-56746d8a041a, ip_allocation=immediate, mac_address=fa:16:3e:7e:00:68, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:21Z, description=, dns_domain=, id=54d19915-3dc0-4577-b573-72119a0c141d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-437254213, port_security_enabled=True, project_id=8c66e098e4fb4a349dc2bb4293454135, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=2845, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1311, status=ACTIVE, subnets=['7300da77-eea5-408a-91a4-c84afe3031ce'], tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:03:23Z, vlan_transparent=None, network_id=54d19915-3dc0-4577-b573-72119a0c141d, port_security_enabled=False, project_id=8c66e098e4fb4a349dc2bb4293454135, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1361, status=DOWN, tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:03:30Z on network 54d19915-3dc0-4577-b573-72119a0c141d
Nov 28 10:03:30 np0005538513.localdomain podman[313750]: 2025-11-28 10:03:30.695709668 +0000 UTC m=+0.065508738 container kill 7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54d19915-3dc0-4577-b573-72119a0c141d, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:03:30 np0005538513.localdomain dnsmasq[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/addn_hosts - 1 addresses
Nov 28 10:03:30 np0005538513.localdomain dnsmasq-dhcp[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/host
Nov 28 10:03:30 np0005538513.localdomain dnsmasq-dhcp[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/opts
Nov 28 10:03:30 np0005538513.localdomain podman[313793]: 
Nov 28 10:03:31 np0005538513.localdomain podman[313793]: 2025-11-28 10:03:31.00396308 +0000 UTC m=+0.105735180 container create 23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:03:31 np0005538513.localdomain podman[313793]: 2025-11-28 10:03:30.956161481 +0000 UTC m=+0.057933621 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:03:31 np0005538513.localdomain ceph-mon[292954]: pgmap v201: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 2.0 KiB/s wr, 23 op/s
Nov 28 10:03:31 np0005538513.localdomain ceph-mon[292954]: osdmap e129: 6 total, 6 up, 6 in
Nov 28 10:03:31 np0005538513.localdomain systemd[1]: Started libpod-conmon-23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae.scope.
Nov 28 10:03:31 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:03:31 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b1a668778bcfd1edf34acdc7e1173cee88f0564f8beedbbd0a0b8935a85fd2a6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:03:31 np0005538513.localdomain podman[313793]: 2025-11-28 10:03:31.18757677 +0000 UTC m=+0.289348880 container init 23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:03:31 np0005538513.localdomain podman[313793]: 2025-11-28 10:03:31.193796999 +0000 UTC m=+0.295569109 container start 23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:03:31 np0005538513.localdomain dnsmasq[313811]: started, version 2.85 cachesize 150
Nov 28 10:03:31 np0005538513.localdomain dnsmasq[313811]: DNS service limited to local subnets
Nov 28 10:03:31 np0005538513.localdomain dnsmasq[313811]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:03:31 np0005538513.localdomain dnsmasq[313811]: warning: no upstream servers configured
Nov 28 10:03:31 np0005538513.localdomain dnsmasq-dhcp[313811]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:03:31 np0005538513.localdomain dnsmasq[313811]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:03:31 np0005538513.localdomain dnsmasq-dhcp[313811]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:03:31 np0005538513.localdomain dnsmasq-dhcp[313811]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:03:31 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:31.234 261084 INFO neutron.agent.dhcp.agent [None req-ed4c1b1d-7cca-4932-a6ac-61674616df16 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:29Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65d4730>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65d4b50>], id=eca23945-2aa5-4e91-a9eb-84685313cbf1, ip_allocation=immediate, mac_address=fa:16:3e:8d:8a:22, name=tempest-NetworksTestDHCPv6-1625906628, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=8, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['c6e666cb-138e-4fa7-b852-373dd89d6438'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:28Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1360, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:29Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1
Nov 28 10:03:31 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:31.371 261084 INFO neutron.agent.dhcp.agent [None req-4519bccb-0d8b-4f79-94fa-1768aeafc3f9 - - - - - -] DHCP configuration for ports {'6f30b16c-3d15-4d00-b3e9-56746d8a041a'} is completed
Nov 28 10:03:31 np0005538513.localdomain dnsmasq[313811]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses
Nov 28 10:03:31 np0005538513.localdomain dnsmasq-dhcp[313811]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:03:31 np0005538513.localdomain dnsmasq-dhcp[313811]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:03:31 np0005538513.localdomain podman[313828]: 2025-11-28 10:03:31.456440696 +0000 UTC m=+0.069018219 container kill 23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:03:31 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:31.526 261084 INFO neutron.agent.dhcp.agent [None req-03f01c79-972b-4a85-9494-52a9f9b8edd6 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed
Nov 28 10:03:31 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:31.762 261084 INFO neutron.agent.dhcp.agent [None req-b059a3c1-4b71-4dcc-8bdd-38a79446d36f - - - - - -] DHCP configuration for ports {'eca23945-2aa5-4e91-a9eb-84685313cbf1'} is completed
Nov 28 10:03:31 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:31Z|00202|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:03:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:31.847 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e129 do_prune osdmap full prune enabled
Nov 28 10:03:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e130 e130: 6 total, 6 up, 6 in
Nov 28 10:03:32 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e130: 6 total, 6 up, 6 in
Nov 28 10:03:32 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:32.349 2 INFO neutron.agent.securitygroups_rpc [None req-23b7a4db-87e9-4c7d-8b9d-380815f2adcd 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:32 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:32.364 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:29Z, description=, device_id=95dd87c4-3547-4d63-b1b3-f7fac634736c, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65b8df0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65b8be0>], id=6f30b16c-3d15-4d00-b3e9-56746d8a041a, ip_allocation=immediate, mac_address=fa:16:3e:7e:00:68, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:21Z, description=, dns_domain=, id=54d19915-3dc0-4577-b573-72119a0c141d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-437254213, port_security_enabled=True, project_id=8c66e098e4fb4a349dc2bb4293454135, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=2845, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1311, status=ACTIVE, subnets=['7300da77-eea5-408a-91a4-c84afe3031ce'], tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:03:23Z, vlan_transparent=None, network_id=54d19915-3dc0-4577-b573-72119a0c141d, port_security_enabled=False, project_id=8c66e098e4fb4a349dc2bb4293454135, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1361, status=DOWN, tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:03:30Z on network 54d19915-3dc0-4577-b573-72119a0c141d
Nov 28 10:03:32 np0005538513.localdomain systemd[1]: tmp-crun.JhQVSQ.mount: Deactivated successfully.
Nov 28 10:03:32 np0005538513.localdomain dnsmasq[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/addn_hosts - 1 addresses
Nov 28 10:03:32 np0005538513.localdomain dnsmasq-dhcp[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/host
Nov 28 10:03:32 np0005538513.localdomain dnsmasq-dhcp[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/opts
Nov 28 10:03:32 np0005538513.localdomain podman[313881]: 2025-11-28 10:03:32.61696357 +0000 UTC m=+0.078970494 container kill 7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54d19915-3dc0-4577-b573-72119a0c141d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:03:32 np0005538513.localdomain dnsmasq[313811]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:03:32 np0005538513.localdomain dnsmasq-dhcp[313811]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:03:32 np0005538513.localdomain dnsmasq-dhcp[313811]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:03:32 np0005538513.localdomain podman[313888]: 2025-11-28 10:03:32.677806913 +0000 UTC m=+0.122397568 container kill 23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:03:32 np0005538513.localdomain systemd[1]: tmp-crun.OUcszA.mount: Deactivated successfully.
Nov 28 10:03:32 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:32.933 261084 INFO neutron.agent.dhcp.agent [None req-ff8a999b-bb21-441c-af5b-99c50650f3ee - - - - - -] DHCP configuration for ports {'6f30b16c-3d15-4d00-b3e9-56746d8a041a'} is completed
Nov 28 10:03:33 np0005538513.localdomain ceph-mon[292954]: pgmap v203: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 97 KiB/s rd, 8.7 KiB/s wr, 132 op/s
Nov 28 10:03:33 np0005538513.localdomain ceph-mon[292954]: osdmap e130: 6 total, 6 up, 6 in
Nov 28 10:03:33 np0005538513.localdomain dnsmasq[313811]: exiting on receipt of SIGTERM
Nov 28 10:03:33 np0005538513.localdomain podman[313939]: 2025-11-28 10:03:33.336349133 +0000 UTC m=+0.064718566 container kill 23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:03:33 np0005538513.localdomain systemd[1]: libpod-23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae.scope: Deactivated successfully.
Nov 28 10:03:33 np0005538513.localdomain podman[313951]: 2025-11-28 10:03:33.406362969 +0000 UTC m=+0.057498769 container died 23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:03:33 np0005538513.localdomain podman[313951]: 2025-11-28 10:03:33.449918927 +0000 UTC m=+0.101054717 container cleanup 23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:03:33 np0005538513.localdomain systemd[1]: libpod-conmon-23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae.scope: Deactivated successfully.
Nov 28 10:03:33 np0005538513.localdomain podman[313958]: 2025-11-28 10:03:33.500913168 +0000 UTC m=+0.137511071 container remove 23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:03:33 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:33Z|00203|binding|INFO|Releasing lport b9229365-e63f-47af-83a5-e34c4eab9b13 from this chassis (sb_readonly=0)
Nov 28 10:03:33 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:33Z|00204|binding|INFO|Setting lport b9229365-e63f-47af-83a5-e34c4eab9b13 down in Southbound
Nov 28 10:03:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:33.547 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:33 np0005538513.localdomain kernel: device tapb9229365-e6 left promiscuous mode
Nov 28 10:03:33 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:33.555 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=b9229365-e63f-47af-83a5-e34c4eab9b13) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:33 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:33.557 158130 INFO neutron.agent.ovn.metadata.agent [-] Port b9229365-e63f-47af-83a5-e34c4eab9b13 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis
Nov 28 10:03:33 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:33.558 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:33 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:33.559 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[7dc8c26c-16ab-41eb-91a3-44fc82d3752e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:33.575 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-b1a668778bcfd1edf34acdc7e1173cee88f0564f8beedbbd0a0b8935a85fd2a6-merged.mount: Deactivated successfully.
Nov 28 10:03:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae-userdata-shm.mount: Deactivated successfully.
Nov 28 10:03:33 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:33.769 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:33Z, description=, device_id=95dd87c4-3547-4d63-b1b3-f7fac634736c, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65b0850>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65b0df0>], id=037df326-df58-4d28-8dbc-25f9c11ef071, ip_allocation=immediate, mac_address=fa:16:3e:44:9d:d6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:22Z, description=, dns_domain=, id=3f9a6f97-9109-45cc-b3d8-12edbd83a346, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1083733682, port_security_enabled=True, project_id=8c66e098e4fb4a349dc2bb4293454135, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=61995, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1315, status=ACTIVE, subnets=['fc4d69a4-0dba-4d3f-a03f-18d4101010b3'], tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:03:26Z, vlan_transparent=None, network_id=3f9a6f97-9109-45cc-b3d8-12edbd83a346, port_security_enabled=False, project_id=8c66e098e4fb4a349dc2bb4293454135, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1375, status=DOWN, tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:03:33Z on network 3f9a6f97-9109-45cc-b3d8-12edbd83a346
Nov 28 10:03:33 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully.
Nov 28 10:03:33 np0005538513.localdomain podman[313999]: 2025-11-28 10:03:33.988767827 +0000 UTC m=+0.064397096 container kill 2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f9a6f97-9109-45cc-b3d8-12edbd83a346, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 10:03:33 np0005538513.localdomain dnsmasq[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/addn_hosts - 1 addresses
Nov 28 10:03:33 np0005538513.localdomain dnsmasq-dhcp[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/host
Nov 28 10:03:33 np0005538513.localdomain dnsmasq-dhcp[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/opts
Nov 28 10:03:34 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:34.504 261084 INFO neutron.agent.dhcp.agent [None req-243ada98-2bf5-439a-a2f3-e76a20daaaa2 - - - - - -] DHCP configuration for ports {'037df326-df58-4d28-8dbc-25f9c11ef071'} is completed
Nov 28 10:03:34 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:34.815 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:33Z, description=, device_id=95dd87c4-3547-4d63-b1b3-f7fac634736c, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd662f100>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd662fee0>], id=037df326-df58-4d28-8dbc-25f9c11ef071, ip_allocation=immediate, mac_address=fa:16:3e:44:9d:d6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:22Z, description=, dns_domain=, id=3f9a6f97-9109-45cc-b3d8-12edbd83a346, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1083733682, port_security_enabled=True, project_id=8c66e098e4fb4a349dc2bb4293454135, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=61995, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1315, status=ACTIVE, subnets=['fc4d69a4-0dba-4d3f-a03f-18d4101010b3'], tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:03:26Z, vlan_transparent=None, network_id=3f9a6f97-9109-45cc-b3d8-12edbd83a346, port_security_enabled=False, project_id=8c66e098e4fb4a349dc2bb4293454135, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1375, status=DOWN, tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:03:33Z on network 3f9a6f97-9109-45cc-b3d8-12edbd83a346
Nov 28 10:03:34 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:34.880 2 INFO neutron.agent.securitygroups_rpc [None req-a1e00e91-b063-4693-9d8b-b7a005d16694 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e130 do_prune osdmap full prune enabled
Nov 28 10:03:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e131 e131: 6 total, 6 up, 6 in
Nov 28 10:03:34 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e131: 6 total, 6 up, 6 in
Nov 28 10:03:35 np0005538513.localdomain dnsmasq[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/addn_hosts - 1 addresses
Nov 28 10:03:35 np0005538513.localdomain dnsmasq-dhcp[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/host
Nov 28 10:03:35 np0005538513.localdomain podman[314040]: 2025-11-28 10:03:35.038522176 +0000 UTC m=+0.058236580 container kill 2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f9a6f97-9109-45cc-b3d8-12edbd83a346, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 28 10:03:35 np0005538513.localdomain dnsmasq-dhcp[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/opts
Nov 28 10:03:35 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:35.062 261084 INFO neutron.agent.linux.ip_lib [None req-ebecf48a-9039-4715-a56f-c608e77dba56 - - - - - -] Device tapd925f0b8-f1 cannot be used as it has no MAC address
Nov 28 10:03:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:35.144 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:35 np0005538513.localdomain kernel: device tapd925f0b8-f1 entered promiscuous mode
Nov 28 10:03:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:35.149 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:35 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324215.1535] manager: (tapd925f0b8-f1): new Generic device (/org/freedesktop/NetworkManager/Devices/37)
Nov 28 10:03:35 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:35Z|00205|binding|INFO|Claiming lport d925f0b8-f14d-42ee-9e29-a163823c50b3 for this chassis.
Nov 28 10:03:35 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:35Z|00206|binding|INFO|d925f0b8-f14d-42ee-9e29-a163823c50b3: Claiming unknown
Nov 28 10:03:35 np0005538513.localdomain systemd-udevd[314063]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:03:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:35.158 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:35 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:35Z|00207|binding|INFO|Setting lport d925f0b8-f14d-42ee-9e29-a163823c50b3 ovn-installed in OVS
Nov 28 10:03:35 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:35Z|00208|binding|INFO|Setting lport d925f0b8-f14d-42ee-9e29-a163823c50b3 up in Southbound
Nov 28 10:03:35 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:35.168 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=d925f0b8-f14d-42ee-9e29-a163823c50b3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:35 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:35.172 158130 INFO neutron.agent.ovn.metadata.agent [-] Port d925f0b8-f14d-42ee-9e29-a163823c50b3 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis
Nov 28 10:03:35 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:35.174 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:35 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:35.175 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[0e22c034-3058-4fd9-a75c-9b7d4ccb602f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:35 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapd925f0b8-f1: No such device
Nov 28 10:03:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:35.188 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:35 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapd925f0b8-f1: No such device
Nov 28 10:03:35 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapd925f0b8-f1: No such device
Nov 28 10:03:35 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapd925f0b8-f1: No such device
Nov 28 10:03:35 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapd925f0b8-f1: No such device
Nov 28 10:03:35 np0005538513.localdomain ceph-mon[292954]: pgmap v205: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 74 KiB/s rd, 6.0 KiB/s wr, 101 op/s
Nov 28 10:03:35 np0005538513.localdomain ceph-mon[292954]: osdmap e131: 6 total, 6 up, 6 in
Nov 28 10:03:35 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapd925f0b8-f1: No such device
Nov 28 10:03:35 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapd925f0b8-f1: No such device
Nov 28 10:03:35 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapd925f0b8-f1: No such device
Nov 28 10:03:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:35.247 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:35.283 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:35.298 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:35 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:35.356 261084 INFO neutron.agent.dhcp.agent [None req-768664a4-c7a9-45d3-a856-6fd2cac9670f - - - - - -] DHCP configuration for ports {'037df326-df58-4d28-8dbc-25f9c11ef071'} is completed
Nov 28 10:03:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:35.693 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:35 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:35.850 2 INFO neutron.agent.securitygroups_rpc [None req-57c1bb59-2e0b-4157-baad-e850337ecf12 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:36 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e131 do_prune osdmap full prune enabled
Nov 28 10:03:36 np0005538513.localdomain podman[314139]: 
Nov 28 10:03:36 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e132 e132: 6 total, 6 up, 6 in
Nov 28 10:03:36 np0005538513.localdomain podman[314139]: 2025-11-28 10:03:36.217656603 +0000 UTC m=+0.092817231 container create 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 28 10:03:36 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e132: 6 total, 6 up, 6 in
Nov 28 10:03:36 np0005538513.localdomain systemd[1]: Started libpod-conmon-905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1.scope.
Nov 28 10:03:36 np0005538513.localdomain podman[314139]: 2025-11-28 10:03:36.174742493 +0000 UTC m=+0.049903191 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:03:36 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:03:36 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f97123adbd9c48a0f6f44f905a033a1731be1c1882de0ddc59c8b81ddaf0fc55/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:03:36 np0005538513.localdomain podman[314139]: 2025-11-28 10:03:36.320213791 +0000 UTC m=+0.195374389 container init 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:03:36 np0005538513.localdomain podman[314139]: 2025-11-28 10:03:36.328248752 +0000 UTC m=+0.203409350 container start 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 10:03:36 np0005538513.localdomain dnsmasq[314158]: started, version 2.85 cachesize 150
Nov 28 10:03:36 np0005538513.localdomain dnsmasq[314158]: DNS service limited to local subnets
Nov 28 10:03:36 np0005538513.localdomain dnsmasq[314158]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:03:36 np0005538513.localdomain dnsmasq[314158]: warning: no upstream servers configured
Nov 28 10:03:36 np0005538513.localdomain dnsmasq[314158]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:03:36 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:36.373 261084 INFO neutron.agent.dhcp.agent [None req-ebecf48a-9039-4715-a56f-c608e77dba56 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:33Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6ee2880>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6ee2760>], id=4ca17edf-539e-4eb8-8272-2ef525cc9f90, ip_allocation=immediate, mac_address=fa:16:3e:21:93:1a, name=tempest-NetworksTestDHCPv6-642344273, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=10, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['04c13047-e932-4b16-a523-563db6998f3b'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:33Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1376, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:34Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1
Nov 28 10:03:36 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:36.526 261084 INFO neutron.agent.dhcp.agent [None req-ff48d7c0-99db-42ce-81db-4ede61c064b7 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed
Nov 28 10:03:36 np0005538513.localdomain dnsmasq[314158]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses
Nov 28 10:03:36 np0005538513.localdomain podman[314190]: 2025-11-28 10:03:36.557781989 +0000 UTC m=+0.058645971 container kill 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 28 10:03:36 np0005538513.localdomain dnsmasq[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/addn_hosts - 0 addresses
Nov 28 10:03:36 np0005538513.localdomain dnsmasq-dhcp[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/host
Nov 28 10:03:36 np0005538513.localdomain podman[314204]: 2025-11-28 10:03:36.657207757 +0000 UTC m=+0.090300578 container kill 2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f9a6f97-9109-45cc-b3d8-12edbd83a346, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:03:36 np0005538513.localdomain dnsmasq-dhcp[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/opts
Nov 28 10:03:36 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:36Z|00209|binding|INFO|Releasing lport 3e54b4c6-8462-4da0-9951-b922d57575cf from this chassis (sb_readonly=0)
Nov 28 10:03:36 np0005538513.localdomain kernel: device tap3e54b4c6-84 left promiscuous mode
Nov 28 10:03:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:36.882 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:36 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:36Z|00210|binding|INFO|Setting lport 3e54b4c6-8462-4da0-9951-b922d57575cf down in Southbound
Nov 28 10:03:36 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:36.897 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-3f9a6f97-9109-45cc-b3d8-12edbd83a346', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f9a6f97-9109-45cc-b3d8-12edbd83a346', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c66e098e4fb4a349dc2bb4293454135', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=34d43fdc-65ac-42a6-8e26-177d541f3791, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=3e54b4c6-8462-4da0-9951-b922d57575cf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:36 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:36.898 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 3e54b4c6-8462-4da0-9951-b922d57575cf in datapath 3f9a6f97-9109-45cc-b3d8-12edbd83a346 unbound from our chassis
Nov 28 10:03:36 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:36.899 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3f9a6f97-9109-45cc-b3d8-12edbd83a346, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:03:36 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:36.900 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[1e7503d9-b7a5-417e-a811-868b4ff95391]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:36.912 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:36 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:36.928 261084 INFO neutron.agent.dhcp.agent [None req-f77bac8d-9df6-465e-ab8a-7da88b9a6892 - - - - - -] DHCP configuration for ports {'4ca17edf-539e-4eb8-8272-2ef525cc9f90'} is completed
Nov 28 10:03:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:03:37 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:03:37 np0005538513.localdomain podman[314235]: 2025-11-28 10:03:37.082804682 +0000 UTC m=+0.067419122 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:03:37 np0005538513.localdomain podman[314235]: 2025-11-28 10:03:37.163187236 +0000 UTC m=+0.147801706 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:03:37 np0005538513.localdomain podman[314236]: 2025-11-28 10:03:37.11444729 +0000 UTC m=+0.094121629 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:03:37 np0005538513.localdomain podman[314236]: 2025-11-28 10:03:37.199640521 +0000 UTC m=+0.179314830 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:03:37 np0005538513.localdomain podman[314290]: 2025-11-28 10:03:37.202891433 +0000 UTC m=+0.057105836 container kill 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:03:37 np0005538513.localdomain dnsmasq[314158]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:03:37 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:03:37 np0005538513.localdomain ceph-mon[292954]: pgmap v207: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 74 KiB/s rd, 6.0 KiB/s wr, 101 op/s
Nov 28 10:03:37 np0005538513.localdomain ceph-mon[292954]: osdmap e132: 6 total, 6 up, 6 in
Nov 28 10:03:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e132 do_prune osdmap full prune enabled
Nov 28 10:03:37 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:03:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e133 e133: 6 total, 6 up, 6 in
Nov 28 10:03:37 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e133: 6 total, 6 up, 6 in
Nov 28 10:03:37 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:37.424 2 INFO neutron.agent.securitygroups_rpc [None req-69a76c47-354e-40d8-9c7a-4acd924cbac4 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:37 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:37.753 261084 INFO neutron.agent.dhcp.agent [None req-66785779-8803-4e42-9b78-bde749875261 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', 'd925f0b8-f14d-42ee-9e29-a163823c50b3'} is completed
Nov 28 10:03:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:37.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:03:37 np0005538513.localdomain systemd[1]: tmp-crun.ckqDjF.mount: Deactivated successfully.
Nov 28 10:03:37 np0005538513.localdomain dnsmasq[314158]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses
Nov 28 10:03:37 np0005538513.localdomain podman[314333]: 2025-11-28 10:03:37.844065136 +0000 UTC m=+0.070976164 container kill 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:03:37 np0005538513.localdomain dnsmasq[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/addn_hosts - 0 addresses
Nov 28 10:03:37 np0005538513.localdomain dnsmasq-dhcp[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/host
Nov 28 10:03:37 np0005538513.localdomain dnsmasq-dhcp[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/opts
Nov 28 10:03:37 np0005538513.localdomain podman[314362]: 2025-11-28 10:03:37.93805734 +0000 UTC m=+0.067218857 container kill 7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54d19915-3dc0-4577-b573-72119a0c141d, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 10:03:38 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:38.064 261084 INFO neutron.agent.dhcp.agent [None req-b537f614-6618-40e2-b0aa-deb70b449541 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:37Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65b0970>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65b0cd0>], id=f0e40f52-9209-4283-a23f-088678cd6c9e, ip_allocation=immediate, mac_address=fa:16:3e:3f:c6:b2, name=tempest-NetworksTestDHCPv6-1521665284, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=12, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['157c2b3e-45c5-4baa-a8ab-f40581745311'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:36Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1380, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:37Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1
Nov 28 10:03:38 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:38.184 261084 INFO neutron.agent.dhcp.agent [None req-12020dff-2454-40f9-9ca8-d08b3e6c59cb - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', 'd925f0b8-f14d-42ee-9e29-a163823c50b3', 'f0e40f52-9209-4283-a23f-088678cd6c9e'} is completed
Nov 28 10:03:38 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:38.187 2 INFO neutron.agent.securitygroups_rpc [None req-eef6fd1f-c62c-4fce-a6ed-f73dc25767c9 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:38 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e133 do_prune osdmap full prune enabled
Nov 28 10:03:38 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:38Z|00211|binding|INFO|Releasing lport 79491b70-fe82-4673-a612-1252578cdd84 from this chassis (sb_readonly=0)
Nov 28 10:03:38 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:38Z|00212|binding|INFO|Setting lport 79491b70-fe82-4673-a612-1252578cdd84 down in Southbound
Nov 28 10:03:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:38.252 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:38 np0005538513.localdomain kernel: device tap79491b70-fe left promiscuous mode
Nov 28 10:03:38 np0005538513.localdomain ceph-mon[292954]: osdmap e133: 6 total, 6 up, 6 in
Nov 28 10:03:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:38.275 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:38 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e134 e134: 6 total, 6 up, 6 in
Nov 28 10:03:38 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e134: 6 total, 6 up, 6 in
Nov 28 10:03:38 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:38.286 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-54d19915-3dc0-4577-b573-72119a0c141d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54d19915-3dc0-4577-b573-72119a0c141d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c66e098e4fb4a349dc2bb4293454135', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db21ade0-fc80-4871-bcd6-f4301708978d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=79491b70-fe82-4673-a612-1252578cdd84) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:38 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:38.288 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 79491b70-fe82-4673-a612-1252578cdd84 in datapath 54d19915-3dc0-4577-b573-72119a0c141d unbound from our chassis
Nov 28 10:03:38 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:38Z|00213|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:03:38 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:38.292 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 54d19915-3dc0-4577-b573-72119a0c141d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:03:38 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:38.293 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[097599d1-84a4-4bb0-b670-0ab8deb0b42f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:38.334 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:38 np0005538513.localdomain podman[314412]: 2025-11-28 10:03:38.345798362 +0000 UTC m=+0.075534425 container kill 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:03:38 np0005538513.localdomain dnsmasq[314158]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses
Nov 28 10:03:38 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:38.611 261084 INFO neutron.agent.dhcp.agent [None req-20f3e84f-edd5-476d-87eb-dc91a7444dcc - - - - - -] DHCP configuration for ports {'f0e40f52-9209-4283-a23f-088678cd6c9e'} is completed
Nov 28 10:03:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:38.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:03:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:38.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:03:38 np0005538513.localdomain dnsmasq[314158]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:03:38 np0005538513.localdomain podman[314450]: 2025-11-28 10:03:38.774693282 +0000 UTC m=+0.073169978 container kill 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:03:39 np0005538513.localdomain dnsmasq[313691]: exiting on receipt of SIGTERM
Nov 28 10:03:39 np0005538513.localdomain podman[314489]: 2025-11-28 10:03:39.21814729 +0000 UTC m=+0.069296947 container kill 2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f9a6f97-9109-45cc-b3d8-12edbd83a346, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 10:03:39 np0005538513.localdomain systemd[1]: libpod-2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a.scope: Deactivated successfully.
Nov 28 10:03:39 np0005538513.localdomain ceph-mon[292954]: pgmap v210: 177 pgs: 177 active+clean; 145 MiB data, 777 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 1.6 KiB/s wr, 29 op/s
Nov 28 10:03:39 np0005538513.localdomain ceph-mon[292954]: osdmap e134: 6 total, 6 up, 6 in
Nov 28 10:03:39 np0005538513.localdomain podman[314518]: 2025-11-28 10:03:39.29495695 +0000 UTC m=+0.054581124 container died 2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f9a6f97-9109-45cc-b3d8-12edbd83a346, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:03:39 np0005538513.localdomain systemd[1]: tmp-crun.LQuSrS.mount: Deactivated successfully.
Nov 28 10:03:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-c6f48a9173b9ce2ea6c707fe91dbf14c53a35e5091245aaa244be183a6c7890a-merged.mount: Deactivated successfully.
Nov 28 10:03:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a-userdata-shm.mount: Deactivated successfully.
Nov 28 10:03:39 np0005538513.localdomain podman[314518]: 2025-11-28 10:03:39.393990047 +0000 UTC m=+0.153614171 container remove 2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f9a6f97-9109-45cc-b3d8-12edbd83a346, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 10:03:39 np0005538513.localdomain systemd[1]: libpod-conmon-2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a.scope: Deactivated successfully.
Nov 28 10:03:39 np0005538513.localdomain dnsmasq[314158]: exiting on receipt of SIGTERM
Nov 28 10:03:39 np0005538513.localdomain systemd[1]: libpod-905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1.scope: Deactivated successfully.
Nov 28 10:03:39 np0005538513.localdomain podman[314538]: 2025-11-28 10:03:39.440684716 +0000 UTC m=+0.160261023 container kill 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:03:39 np0005538513.localdomain systemd[1]: tmp-crun.PvFtZt.mount: Deactivated successfully.
Nov 28 10:03:39 np0005538513.localdomain dnsmasq[313457]: exiting on receipt of SIGTERM
Nov 28 10:03:39 np0005538513.localdomain podman[314573]: 2025-11-28 10:03:39.515558831 +0000 UTC m=+0.112121723 container kill 7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54d19915-3dc0-4577-b573-72119a0c141d, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:03:39 np0005538513.localdomain systemd[1]: libpod-7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5.scope: Deactivated successfully.
Nov 28 10:03:39 np0005538513.localdomain podman[314587]: 2025-11-28 10:03:39.536390098 +0000 UTC m=+0.083232135 container died 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:03:39 np0005538513.localdomain podman[314587]: 2025-11-28 10:03:39.571606817 +0000 UTC m=+0.118448804 container cleanup 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:03:39 np0005538513.localdomain systemd[1]: libpod-conmon-905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1.scope: Deactivated successfully.
Nov 28 10:03:39 np0005538513.localdomain podman[314594]: 2025-11-28 10:03:39.614884437 +0000 UTC m=+0.144584873 container remove 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 28 10:03:39 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:39Z|00214|binding|INFO|Releasing lport d925f0b8-f14d-42ee-9e29-a163823c50b3 from this chassis (sb_readonly=0)
Nov 28 10:03:39 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:39Z|00215|binding|INFO|Setting lport d925f0b8-f14d-42ee-9e29-a163823c50b3 down in Southbound
Nov 28 10:03:39 np0005538513.localdomain kernel: device tapd925f0b8-f1 left promiscuous mode
Nov 28 10:03:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:39.666 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:39 np0005538513.localdomain podman[314612]: 2025-11-28 10:03:39.675569756 +0000 UTC m=+0.140021333 container died 7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54d19915-3dc0-4577-b573-72119a0c141d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:03:39 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:39.677 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=d925f0b8-f14d-42ee-9e29-a163823c50b3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:39 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:39.681 261084 INFO neutron.agent.dhcp.agent [None req-90146684-8217-426a-92a0-b7458f59b967 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:39 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:39.683 261084 INFO neutron.agent.dhcp.agent [None req-90146684-8217-426a-92a0-b7458f59b967 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:39.686 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:39 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:39.685 158130 INFO neutron.agent.ovn.metadata.agent [-] Port d925f0b8-f14d-42ee-9e29-a163823c50b3 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis
Nov 28 10:03:39 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:39.692 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:39 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:39.694 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[2dea9726-06d2-45db-be79-507dd932009c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:39.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:03:39 np0005538513.localdomain podman[314612]: 2025-11-28 10:03:39.773830562 +0000 UTC m=+0.238282159 container remove 7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54d19915-3dc0-4577-b573-72119a0c141d, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 10:03:39 np0005538513.localdomain systemd[1]: libpod-conmon-7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5.scope: Deactivated successfully.
Nov 28 10:03:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e134 do_prune osdmap full prune enabled
Nov 28 10:03:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e135 e135: 6 total, 6 up, 6 in
Nov 28 10:03:39 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e135: 6 total, 6 up, 6 in
Nov 28 10:03:40 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:40.037 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:40 np0005538513.localdomain podman[238687]: time="2025-11-28T10:03:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:03:40 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:40.089 261084 INFO neutron.agent.dhcp.agent [None req-17c930e4-ed87-497a-a8db-dbf6503df94e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:03:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1"
Nov 28 10:03:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:03:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19254 "" "Go-http-client/1.1"
Nov 28 10:03:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:40.302 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:40 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f97123adbd9c48a0f6f44f905a033a1731be1c1882de0ddc59c8b81ddaf0fc55-merged.mount: Deactivated successfully.
Nov 28 10:03:40 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1-userdata-shm.mount: Deactivated successfully.
Nov 28 10:03:40 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully.
Nov 28 10:03:40 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d3f9a6f97\x2d9109\x2d45cc\x2db3d8\x2d12edbd83a346.mount: Deactivated successfully.
Nov 28 10:03:40 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-7c2133ceac09aa4671885348478e6b1807140f8a826fd1cc3877c75256248e37-merged.mount: Deactivated successfully.
Nov 28 10:03:40 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5-userdata-shm.mount: Deactivated successfully.
Nov 28 10:03:40 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d54d19915\x2d3dc0\x2d4577\x2db573\x2d72119a0c141d.mount: Deactivated successfully.
Nov 28 10:03:40 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:40.465 2 INFO neutron.agent.securitygroups_rpc [None req-3119b771-1e00-43fd-8d05-15e8a1d2219b 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:40.534 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:40.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:03:40 np0005538513.localdomain ceph-mon[292954]: pgmap v212: 177 pgs: 177 active+clean; 145 MiB data, 777 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 1.9 KiB/s wr, 34 op/s
Nov 28 10:03:40 np0005538513.localdomain ceph-mon[292954]: osdmap e135: 6 total, 6 up, 6 in
Nov 28 10:03:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:41.149 261084 INFO neutron.agent.linux.ip_lib [None req-2dcca3c0-71e9-44b5-9cb0-377311810ae2 - - - - - -] Device tapda5b2ad4-01 cannot be used as it has no MAC address
Nov 28 10:03:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:41.200 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:41 np0005538513.localdomain kernel: device tapda5b2ad4-01 entered promiscuous mode
Nov 28 10:03:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:41.208 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:41 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324221.2112] manager: (tapda5b2ad4-01): new Generic device (/org/freedesktop/NetworkManager/Devices/38)
Nov 28 10:03:41 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:41Z|00216|binding|INFO|Claiming lport da5b2ad4-01e7-4e6c-9145-eaa3075e76fa for this chassis.
Nov 28 10:03:41 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:41Z|00217|binding|INFO|da5b2ad4-01e7-4e6c-9145-eaa3075e76fa: Claiming unknown
Nov 28 10:03:41 np0005538513.localdomain systemd-udevd[314649]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:03:41 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:41Z|00218|binding|INFO|Setting lport da5b2ad4-01e7-4e6c-9145-eaa3075e76fa ovn-installed in OVS
Nov 28 10:03:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:41.218 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:41 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:41Z|00219|binding|INFO|Setting lport da5b2ad4-01e7-4e6c-9145-eaa3075e76fa up in Southbound
Nov 28 10:03:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:41.226 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=da5b2ad4-01e7-4e6c-9145-eaa3075e76fa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:41.227 158130 INFO neutron.agent.ovn.metadata.agent [-] Port da5b2ad4-01e7-4e6c-9145-eaa3075e76fa in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis
Nov 28 10:03:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:41.228 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:41.229 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[1e763f94-e2e1-4b8d-8b11-c7915ee51443]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:41.239 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:41.247 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:41.286 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:41.327 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:41.555 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:41 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:41.646 2 INFO neutron.agent.securitygroups_rpc [None req-1b0c80b5-e3aa-421f-ac6e-3cbc3bd6a095 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:41.722 261084 INFO neutron.agent.linux.ip_lib [None req-fb7d8aeb-8c15-4a31-a128-a92fe0323a66 - - - - - -] Device tap0f64cccb-f5 cannot be used as it has no MAC address
Nov 28 10:03:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:41.755 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:41 np0005538513.localdomain kernel: device tap0f64cccb-f5 entered promiscuous mode
Nov 28 10:03:41 np0005538513.localdomain systemd-udevd[314651]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:03:41 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324221.7627] manager: (tap0f64cccb-f5): new Generic device (/org/freedesktop/NetworkManager/Devices/39)
Nov 28 10:03:41 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:41Z|00220|binding|INFO|Claiming lport 0f64cccb-f58c-4c85-8d1d-c5dbcdc8de50 for this chassis.
Nov 28 10:03:41 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:41Z|00221|binding|INFO|0f64cccb-f58c-4c85-8d1d-c5dbcdc8de50: Claiming unknown
Nov 28 10:03:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:41.764 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:41.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:03:41 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap0f64cccb-f5: No such device
Nov 28 10:03:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:41.790 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-eec9ef76-e9ff-47a5-b8c7-9b69e3732166', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eec9ef76-e9ff-47a5-b8c7-9b69e3732166', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '50a1392ce96c4024bcd36a3df403ca29', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76364ea1-fb09-4d6a-aaea-0dba21691fb4, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=0f64cccb-f58c-4c85-8d1d-c5dbcdc8de50) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:41.792 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:41.794 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 0f64cccb-f58c-4c85-8d1d-c5dbcdc8de50 in datapath eec9ef76-e9ff-47a5-b8c7-9b69e3732166 bound to our chassis
Nov 28 10:03:41 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:41Z|00222|binding|INFO|Setting lport 0f64cccb-f58c-4c85-8d1d-c5dbcdc8de50 ovn-installed in OVS
Nov 28 10:03:41 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:41Z|00223|binding|INFO|Setting lport 0f64cccb-f58c-4c85-8d1d-c5dbcdc8de50 up in Southbound
Nov 28 10:03:41 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap0f64cccb-f5: No such device
Nov 28 10:03:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:41.796 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network eec9ef76-e9ff-47a5-b8c7-9b69e3732166 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:41.796 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:41 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:41.797 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4371a29c-0d29-40cb-be6f-f78b07a0fef3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:41 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap0f64cccb-f5: No such device
Nov 28 10:03:41 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap0f64cccb-f5: No such device
Nov 28 10:03:41 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap0f64cccb-f5: No such device
Nov 28 10:03:41 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap0f64cccb-f5: No such device
Nov 28 10:03:41 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap0f64cccb-f5: No such device
Nov 28 10:03:41 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap0f64cccb-f5: No such device
Nov 28 10:03:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:41.856 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:41.895 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:42 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:42.090 2 INFO neutron.agent.securitygroups_rpc [None req-e8cd93f0-eb23-4135-97c1-cd5750f74f24 b286c38dfd0e4889806c62c7b4b9ee98 50a1392ce96c4024bcd36a3df403ca29 - - default default] Security group member updated ['b372bb98-860c-4571-936b-bf08ecbd647d']
Nov 28 10:03:42 np0005538513.localdomain podman[314747]: 
Nov 28 10:03:42 np0005538513.localdomain podman[314747]: 2025-11-28 10:03:42.326386632 +0000 UTC m=+0.114090840 container create a4028ba968286b01611c690ab3feb5a8c69335b6561e7e724f885feb57711aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:03:42 np0005538513.localdomain podman[314747]: 2025-11-28 10:03:42.281011252 +0000 UTC m=+0.068715480 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:03:42 np0005538513.localdomain systemd[1]: Started libpod-conmon-a4028ba968286b01611c690ab3feb5a8c69335b6561e7e724f885feb57711aa6.scope.
Nov 28 10:03:42 np0005538513.localdomain systemd[1]: tmp-crun.II8WIY.mount: Deactivated successfully.
Nov 28 10:03:42 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:03:42 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fd4128ec09b58571ce595b339c06d55fb46f8875cc1866ebdca77487d42fde8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:03:42 np0005538513.localdomain podman[314747]: 2025-11-28 10:03:42.435365245 +0000 UTC m=+0.223069433 container init a4028ba968286b01611c690ab3feb5a8c69335b6561e7e724f885feb57711aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 10:03:42 np0005538513.localdomain podman[314747]: 2025-11-28 10:03:42.445687431 +0000 UTC m=+0.233391619 container start a4028ba968286b01611c690ab3feb5a8c69335b6561e7e724f885feb57711aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:03:42 np0005538513.localdomain dnsmasq[314776]: started, version 2.85 cachesize 150
Nov 28 10:03:42 np0005538513.localdomain dnsmasq[314776]: DNS service limited to local subnets
Nov 28 10:03:42 np0005538513.localdomain dnsmasq[314776]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:03:42 np0005538513.localdomain dnsmasq[314776]: warning: no upstream servers configured
Nov 28 10:03:42 np0005538513.localdomain dnsmasq-dhcp[314776]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:03:42 np0005538513.localdomain dnsmasq[314776]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:03:42 np0005538513.localdomain dnsmasq-dhcp[314776]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:03:42 np0005538513.localdomain dnsmasq-dhcp[314776]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:03:42 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:42.608 261084 INFO neutron.agent.dhcp.agent [None req-ccbb5d18-c057-4a29-bd70-cab972c8980e - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed
Nov 28 10:03:42 np0005538513.localdomain systemd[1]: tmp-crun.T2eI5r.mount: Deactivated successfully.
Nov 28 10:03:42 np0005538513.localdomain dnsmasq[314776]: exiting on receipt of SIGTERM
Nov 28 10:03:42 np0005538513.localdomain podman[314803]: 2025-11-28 10:03:42.876869256 +0000 UTC m=+0.130778829 container kill a4028ba968286b01611c690ab3feb5a8c69335b6561e7e724f885feb57711aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:03:42 np0005538513.localdomain systemd[1]: libpod-a4028ba968286b01611c690ab3feb5a8c69335b6561e7e724f885feb57711aa6.scope: Deactivated successfully.
Nov 28 10:03:42 np0005538513.localdomain podman[314826]: 
Nov 28 10:03:42 np0005538513.localdomain podman[314845]: 2025-11-28 10:03:42.948508078 +0000 UTC m=+0.048250893 container died a4028ba968286b01611c690ab3feb5a8c69335b6561e7e724f885feb57711aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:03:42 np0005538513.localdomain podman[314826]: 2025-11-28 10:03:42.890181277 +0000 UTC m=+0.053768742 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:03:42 np0005538513.localdomain podman[314845]: 2025-11-28 10:03:42.989972997 +0000 UTC m=+0.089715802 container remove a4028ba968286b01611c690ab3feb5a8c69335b6561e7e724f885feb57711aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:03:43 np0005538513.localdomain systemd[1]: libpod-conmon-a4028ba968286b01611c690ab3feb5a8c69335b6561e7e724f885feb57711aa6.scope: Deactivated successfully.
Nov 28 10:03:43 np0005538513.localdomain podman[314826]: 2025-11-28 10:03:43.003236717 +0000 UTC m=+0.166824142 container create 62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eec9ef76-e9ff-47a5-b8c7-9b69e3732166, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:03:43 np0005538513.localdomain ceph-mon[292954]: pgmap v214: 177 pgs: 177 active+clean; 161 MiB data, 794 MiB used, 41 GiB / 42 GiB avail; 107 KiB/s rd, 2.9 MiB/s wr, 146 op/s
Nov 28 10:03:43 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2803641897' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:03:43 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2803641897' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:03:43 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:43Z|00224|binding|INFO|Releasing lport da5b2ad4-01e7-4e6c-9145-eaa3075e76fa from this chassis (sb_readonly=0)
Nov 28 10:03:43 np0005538513.localdomain kernel: device tapda5b2ad4-01 left promiscuous mode
Nov 28 10:03:43 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:43Z|00225|binding|INFO|Setting lport da5b2ad4-01e7-4e6c-9145-eaa3075e76fa down in Southbound
Nov 28 10:03:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:43.014 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:43.029 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:43.038 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=da5b2ad4-01e7-4e6c-9145-eaa3075e76fa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:43.040 158130 INFO neutron.agent.ovn.metadata.agent [-] Port da5b2ad4-01e7-4e6c-9145-eaa3075e76fa in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis
Nov 28 10:03:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:43.042 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:43.043 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[52f61220-15b7-455d-a8c1-32c9382ab11e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:43 np0005538513.localdomain systemd[1]: Started libpod-conmon-62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c.scope.
Nov 28 10:03:43 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:03:43 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/721a99abb32a0f90fdd1a1d8fc012d3e9fa0457f65acb87fb03055f5486bc057/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:03:43 np0005538513.localdomain podman[314826]: 2025-11-28 10:03:43.097211549 +0000 UTC m=+0.260798944 container init 62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eec9ef76-e9ff-47a5-b8c7-9b69e3732166, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 28 10:03:43 np0005538513.localdomain podman[314826]: 2025-11-28 10:03:43.104803577 +0000 UTC m=+0.268390982 container start 62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eec9ef76-e9ff-47a5-b8c7-9b69e3732166, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 10:03:43 np0005538513.localdomain dnsmasq[314875]: started, version 2.85 cachesize 150
Nov 28 10:03:43 np0005538513.localdomain dnsmasq[314875]: DNS service limited to local subnets
Nov 28 10:03:43 np0005538513.localdomain dnsmasq[314875]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:03:43 np0005538513.localdomain dnsmasq[314875]: warning: no upstream servers configured
Nov 28 10:03:43 np0005538513.localdomain dnsmasq-dhcp[314875]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:03:43 np0005538513.localdomain dnsmasq[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/addn_hosts - 0 addresses
Nov 28 10:03:43 np0005538513.localdomain dnsmasq-dhcp[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/host
Nov 28 10:03:43 np0005538513.localdomain dnsmasq-dhcp[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/opts
Nov 28 10:03:43 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:43.175 261084 INFO neutron.agent.dhcp.agent [None req-fb7d8aeb-8c15-4a31-a128-a92fe0323a66 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:41Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6629340>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6629310>], id=12365673-4af9-4ad5-b481-8ba59b4a12f4, ip_allocation=immediate, mac_address=fa:16:3e:59:37:09, name=tempest-RoutersIpV6Test-279049389, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:38Z, description=, dns_domain=, id=eec9ef76-e9ff-47a5-b8c7-9b69e3732166, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1460277724, port_security_enabled=True, project_id=50a1392ce96c4024bcd36a3df403ca29, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=54776, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1381, status=ACTIVE, subnets=['3795ae5d-6bce-4edc-bccb-dff09b0cd314'], tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:03:40Z, vlan_transparent=None, network_id=eec9ef76-e9ff-47a5-b8c7-9b69e3732166, port_security_enabled=True, project_id=50a1392ce96c4024bcd36a3df403ca29, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['b372bb98-860c-4571-936b-bf08ecbd647d'], standard_attr_id=1406, status=DOWN, tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:03:41Z on network eec9ef76-e9ff-47a5-b8c7-9b69e3732166
Nov 28 10:03:43 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:43.267 261084 INFO neutron.agent.dhcp.agent [None req-f580b92e-ead0-4dcb-b599-a9d76b513677 - - - - - -] DHCP configuration for ports {'6881ddac-156c-4793-be9e-6e7674c0d668'} is completed
Nov 28 10:03:43 np0005538513.localdomain dnsmasq[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/addn_hosts - 1 addresses
Nov 28 10:03:43 np0005538513.localdomain dnsmasq-dhcp[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/host
Nov 28 10:03:43 np0005538513.localdomain dnsmasq-dhcp[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/opts
Nov 28 10:03:43 np0005538513.localdomain podman[314894]: 2025-11-28 10:03:43.380751064 +0000 UTC m=+0.062984486 container kill 62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eec9ef76-e9ff-47a5-b8c7-9b69e3732166, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:03:43 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-6fd4128ec09b58571ce595b339c06d55fb46f8875cc1866ebdca77487d42fde8-merged.mount: Deactivated successfully.
Nov 28 10:03:43 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a4028ba968286b01611c690ab3feb5a8c69335b6561e7e724f885feb57711aa6-userdata-shm.mount: Deactivated successfully.
Nov 28 10:03:43 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully.
Nov 28 10:03:43 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:43.682 261084 INFO neutron.agent.dhcp.agent [None req-e4affca1-79b6-48a4-91a1-2507aefbbc6e - - - - - -] DHCP configuration for ports {'12365673-4af9-4ad5-b481-8ba59b4a12f4'} is completed
Nov 28 10:03:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:43.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:03:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:43.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:03:44 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:44.193 2 INFO neutron.agent.securitygroups_rpc [None req-363c9598-0bac-406f-990f-c24334dc748e 595b5cbed3764c7a95b0ab3634e5becb 8c66e098e4fb4a349dc2bb4293454135 - - default default] Security group member updated ['f75eb612-183d-4664-84c2-1d15534e163f']
Nov 28 10:03:44 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:44.452 261084 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Nov 28 10:03:44 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:44.671 261084 INFO neutron.agent.dhcp.agent [None req-a7e52cd8-48cc-42a2-89aa-67ae7618ec92 - - - - - -] All active networks have been fetched through RPC.
Nov 28 10:03:44 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:44.672 261084 INFO neutron.agent.dhcp.agent [-] Starting network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 dhcp configuration
Nov 28 10:03:44 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:44.672 261084 INFO neutron.agent.dhcp.agent [-] Finished network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 dhcp configuration
Nov 28 10:03:44 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:44.673 261084 INFO neutron.agent.dhcp.agent [None req-a7e52cd8-48cc-42a2-89aa-67ae7618ec92 - - - - - -] Synchronizing state complete
Nov 28 10:03:44 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:44.674 261084 INFO neutron.agent.dhcp.agent [None req-7f1d8073-db00-44f4-924a-155fc8d62a3e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:44 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:44.674 261084 INFO neutron.agent.dhcp.agent [None req-7f1d8073-db00-44f4-924a-155fc8d62a3e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:44 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:44.675 261084 INFO neutron.agent.dhcp.agent [None req-7f1d8073-db00-44f4-924a-155fc8d62a3e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:44.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:03:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:44.791 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:03:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:44.792 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:03:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:44.792 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:03:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:44.793 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:03:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:44.794 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:03:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:44.952 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:44 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:44.956 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:44 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:44.958 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:03:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e135 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e135 do_prune osdmap full prune enabled
Nov 28 10:03:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e136 e136: 6 total, 6 up, 6 in
Nov 28 10:03:44 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e136: 6 total, 6 up, 6 in
Nov 28 10:03:45 np0005538513.localdomain ceph-mon[292954]: pgmap v215: 177 pgs: 177 active+clean; 161 MiB data, 794 MiB used, 41 GiB / 42 GiB avail; 72 KiB/s rd, 2.5 MiB/s wr, 99 op/s
Nov 28 10:03:45 np0005538513.localdomain ceph-mon[292954]: osdmap e136: 6 total, 6 up, 6 in
Nov 28 10:03:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:03:45 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3685964475' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:03:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:45.267 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:03:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:45.326 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:45.343 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:03:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:45.344 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:03:45 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:45.487 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:41Z, description=, device_id=cd81d6b9-90e3-4f09-bc0f-0098790e2353, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6585e80>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6585460>], id=12365673-4af9-4ad5-b481-8ba59b4a12f4, ip_allocation=immediate, mac_address=fa:16:3e:59:37:09, name=tempest-RoutersIpV6Test-279049389, network_id=eec9ef76-e9ff-47a5-b8c7-9b69e3732166, port_security_enabled=True, project_id=50a1392ce96c4024bcd36a3df403ca29, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['b372bb98-860c-4571-936b-bf08ecbd647d'], standard_attr_id=1406, status=DOWN, tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:03:42Z on network eec9ef76-e9ff-47a5-b8c7-9b69e3732166
Nov 28 10:03:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:45.596 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:03:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:45.599 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11198MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:03:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:45.600 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:03:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:45.600 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:03:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:45.674 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 10:03:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:45.675 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:03:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:45.675 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:03:45 np0005538513.localdomain systemd[1]: tmp-crun.4K5DLx.mount: Deactivated successfully.
Nov 28 10:03:45 np0005538513.localdomain podman[314955]: 2025-11-28 10:03:45.721190367 +0000 UTC m=+0.083923956 container kill 62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eec9ef76-e9ff-47a5-b8c7-9b69e3732166, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:03:45 np0005538513.localdomain dnsmasq[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/addn_hosts - 1 addresses
Nov 28 10:03:45 np0005538513.localdomain dnsmasq-dhcp[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/host
Nov 28 10:03:45 np0005538513.localdomain dnsmasq-dhcp[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/opts
Nov 28 10:03:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:45.724 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:03:45 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:45.834 261084 INFO neutron.agent.linux.ip_lib [None req-e89f33e6-aa1d-4760-b4e1-22b9f72304ed - - - - - -] Device tap46d5cf00-99 cannot be used as it has no MAC address
Nov 28 10:03:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:45.866 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:45 np0005538513.localdomain kernel: device tap46d5cf00-99 entered promiscuous mode
Nov 28 10:03:45 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324225.8739] manager: (tap46d5cf00-99): new Generic device (/org/freedesktop/NetworkManager/Devices/40)
Nov 28 10:03:45 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:45Z|00226|binding|INFO|Claiming lport 46d5cf00-9955-46d8-9cab-1f8f84e925f3 for this chassis.
Nov 28 10:03:45 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:45Z|00227|binding|INFO|46d5cf00-9955-46d8-9cab-1f8f84e925f3: Claiming unknown
Nov 28 10:03:45 np0005538513.localdomain systemd-udevd[314995]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:03:45 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:45Z|00228|binding|INFO|Setting lport 46d5cf00-9955-46d8-9cab-1f8f84e925f3 ovn-installed in OVS
Nov 28 10:03:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:45.889 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:45 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:45Z|00229|binding|INFO|Setting lport 46d5cf00-9955-46d8-9cab-1f8f84e925f3 up in Southbound
Nov 28 10:03:45 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:45.894 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=46d5cf00-9955-46d8-9cab-1f8f84e925f3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:45 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:45.896 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 46d5cf00-9955-46d8-9cab-1f8f84e925f3 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis
Nov 28 10:03:45 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:45.897 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:45 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:45.898 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[7333c351-4d81-4435-a6e1-c9b5a4a0bec3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:45 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap46d5cf00-99: No such device
Nov 28 10:03:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:45.909 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:45 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap46d5cf00-99: No such device
Nov 28 10:03:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:45.916 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:45 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap46d5cf00-99: No such device
Nov 28 10:03:45 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap46d5cf00-99: No such device
Nov 28 10:03:45 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap46d5cf00-99: No such device
Nov 28 10:03:45 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap46d5cf00-99: No such device
Nov 28 10:03:45 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap46d5cf00-99: No such device
Nov 28 10:03:45 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap46d5cf00-99: No such device
Nov 28 10:03:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:45.973 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:46.017 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:46 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2196630115' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:03:46 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3685964475' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:03:46 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/3641885735' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:03:46 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1044460926' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:03:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:46.245 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:03:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:46.254 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:03:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:46.334 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:03:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:46.338 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:03:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:46.338 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.738s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:03:46 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:46.434 261084 INFO neutron.agent.dhcp.agent [None req-27c1ffa9-10aa-4d14-9216-c2e9382ec5f8 - - - - - -] DHCP configuration for ports {'12365673-4af9-4ad5-b481-8ba59b4a12f4'} is completed
Nov 28 10:03:46 np0005538513.localdomain podman[315080]: 
Nov 28 10:03:46 np0005538513.localdomain podman[315080]: 2025-11-28 10:03:46.990210089 +0000 UTC m=+0.103462676 container create 329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Nov 28 10:03:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:03:47 np0005538513.localdomain systemd[1]: Started libpod-conmon-329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4.scope.
Nov 28 10:03:47 np0005538513.localdomain podman[315080]: 2025-11-28 10:03:46.944654584 +0000 UTC m=+0.057907231 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:03:47 np0005538513.localdomain systemd[1]: tmp-crun.McNrgT.mount: Deactivated successfully.
Nov 28 10:03:47 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:47Z|00230|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:03:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e136 do_prune osdmap full prune enabled
Nov 28 10:03:47 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:03:47 np0005538513.localdomain ceph-mon[292954]: pgmap v217: 177 pgs: 177 active+clean; 161 MiB data, 794 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 2.2 MiB/s wr, 86 op/s
Nov 28 10:03:47 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1881536497' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:03:47 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1619156998' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:03:47 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3438130315' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:03:47 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3438130315' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:03:47 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/578f4be3cc19b21fef1ae33791a1b3511e0f8f189090f691ad48efc34f6226aa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:03:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e137 e137: 6 total, 6 up, 6 in
Nov 28 10:03:47 np0005538513.localdomain podman[315080]: 2025-11-28 10:03:47.089878485 +0000 UTC m=+0.203131072 container init 329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Nov 28 10:03:47 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e137: 6 total, 6 up, 6 in
Nov 28 10:03:47 np0005538513.localdomain podman[315080]: 2025-11-28 10:03:47.100824328 +0000 UTC m=+0.214076905 container start 329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:03:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:47.101 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:47 np0005538513.localdomain dnsmasq[315108]: started, version 2.85 cachesize 150
Nov 28 10:03:47 np0005538513.localdomain dnsmasq[315108]: DNS service limited to local subnets
Nov 28 10:03:47 np0005538513.localdomain dnsmasq[315108]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:03:47 np0005538513.localdomain dnsmasq[315108]: warning: no upstream servers configured
Nov 28 10:03:47 np0005538513.localdomain dnsmasq-dhcp[315108]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:03:47 np0005538513.localdomain dnsmasq[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:03:47 np0005538513.localdomain dnsmasq-dhcp[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:03:47 np0005538513.localdomain dnsmasq-dhcp[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:03:47 np0005538513.localdomain podman[315095]: 2025-11-28 10:03:47.185855035 +0000 UTC m=+0.151199223 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, architecture=x86_64, config_id=edpm)
Nov 28 10:03:47 np0005538513.localdomain podman[315095]: 2025-11-28 10:03:47.198830226 +0000 UTC m=+0.164174414 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, version=9.6, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, config_id=edpm, release=1755695350, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 10:03:47 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:03:47 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:47.361 261084 INFO neutron.agent.dhcp.agent [None req-35bfdd1a-a27b-4b8e-9b52-2da75bab8f1c - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '46d5cf00-9955-46d8-9cab-1f8f84e925f3'} is completed
Nov 28 10:03:47 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:47.546 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:46Z, description=, device_id=fd23721d-5971-4a55-93b6-1f314d7bfba5, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65e2400>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65e2c40>], id=9814bbf1-0df0-4d12-a042-026ac18c60f8, ip_allocation=immediate, mac_address=fa:16:3e:2c:af:f0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=18, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['78639234-2b5b-4fdc-97e8-c3d3fd0c61b3'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:44Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=False, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1420, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:46Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1
Nov 28 10:03:47 np0005538513.localdomain dnsmasq[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses
Nov 28 10:03:47 np0005538513.localdomain dnsmasq-dhcp[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:03:47 np0005538513.localdomain podman[315138]: 2025-11-28 10:03:47.762010403 +0000 UTC m=+0.063358766 container kill 329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:03:47 np0005538513.localdomain dnsmasq-dhcp[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:03:48 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:48.038 261084 INFO neutron.agent.dhcp.agent [None req-6c58773d-3169-48ed-9bea-3b7ee28846fe - - - - - -] DHCP configuration for ports {'9814bbf1-0df0-4d12-a042-026ac18c60f8'} is completed
Nov 28 10:03:48 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:48.074 2 INFO neutron.agent.securitygroups_rpc [None req-d103e3e5-6a2c-4d52-97a2-9ed0e9f72fa6 595b5cbed3764c7a95b0ab3634e5becb 8c66e098e4fb4a349dc2bb4293454135 - - default default] Security group member updated ['f75eb612-183d-4664-84c2-1d15534e163f']
Nov 28 10:03:48 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e137 do_prune osdmap full prune enabled
Nov 28 10:03:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:03:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:03:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:03:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:03:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:03:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:03:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:03:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:03:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:03:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:03:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:03:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:03:48 np0005538513.localdomain ceph-mon[292954]: osdmap e137: 6 total, 6 up, 6 in
Nov 28 10:03:48 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e138 e138: 6 total, 6 up, 6 in
Nov 28 10:03:48 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e138: 6 total, 6 up, 6 in
Nov 28 10:03:48 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:03:48.287 2 INFO neutron.agent.securitygroups_rpc [None req-8df05b65-915d-4be4-a7dc-9f54beb052e9 b286c38dfd0e4889806c62c7b4b9ee98 50a1392ce96c4024bcd36a3df403ca29 - - default default] Security group member updated ['b372bb98-860c-4571-936b-bf08ecbd647d']
Nov 28 10:03:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:48.341 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:03:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:48.342 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:03:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:48.343 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:03:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:48.478 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:03:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:48.479 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:03:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:48.480 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 10:03:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:48.480 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:03:48 np0005538513.localdomain podman[315177]: 2025-11-28 10:03:48.525780329 +0000 UTC m=+0.065453246 container kill 62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eec9ef76-e9ff-47a5-b8c7-9b69e3732166, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:03:48 np0005538513.localdomain dnsmasq[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/addn_hosts - 0 addresses
Nov 28 10:03:48 np0005538513.localdomain dnsmasq-dhcp[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/host
Nov 28 10:03:48 np0005538513.localdomain dnsmasq-dhcp[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/opts
Nov 28 10:03:48 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:48Z|00231|binding|INFO|Releasing lport 0f64cccb-f58c-4c85-8d1d-c5dbcdc8de50 from this chassis (sb_readonly=0)
Nov 28 10:03:48 np0005538513.localdomain kernel: device tap0f64cccb-f5 left promiscuous mode
Nov 28 10:03:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:48.723 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:48 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:48Z|00232|binding|INFO|Setting lport 0f64cccb-f58c-4c85-8d1d-c5dbcdc8de50 down in Southbound
Nov 28 10:03:48 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:48.736 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-eec9ef76-e9ff-47a5-b8c7-9b69e3732166', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eec9ef76-e9ff-47a5-b8c7-9b69e3732166', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '50a1392ce96c4024bcd36a3df403ca29', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76364ea1-fb09-4d6a-aaea-0dba21691fb4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=0f64cccb-f58c-4c85-8d1d-c5dbcdc8de50) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:48 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:48.738 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 0f64cccb-f58c-4c85-8d1d-c5dbcdc8de50 in datapath eec9ef76-e9ff-47a5-b8c7-9b69e3732166 unbound from our chassis
Nov 28 10:03:48 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:48.739 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network eec9ef76-e9ff-47a5-b8c7-9b69e3732166 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:48 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:48.740 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[7db80c86-35dc-4700-bbf4-8e5142bc1ccb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:48.752 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:49 np0005538513.localdomain ceph-mon[292954]: pgmap v219: 177 pgs: 177 active+clean; 257 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 117 KiB/s rd, 15 MiB/s wr, 161 op/s
Nov 28 10:03:49 np0005538513.localdomain ceph-mon[292954]: osdmap e138: 6 total, 6 up, 6 in
Nov 28 10:03:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:49.713 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:03:49 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:49.726 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:46Z, description=, device_id=fd23721d-5971-4a55-93b6-1f314d7bfba5, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65841c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6584b20>], id=9814bbf1-0df0-4d12-a042-026ac18c60f8, ip_allocation=immediate, mac_address=fa:16:3e:2c:af:f0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=18, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['78639234-2b5b-4fdc-97e8-c3d3fd0c61b3'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:44Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=False, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1420, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:46Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1
Nov 28 10:03:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:49.737 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:03:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:49.738 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 10:03:49 np0005538513.localdomain dnsmasq[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses
Nov 28 10:03:49 np0005538513.localdomain dnsmasq-dhcp[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:03:49 np0005538513.localdomain podman[315218]: 2025-11-28 10:03:49.936257255 +0000 UTC m=+0.071335865 container kill 329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:03:49 np0005538513.localdomain dnsmasq-dhcp[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:03:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e138 do_prune osdmap full prune enabled
Nov 28 10:03:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e139 e139: 6 total, 6 up, 6 in
Nov 28 10:03:49 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e139: 6 total, 6 up, 6 in
Nov 28 10:03:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:50.012 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:50.365 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:50 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:50.394 261084 INFO neutron.agent.dhcp.agent [None req-a6054db3-96e0-4b25-b3de-2530fc3c0543 - - - - - -] DHCP configuration for ports {'9814bbf1-0df0-4d12-a042-026ac18c60f8'} is completed
Nov 28 10:03:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:50.842 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:03:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:50.843 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:03:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:50.843 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:03:50 np0005538513.localdomain ceph-mon[292954]: pgmap v221: 177 pgs: 177 active+clean; 257 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 73 KiB/s rd, 16 MiB/s wr, 101 op/s
Nov 28 10:03:50 np0005538513.localdomain ceph-mon[292954]: osdmap e139: 6 total, 6 up, 6 in
Nov 28 10:03:51 np0005538513.localdomain systemd[1]: tmp-crun.TZhsTq.mount: Deactivated successfully.
Nov 28 10:03:51 np0005538513.localdomain dnsmasq[314875]: exiting on receipt of SIGTERM
Nov 28 10:03:51 np0005538513.localdomain podman[315255]: 2025-11-28 10:03:51.01865139 +0000 UTC m=+0.055998606 container kill 62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eec9ef76-e9ff-47a5-b8c7-9b69e3732166, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:03:51 np0005538513.localdomain systemd[1]: libpod-62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c.scope: Deactivated successfully.
Nov 28 10:03:51 np0005538513.localdomain podman[315269]: 2025-11-28 10:03:51.104113279 +0000 UTC m=+0.067982300 container died 62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eec9ef76-e9ff-47a5-b8c7-9b69e3732166, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:03:51 np0005538513.localdomain podman[315269]: 2025-11-28 10:03:51.138331199 +0000 UTC m=+0.102200170 container cleanup 62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eec9ef76-e9ff-47a5-b8c7-9b69e3732166, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 28 10:03:51 np0005538513.localdomain systemd[1]: libpod-conmon-62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c.scope: Deactivated successfully.
Nov 28 10:03:51 np0005538513.localdomain podman[315271]: 2025-11-28 10:03:51.183431152 +0000 UTC m=+0.138113199 container remove 62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eec9ef76-e9ff-47a5-b8c7-9b69e3732166, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:03:51 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:51.513 261084 INFO neutron.agent.dhcp.agent [None req-7ca3987f-3444-4869-9417-f626955b6198 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:51 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:51.514 261084 INFO neutron.agent.dhcp.agent [None req-7ca3987f-3444-4869-9417-f626955b6198 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:51 np0005538513.localdomain dnsmasq[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:03:51 np0005538513.localdomain dnsmasq-dhcp[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:03:51 np0005538513.localdomain dnsmasq-dhcp[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:03:51 np0005538513.localdomain podman[315316]: 2025-11-28 10:03:51.560104344 +0000 UTC m=+0.074232867 container kill 329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 28 10:03:51 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:51.599 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:03:51 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:51Z|00233|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:03:51 np0005538513.localdomain podman[315338]: 2025-11-28 10:03:51.869549412 +0000 UTC m=+0.091994268 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:03:51 np0005538513.localdomain podman[315338]: 2025-11-28 10:03:51.883229134 +0000 UTC m=+0.105674040 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 10:03:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:51.902 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:51 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:03:52 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-721a99abb32a0f90fdd1a1d8fc012d3e9fa0457f65acb87fb03055f5486bc057-merged.mount: Deactivated successfully.
Nov 28 10:03:52 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c-userdata-shm.mount: Deactivated successfully.
Nov 28 10:03:52 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2deec9ef76\x2de9ff\x2d47a5\x2db8c7\x2d9b69e3732166.mount: Deactivated successfully.
Nov 28 10:03:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:52.164 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:03:52 np0005538513.localdomain dnsmasq[315108]: exiting on receipt of SIGTERM
Nov 28 10:03:52 np0005538513.localdomain podman[315378]: 2025-11-28 10:03:52.639288188 +0000 UTC m=+0.062230795 container kill 329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 28 10:03:52 np0005538513.localdomain systemd[1]: libpod-329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4.scope: Deactivated successfully.
Nov 28 10:03:52 np0005538513.localdomain podman[315390]: 2025-11-28 10:03:52.712566268 +0000 UTC m=+0.057405537 container died 329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:03:52 np0005538513.localdomain systemd[1]: tmp-crun.m1U55p.mount: Deactivated successfully.
Nov 28 10:03:52 np0005538513.localdomain podman[315390]: 2025-11-28 10:03:52.753365276 +0000 UTC m=+0.098204485 container cleanup 329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:03:52 np0005538513.localdomain systemd[1]: libpod-conmon-329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4.scope: Deactivated successfully.
Nov 28 10:03:52 np0005538513.localdomain podman[315392]: 2025-11-28 10:03:52.800842517 +0000 UTC m=+0.134623139 container remove 329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 10:03:52 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:52Z|00234|binding|INFO|Releasing lport 46d5cf00-9955-46d8-9cab-1f8f84e925f3 from this chassis (sb_readonly=0)
Nov 28 10:03:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:52.847 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:52 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:52Z|00235|binding|INFO|Setting lport 46d5cf00-9955-46d8-9cab-1f8f84e925f3 down in Southbound
Nov 28 10:03:52 np0005538513.localdomain kernel: device tap46d5cf00-99 left promiscuous mode
Nov 28 10:03:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:52.861 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=46d5cf00-9955-46d8-9cab-1f8f84e925f3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:52.863 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 46d5cf00-9955-46d8-9cab-1f8f84e925f3 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis
Nov 28 10:03:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:52.865 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:52.866 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[c5f5e74a-121c-45a8-abee-e3e6a4648b61]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:52.874 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:52.875 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:53 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-578f4be3cc19b21fef1ae33791a1b3511e0f8f189090f691ad48efc34f6226aa-merged.mount: Deactivated successfully.
Nov 28 10:03:53 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4-userdata-shm.mount: Deactivated successfully.
Nov 28 10:03:53 np0005538513.localdomain ceph-mon[292954]: pgmap v223: 177 pgs: 177 active+clean; 145 MiB data, 782 MiB used, 41 GiB / 42 GiB avail; 131 KiB/s rd, 16 MiB/s wr, 185 op/s
Nov 28 10:03:53 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully.
Nov 28 10:03:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:03:54 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:54.349 261084 INFO neutron.agent.linux.ip_lib [None req-96df33e2-8cea-42a0-ae91-fe4e1878310c - - - - - -] Device tapd896fe01-47 cannot be used as it has no MAC address
Nov 28 10:03:54 np0005538513.localdomain podman[315422]: 2025-11-28 10:03:54.37422586 +0000 UTC m=+0.130654705 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 28 10:03:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:03:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:54.384 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:54 np0005538513.localdomain podman[315422]: 2025-11-28 10:03:54.390507157 +0000 UTC m=+0.146936052 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:03:54 np0005538513.localdomain kernel: device tapd896fe01-47 entered promiscuous mode
Nov 28 10:03:54 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:54Z|00236|binding|INFO|Claiming lport d896fe01-471a-407a-a0b3-6d40883262a8 for this chassis.
Nov 28 10:03:54 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:54Z|00237|binding|INFO|d896fe01-471a-407a-a0b3-6d40883262a8: Claiming unknown
Nov 28 10:03:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:54.395 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:54 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324234.3971] manager: (tapd896fe01-47): new Generic device (/org/freedesktop/NetworkManager/Devices/41)
Nov 28 10:03:54 np0005538513.localdomain systemd-udevd[315454]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:03:54 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:54Z|00238|binding|INFO|Setting lport d896fe01-471a-407a-a0b3-6d40883262a8 ovn-installed in OVS
Nov 28 10:03:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:54.407 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:54 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:54Z|00239|binding|INFO|Setting lport d896fe01-471a-407a-a0b3-6d40883262a8 up in Southbound
Nov 28 10:03:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:54.409 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=d896fe01-471a-407a-a0b3-6d40883262a8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:54.411 158130 INFO neutron.agent.ovn.metadata.agent [-] Port d896fe01-471a-407a-a0b3-6d40883262a8 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis
Nov 28 10:03:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:54.413 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:54.414 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[334140aa-b9ba-46e6-986d-861e1ad2cd13]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:54 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:03:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:54.426 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:54.474 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:54.505 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:54 np0005538513.localdomain podman[315447]: 2025-11-28 10:03:54.507628132 +0000 UTC m=+0.110366453 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 10:03:54 np0005538513.localdomain podman[315447]: 2025-11-28 10:03:54.552387865 +0000 UTC m=+0.155126226 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:03:54 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:03:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:54.960 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:03:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e139 do_prune osdmap full prune enabled
Nov 28 10:03:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e140 e140: 6 total, 6 up, 6 in
Nov 28 10:03:54 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e140: 6 total, 6 up, 6 in
Nov 28 10:03:55 np0005538513.localdomain ceph-mon[292954]: pgmap v224: 177 pgs: 177 active+clean; 145 MiB data, 782 MiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 5.0 KiB/s wr, 76 op/s
Nov 28 10:03:55 np0005538513.localdomain ceph-mon[292954]: osdmap e140: 6 total, 6 up, 6 in
Nov 28 10:03:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:55.391 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:55 np0005538513.localdomain podman[315526]: 
Nov 28 10:03:55 np0005538513.localdomain podman[315526]: 2025-11-28 10:03:55.476860105 +0000 UTC m=+0.116885270 container create d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 28 10:03:55 np0005538513.localdomain systemd[1]: Started libpod-conmon-d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423.scope.
Nov 28 10:03:55 np0005538513.localdomain podman[315526]: 2025-11-28 10:03:55.429115267 +0000 UTC m=+0.069140482 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:03:55 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:03:55 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c9bf28eca558c2a4e7a8c7f2646cc9ee0c26530bb499ee82f943e5f60e7841a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:03:55 np0005538513.localdomain podman[315526]: 2025-11-28 10:03:55.571056604 +0000 UTC m=+0.211081769 container init d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 10:03:55 np0005538513.localdomain podman[315526]: 2025-11-28 10:03:55.584921961 +0000 UTC m=+0.224947126 container start d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:03:55 np0005538513.localdomain dnsmasq[315544]: started, version 2.85 cachesize 150
Nov 28 10:03:55 np0005538513.localdomain dnsmasq[315544]: DNS service limited to local subnets
Nov 28 10:03:55 np0005538513.localdomain dnsmasq[315544]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:03:55 np0005538513.localdomain dnsmasq[315544]: warning: no upstream servers configured
Nov 28 10:03:55 np0005538513.localdomain dnsmasq[315544]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:03:55 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:55.646 261084 INFO neutron.agent.dhcp.agent [None req-96df33e2-8cea-42a0-ae91-fe4e1878310c - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:53Z, description=, device_id=83d505e9-2952-4cc1-b961-ffbd511bb38a, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65e2b80>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65e2fd0>], id=40f1021c-6ae3-41b5-b7d7-1bd4943993da, ip_allocation=immediate, mac_address=fa:16:3e:0a:a8:0e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=20, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['7a3f4fc8-2554-42b9-b3cf-c49e67a0bb09'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:52Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=False, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1461, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:54Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1
Nov 28 10:03:55 np0005538513.localdomain dnsmasq[315544]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses
Nov 28 10:03:55 np0005538513.localdomain podman[315562]: 2025-11-28 10:03:55.842233174 +0000 UTC m=+0.055630305 container kill d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 10:03:55 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:55.932 261084 INFO neutron.agent.dhcp.agent [None req-8f336738-3886-4916-8fc2-4fc9156113b8 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed
Nov 28 10:03:56 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:56Z|00240|binding|INFO|Releasing lport d896fe01-471a-407a-a0b3-6d40883262a8 from this chassis (sb_readonly=0)
Nov 28 10:03:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:56.038 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:56 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:56Z|00241|binding|INFO|Setting lport d896fe01-471a-407a-a0b3-6d40883262a8 down in Southbound
Nov 28 10:03:56 np0005538513.localdomain kernel: device tapd896fe01-47 left promiscuous mode
Nov 28 10:03:56 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:56.052 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=d896fe01-471a-407a-a0b3-6d40883262a8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:56 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:56.055 158130 INFO neutron.agent.ovn.metadata.agent [-] Port d896fe01-471a-407a-a0b3-6d40883262a8 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis
Nov 28 10:03:56 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:56.058 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:56 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:56.059 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[adba56c0-8a10-470d-b7c2-8ce4f68718a6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:56.061 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.119 261084 INFO neutron.agent.dhcp.agent [None req-10ec3fab-cc80-4fc1-85ca-fff69e63bb87 - - - - - -] DHCP configuration for ports {'40f1021c-6ae3-41b5-b7d7-1bd4943993da'} is completed
Nov 28 10:03:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.554 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:53Z, description=, device_id=83d505e9-2952-4cc1-b961-ffbd511bb38a, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd66790a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65fdac0>], id=40f1021c-6ae3-41b5-b7d7-1bd4943993da, ip_allocation=immediate, mac_address=fa:16:3e:0a:a8:0e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=20, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['7a3f4fc8-2554-42b9-b3cf-c49e67a0bb09'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:52Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=False, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1461, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:54Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1
Nov 28 10:03:56 np0005538513.localdomain systemd[1]: tmp-crun.j64yLf.mount: Deactivated successfully.
Nov 28 10:03:56 np0005538513.localdomain podman[315584]: 2025-11-28 10:03:56.612908468 +0000 UTC m=+0.095721404 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 28 10:03:56 np0005538513.localdomain podman[315584]: 2025-11-28 10:03:56.625677014 +0000 UTC m=+0.108489950 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 28 10:03:56 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:03:56 np0005538513.localdomain dnsmasq[315544]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses
Nov 28 10:03:56 np0005538513.localdomain podman[315619]: 2025-11-28 10:03:56.776489905 +0000 UTC m=+0.073407534 container kill d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent [-] Unable to reload_allocations dhcp for 8642adde-54ae-4fc2-b997-bf1962c6c7f1.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapd896fe01-47 not found in namespace qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1.
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent     return fut.result()
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent     raise self._exception
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapd896fe01-47 not found in namespace qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1.
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent 
Nov 28 10:03:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.867 261084 INFO neutron.agent.dhcp.agent [None req-6a56c746-7130-4efb-8c36-7af29a638d45 - - - - - -] DHCP configuration for ports {'40f1021c-6ae3-41b5-b7d7-1bd4943993da'} is completed
Nov 28 10:03:57 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e140 do_prune osdmap full prune enabled
Nov 28 10:03:57 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e141 e141: 6 total, 6 up, 6 in
Nov 28 10:03:57 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e141: 6 total, 6 up, 6 in
Nov 28 10:03:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:57.081 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:57 np0005538513.localdomain ceph-mon[292954]: pgmap v226: 177 pgs: 177 active+clean; 145 MiB data, 782 MiB used, 41 GiB / 42 GiB avail; 46 KiB/s rd, 4.4 KiB/s wr, 67 op/s
Nov 28 10:03:57 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:57.204 261084 INFO neutron.agent.linux.ip_lib [None req-3f3aeafc-19b3-4e93-b7b2-cb95dd628857 - - - - - -] Device tapbe14f038-71 cannot be used as it has no MAC address
Nov 28 10:03:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:57.234 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:57 np0005538513.localdomain kernel: device tapbe14f038-71 entered promiscuous mode
Nov 28 10:03:57 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324237.2440] manager: (tapbe14f038-71): new Generic device (/org/freedesktop/NetworkManager/Devices/42)
Nov 28 10:03:57 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:57Z|00242|binding|INFO|Claiming lport be14f038-71a5-4e0d-89f8-2103c3afd8ad for this chassis.
Nov 28 10:03:57 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:57Z|00243|binding|INFO|be14f038-71a5-4e0d-89f8-2103c3afd8ad: Claiming unknown
Nov 28 10:03:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:57.245 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:57 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:57.256 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b11f04c88dd4db3aa7f405d125f76a4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=785a40f8-f8b6-4ed9-ab6b-593396055237, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=be14f038-71a5-4e0d-89f8-2103c3afd8ad) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:57 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:57.258 158130 INFO neutron.agent.ovn.metadata.agent [-] Port be14f038-71a5-4e0d-89f8-2103c3afd8ad in datapath a8ba88fa-1415-4e6a-82ae-4f41cdd912f1 bound to our chassis
Nov 28 10:03:57 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:57.260 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a8ba88fa-1415-4e6a-82ae-4f41cdd912f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:57 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:57.261 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[0f69eb90-0c21-4644-9686-d9cff1cafb8e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:57 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:57Z|00244|binding|INFO|Setting lport be14f038-71a5-4e0d-89f8-2103c3afd8ad ovn-installed in OVS
Nov 28 10:03:57 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:57Z|00245|binding|INFO|Setting lport be14f038-71a5-4e0d-89f8-2103c3afd8ad up in Southbound
Nov 28 10:03:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:57.282 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:57.336 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:57.376 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:57 np0005538513.localdomain systemd[1]: tmp-crun.6snNtl.mount: Deactivated successfully.
Nov 28 10:03:58 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e141 do_prune osdmap full prune enabled
Nov 28 10:03:58 np0005538513.localdomain ceph-mon[292954]: osdmap e141: 6 total, 6 up, 6 in
Nov 28 10:03:58 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e142 e142: 6 total, 6 up, 6 in
Nov 28 10:03:58 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e142: 6 total, 6 up, 6 in
Nov 28 10:03:58 np0005538513.localdomain podman[315697]: 
Nov 28 10:03:58 np0005538513.localdomain podman[315697]: 2025-11-28 10:03:58.359861545 +0000 UTC m=+0.101003165 container create 3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:03:58 np0005538513.localdomain systemd[1]: Started libpod-conmon-3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b.scope.
Nov 28 10:03:58 np0005538513.localdomain podman[315697]: 2025-11-28 10:03:58.314052022 +0000 UTC m=+0.055193672 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:03:58 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:03:58 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3b7a3bb605cc17c39a0a6c043199db98db1eab94e3efb6892ac928f5feb4c4b2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:03:58 np0005538513.localdomain podman[315697]: 2025-11-28 10:03:58.431316412 +0000 UTC m=+0.172458032 container init 3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:03:58 np0005538513.localdomain podman[315697]: 2025-11-28 10:03:58.440474344 +0000 UTC m=+0.181615964 container start 3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 28 10:03:58 np0005538513.localdomain dnsmasq[315715]: started, version 2.85 cachesize 150
Nov 28 10:03:58 np0005538513.localdomain dnsmasq[315715]: DNS service limited to local subnets
Nov 28 10:03:58 np0005538513.localdomain dnsmasq[315715]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:03:58 np0005538513.localdomain dnsmasq[315715]: warning: no upstream servers configured
Nov 28 10:03:58 np0005538513.localdomain dnsmasq-dhcp[315715]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:03:58 np0005538513.localdomain dnsmasq[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/addn_hosts - 0 addresses
Nov 28 10:03:58 np0005538513.localdomain dnsmasq-dhcp[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/host
Nov 28 10:03:58 np0005538513.localdomain dnsmasq-dhcp[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/opts
Nov 28 10:03:58 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:58.518 261084 INFO neutron.agent.dhcp.agent [None req-a7e52cd8-48cc-42a2-89aa-67ae7618ec92 - - - - - -] Synchronizing state
Nov 28 10:03:58 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:58.629 261084 INFO neutron.agent.dhcp.agent [None req-958a9933-f9d9-4c72-9e8d-08b93ca9792c - - - - - -] DHCP configuration for ports {'912a3c72-960a-49a4-a4ad-da9ef63c5cb1'} is completed
Nov 28 10:03:58 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:58.697 261084 INFO neutron.agent.dhcp.agent [None req-3a4c8826-6eeb-4084-b6e5-81cd4ed1d4d8 - - - - - -] All active networks have been fetched through RPC.
Nov 28 10:03:58 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:58.698 261084 INFO neutron.agent.dhcp.agent [-] Starting network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 dhcp configuration
Nov 28 10:03:58 np0005538513.localdomain dnsmasq[315544]: exiting on receipt of SIGTERM
Nov 28 10:03:58 np0005538513.localdomain podman[315733]: 2025-11-28 10:03:58.894749081 +0000 UTC m=+0.064619193 container kill d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:03:58 np0005538513.localdomain systemd[1]: libpod-d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423.scope: Deactivated successfully.
Nov 28 10:03:58 np0005538513.localdomain podman[315753]: 2025-11-28 10:03:58.980172069 +0000 UTC m=+0.055444620 container died d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 28 10:03:59 np0005538513.localdomain podman[315753]: 2025-11-28 10:03:59.03500142 +0000 UTC m=+0.110273931 container remove d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 10:03:59 np0005538513.localdomain systemd[1]: libpod-conmon-d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423.scope: Deactivated successfully.
Nov 28 10:03:59 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:03:59.084 261084 INFO neutron.agent.linux.ip_lib [-] Device tapd896fe01-47 cannot be used as it has no MAC address
Nov 28 10:03:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:59.113 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:59 np0005538513.localdomain kernel: device tapd896fe01-47 entered promiscuous mode
Nov 28 10:03:59 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324239.1223] manager: (tapd896fe01-47): new Generic device (/org/freedesktop/NetworkManager/Devices/43)
Nov 28 10:03:59 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:59Z|00246|binding|INFO|Claiming lport d896fe01-471a-407a-a0b3-6d40883262a8 for this chassis.
Nov 28 10:03:59 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:59Z|00247|binding|INFO|d896fe01-471a-407a-a0b3-6d40883262a8: Claiming unknown
Nov 28 10:03:59 np0005538513.localdomain ceph-mon[292954]: pgmap v228: 177 pgs: 177 active+clean; 145 MiB data, 782 MiB used, 41 GiB / 42 GiB avail; 48 KiB/s rd, 4.3 KiB/s wr, 70 op/s
Nov 28 10:03:59 np0005538513.localdomain ceph-mon[292954]: osdmap e142: 6 total, 6 up, 6 in
Nov 28 10:03:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:59.124 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:59 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:59Z|00248|binding|INFO|Setting lport d896fe01-471a-407a-a0b3-6d40883262a8 ovn-installed in OVS
Nov 28 10:03:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:59.135 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:59.137 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:59 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:59Z|00249|binding|INFO|Setting lport d896fe01-471a-407a-a0b3-6d40883262a8 up in Southbound
Nov 28 10:03:59 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:59.143 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=d896fe01-471a-407a-a0b3-6d40883262a8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:59 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:59.145 158130 INFO neutron.agent.ovn.metadata.agent [-] Port d896fe01-471a-407a-a0b3-6d40883262a8 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis
Nov 28 10:03:59 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:59.146 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:59 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:03:59.148 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[b9bce8c2-bb80-434b-bf2d-6d67eac7f330]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:59 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapd896fe01-47: No such device
Nov 28 10:03:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:59.158 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:59 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapd896fe01-47: No such device
Nov 28 10:03:59 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapd896fe01-47: No such device
Nov 28 10:03:59 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapd896fe01-47: No such device
Nov 28 10:03:59 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapd896fe01-47: No such device
Nov 28 10:03:59 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapd896fe01-47: No such device
Nov 28 10:03:59 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapd896fe01-47: No such device
Nov 28 10:03:59 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapd896fe01-47: No such device
Nov 28 10:03:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:59.211 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:59.245 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:59 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-1c9bf28eca558c2a4e7a8c7f2646cc9ee0c26530bb499ee82f943e5f60e7841a-merged.mount: Deactivated successfully.
Nov 28 10:03:59 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423-userdata-shm.mount: Deactivated successfully.
Nov 28 10:03:59 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:03:59Z|00250|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:03:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:03:59.607 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:59 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:00 np0005538513.localdomain podman[315848]: 
Nov 28 10:04:00 np0005538513.localdomain podman[315848]: 2025-11-28 10:04:00.057347735 +0000 UTC m=+0.095760266 container create d36fc74848dd5752738fa562ae7718338f734f35475b5ceed56e46fe22e5e32e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 28 10:04:00 np0005538513.localdomain systemd[1]: Started libpod-conmon-d36fc74848dd5752738fa562ae7718338f734f35475b5ceed56e46fe22e5e32e.scope.
Nov 28 10:04:00 np0005538513.localdomain podman[315848]: 2025-11-28 10:04:00.012581682 +0000 UTC m=+0.050994303 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:00 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:00 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55257e7db66b5e9d4238db462fc123dc4198a1b88b4403abed55e4b023628632/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:00 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e142 do_prune osdmap full prune enabled
Nov 28 10:04:00 np0005538513.localdomain podman[315848]: 2025-11-28 10:04:00.134219357 +0000 UTC m=+0.172631878 container init d36fc74848dd5752738fa562ae7718338f734f35475b5ceed56e46fe22e5e32e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:04:00 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e143 e143: 6 total, 6 up, 6 in
Nov 28 10:04:00 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e143: 6 total, 6 up, 6 in
Nov 28 10:04:00 np0005538513.localdomain podman[315848]: 2025-11-28 10:04:00.148063464 +0000 UTC m=+0.186475995 container start d36fc74848dd5752738fa562ae7718338f734f35475b5ceed56e46fe22e5e32e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:00 np0005538513.localdomain dnsmasq[315866]: started, version 2.85 cachesize 150
Nov 28 10:04:00 np0005538513.localdomain dnsmasq[315866]: DNS service limited to local subnets
Nov 28 10:04:00 np0005538513.localdomain dnsmasq[315866]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:00 np0005538513.localdomain dnsmasq[315866]: warning: no upstream servers configured
Nov 28 10:04:00 np0005538513.localdomain dnsmasq[315866]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:00 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:00.220 261084 INFO neutron.agent.dhcp.agent [-] Finished network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 dhcp configuration
Nov 28 10:04:00 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:00.220 261084 INFO neutron.agent.dhcp.agent [None req-3a4c8826-6eeb-4084-b6e5-81cd4ed1d4d8 - - - - - -] Synchronizing state complete
Nov 28 10:04:00 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:00.337 261084 INFO neutron.agent.dhcp.agent [None req-d5e0c524-6fbe-4131-9826-21f5df2a33b2 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', 'd896fe01-471a-407a-a0b3-6d40883262a8'} is completed
Nov 28 10:04:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:00.431 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:00 np0005538513.localdomain podman[315884]: 2025-11-28 10:04:00.539595053 +0000 UTC m=+0.065324264 container kill d36fc74848dd5752738fa562ae7718338f734f35475b5ceed56e46fe22e5e32e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:00 np0005538513.localdomain dnsmasq[315866]: exiting on receipt of SIGTERM
Nov 28 10:04:00 np0005538513.localdomain systemd[1]: libpod-d36fc74848dd5752738fa562ae7718338f734f35475b5ceed56e46fe22e5e32e.scope: Deactivated successfully.
Nov 28 10:04:00 np0005538513.localdomain podman[315899]: 2025-11-28 10:04:00.602868655 +0000 UTC m=+0.047495732 container died d36fc74848dd5752738fa562ae7718338f734f35475b5ceed56e46fe22e5e32e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:04:00 np0005538513.localdomain systemd[1]: tmp-crun.6yvhyr.mount: Deactivated successfully.
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.675 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.676 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.680 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9a7fb02-edf9-4d21-8122-a7ee5a7b6d25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:04:00.677004', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8fd9661c-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.848925512, 'message_signature': '1877d22c9c5110be366247cc412b88083af02b2e85742d24b0296ca4a4c59229'}]}, 'timestamp': '2025-11-28 10:04:00.681826', '_unique_id': '3097198d14c84247b55de72e90e5d796'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.684 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 10:04:00 np0005538513.localdomain podman[315899]: 2025-11-28 10:04:00.695768097 +0000 UTC m=+0.140395114 container cleanup d36fc74848dd5752738fa562ae7718338f734f35475b5ceed56e46fe22e5e32e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 28 10:04:00 np0005538513.localdomain systemd[1]: libpod-conmon-d36fc74848dd5752738fa562ae7718338f734f35475b5ceed56e46fe22e5e32e.scope: Deactivated successfully.
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.717 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.718 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '363a2971-e35a-4845-9836-48f56dd4fd03', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:04:00.684962', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8fdf02d4-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': '6b0f7f773c3376caa1ff08a85f907c5f79c4f259a7c3e43ca125b8bb32c1c3c1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:04:00.684962', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8fdf19ea-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': 'dd4b1ed66d82d7140c85fdd6b5a996a0f66ffb93e8ca7880509fb944817e7940'}]}, 'timestamp': '2025-11-28 10:04:00.719223', '_unique_id': '6298c8f2775a4e75938624a7eae42d40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:04:00 np0005538513.localdomain systemd-journald[47227]: Data hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 75.0 (53724 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation.
Nov 28 10:04:00 np0005538513.localdomain systemd-journald[47227]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 10:04:00 np0005538513.localdomain rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.722 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.722 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.723 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70676303-5427-4d85-b912-b58f8a549b98', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:04:00.722465', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8fdfb288-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': '5231a6837f804074d4cd46742a8d82b4e20d6f1ccdaa339be50745cf9e996b9d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:04:00.722465', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8fdfcea8-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': 'a43604e8d0c15024d3ffd891e86adf51e2729b4e7f6210327d09a6d4705f8adb'}]}, 'timestamp': '2025-11-28 10:04:00.723821', '_unique_id': '902025e17c9a484cb88f2da95bcf8ad5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.726 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.727 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain podman[315900]: 2025-11-28 10:04:00.729119213 +0000 UTC m=+0.163902417 container remove d36fc74848dd5752738fa562ae7718338f734f35475b5ceed56e46fe22e5e32e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5357b61f-2aba-4c9a-a601-c78fea703e83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:04:00.726997', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8fe06926-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.848925512, 'message_signature': '80b91536985e85aa7ba4f0c0f0e4ec951e64142dadd95aeea0af86bf5e5cc122'}]}, 'timestamp': '2025-11-28 10:04:00.727817', '_unique_id': '37278e6f4e7947b3888d89ebac401af4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.731 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.731 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84f436f4-4b70-4535-b0a9-f469c7858ed6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:04:00.731317', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8fe10b56-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.848925512, 'message_signature': '1cdb739a9b55d10c9d33194449093bff1d8554c1f2aa2f38fdddfe6609402eb5'}]}, 'timestamp': '2025-11-28 10:04:00.731976', '_unique_id': '22882a474a334cb3b0cd404b65df046e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.734 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.735 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4124fd88-1bdf-4891-a4b9-7937249997cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:04:00.735104', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8fe1a138-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.848925512, 'message_signature': '80dfba672449ed38730979380c745807ba237b3d8abe7ba912e73a539936527e'}]}, 'timestamp': '2025-11-28 10:04:00.735825', '_unique_id': '6bb436499f86421e91e68c80057f6ede'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.738 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 10:04:00 np0005538513.localdomain kernel: device tapd896fe01-47 left promiscuous mode
Nov 28 10:04:00 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:00Z|00251|binding|INFO|Releasing lport d896fe01-471a-407a-a0b3-6d40883262a8 from this chassis (sb_readonly=0)
Nov 28 10:04:00 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:00Z|00252|binding|INFO|Setting lport d896fe01-471a-407a-a0b3-6d40883262a8 down in Southbound
Nov 28 10:04:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:00.762 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.767 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 16370000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79354169-df3a-4cb4-948a-a01999f270d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16370000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:04:00.739071', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '8fe68afe-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.939135017, 'message_signature': 'c897e1db75163101dca09f0131d8f45e9af7bd2efbda9bf3ee30c88828bfac30'}]}, 'timestamp': '2025-11-28 10:04:00.767918', '_unique_id': '62feecdfb042478292f2ea6e457da144'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.770 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.770 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.770 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8e46437-9cb6-46dc-9737-01ac83740a3d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:04:00.770637', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '8fe708c6-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.939135017, 'message_signature': '6e8577687f2d022931bfbde3fe5ea703e1b1182027f7b554bb8829d688b97a60'}]}, 'timestamp': '2025-11-28 10:04:00.771163', '_unique_id': '83a07f879d0a4038b81825f51c553fe9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.774 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.774 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5465045-f2fd-425a-835b-0120814fe254', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:04:00.774359', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8fe79af2-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.848925512, 'message_signature': '54e4d09a84a1721f21976f40a4b05f4fe00e192367c13c552b5bfacb5d68d4d8'}]}, 'timestamp': '2025-11-28 10:04:00.774868', '_unique_id': '8aa8fe84b23c4108a3045ad3cf14d694'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.777 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.778 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:00.778 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:00.776 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=d896fe01-471a-407a-a0b3-6d40883262a8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:00.778 158130 INFO neutron.agent.ovn.metadata.agent [-] Port d896fe01-471a-407a-a0b3-6d40883262a8 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis
Nov 28 10:04:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:00.780 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:04:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:00.781 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[dfae5901-c08f-4a35-a21f-b7d8315dff63]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10e4046a-ccc1-456b-8981-05f848eeef3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:04:00.777969', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8fe82ec2-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.848925512, 'message_signature': '4fa842ae6fc2818dfa5d3600652b74f88d39d4c3cfadfe4525a882b02391e6f7'}]}, 'timestamp': '2025-11-28 10:04:00.779410', '_unique_id': '3387accc91594ad7bf1a7bf9fa3b716a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.785 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.785 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aee721c0-f0fb-4b6f-9582-4242c6bbb5bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:04:00.785397', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8fe948e8-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.848925512, 'message_signature': 'dc887fe0818e1d411378a3f36dde74ac33e6828f4d15abe1fd385ebc1c3ef728'}]}, 'timestamp': '2025-11-28 10:04:00.785912', '_unique_id': '2118d6b9c994488ab1fcdc14041abc40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.788 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.788 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.789 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19e7d889-92ed-402f-bf12-4973b1731198', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:04:00.788598', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8fe9c5ac-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': 'e37d6d61a8e96f7528284c7126c17c07656f82e3371e9802777e18a00db2d56a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:04:00.788598', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8fe9d8c6-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': '1dec9adff7bfb3927cc4c68ada0f2cf7d36574414001a01df5ba90db559e9653'}]}, 'timestamp': '2025-11-28 10:04:00.789532', '_unique_id': 'de37c780bee94a73a02bf0ab621de784'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.791 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.803 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.804 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5522ee05-e427-4d83-8050-65041aa55054', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:04:00.791826', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8fec0f88-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.963740072, 'message_signature': '062acaadb13cdceea3a1d3a62eb11375d3ff198ac403b80dc942ccdeef84004a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:04:00.791826', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8fec23a6-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.963740072, 'message_signature': 'f5b786fecdc738ee62e2a7c3f07f544db7ab9438c4d4434d000ed023b8e75496'}]}, 'timestamp': '2025-11-28 10:04:00.804555', '_unique_id': '47979b40213b41679ea1dc43571ca7bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.806 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.806 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.807 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.807 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3b57731-e209-467f-863d-7d22cfc2deb7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:04:00.806959', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8fec946c-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.963740072, 'message_signature': 'aabcbcc3434eb3affc00fd02d11d72bf27f7e95c3267e85f193d22accc0bf80a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:04:00.806959', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8feca4f2-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.963740072, 'message_signature': '9f6ffd287a43201de9822ea467198cdec02387c083a34017e7308812391198b6'}]}, 'timestamp': '2025-11-28 10:04:00.807864', '_unique_id': '15b609f400b549c2ae0c0019905b8d43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.810 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.810 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.810 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86f86b47-eac6-474f-a60c-926a4ac9dedf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:04:00.810206', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8fed11a8-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.963740072, 'message_signature': 'b0693e9a70bdcdd4d2332a735f7c9ff62c7a1ca8e4a8cb3ec749aced433df470'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:04:00.810206', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8fed2224-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.963740072, 'message_signature': '5946666589e7e23baeee209357572477219416bac03665e08fe74bed8c7e8ba9'}]}, 'timestamp': '2025-11-28 10:04:00.811092', '_unique_id': '775e15185f824008bf014eabc5d1d629'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.813 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.813 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.813 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0760e97e-f176-4572-af3c-4e23df721124', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:04:00.813291', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8fed8a34-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': '3f712eb541707a648ce63a9b3b9bc2ff679f975bf62061ed35dc3601ff30f9d5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:04:00.813291', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8fed9b82-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': '419c2fce2df44c9e01f295afe778f863994ed6cec222ea10fd5bff9d82a5fe9b'}]}, 'timestamp': '2025-11-28 10:04:00.814207', '_unique_id': 'ddf308707e9e407c8f1da8e4af5d6d00'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.816 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.816 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5497ddf7-892a-475b-acad-655bf92e3193', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:04:00.816390', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8fee0360-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.848925512, 'message_signature': '539917ea0916d2bfdc7c1cd38c329e73024f228fc11cfa1adf90bca0e4f6aa41'}]}, 'timestamp': '2025-11-28 10:04:00.816863', '_unique_id': '3fd8a4f64a8d476cb8b9bb59aed5119f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.818 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.819 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.819 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ebdca55d-1c50-4ec7-8381-96b0694729ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:04:00.819069', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8fee6c74-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': '6ea4e1ce75cf15e6f315923863817910dc781c35e7ea6b3e473900f66a6fd9f9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:04:00.819069', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8fee7c8c-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': '41da1ac54a8a4886a33fbd7953f8819efa9d55a746425210ca87f311601d10ac'}]}, 'timestamp': '2025-11-28 10:04:00.819931', '_unique_id': 'd4c4a15aaa2c41cc9cd11da130e1d8ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.822 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.822 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.822 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4fea83b9-992d-4b52-98b3-5f82462e73c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:04:00.822477', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8feef0f4-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': 'e8dd903a934b37e879d08440d25527993886e90735523d53844c504eb82456d6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:04:00.822477', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8fef02ce-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': 'a69b54a75edfeb5e6491620d4811ed047a695ae57f0242965e8e371abfedd5c2'}]}, 'timestamp': '2025-11-28 10:04:00.823371', '_unique_id': 'bac0904ac8ee4a599af8ca2faa22feaf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.825 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.825 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.825 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.825 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fab16d33-0576-4330-891a-b680d60664a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:04:00.825884', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8fef779a-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.848925512, 'message_signature': '1b5661850848126765fd207a37bb1a2b9a2b30c306d90cda8aa3040531a08ae7'}]}, 'timestamp': '2025-11-28 10:04:00.826390', '_unique_id': '37e3d50f88974acf986a113d2ae56734'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a51bf0bd-83ae-4609-979d-4aea874d7513', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:04:00.827818', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8fefbe3a-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.848925512, 'message_signature': '0bfcbeacadc86f14b1e0f3cde4764fc0c063589c31f488272861c565fde55d28'}]}, 'timestamp': '2025-11-28 10:04:00.828151', '_unique_id': '901b04dc751f49f3a43be5583c55a264'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:04:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:04:00 np0005538513.localdomain rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 10:04:00 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:00.953 261084 INFO neutron.agent.dhcp.agent [None req-1bf1e86e-c455-4eb0-8661-53ccfd3c0aa4 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:04:00 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:00.954 261084 INFO neutron.agent.dhcp.agent [None req-1bf1e86e-c455-4eb0-8661-53ccfd3c0aa4 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:04:00 np0005538513.localdomain sshd[315928]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:04:01 np0005538513.localdomain ceph-mon[292954]: pgmap v230: 177 pgs: 177 active+clean; 145 MiB data, 782 MiB used, 41 GiB / 42 GiB avail; 4.0 KiB/s rd, 0 B/s wr, 5 op/s
Nov 28 10:04:01 np0005538513.localdomain ceph-mon[292954]: osdmap e143: 6 total, 6 up, 6 in
Nov 28 10:04:01 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e143 do_prune osdmap full prune enabled
Nov 28 10:04:01 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e144 e144: 6 total, 6 up, 6 in
Nov 28 10:04:01 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e144: 6 total, 6 up, 6 in
Nov 28 10:04:01 np0005538513.localdomain sshd[315928]: error: kex_exchange_identification: Connection closed by remote host
Nov 28 10:04:01 np0005538513.localdomain sshd[315928]: Connection closed by 218.8.225.25 port 50990
Nov 28 10:04:01 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-55257e7db66b5e9d4238db462fc123dc4198a1b88b4403abed55e4b023628632-merged.mount: Deactivated successfully.
Nov 28 10:04:01 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d36fc74848dd5752738fa562ae7718338f734f35475b5ceed56e46fe22e5e32e-userdata-shm.mount: Deactivated successfully.
Nov 28 10:04:01 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully.
Nov 28 10:04:02 np0005538513.localdomain ceph-mon[292954]: osdmap e144: 6 total, 6 up, 6 in
Nov 28 10:04:03 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:03.179 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:03 np0005538513.localdomain ceph-mon[292954]: pgmap v233: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 54 KiB/s rd, 4.3 KiB/s wr, 74 op/s
Nov 28 10:04:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:03.970 261084 INFO neutron.agent.linux.ip_lib [None req-4fae51f1-3f1d-4d2d-a9ed-5509cf71f4c9 - - - - - -] Device tap5fa5dbd0-88 cannot be used as it has no MAC address
Nov 28 10:04:03 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:03.994 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:04 np0005538513.localdomain kernel: device tap5fa5dbd0-88 entered promiscuous mode
Nov 28 10:04:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:04.002 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:04 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:04Z|00253|binding|INFO|Claiming lport 5fa5dbd0-889e-4133-9ba4-6f3810999535 for this chassis.
Nov 28 10:04:04 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:04Z|00254|binding|INFO|5fa5dbd0-889e-4133-9ba4-6f3810999535: Claiming unknown
Nov 28 10:04:04 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324244.0038] manager: (tap5fa5dbd0-88): new Generic device (/org/freedesktop/NetworkManager/Devices/44)
Nov 28 10:04:04 np0005538513.localdomain systemd-udevd[315939]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:04:04 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:04Z|00255|binding|INFO|Setting lport 5fa5dbd0-889e-4133-9ba4-6f3810999535 ovn-installed in OVS
Nov 28 10:04:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:04.015 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:04 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap5fa5dbd0-88: No such device
Nov 28 10:04:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:04.035 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:04 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap5fa5dbd0-88: No such device
Nov 28 10:04:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:04.041 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:04 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap5fa5dbd0-88: No such device
Nov 28 10:04:04 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap5fa5dbd0-88: No such device
Nov 28 10:04:04 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap5fa5dbd0-88: No such device
Nov 28 10:04:04 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap5fa5dbd0-88: No such device
Nov 28 10:04:04 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap5fa5dbd0-88: No such device
Nov 28 10:04:04 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap5fa5dbd0-88: No such device
Nov 28 10:04:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:04.079 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:04.106 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe5a:389/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=5fa5dbd0-889e-4133-9ba4-6f3810999535) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:04.108 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 5fa5dbd0-889e-4133-9ba4-6f3810999535 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis
Nov 28 10:04:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:04.111 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 3fb0e974-4428-459d-a213-c32931747eec IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:04:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:04.111 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:04.113 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:04.112 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[d682f522-f29b-4622-8e70-70d3419d2988]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:04 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:04Z|00256|binding|INFO|Setting lport 5fa5dbd0-889e-4133-9ba4-6f3810999535 up in Southbound
Nov 28 10:04:04 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2743022339' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:04:04 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2743022339' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:04:04 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:04.965 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:04Z, description=, device_id=d3e54e1b-bd42-4183-a535-29739c2a7728, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65fda00>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd663c340>], id=7a77d608-8675-459d-a174-8748cdb9bb12, ip_allocation=immediate, mac_address=fa:16:3e:39:10:08, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:54Z, description=, dns_domain=, id=a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroupTestJSON-775491249-network, port_security_enabled=True, project_id=6b11f04c88dd4db3aa7f405d125f76a4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10427, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1467, status=ACTIVE, subnets=['6b4c8b66-460b-444a-9b80-e4ad139c626d'], tags=[], tenant_id=6b11f04c88dd4db3aa7f405d125f76a4, updated_at=2025-11-28T10:03:56Z, vlan_transparent=None, network_id=a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, port_security_enabled=False, project_id=6b11f04c88dd4db3aa7f405d125f76a4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1509, status=DOWN, tags=[], tenant_id=6b11f04c88dd4db3aa7f405d125f76a4, updated_at=2025-11-28T10:04:04Z on network a8ba88fa-1415-4e6a-82ae-4f41cdd912f1
Nov 28 10:04:04 np0005538513.localdomain podman[316010]: 
Nov 28 10:04:04 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:04 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e144 do_prune osdmap full prune enabled
Nov 28 10:04:04 np0005538513.localdomain podman[316010]: 2025-11-28 10:04:04.995224774 +0000 UTC m=+0.102979142 container create 23956af9f313c65c4c605801daf5486c487e2e32aa6889c7ee0e355eb6ba6170 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 10:04:05 np0005538513.localdomain podman[316010]: 2025-11-28 10:04:04.941635189 +0000 UTC m=+0.049389637 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:05 np0005538513.localdomain systemd[1]: Started libpod-conmon-23956af9f313c65c4c605801daf5486c487e2e32aa6889c7ee0e355eb6ba6170.scope.
Nov 28 10:04:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:05.060 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 2001:db8::f816:3eff:fef8:ad87'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:05.062 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated
Nov 28 10:04:05 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:05.065 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 3fb0e974-4428-459d-a213-c32931747eec IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:04:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:05.066 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:05.067 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[6ea454cf-d354-42d8-9977-580570a5d945]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:05 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aac4355a69b01ddb27ef64e3ac236aecb4f69adf3422aae8cccf0b7119eb8998/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:05 np0005538513.localdomain podman[316010]: 2025-11-28 10:04:05.080725514 +0000 UTC m=+0.188479882 container init 23956af9f313c65c4c605801daf5486c487e2e32aa6889c7ee0e355eb6ba6170 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:05 np0005538513.localdomain podman[316010]: 2025-11-28 10:04:05.090291398 +0000 UTC m=+0.198045756 container start 23956af9f313c65c4c605801daf5486c487e2e32aa6889c7ee0e355eb6ba6170 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:04:05 np0005538513.localdomain dnsmasq[316038]: started, version 2.85 cachesize 150
Nov 28 10:04:05 np0005538513.localdomain dnsmasq[316038]: DNS service limited to local subnets
Nov 28 10:04:05 np0005538513.localdomain dnsmasq[316038]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:05 np0005538513.localdomain dnsmasq[316038]: warning: no upstream servers configured
Nov 28 10:04:05 np0005538513.localdomain dnsmasq[316038]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:05 np0005538513.localdomain podman[316045]: 2025-11-28 10:04:05.226129811 +0000 UTC m=+0.047014378 container kill 3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 28 10:04:05 np0005538513.localdomain dnsmasq[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/addn_hosts - 1 addresses
Nov 28 10:04:05 np0005538513.localdomain dnsmasq-dhcp[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/host
Nov 28 10:04:05 np0005538513.localdomain dnsmasq-dhcp[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/opts
Nov 28 10:04:05 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:05.253 261084 INFO neutron.agent.dhcp.agent [None req-6e76fc09-aae3-4c05-a37b-9cb66dcf8ec4 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed
Nov 28 10:04:05 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e145 e145: 6 total, 6 up, 6 in
Nov 28 10:04:05 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e145: 6 total, 6 up, 6 in
Nov 28 10:04:05 np0005538513.localdomain ceph-mon[292954]: pgmap v234: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 3.3 KiB/s wr, 57 op/s
Nov 28 10:04:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:05.472 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:05 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:05.504 261084 INFO neutron.agent.dhcp.agent [None req-f08d02d1-3b77-4ea9-9ab4-c899e7c4845b - - - - - -] DHCP configuration for ports {'7a77d608-8675-459d-a174-8748cdb9bb12'} is completed
Nov 28 10:04:05 np0005538513.localdomain dnsmasq[316038]: exiting on receipt of SIGTERM
Nov 28 10:04:05 np0005538513.localdomain podman[316083]: 2025-11-28 10:04:05.517903721 +0000 UTC m=+0.097760512 container kill 23956af9f313c65c4c605801daf5486c487e2e32aa6889c7ee0e355eb6ba6170 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:04:05 np0005538513.localdomain systemd[1]: libpod-23956af9f313c65c4c605801daf5486c487e2e32aa6889c7ee0e355eb6ba6170.scope: Deactivated successfully.
Nov 28 10:04:05 np0005538513.localdomain podman[316098]: 2025-11-28 10:04:05.579775434 +0000 UTC m=+0.041053388 container died 23956af9f313c65c4c605801daf5486c487e2e32aa6889c7ee0e355eb6ba6170 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:05 np0005538513.localdomain podman[316098]: 2025-11-28 10:04:05.633763331 +0000 UTC m=+0.095041265 container remove 23956af9f313c65c4c605801daf5486c487e2e32aa6889c7ee0e355eb6ba6170 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:05 np0005538513.localdomain systemd[1]: libpod-conmon-23956af9f313c65c4c605801daf5486c487e2e32aa6889c7ee0e355eb6ba6170.scope: Deactivated successfully.
Nov 28 10:04:06 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-aac4355a69b01ddb27ef64e3ac236aecb4f69adf3422aae8cccf0b7119eb8998-merged.mount: Deactivated successfully.
Nov 28 10:04:06 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-23956af9f313c65c4c605801daf5486c487e2e32aa6889c7ee0e355eb6ba6170-userdata-shm.mount: Deactivated successfully.
Nov 28 10:04:06 np0005538513.localdomain ceph-mon[292954]: osdmap e145: 6 total, 6 up, 6 in
Nov 28 10:04:07 np0005538513.localdomain podman[316174]: 
Nov 28 10:04:07 np0005538513.localdomain podman[316174]: 2025-11-28 10:04:07.088254087 +0000 UTC m=+0.092321697 container create 7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:04:07 np0005538513.localdomain systemd[1]: Started libpod-conmon-7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44.scope.
Nov 28 10:04:07 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:07 np0005538513.localdomain podman[316174]: 2025-11-28 10:04:07.042399193 +0000 UTC m=+0.046466823 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:07 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e9ba0aa3684c459cfc81f6cc61b5ee805300d4fb3d8ef243c9434b8e324e32f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:07 np0005538513.localdomain podman[316174]: 2025-11-28 10:04:07.153873438 +0000 UTC m=+0.157941038 container init 7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 28 10:04:07 np0005538513.localdomain podman[316174]: 2025-11-28 10:04:07.164705768 +0000 UTC m=+0.168773388 container start 7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:07 np0005538513.localdomain dnsmasq[316192]: started, version 2.85 cachesize 150
Nov 28 10:04:07 np0005538513.localdomain dnsmasq[316192]: DNS service limited to local subnets
Nov 28 10:04:07 np0005538513.localdomain dnsmasq[316192]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:07 np0005538513.localdomain dnsmasq[316192]: warning: no upstream servers configured
Nov 28 10:04:07 np0005538513.localdomain dnsmasq-dhcp[316192]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:04:07 np0005538513.localdomain dnsmasq[316192]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:07 np0005538513.localdomain dnsmasq-dhcp[316192]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:07 np0005538513.localdomain dnsmasq-dhcp[316192]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:07 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:07.228 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:04Z, description=, device_id=d3e54e1b-bd42-4183-a535-29739c2a7728, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64c7ac0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64c7070>], id=7a77d608-8675-459d-a174-8748cdb9bb12, ip_allocation=immediate, mac_address=fa:16:3e:39:10:08, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:54Z, description=, dns_domain=, id=a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroupTestJSON-775491249-network, port_security_enabled=True, project_id=6b11f04c88dd4db3aa7f405d125f76a4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10427, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1467, status=ACTIVE, subnets=['6b4c8b66-460b-444a-9b80-e4ad139c626d'], tags=[], tenant_id=6b11f04c88dd4db3aa7f405d125f76a4, updated_at=2025-11-28T10:03:56Z, vlan_transparent=None, network_id=a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, port_security_enabled=False, project_id=6b11f04c88dd4db3aa7f405d125f76a4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1509, status=DOWN, tags=[], tenant_id=6b11f04c88dd4db3aa7f405d125f76a4, updated_at=2025-11-28T10:04:04Z on network a8ba88fa-1415-4e6a-82ae-4f41cdd912f1
Nov 28 10:04:07 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e145 do_prune osdmap full prune enabled
Nov 28 10:04:07 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e146 e146: 6 total, 6 up, 6 in
Nov 28 10:04:07 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e146: 6 total, 6 up, 6 in
Nov 28 10:04:07 np0005538513.localdomain ceph-mon[292954]: pgmap v236: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 3.3 KiB/s wr, 57 op/s
Nov 28 10:04:07 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:07.374 261084 INFO neutron.agent.dhcp.agent [None req-85d6ca77-7fb6-4a0a-93e2-f999f3b259aa - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed
Nov 28 10:04:07 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:07.425 2 INFO neutron.agent.securitygroups_rpc [None req-f8d4b801-af07-4edf-8fd0-12384366c126 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:07 np0005538513.localdomain podman[316209]: 2025-11-28 10:04:07.458379702 +0000 UTC m=+0.063713676 container kill 3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:04:07 np0005538513.localdomain dnsmasq[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/addn_hosts - 1 addresses
Nov 28 10:04:07 np0005538513.localdomain dnsmasq-dhcp[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/host
Nov 28 10:04:07 np0005538513.localdomain dnsmasq-dhcp[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/opts
Nov 28 10:04:07 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:07.538 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:06Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd665ca30>, <neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6659610>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd665c610>, <neutron.agent.linux.dhcp.DictModel object at 0x7fdcd665c9d0>], id=8ea63c52-cb39-4f5e-9330-64cd47bf74a1, ip_allocation=immediate, mac_address=fa:16:3e:06:3f:d1, name=tempest-NetworksTestDHCPv6-151675552, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=23, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['24e3dfe7-e849-45dc-a899-cccb948411be', '82448f9b-7246-4f52-8ff3-d912fa5f48b9'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:03Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1511, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:07Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1
Nov 28 10:04:07 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:07.722 261084 INFO neutron.agent.dhcp.agent [None req-e69ac829-3f70-44eb-8227-a20f5e2fce20 - - - - - -] DHCP configuration for ports {'7a77d608-8675-459d-a174-8748cdb9bb12'} is completed
Nov 28 10:04:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:04:07 np0005538513.localdomain podman[316248]: 2025-11-28 10:04:07.764793233 +0000 UTC m=+0.058324033 container kill 7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:04:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:04:07 np0005538513.localdomain dnsmasq[316192]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 2 addresses
Nov 28 10:04:07 np0005538513.localdomain dnsmasq-dhcp[316192]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:07 np0005538513.localdomain dnsmasq-dhcp[316192]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:07 np0005538513.localdomain podman[316263]: 2025-11-28 10:04:07.870649256 +0000 UTC m=+0.093551162 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:04:07 np0005538513.localdomain podman[316263]: 2025-11-28 10:04:07.883442112 +0000 UTC m=+0.106343988 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:04:07 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:04:07 np0005538513.localdomain podman[316262]: 2025-11-28 10:04:07.969146418 +0000 UTC m=+0.195083861 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:04:07 np0005538513.localdomain podman[316262]: 2025-11-28 10:04:07.976819708 +0000 UTC m=+0.202757221 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:04:07 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:04:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:08.076 261084 INFO neutron.agent.dhcp.agent [None req-4bf49508-5b67-4b74-9817-81b475360309 - - - - - -] DHCP configuration for ports {'8ea63c52-cb39-4f5e-9330-64cd47bf74a1'} is completed
Nov 28 10:04:08 np0005538513.localdomain ceph-mon[292954]: osdmap e146: 6 total, 6 up, 6 in
Nov 28 10:04:09 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:09.207 2 INFO neutron.agent.securitygroups_rpc [None req-f248108c-11ce-43fd-804f-455d486d1048 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e146 do_prune osdmap full prune enabled
Nov 28 10:04:09 np0005538513.localdomain ceph-mon[292954]: pgmap v238: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 77 KiB/s rd, 4.6 KiB/s wr, 103 op/s
Nov 28 10:04:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e147 e147: 6 total, 6 up, 6 in
Nov 28 10:04:09 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e147: 6 total, 6 up, 6 in
Nov 28 10:04:09 np0005538513.localdomain dnsmasq[316192]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:09 np0005538513.localdomain podman[316330]: 2025-11-28 10:04:09.526929105 +0000 UTC m=+0.061403091 container kill 7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 10:04:09 np0005538513.localdomain dnsmasq-dhcp[316192]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:09 np0005538513.localdomain dnsmasq-dhcp[316192]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:10 np0005538513.localdomain podman[238687]: time="2025-11-28T10:04:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:04:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:04:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159335 "" "Go-http-client/1.1"
Nov 28 10:04:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:04:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20201 "" "Go-http-client/1.1"
Nov 28 10:04:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:04:10 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3798834837' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:04:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:04:10 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3798834837' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:04:10 np0005538513.localdomain ceph-mon[292954]: osdmap e147: 6 total, 6 up, 6 in
Nov 28 10:04:10 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3798834837' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:04:10 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3798834837' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:04:10 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:10.474 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:04:10 np0005538513.localdomain dnsmasq[316192]: exiting on receipt of SIGTERM
Nov 28 10:04:10 np0005538513.localdomain podman[316368]: 2025-11-28 10:04:10.940944532 +0000 UTC m=+0.068155835 container kill 7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:04:10 np0005538513.localdomain systemd[1]: libpod-7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44.scope: Deactivated successfully.
Nov 28 10:04:11 np0005538513.localdomain podman[316382]: 2025-11-28 10:04:11.025880035 +0000 UTC m=+0.065576810 container died 7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 10:04:11 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44-userdata-shm.mount: Deactivated successfully.
Nov 28 10:04:11 np0005538513.localdomain podman[316382]: 2025-11-28 10:04:11.063277406 +0000 UTC m=+0.102974091 container cleanup 7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:04:11 np0005538513.localdomain systemd[1]: libpod-conmon-7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44.scope: Deactivated successfully.
Nov 28 10:04:11 np0005538513.localdomain podman[316384]: 2025-11-28 10:04:11.102948153 +0000 UTC m=+0.135971136 container remove 7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:11 np0005538513.localdomain ceph-mon[292954]: pgmap v240: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 1.7 KiB/s wr, 55 op/s
Nov 28 10:04:11 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-2e9ba0aa3684c459cfc81f6cc61b5ee805300d4fb3d8ef243c9434b8e324e32f-merged.mount: Deactivated successfully.
Nov 28 10:04:12 np0005538513.localdomain podman[316457]: 
Nov 28 10:04:12 np0005538513.localdomain podman[316457]: 2025-11-28 10:04:12.026226551 +0000 UTC m=+0.106100661 container create df94be99b838ddb7f178da6fbb4e082ee64a3e1c2e542ccf53a9b992a1a5455a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:12 np0005538513.localdomain systemd[1]: Started libpod-conmon-df94be99b838ddb7f178da6fbb4e082ee64a3e1c2e542ccf53a9b992a1a5455a.scope.
Nov 28 10:04:12 np0005538513.localdomain podman[316457]: 2025-11-28 10:04:11.980512232 +0000 UTC m=+0.060386382 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:12 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:12 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a2ec9c6c2154e417c331c77cab1b1300ce0ef76468d7987ca196d76462339c1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:12 np0005538513.localdomain podman[316457]: 2025-11-28 10:04:12.109461657 +0000 UTC m=+0.189335767 container init df94be99b838ddb7f178da6fbb4e082ee64a3e1c2e542ccf53a9b992a1a5455a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 10:04:12 np0005538513.localdomain podman[316457]: 2025-11-28 10:04:12.118638689 +0000 UTC m=+0.198512799 container start df94be99b838ddb7f178da6fbb4e082ee64a3e1c2e542ccf53a9b992a1a5455a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:12 np0005538513.localdomain dnsmasq[316476]: started, version 2.85 cachesize 150
Nov 28 10:04:12 np0005538513.localdomain dnsmasq[316476]: DNS service limited to local subnets
Nov 28 10:04:12 np0005538513.localdomain dnsmasq[316476]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:12 np0005538513.localdomain dnsmasq[316476]: warning: no upstream servers configured
Nov 28 10:04:12 np0005538513.localdomain dnsmasq-dhcp[316476]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:04:12 np0005538513.localdomain dnsmasq[316476]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:12 np0005538513.localdomain dnsmasq-dhcp[316476]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:12 np0005538513.localdomain dnsmasq-dhcp[316476]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:12 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:12.439 261084 INFO neutron.agent.dhcp.agent [None req-fb529359-f6e2-46f3-abe3-e10e5e012b18 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed
Nov 28 10:04:12 np0005538513.localdomain dnsmasq[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/addn_hosts - 0 addresses
Nov 28 10:04:12 np0005538513.localdomain podman[316503]: 2025-11-28 10:04:12.533207218 +0000 UTC m=+0.064933341 container kill 3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 10:04:12 np0005538513.localdomain dnsmasq-dhcp[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/host
Nov 28 10:04:12 np0005538513.localdomain dnsmasq-dhcp[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/opts
Nov 28 10:04:12 np0005538513.localdomain dnsmasq[316476]: exiting on receipt of SIGTERM
Nov 28 10:04:12 np0005538513.localdomain podman[316522]: 2025-11-28 10:04:12.591946861 +0000 UTC m=+0.055751698 container kill df94be99b838ddb7f178da6fbb4e082ee64a3e1c2e542ccf53a9b992a1a5455a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 10:04:12 np0005538513.localdomain systemd[1]: libpod-df94be99b838ddb7f178da6fbb4e082ee64a3e1c2e542ccf53a9b992a1a5455a.scope: Deactivated successfully.
Nov 28 10:04:12 np0005538513.localdomain podman[316541]: 2025-11-28 10:04:12.6672927 +0000 UTC m=+0.060708470 container died df94be99b838ddb7f178da6fbb4e082ee64a3e1c2e542ccf53a9b992a1a5455a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 28 10:04:12 np0005538513.localdomain podman[316541]: 2025-11-28 10:04:12.703972341 +0000 UTC m=+0.097388061 container cleanup df94be99b838ddb7f178da6fbb4e082ee64a3e1c2e542ccf53a9b992a1a5455a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 28 10:04:12 np0005538513.localdomain systemd[1]: libpod-conmon-df94be99b838ddb7f178da6fbb4e082ee64a3e1c2e542ccf53a9b992a1a5455a.scope: Deactivated successfully.
Nov 28 10:04:12 np0005538513.localdomain podman[316543]: 2025-11-28 10:04:12.757160565 +0000 UTC m=+0.143098871 container remove df94be99b838ddb7f178da6fbb4e082ee64a3e1c2e542ccf53a9b992a1a5455a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:12 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:12Z|00257|binding|INFO|Releasing lport be14f038-71a5-4e0d-89f8-2103c3afd8ad from this chassis (sb_readonly=0)
Nov 28 10:04:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:12.776 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:12 np0005538513.localdomain kernel: device tapbe14f038-71 left promiscuous mode
Nov 28 10:04:12 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:12Z|00258|binding|INFO|Setting lport be14f038-71a5-4e0d-89f8-2103c3afd8ad down in Southbound
Nov 28 10:04:12 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:12.789 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b11f04c88dd4db3aa7f405d125f76a4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=785a40f8-f8b6-4ed9-ab6b-593396055237, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=be14f038-71a5-4e0d-89f8-2103c3afd8ad) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:12 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:12.790 158130 INFO neutron.agent.ovn.metadata.agent [-] Port be14f038-71a5-4e0d-89f8-2103c3afd8ad in datapath a8ba88fa-1415-4e6a-82ae-4f41cdd912f1 unbound from our chassis
Nov 28 10:04:12 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:12.793 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:12 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:12.794 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[deb68c85-54f9-4365-a241-48960eb76783]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:12.794 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:12 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-1a2ec9c6c2154e417c331c77cab1b1300ce0ef76468d7987ca196d76462339c1-merged.mount: Deactivated successfully.
Nov 28 10:04:12 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-df94be99b838ddb7f178da6fbb4e082ee64a3e1c2e542ccf53a9b992a1a5455a-userdata-shm.mount: Deactivated successfully.
Nov 28 10:04:13 np0005538513.localdomain ceph-mon[292954]: pgmap v241: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 80 KiB/s rd, 3.6 KiB/s wr, 106 op/s
Nov 28 10:04:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2186460897' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:04:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2186460897' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:04:13 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:13.941 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 2001:db8::f816:3eff:fef8:ad87'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:13 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:13.942 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated
Nov 28 10:04:13 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:13.945 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 3fb0e974-4428-459d-a213-c32931747eec IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:04:13 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:13.945 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:13 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:13.946 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[c8f393b2-272e-4e85-adbe-32d593e0a9b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:14 np0005538513.localdomain podman[316629]: 
Nov 28 10:04:14 np0005538513.localdomain podman[316629]: 2025-11-28 10:04:14.400972747 +0000 UTC m=+0.069766360 container create a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:14 np0005538513.localdomain systemd[1]: Started libpod-conmon-a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa.scope.
Nov 28 10:04:14 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:14 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61899381419cb9e9ab4e8e7d65a63a600b6a344482cd29fb8ea61037f050100c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:14 np0005538513.localdomain podman[316629]: 2025-11-28 10:04:14.374762085 +0000 UTC m=+0.043555748 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:14 np0005538513.localdomain podman[316629]: 2025-11-28 10:04:14.48100599 +0000 UTC m=+0.149799593 container init a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 10:04:14 np0005538513.localdomain podman[316629]: 2025-11-28 10:04:14.490522773 +0000 UTC m=+0.159316386 container start a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 10:04:14 np0005538513.localdomain dnsmasq[316647]: started, version 2.85 cachesize 150
Nov 28 10:04:14 np0005538513.localdomain dnsmasq[316647]: DNS service limited to local subnets
Nov 28 10:04:14 np0005538513.localdomain dnsmasq[316647]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:14 np0005538513.localdomain dnsmasq[316647]: warning: no upstream servers configured
Nov 28 10:04:14 np0005538513.localdomain dnsmasq-dhcp[316647]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:04:14 np0005538513.localdomain dnsmasq[316647]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:14 np0005538513.localdomain dnsmasq-dhcp[316647]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:14 np0005538513.localdomain dnsmasq-dhcp[316647]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:14 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:14.746 261084 INFO neutron.agent.dhcp.agent [None req-88f5cbd4-a475-472e-829e-0920b4855d51 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed
Nov 28 10:04:14 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:14Z|00259|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:04:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:14.853 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:14 np0005538513.localdomain dnsmasq[316647]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:14 np0005538513.localdomain dnsmasq-dhcp[316647]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:14 np0005538513.localdomain dnsmasq-dhcp[316647]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:14 np0005538513.localdomain podman[316663]: 2025-11-28 10:04:14.87396668 +0000 UTC m=+0.067892457 container kill a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:04:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e147 do_prune osdmap full prune enabled
Nov 28 10:04:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e148 e148: 6 total, 6 up, 6 in
Nov 28 10:04:15 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e148: 6 total, 6 up, 6 in
Nov 28 10:04:15 np0005538513.localdomain dnsmasq[315715]: exiting on receipt of SIGTERM
Nov 28 10:04:15 np0005538513.localdomain podman[316701]: 2025-11-28 10:04:15.242066617 +0000 UTC m=+0.060097843 container kill 3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:04:15 np0005538513.localdomain systemd[1]: libpod-3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b.scope: Deactivated successfully.
Nov 28 10:04:15 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:15.312 261084 INFO neutron.agent.dhcp.agent [None req-dc5e7cc5-9842-4dcf-9c74-6d14e6032755 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed
Nov 28 10:04:15 np0005538513.localdomain podman[316715]: 2025-11-28 10:04:15.315780709 +0000 UTC m=+0.056524340 container died 3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 10:04:15 np0005538513.localdomain podman[316715]: 2025-11-28 10:04:15.349856645 +0000 UTC m=+0.090600206 container cleanup 3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:04:15 np0005538513.localdomain systemd[1]: libpod-conmon-3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b.scope: Deactivated successfully.
Nov 28 10:04:15 np0005538513.localdomain podman[316716]: 2025-11-28 10:04:15.397435619 +0000 UTC m=+0.133016152 container remove 3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:04:15 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3b7a3bb605cc17c39a0a6c043199db98db1eab94e3efb6892ac928f5feb4c4b2-merged.mount: Deactivated successfully.
Nov 28 10:04:15 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b-userdata-shm.mount: Deactivated successfully.
Nov 28 10:04:15 np0005538513.localdomain ceph-mon[292954]: pgmap v242: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 65 KiB/s rd, 2.9 KiB/s wr, 85 op/s
Nov 28 10:04:15 np0005538513.localdomain ceph-mon[292954]: osdmap e148: 6 total, 6 up, 6 in
Nov 28 10:04:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:15.504 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:15 np0005538513.localdomain systemd[1]: tmp-crun.OZ1zaR.mount: Deactivated successfully.
Nov 28 10:04:15 np0005538513.localdomain dnsmasq[316647]: exiting on receipt of SIGTERM
Nov 28 10:04:15 np0005538513.localdomain podman[316761]: 2025-11-28 10:04:15.656961696 +0000 UTC m=+0.079747916 container kill a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 10:04:15 np0005538513.localdomain systemd[1]: libpod-a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa.scope: Deactivated successfully.
Nov 28 10:04:15 np0005538513.localdomain podman[316775]: 2025-11-28 10:04:15.733391605 +0000 UTC m=+0.060825923 container died a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:04:15 np0005538513.localdomain podman[316775]: 2025-11-28 10:04:15.764995082 +0000 UTC m=+0.092429360 container cleanup a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:04:15 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:15.766 261084 INFO neutron.agent.dhcp.agent [None req-0c0eeedf-cde3-419e-bb30-183a07fa525f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:04:15 np0005538513.localdomain systemd[1]: libpod-conmon-a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa.scope: Deactivated successfully.
Nov 28 10:04:15 np0005538513.localdomain podman[316777]: 2025-11-28 10:04:15.823161348 +0000 UTC m=+0.142807453 container remove a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 10:04:15 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:15.894 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:04:16 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:16.316 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:04:16 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-61899381419cb9e9ab4e8e7d65a63a600b6a344482cd29fb8ea61037f050100c-merged.mount: Deactivated successfully.
Nov 28 10:04:16 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa-userdata-shm.mount: Deactivated successfully.
Nov 28 10:04:16 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2da8ba88fa\x2d1415\x2d4e6a\x2d82ae\x2d4f41cdd912f1.mount: Deactivated successfully.
Nov 28 10:04:17 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:17.263 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 2001:db8::f816:3eff:fef8:ad87'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:17 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:17.266 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated
Nov 28 10:04:17 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:17.268 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 3fb0e974-4428-459d-a213-c32931747eec IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:04:17 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:17.268 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:17 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:17.269 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[a32cc698-200b-40d9-9a66-875cd064f557]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:17 np0005538513.localdomain ceph-mon[292954]: pgmap v244: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 1.6 KiB/s wr, 43 op/s
Nov 28 10:04:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:04:17 np0005538513.localdomain systemd[1]: tmp-crun.1XsHMz.mount: Deactivated successfully.
Nov 28 10:04:17 np0005538513.localdomain podman[316829]: 2025-11-28 10:04:17.857750527 +0000 UTC m=+0.093087799 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc.)
Nov 28 10:04:17 np0005538513.localdomain podman[316829]: 2025-11-28 10:04:17.900620515 +0000 UTC m=+0.135957787 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 10:04:17 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:04:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:04:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:04:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:04:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:04:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:04:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:04:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:04:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:04:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:04:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:04:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:04:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:04:18 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:18.312 2 INFO neutron.agent.securitygroups_rpc [None req-5939fc78-1573-4689-b1d7-9426dbeeb10b 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:18 np0005538513.localdomain podman[316876]: 
Nov 28 10:04:18 np0005538513.localdomain podman[316876]: 2025-11-28 10:04:18.374201775 +0000 UTC m=+0.097734902 container create ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:18 np0005538513.localdomain systemd[1]: Started libpod-conmon-ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2.scope.
Nov 28 10:04:18 np0005538513.localdomain podman[316876]: 2025-11-28 10:04:18.327140156 +0000 UTC m=+0.050673323 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:18 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:18 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b82bff0f99f41361d4047b7d9eb46cf6ab0cee59ade8c6d5a2773ccc718f27d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:18 np0005538513.localdomain podman[316876]: 2025-11-28 10:04:18.454153606 +0000 UTC m=+0.177686733 container init ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 28 10:04:18 np0005538513.localdomain podman[316876]: 2025-11-28 10:04:18.464141342 +0000 UTC m=+0.187674469 container start ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:18 np0005538513.localdomain dnsmasq[316894]: started, version 2.85 cachesize 150
Nov 28 10:04:18 np0005538513.localdomain dnsmasq[316894]: DNS service limited to local subnets
Nov 28 10:04:18 np0005538513.localdomain dnsmasq[316894]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:18 np0005538513.localdomain dnsmasq[316894]: warning: no upstream servers configured
Nov 28 10:04:18 np0005538513.localdomain dnsmasq-dhcp[316894]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:04:18 np0005538513.localdomain dnsmasq-dhcp[316894]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:04:18 np0005538513.localdomain dnsmasq[316894]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:18 np0005538513.localdomain dnsmasq-dhcp[316894]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:18 np0005538513.localdomain dnsmasq-dhcp[316894]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:18 np0005538513.localdomain ceph-mon[292954]: pgmap v245: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 1.6 KiB/s wr, 42 op/s
Nov 28 10:04:18 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:18.553 261084 INFO neutron.agent.dhcp.agent [None req-a87e3802-90fd-4535-aae0-7e569047fd7f - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:17Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd669b3d0>, <neutron.agent.linux.dhcp.DictModel object at 0x7fdcd669ba60>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd67d3880>, <neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6fa25e0>], id=b40a20d3-aa39-40cf-a7d0-da0fcde9a098, ip_allocation=immediate, mac_address=fa:16:3e:7e:0c:8c, name=tempest-NetworksTestDHCPv6-869146364, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=27, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['00f69316-162e-439e-a6c1-9cd70f0d853b', 'e92409df-693c-4ecf-a085-6f53220a2361'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:14Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1580, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:18Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1
Nov 28 10:04:18 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:18.735 261084 INFO neutron.agent.dhcp.agent [None req-1e745301-6ff5-4cd7-942b-a11ad0bca8bd - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed
Nov 28 10:04:18 np0005538513.localdomain dnsmasq[316894]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 2 addresses
Nov 28 10:04:18 np0005538513.localdomain podman[316913]: 2025-11-28 10:04:18.824261381 +0000 UTC m=+0.066119816 container kill ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:18 np0005538513.localdomain dnsmasq-dhcp[316894]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:18 np0005538513.localdomain dnsmasq-dhcp[316894]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:19 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:19.182 261084 INFO neutron.agent.dhcp.agent [None req-0f8ba0e6-6c52-45c0-8e2d-67bbdce3e3dd - - - - - -] DHCP configuration for ports {'b40a20d3-aa39-40cf-a7d0-da0fcde9a098'} is completed
Nov 28 10:04:19 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:19.189 2 INFO neutron.agent.securitygroups_rpc [None req-582a65ec-d5d3-451f-981d-4b6bb2c1b94e 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:19 np0005538513.localdomain dnsmasq[316894]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:19 np0005538513.localdomain dnsmasq-dhcp[316894]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:19 np0005538513.localdomain podman[316952]: 2025-11-28 10:04:19.429237057 +0000 UTC m=+0.065815258 container kill ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:04:19 np0005538513.localdomain dnsmasq-dhcp[316894]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:19 np0005538513.localdomain systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 28 10:04:19 np0005538513.localdomain dnsmasq[316894]: exiting on receipt of SIGTERM
Nov 28 10:04:19 np0005538513.localdomain podman[316993]: 2025-11-28 10:04:19.964262497 +0000 UTC m=+0.072444337 container kill ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 28 10:04:19 np0005538513.localdomain systemd[1]: libpod-ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2.scope: Deactivated successfully.
Nov 28 10:04:20 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:20 np0005538513.localdomain sudo[317005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:04:20 np0005538513.localdomain sudo[317005]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:04:20 np0005538513.localdomain sudo[317005]: pam_unix(sudo:session): session closed for user root
Nov 28 10:04:20 np0005538513.localdomain podman[317019]: 2025-11-28 10:04:20.051335622 +0000 UTC m=+0.064865710 container died ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 28 10:04:20 np0005538513.localdomain systemd[1]: tmp-crun.H1o6si.mount: Deactivated successfully.
Nov 28 10:04:20 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2-userdata-shm.mount: Deactivated successfully.
Nov 28 10:04:20 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-1b82bff0f99f41361d4047b7d9eb46cf6ab0cee59ade8c6d5a2773ccc718f27d-merged.mount: Deactivated successfully.
Nov 28 10:04:20 np0005538513.localdomain podman[317019]: 2025-11-28 10:04:20.110398024 +0000 UTC m=+0.123928072 container remove ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:20 np0005538513.localdomain systemd[1]: libpod-conmon-ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2.scope: Deactivated successfully.
Nov 28 10:04:20 np0005538513.localdomain sudo[317048]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:04:20 np0005538513.localdomain sudo[317048]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:04:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:20.508 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:20 np0005538513.localdomain sudo[317048]: pam_unix(sudo:session): session closed for user root
Nov 28 10:04:20 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:20.886 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:20 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:20.888 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated
Nov 28 10:04:20 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:20.891 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 3fb0e974-4428-459d-a213-c32931747eec IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:04:20 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:20.891 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:20 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:20.892 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[86fc169a-647c-46e9-a60b-735d11464c26]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:20 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:04:20 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:04:21 np0005538513.localdomain podman[317148]: 
Nov 28 10:04:21 np0005538513.localdomain ceph-mon[292954]: pgmap v246: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 1.3 KiB/s wr, 35 op/s
Nov 28 10:04:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:04:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:04:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:04:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:04:21 np0005538513.localdomain podman[317148]: 2025-11-28 10:04:21.025712551 +0000 UTC m=+0.095889648 container create 941638d49a9614b2dc93143a8abbafd739a5b82ed995976d94d0c2ac9d377d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 10:04:21 np0005538513.localdomain systemd[1]: Started libpod-conmon-941638d49a9614b2dc93143a8abbafd739a5b82ed995976d94d0c2ac9d377d85.scope.
Nov 28 10:04:21 np0005538513.localdomain podman[317148]: 2025-11-28 10:04:20.979536058 +0000 UTC m=+0.049713195 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:21 np0005538513.localdomain sudo[317161]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:04:21 np0005538513.localdomain sudo[317161]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:04:21 np0005538513.localdomain sudo[317161]: pam_unix(sudo:session): session closed for user root
Nov 28 10:04:21 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:21 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ee3114bc9472005d8cdf6f03db49c7ae6c8046cdba7b397d3300bce3c362549/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:21 np0005538513.localdomain podman[317148]: 2025-11-28 10:04:21.131350418 +0000 UTC m=+0.201527525 container init 941638d49a9614b2dc93143a8abbafd739a5b82ed995976d94d0c2ac9d377d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 10:04:21 np0005538513.localdomain podman[317148]: 2025-11-28 10:04:21.140794749 +0000 UTC m=+0.210971856 container start 941638d49a9614b2dc93143a8abbafd739a5b82ed995976d94d0c2ac9d377d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 10:04:21 np0005538513.localdomain dnsmasq[317185]: started, version 2.85 cachesize 150
Nov 28 10:04:21 np0005538513.localdomain dnsmasq[317185]: DNS service limited to local subnets
Nov 28 10:04:21 np0005538513.localdomain dnsmasq[317185]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:21 np0005538513.localdomain dnsmasq[317185]: warning: no upstream servers configured
Nov 28 10:04:21 np0005538513.localdomain dnsmasq-dhcp[317185]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:04:21 np0005538513.localdomain dnsmasq[317185]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:21 np0005538513.localdomain dnsmasq-dhcp[317185]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:21 np0005538513.localdomain dnsmasq-dhcp[317185]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:21 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:21.269 2 INFO neutron.agent.securitygroups_rpc [None req-2bb5c88d-1a1c-4245-b22f-59b37a9a0aaf 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']
Nov 28 10:04:21 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:21.333 261084 INFO neutron.agent.dhcp.agent [None req-5a2acf10-c36b-4973-ac0a-755c4942cdd0 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed
Nov 28 10:04:21 np0005538513.localdomain podman[317204]: 2025-11-28 10:04:21.500562948 +0000 UTC m=+0.069462042 container kill 941638d49a9614b2dc93143a8abbafd739a5b82ed995976d94d0c2ac9d377d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:04:21 np0005538513.localdomain dnsmasq[317185]: exiting on receipt of SIGTERM
Nov 28 10:04:21 np0005538513.localdomain systemd[1]: libpod-941638d49a9614b2dc93143a8abbafd739a5b82ed995976d94d0c2ac9d377d85.scope: Deactivated successfully.
Nov 28 10:04:21 np0005538513.localdomain podman[317222]: 2025-11-28 10:04:21.575400072 +0000 UTC m=+0.049946762 container died 941638d49a9614b2dc93143a8abbafd739a5b82ed995976d94d0c2ac9d377d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 10:04:21 np0005538513.localdomain podman[317222]: 2025-11-28 10:04:21.621620366 +0000 UTC m=+0.096167046 container remove 941638d49a9614b2dc93143a8abbafd739a5b82ed995976d94d0c2ac9d377d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:21 np0005538513.localdomain systemd[1]: libpod-conmon-941638d49a9614b2dc93143a8abbafd739a5b82ed995976d94d0c2ac9d377d85.scope: Deactivated successfully.
Nov 28 10:04:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:04:22 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-7ee3114bc9472005d8cdf6f03db49c7ae6c8046cdba7b397d3300bce3c362549-merged.mount: Deactivated successfully.
Nov 28 10:04:22 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-941638d49a9614b2dc93143a8abbafd739a5b82ed995976d94d0c2ac9d377d85-userdata-shm.mount: Deactivated successfully.
Nov 28 10:04:22 np0005538513.localdomain podman[317245]: 2025-11-28 10:04:22.099918751 +0000 UTC m=+0.084721488 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:04:22 np0005538513.localdomain podman[317245]: 2025-11-28 10:04:22.114549471 +0000 UTC m=+0.099352208 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:04:22 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:04:22 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:22.613 2 INFO neutron.agent.securitygroups_rpc [None req-038c15bb-00b9-42a6-bcc5-a72acb379335 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']
Nov 28 10:04:23 np0005538513.localdomain ceph-mon[292954]: pgmap v247: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:23 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:23.081 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:23 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:23.085 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated
Nov 28 10:04:23 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:23.088 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 3fb0e974-4428-459d-a213-c32931747eec IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:04:23 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:23.088 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:23 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:23.089 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[71bd525c-a759-4974-83f1-eccbcfd83c66]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:23 np0005538513.localdomain podman[317319]: 
Nov 28 10:04:23 np0005538513.localdomain podman[317319]: 2025-11-28 10:04:23.116814399 +0000 UTC m=+0.111829185 container create 5921a5dec60b5690f2ebdc26cd46bbcdc4c837261c9b15a4e6ffe8b1da0c7a62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:23 np0005538513.localdomain systemd[1]: Started libpod-conmon-5921a5dec60b5690f2ebdc26cd46bbcdc4c837261c9b15a4e6ffe8b1da0c7a62.scope.
Nov 28 10:04:23 np0005538513.localdomain podman[317319]: 2025-11-28 10:04:23.070319017 +0000 UTC m=+0.065333823 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:23 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:23 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f8494eb7c4c1eacaab847e2fba5c6ba8e96cf0679e520f61c5dec5de3bf11885/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:23 np0005538513.localdomain podman[317319]: 2025-11-28 10:04:23.197112821 +0000 UTC m=+0.192127637 container init 5921a5dec60b5690f2ebdc26cd46bbcdc4c837261c9b15a4e6ffe8b1da0c7a62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:23 np0005538513.localdomain podman[317319]: 2025-11-28 10:04:23.206080347 +0000 UTC m=+0.201095123 container start 5921a5dec60b5690f2ebdc26cd46bbcdc4c837261c9b15a4e6ffe8b1da0c7a62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:04:23 np0005538513.localdomain dnsmasq[317337]: started, version 2.85 cachesize 150
Nov 28 10:04:23 np0005538513.localdomain dnsmasq[317337]: DNS service limited to local subnets
Nov 28 10:04:23 np0005538513.localdomain dnsmasq[317337]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:23 np0005538513.localdomain dnsmasq[317337]: warning: no upstream servers configured
Nov 28 10:04:23 np0005538513.localdomain dnsmasq-dhcp[317337]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:04:23 np0005538513.localdomain dnsmasq[317337]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:23 np0005538513.localdomain dnsmasq-dhcp[317337]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:23 np0005538513.localdomain dnsmasq-dhcp[317337]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:23 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:23.577 261084 INFO neutron.agent.dhcp.agent [None req-b9b7d2dd-3c6e-4de4-a296-d0f17396a2d3 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed
Nov 28 10:04:23 np0005538513.localdomain dnsmasq[317337]: exiting on receipt of SIGTERM
Nov 28 10:04:23 np0005538513.localdomain podman[317355]: 2025-11-28 10:04:23.813207544 +0000 UTC m=+0.065080206 container kill 5921a5dec60b5690f2ebdc26cd46bbcdc4c837261c9b15a4e6ffe8b1da0c7a62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:23 np0005538513.localdomain systemd[1]: libpod-5921a5dec60b5690f2ebdc26cd46bbcdc4c837261c9b15a4e6ffe8b1da0c7a62.scope: Deactivated successfully.
Nov 28 10:04:23 np0005538513.localdomain podman[317367]: 2025-11-28 10:04:23.893804443 +0000 UTC m=+0.064967212 container died 5921a5dec60b5690f2ebdc26cd46bbcdc4c837261c9b15a4e6ffe8b1da0c7a62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:23 np0005538513.localdomain podman[317367]: 2025-11-28 10:04:23.92823913 +0000 UTC m=+0.099401869 container cleanup 5921a5dec60b5690f2ebdc26cd46bbcdc4c837261c9b15a4e6ffe8b1da0c7a62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:23 np0005538513.localdomain systemd[1]: libpod-conmon-5921a5dec60b5690f2ebdc26cd46bbcdc4c837261c9b15a4e6ffe8b1da0c7a62.scope: Deactivated successfully.
Nov 28 10:04:23 np0005538513.localdomain podman[317369]: 2025-11-28 10:04:23.982739552 +0000 UTC m=+0.142408702 container remove 5921a5dec60b5690f2ebdc26cd46bbcdc4c837261c9b15a4e6ffe8b1da0c7a62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 28 10:04:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f8494eb7c4c1eacaab847e2fba5c6ba8e96cf0679e520f61c5dec5de3bf11885-merged.mount: Deactivated successfully.
Nov 28 10:04:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5921a5dec60b5690f2ebdc26cd46bbcdc4c837261c9b15a4e6ffe8b1da0c7a62-userdata-shm.mount: Deactivated successfully.
Nov 28 10:04:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:04:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:04:24 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:24.856 2 INFO neutron.agent.securitygroups_rpc [None req-5be6ae81-d286-424b-afd6-2b6865c77664 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:24 np0005538513.localdomain podman[317427]: 2025-11-28 10:04:24.883229584 +0000 UTC m=+0.113622227 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:24 np0005538513.localdomain podman[317427]: 2025-11-28 10:04:24.96965918 +0000 UTC m=+0.200051813 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 28 10:04:24 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:04:25 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:25 np0005538513.localdomain podman[317476]: 
Nov 28 10:04:25 np0005538513.localdomain podman[317476]: 2025-11-28 10:04:25.029100254 +0000 UTC m=+0.110864858 container create 94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:04:25 np0005538513.localdomain ceph-mon[292954]: pgmap v248: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:25 np0005538513.localdomain podman[317476]: 2025-11-28 10:04:24.972238124 +0000 UTC m=+0.054002748 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:25 np0005538513.localdomain podman[317426]: 2025-11-28 10:04:25.028505636 +0000 UTC m=+0.259787614 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:25 np0005538513.localdomain systemd[1]: Started libpod-conmon-94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1.scope.
Nov 28 10:04:25 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:25 np0005538513.localdomain podman[317426]: 2025-11-28 10:04:25.131522969 +0000 UTC m=+0.362804987 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 28 10:04:25 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87952802ebce232016498b37ed1bc68b69594690c353cd872596f34af65028ef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:25 np0005538513.localdomain podman[317476]: 2025-11-28 10:04:25.145354125 +0000 UTC m=+0.227118719 container init 94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:04:25 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:04:25 np0005538513.localdomain systemd[1]: tmp-crun.9X1tl8.mount: Deactivated successfully.
Nov 28 10:04:25 np0005538513.localdomain podman[317476]: 2025-11-28 10:04:25.167132289 +0000 UTC m=+0.248896883 container start 94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:04:25 np0005538513.localdomain dnsmasq[317512]: started, version 2.85 cachesize 150
Nov 28 10:04:25 np0005538513.localdomain dnsmasq[317512]: DNS service limited to local subnets
Nov 28 10:04:25 np0005538513.localdomain dnsmasq[317512]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:25 np0005538513.localdomain dnsmasq[317512]: warning: no upstream servers configured
Nov 28 10:04:25 np0005538513.localdomain dnsmasq-dhcp[317512]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:04:25 np0005538513.localdomain dnsmasq[317512]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:25 np0005538513.localdomain dnsmasq-dhcp[317512]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:25 np0005538513.localdomain dnsmasq-dhcp[317512]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:25 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:25.230 261084 INFO neutron.agent.dhcp.agent [None req-4a879e38-f8b7-488f-a04a-96cd6030db32 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:24Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd665c970>, <neutron.agent.linux.dhcp.DictModel object at 0x7fdcd665c100>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd66ecb50>, <neutron.agent.linux.dhcp.DictModel object at 0x7fdcd66ec430>], id=4dd844d7-c4a5-45a5-8561-d62489aaa9e8, ip_allocation=immediate, mac_address=fa:16:3e:71:8f:32, name=tempest-NetworksTestDHCPv6-1013711471, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=31, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['1e4a40e1-a5dc-42b4-816e-f8554b6a9964', '6c8448d9-9093-4c59-8a0f-3d17b74604a1'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:21Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1614, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:24Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1
Nov 28 10:04:25 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:25.479 261084 INFO neutron.agent.dhcp.agent [None req-07dea47e-4fdd-4ae0-a660-9825ca60fb26 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed
Nov 28 10:04:25 np0005538513.localdomain dnsmasq[317512]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 2 addresses
Nov 28 10:04:25 np0005538513.localdomain dnsmasq-dhcp[317512]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:25 np0005538513.localdomain dnsmasq-dhcp[317512]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:25 np0005538513.localdomain podman[317531]: 2025-11-28 10:04:25.495138618 +0000 UTC m=+0.066754224 container kill 94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:25.511 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:25.516 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:25 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:25.812 261084 INFO neutron.agent.dhcp.agent [None req-ee8500d6-7313-44a5-9e23-08e406f60d88 - - - - - -] DHCP configuration for ports {'4dd844d7-c4a5-45a5-8561-d62489aaa9e8'} is completed
Nov 28 10:04:25 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:04:25 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:04:26 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:26.165 2 INFO neutron.agent.securitygroups_rpc [None req-f516662f-fbee-4914-9fae-83fcd2f7d639 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:26 np0005538513.localdomain dnsmasq[317512]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:26 np0005538513.localdomain dnsmasq-dhcp[317512]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:26 np0005538513.localdomain dnsmasq-dhcp[317512]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:26 np0005538513.localdomain podman[317573]: 2025-11-28 10:04:26.43051254 +0000 UTC m=+0.064455537 container kill 94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:26 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:26.738 261084 INFO neutron.agent.linux.ip_lib [None req-843ba996-86a5-49b0-9513-4e771998a713 - - - - - -] Device tap404598dd-47 cannot be used as it has no MAC address
Nov 28 10:04:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:04:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:26.803 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:26 np0005538513.localdomain kernel: device tap404598dd-47 entered promiscuous mode
Nov 28 10:04:26 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324266.8169] manager: (tap404598dd-47): new Generic device (/org/freedesktop/NetworkManager/Devices/45)
Nov 28 10:04:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:26.814 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:26 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:26Z|00260|binding|INFO|Claiming lport 404598dd-4706-4aa3-a857-56207d0fd483 for this chassis.
Nov 28 10:04:26 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:26Z|00261|binding|INFO|404598dd-4706-4aa3-a857-56207d0fd483: Claiming unknown
Nov 28 10:04:26 np0005538513.localdomain systemd-udevd[317614]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:04:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:26.827 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-1a246530-be70-4846-9202-8f9cd6d862ae', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a246530-be70-4846-9202-8f9cd6d862ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '50a1392ce96c4024bcd36a3df403ca29', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d9cc17b-2c39-4130-a2f2-9d12894eaf52, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=404598dd-4706-4aa3-a857-56207d0fd483) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:26.829 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 404598dd-4706-4aa3-a857-56207d0fd483 in datapath 1a246530-be70-4846-9202-8f9cd6d862ae bound to our chassis
Nov 28 10:04:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:26.831 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1a246530-be70-4846-9202-8f9cd6d862ae or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:04:26 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:26.832 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[9ae799ae-1384-4f78-9cb3-cf456e6640d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:26 np0005538513.localdomain ceph-mon[292954]: pgmap v249: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:26 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:04:26 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap404598dd-47: No such device
Nov 28 10:04:26 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:26Z|00262|binding|INFO|Setting lport 404598dd-4706-4aa3-a857-56207d0fd483 ovn-installed in OVS
Nov 28 10:04:26 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:26Z|00263|binding|INFO|Setting lport 404598dd-4706-4aa3-a857-56207d0fd483 up in Southbound
Nov 28 10:04:26 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap404598dd-47: No such device
Nov 28 10:04:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:26.862 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:26 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap404598dd-47: No such device
Nov 28 10:04:26 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap404598dd-47: No such device
Nov 28 10:04:26 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap404598dd-47: No such device
Nov 28 10:04:26 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap404598dd-47: No such device
Nov 28 10:04:26 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap404598dd-47: No such device
Nov 28 10:04:26 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap404598dd-47: No such device
Nov 28 10:04:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:26.901 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:26 np0005538513.localdomain podman[317604]: 2025-11-28 10:04:26.922546538 +0000 UTC m=+0.150287946 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm)
Nov 28 10:04:26 np0005538513.localdomain podman[317604]: 2025-11-28 10:04:26.934868412 +0000 UTC m=+0.162609790 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Nov 28 10:04:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:26.949 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:26 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:26Z|00264|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:04:26 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:04:27 np0005538513.localdomain dnsmasq[317512]: exiting on receipt of SIGTERM
Nov 28 10:04:27 np0005538513.localdomain podman[317681]: 2025-11-28 10:04:27.389133618 +0000 UTC m=+0.072341674 container kill 94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 10:04:27 np0005538513.localdomain systemd[1]: libpod-94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1.scope: Deactivated successfully.
Nov 28 10:04:27 np0005538513.localdomain podman[317697]: 2025-11-28 10:04:27.47887005 +0000 UTC m=+0.071468579 container died 94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:04:27 np0005538513.localdomain systemd[1]: tmp-crun.0eJiGp.mount: Deactivated successfully.
Nov 28 10:04:27 np0005538513.localdomain podman[317697]: 2025-11-28 10:04:27.526180305 +0000 UTC m=+0.118778784 container cleanup 94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:04:27 np0005538513.localdomain systemd[1]: libpod-conmon-94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1.scope: Deactivated successfully.
Nov 28 10:04:27 np0005538513.localdomain podman[317699]: 2025-11-28 10:04:27.614112025 +0000 UTC m=+0.199829817 container remove 94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 28 10:04:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:27.757 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:27 np0005538513.localdomain podman[317764]: 
Nov 28 10:04:27 np0005538513.localdomain podman[317764]: 2025-11-28 10:04:27.91512023 +0000 UTC m=+0.100517091 container create 57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a246530-be70-4846-9202-8f9cd6d862ae, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 28 10:04:27 np0005538513.localdomain podman[317764]: 2025-11-28 10:04:27.867111555 +0000 UTC m=+0.052508476 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:27 np0005538513.localdomain systemd[1]: Started libpod-conmon-57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5.scope.
Nov 28 10:04:27 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:27 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f57866dc5c6162f1b2474906fd9f4e8b712b44716f63d14a9891a2098561e5e3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:28 np0005538513.localdomain podman[317764]: 2025-11-28 10:04:28.003271375 +0000 UTC m=+0.188668226 container init 57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a246530-be70-4846-9202-8f9cd6d862ae, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:28 np0005538513.localdomain podman[317764]: 2025-11-28 10:04:28.013960372 +0000 UTC m=+0.199357223 container start 57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a246530-be70-4846-9202-8f9cd6d862ae, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:04:28 np0005538513.localdomain dnsmasq[317792]: started, version 2.85 cachesize 150
Nov 28 10:04:28 np0005538513.localdomain dnsmasq[317792]: DNS service limited to local subnets
Nov 28 10:04:28 np0005538513.localdomain dnsmasq[317792]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:28 np0005538513.localdomain dnsmasq[317792]: warning: no upstream servers configured
Nov 28 10:04:28 np0005538513.localdomain dnsmasq-dhcp[317792]: DHCPv6, static leases only on 2001:db8:1::, lease time 1d
Nov 28 10:04:28 np0005538513.localdomain dnsmasq[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/addn_hosts - 0 addresses
Nov 28 10:04:28 np0005538513.localdomain dnsmasq-dhcp[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/host
Nov 28 10:04:28 np0005538513.localdomain dnsmasq-dhcp[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/opts
Nov 28 10:04:28 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:28.111 261084 INFO neutron.agent.dhcp.agent [None req-843ba996-86a5-49b0-9513-4e771998a713 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:26Z, description=, device_id=1c1f4bc1-9864-4ab2-ab0b-3419e92647f6, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6585880>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65856a0>], id=7d15d485-b9d0-4a4c-92f9-e215283394d6, ip_allocation=immediate, mac_address=fa:16:3e:29:8f:09, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:04:23Z, description=, dns_domain=, id=1a246530-be70-4846-9202-8f9cd6d862ae, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-2019588082, port_security_enabled=True, project_id=50a1392ce96c4024bcd36a3df403ca29, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37301, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1609, status=ACTIVE, subnets=['64b56fea-cfed-4ba1-b5bf-14193e8cd8a5'], tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:04:25Z, vlan_transparent=None, network_id=1a246530-be70-4846-9202-8f9cd6d862ae, port_security_enabled=False, project_id=50a1392ce96c4024bcd36a3df403ca29, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1633, status=DOWN, tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:04:26Z on network 1a246530-be70-4846-9202-8f9cd6d862ae
Nov 28 10:04:28 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:28.298 261084 INFO neutron.agent.dhcp.agent [None req-323e6704-3489-427e-b1a6-bdf78087d826 - - - - - -] DHCP configuration for ports {'d1d8da55-f963-488f-b18a-dae5fc16078a'} is completed
Nov 28 10:04:28 np0005538513.localdomain dnsmasq[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/addn_hosts - 1 addresses
Nov 28 10:04:28 np0005538513.localdomain dnsmasq-dhcp[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/host
Nov 28 10:04:28 np0005538513.localdomain dnsmasq-dhcp[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/opts
Nov 28 10:04:28 np0005538513.localdomain podman[317816]: 2025-11-28 10:04:28.332372716 +0000 UTC m=+0.064789827 container kill 57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a246530-be70-4846-9202-8f9cd6d862ae, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:28 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:28.461 2 INFO neutron.agent.securitygroups_rpc [None req-35c0af25-6cf6-4373-be02-f6ff138ff337 e7c9d49fbf5f41059b8d32426e1740a6 517e4cc7e34e4fe7b49313300e5db635 - - default default] Security group member updated ['d7bc25b0-2d77-4fba-a003-707609b573d6']
Nov 28 10:04:28 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-87952802ebce232016498b37ed1bc68b69594690c353cd872596f34af65028ef-merged.mount: Deactivated successfully.
Nov 28 10:04:28 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1-userdata-shm.mount: Deactivated successfully.
Nov 28 10:04:28 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:28.499 261084 INFO neutron.agent.dhcp.agent [None req-843ba996-86a5-49b0-9513-4e771998a713 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:26Z, description=, device_id=1c1f4bc1-9864-4ab2-ab0b-3419e92647f6, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6fa2c70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65c5610>], id=7d15d485-b9d0-4a4c-92f9-e215283394d6, ip_allocation=immediate, mac_address=fa:16:3e:29:8f:09, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:04:23Z, description=, dns_domain=, id=1a246530-be70-4846-9202-8f9cd6d862ae, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-2019588082, port_security_enabled=True, project_id=50a1392ce96c4024bcd36a3df403ca29, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37301, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1609, status=ACTIVE, subnets=['64b56fea-cfed-4ba1-b5bf-14193e8cd8a5'], tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:04:25Z, vlan_transparent=None, network_id=1a246530-be70-4846-9202-8f9cd6d862ae, port_security_enabled=False, project_id=50a1392ce96c4024bcd36a3df403ca29, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1633, status=DOWN, tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:04:26Z on network 1a246530-be70-4846-9202-8f9cd6d862ae
Nov 28 10:04:28 np0005538513.localdomain podman[317856]: 
Nov 28 10:04:28 np0005538513.localdomain podman[317856]: 2025-11-28 10:04:28.552606456 +0000 UTC m=+0.098634407 container create dd76ff50412881725b1ad013e1aba74711e1b2dee39620f9091f578a4ef40f7d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:28 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:28.578 261084 INFO neutron.agent.dhcp.agent [None req-36b16de0-af32-493f-bf2a-d6ed00e46133 - - - - - -] DHCP configuration for ports {'7d15d485-b9d0-4a4c-92f9-e215283394d6'} is completed
Nov 28 10:04:28 np0005538513.localdomain systemd[1]: Started libpod-conmon-dd76ff50412881725b1ad013e1aba74711e1b2dee39620f9091f578a4ef40f7d.scope.
Nov 28 10:04:28 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:28 np0005538513.localdomain podman[317856]: 2025-11-28 10:04:28.510628093 +0000 UTC m=+0.056656124 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:28 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e082c3b6a4213499386110c64e819aa13cccd28eb6ce3df98bc97cfa2e4eb5d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:28 np0005538513.localdomain podman[317856]: 2025-11-28 10:04:28.624341712 +0000 UTC m=+0.170369693 container init dd76ff50412881725b1ad013e1aba74711e1b2dee39620f9091f578a4ef40f7d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:04:28 np0005538513.localdomain podman[317856]: 2025-11-28 10:04:28.634903315 +0000 UTC m=+0.180931286 container start dd76ff50412881725b1ad013e1aba74711e1b2dee39620f9091f578a4ef40f7d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 28 10:04:28 np0005538513.localdomain dnsmasq[317892]: started, version 2.85 cachesize 150
Nov 28 10:04:28 np0005538513.localdomain dnsmasq[317892]: DNS service limited to local subnets
Nov 28 10:04:28 np0005538513.localdomain dnsmasq[317892]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:28 np0005538513.localdomain dnsmasq[317892]: warning: no upstream servers configured
Nov 28 10:04:28 np0005538513.localdomain dnsmasq[317892]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:28 np0005538513.localdomain dnsmasq[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/addn_hosts - 1 addresses
Nov 28 10:04:28 np0005538513.localdomain dnsmasq-dhcp[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/host
Nov 28 10:04:28 np0005538513.localdomain podman[317895]: 2025-11-28 10:04:28.789935457 +0000 UTC m=+0.068628157 container kill 57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a246530-be70-4846-9202-8f9cd6d862ae, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 10:04:28 np0005538513.localdomain dnsmasq-dhcp[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/opts
Nov 28 10:04:28 np0005538513.localdomain ceph-mon[292954]: pgmap v250: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:28 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:28.885 261084 INFO neutron.agent.dhcp.agent [None req-c5abd135-c599-4696-8090-d8bdebb79c7f - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed
Nov 28 10:04:29 np0005538513.localdomain dnsmasq[317892]: exiting on receipt of SIGTERM
Nov 28 10:04:29 np0005538513.localdomain podman[317932]: 2025-11-28 10:04:29.023169109 +0000 UTC m=+0.069668987 container kill dd76ff50412881725b1ad013e1aba74711e1b2dee39620f9091f578a4ef40f7d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:29 np0005538513.localdomain systemd[1]: libpod-dd76ff50412881725b1ad013e1aba74711e1b2dee39620f9091f578a4ef40f7d.scope: Deactivated successfully.
Nov 28 10:04:29 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:29.074 261084 INFO neutron.agent.dhcp.agent [None req-d9a1fe49-64c0-483d-a122-e87042550d1f - - - - - -] DHCP configuration for ports {'7d15d485-b9d0-4a4c-92f9-e215283394d6'} is completed
Nov 28 10:04:29 np0005538513.localdomain podman[317946]: 2025-11-28 10:04:29.09994762 +0000 UTC m=+0.061084401 container died dd76ff50412881725b1ad013e1aba74711e1b2dee39620f9091f578a4ef40f7d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 10:04:29 np0005538513.localdomain podman[317946]: 2025-11-28 10:04:29.13452451 +0000 UTC m=+0.095661241 container cleanup dd76ff50412881725b1ad013e1aba74711e1b2dee39620f9091f578a4ef40f7d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:29 np0005538513.localdomain systemd[1]: libpod-conmon-dd76ff50412881725b1ad013e1aba74711e1b2dee39620f9091f578a4ef40f7d.scope: Deactivated successfully.
Nov 28 10:04:29 np0005538513.localdomain podman[317952]: 2025-11-28 10:04:29.188668862 +0000 UTC m=+0.136162412 container remove dd76ff50412881725b1ad013e1aba74711e1b2dee39620f9091f578a4ef40f7d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:29 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:29.311 2 INFO neutron.agent.securitygroups_rpc [None req-535fb80e-1678-409f-9e3d-b2eaa82a20b5 e7c9d49fbf5f41059b8d32426e1740a6 517e4cc7e34e4fe7b49313300e5db635 - - default default] Security group member updated ['d7bc25b0-2d77-4fba-a003-707609b573d6']
Nov 28 10:04:29 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-4e082c3b6a4213499386110c64e819aa13cccd28eb6ce3df98bc97cfa2e4eb5d-merged.mount: Deactivated successfully.
Nov 28 10:04:29 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dd76ff50412881725b1ad013e1aba74711e1b2dee39620f9091f578a4ef40f7d-userdata-shm.mount: Deactivated successfully.
Nov 28 10:04:29 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:29.605 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:29 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:29.608 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated
Nov 28 10:04:29 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:29.612 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 3fb0e974-4428-459d-a213-c32931747eec IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:04:29 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:29.613 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:29 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:29.614 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[2256f7f6-f854-4736-8d4a-e789867a1e01]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:29 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/4123920278' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:04:29 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/4123920278' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:04:30 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:30 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:30.499 261084 INFO neutron.agent.linux.ip_lib [None req-553b78a4-862c-404f-8023-7f862ca89787 - - - - - -] Device tapaad9e073-ac cannot be used as it has no MAC address
Nov 28 10:04:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:30.535 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:30.553 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:30 np0005538513.localdomain kernel: device tapaad9e073-ac entered promiscuous mode
Nov 28 10:04:30 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324270.5599] manager: (tapaad9e073-ac): new Generic device (/org/freedesktop/NetworkManager/Devices/46)
Nov 28 10:04:30 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:30Z|00265|binding|INFO|Claiming lport aad9e073-acbe-49ad-8b8e-e03f91cd53c9 for this chassis.
Nov 28 10:04:30 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:30Z|00266|binding|INFO|aad9e073-acbe-49ad-8b8e-e03f91cd53c9: Claiming unknown
Nov 28 10:04:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:30.564 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:30 np0005538513.localdomain systemd-udevd[318028]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:04:30 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:30.581 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:2::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '50a1392ce96c4024bcd36a3df403ca29', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7becd79-4f22-46be-87af-f81de3e971b9, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=aad9e073-acbe-49ad-8b8e-e03f91cd53c9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:30 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:30.584 158130 INFO neutron.agent.ovn.metadata.agent [-] Port aad9e073-acbe-49ad-8b8e-e03f91cd53c9 in datapath f6bc7039-ebcb-4d5c-bff1-81be4c2607bb bound to our chassis
Nov 28 10:04:30 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:30.586 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f6bc7039-ebcb-4d5c-bff1-81be4c2607bb or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:04:30 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:30.587 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[b33c8077-075a-40bb-a53b-3e91e35be371]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:30 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapaad9e073-ac: No such device
Nov 28 10:04:30 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:30Z|00267|binding|INFO|Setting lport aad9e073-acbe-49ad-8b8e-e03f91cd53c9 ovn-installed in OVS
Nov 28 10:04:30 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:30Z|00268|binding|INFO|Setting lport aad9e073-acbe-49ad-8b8e-e03f91cd53c9 up in Southbound
Nov 28 10:04:30 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapaad9e073-ac: No such device
Nov 28 10:04:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:30.596 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:30 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapaad9e073-ac: No such device
Nov 28 10:04:30 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapaad9e073-ac: No such device
Nov 28 10:04:30 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapaad9e073-ac: No such device
Nov 28 10:04:30 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapaad9e073-ac: No such device
Nov 28 10:04:30 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapaad9e073-ac: No such device
Nov 28 10:04:30 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapaad9e073-ac: No such device
Nov 28 10:04:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:30.654 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:30.680 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:30 np0005538513.localdomain podman[318064]: 
Nov 28 10:04:30 np0005538513.localdomain podman[318064]: 2025-11-28 10:04:30.778523258 +0000 UTC m=+0.104002501 container create cc917b7b25b70227e828e17b9dcf42c71ef1878b4d8b2156d8f88edd6aed1124 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 10:04:30 np0005538513.localdomain systemd[1]: Started libpod-conmon-cc917b7b25b70227e828e17b9dcf42c71ef1878b4d8b2156d8f88edd6aed1124.scope.
Nov 28 10:04:30 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:30 np0005538513.localdomain podman[318064]: 2025-11-28 10:04:30.733180268 +0000 UTC m=+0.058659541 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:30 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05622c3ff6db343faa6e0358a4ee5a25704b2d77fda63eb773bb03dec5bdf99e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:30 np0005538513.localdomain podman[318064]: 2025-11-28 10:04:30.844463006 +0000 UTC m=+0.169942249 container init cc917b7b25b70227e828e17b9dcf42c71ef1878b4d8b2156d8f88edd6aed1124 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:30 np0005538513.localdomain podman[318064]: 2025-11-28 10:04:30.856600965 +0000 UTC m=+0.182080228 container start cc917b7b25b70227e828e17b9dcf42c71ef1878b4d8b2156d8f88edd6aed1124 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:04:30 np0005538513.localdomain dnsmasq[318092]: started, version 2.85 cachesize 150
Nov 28 10:04:30 np0005538513.localdomain dnsmasq[318092]: DNS service limited to local subnets
Nov 28 10:04:30 np0005538513.localdomain dnsmasq[318092]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:30 np0005538513.localdomain dnsmasq[318092]: warning: no upstream servers configured
Nov 28 10:04:30 np0005538513.localdomain dnsmasq-dhcp[318092]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:04:30 np0005538513.localdomain dnsmasq[318092]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:30 np0005538513.localdomain dnsmasq-dhcp[318092]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:30 np0005538513.localdomain dnsmasq-dhcp[318092]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:31 np0005538513.localdomain ceph-mon[292954]: pgmap v251: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:31 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:31.146 261084 INFO neutron.agent.dhcp.agent [None req-56b7b415-30d4-4ab6-9aa5-fde3ab2f2ff0 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed
Nov 28 10:04:31 np0005538513.localdomain podman[318130]: 
Nov 28 10:04:31 np0005538513.localdomain podman[318130]: 2025-11-28 10:04:31.47660832 +0000 UTC m=+0.046224615 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:31 np0005538513.localdomain podman[318130]: 2025-11-28 10:04:31.585659585 +0000 UTC m=+0.155275850 container create 10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 10:04:31 np0005538513.localdomain systemd[1]: Started libpod-conmon-10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730.scope.
Nov 28 10:04:31 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:31 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/578830e16dc254a672039252f1b4aca3106a0a33f8e3c5f9da3689144398b5d1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:31 np0005538513.localdomain podman[318130]: 2025-11-28 10:04:31.651655827 +0000 UTC m=+0.221272082 container init 10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:31 np0005538513.localdomain podman[318130]: 2025-11-28 10:04:31.66119732 +0000 UTC m=+0.230813585 container start 10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 10:04:31 np0005538513.localdomain dnsmasq[318149]: started, version 2.85 cachesize 150
Nov 28 10:04:31 np0005538513.localdomain dnsmasq[318149]: DNS service limited to local subnets
Nov 28 10:04:31 np0005538513.localdomain dnsmasq[318149]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:31 np0005538513.localdomain dnsmasq[318149]: warning: no upstream servers configured
Nov 28 10:04:31 np0005538513.localdomain dnsmasq-dhcp[318149]: DHCPv6, static leases only on 2001:db8:2::, lease time 1d
Nov 28 10:04:31 np0005538513.localdomain dnsmasq[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/addn_hosts - 0 addresses
Nov 28 10:04:31 np0005538513.localdomain dnsmasq-dhcp[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/host
Nov 28 10:04:31 np0005538513.localdomain dnsmasq-dhcp[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/opts
Nov 28 10:04:31 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:31.717 261084 INFO neutron.agent.dhcp.agent [None req-553b78a4-862c-404f-8023-7f862ca89787 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:30Z, description=, device_id=1c1f4bc1-9864-4ab2-ab0b-3419e92647f6, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65fffa0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd67bc670>], id=d578c17e-efea-42d9-9b0e-1fc0d3472d19, ip_allocation=immediate, mac_address=fa:16:3e:d4:f8:84, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:04:27Z, description=, dns_domain=, id=f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1221052021, port_security_enabled=True, project_id=50a1392ce96c4024bcd36a3df403ca29, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9596, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1646, status=ACTIVE, subnets=['d7f1568e-313e-48e3-9b4d-32767b1bddcf'], tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:04:29Z, vlan_transparent=None, network_id=f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, port_security_enabled=False, project_id=50a1392ce96c4024bcd36a3df403ca29, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1660, status=DOWN, tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:04:30Z on network f6bc7039-ebcb-4d5c-bff1-81be4c2607bb
Nov 28 10:04:31 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:31.803 261084 INFO neutron.agent.dhcp.agent [None req-ae08e445-9120-431b-88c1-1eef8075a573 - - - - - -] DHCP configuration for ports {'f17c271d-1e05-4708-b116-d1572df0ff8c'} is completed
Nov 28 10:04:31 np0005538513.localdomain dnsmasq[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/addn_hosts - 1 addresses
Nov 28 10:04:31 np0005538513.localdomain dnsmasq-dhcp[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/host
Nov 28 10:04:31 np0005538513.localdomain podman[318168]: 2025-11-28 10:04:31.91911743 +0000 UTC m=+0.061640427 container kill 10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 10:04:31 np0005538513.localdomain dnsmasq-dhcp[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/opts
Nov 28 10:04:32 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:32.151 261084 INFO neutron.agent.dhcp.agent [None req-7716dda8-e09e-4c11-845f-b15d2294fa79 - - - - - -] DHCP configuration for ports {'d578c17e-efea-42d9-9b0e-1fc0d3472d19'} is completed
Nov 28 10:04:33 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:33.049 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:33 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:33.052 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated
Nov 28 10:04:33 np0005538513.localdomain ceph-mon[292954]: pgmap v252: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Nov 28 10:04:33 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:33.056 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 3fb0e974-4428-459d-a213-c32931747eec IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:04:33 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:33.056 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:33 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:33.058 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[5256cdbb-a39e-4137-8899-cf76d95fa840]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:33 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:33.170 2 INFO neutron.agent.securitygroups_rpc [None req-e7bb8635-ea1b-4f3c-951e-d95d847ad39e 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']
Nov 28 10:04:33 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:33.295 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:30Z, description=, device_id=1c1f4bc1-9864-4ab2-ab0b-3419e92647f6, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64c94f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64c9160>], id=d578c17e-efea-42d9-9b0e-1fc0d3472d19, ip_allocation=immediate, mac_address=fa:16:3e:d4:f8:84, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:04:27Z, description=, dns_domain=, id=f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1221052021, port_security_enabled=True, project_id=50a1392ce96c4024bcd36a3df403ca29, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9596, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1646, status=ACTIVE, subnets=['d7f1568e-313e-48e3-9b4d-32767b1bddcf'], tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:04:29Z, vlan_transparent=None, network_id=f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, port_security_enabled=False, project_id=50a1392ce96c4024bcd36a3df403ca29, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1660, status=DOWN, tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:04:30Z on network f6bc7039-ebcb-4d5c-bff1-81be4c2607bb
Nov 28 10:04:33 np0005538513.localdomain dnsmasq[318092]: exiting on receipt of SIGTERM
Nov 28 10:04:33 np0005538513.localdomain podman[318222]: 2025-11-28 10:04:33.526310403 +0000 UTC m=+0.085126871 container kill cc917b7b25b70227e828e17b9dcf42c71ef1878b4d8b2156d8f88edd6aed1124 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:33 np0005538513.localdomain systemd[1]: tmp-crun.kwqwY2.mount: Deactivated successfully.
Nov 28 10:04:33 np0005538513.localdomain systemd[1]: libpod-cc917b7b25b70227e828e17b9dcf42c71ef1878b4d8b2156d8f88edd6aed1124.scope: Deactivated successfully.
Nov 28 10:04:33 np0005538513.localdomain dnsmasq[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/addn_hosts - 1 addresses
Nov 28 10:04:33 np0005538513.localdomain dnsmasq-dhcp[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/host
Nov 28 10:04:33 np0005538513.localdomain dnsmasq-dhcp[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/opts
Nov 28 10:04:33 np0005538513.localdomain podman[318234]: 2025-11-28 10:04:33.587664151 +0000 UTC m=+0.077095910 container kill 10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:33 np0005538513.localdomain podman[318241]: 2025-11-28 10:04:33.599954853 +0000 UTC m=+0.057617351 container died cc917b7b25b70227e828e17b9dcf42c71ef1878b4d8b2156d8f88edd6aed1124 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 10:04:33 np0005538513.localdomain podman[318241]: 2025-11-28 10:04:33.775050421 +0000 UTC m=+0.232712919 container cleanup cc917b7b25b70227e828e17b9dcf42c71ef1878b4d8b2156d8f88edd6aed1124 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 10:04:33 np0005538513.localdomain systemd[1]: libpod-conmon-cc917b7b25b70227e828e17b9dcf42c71ef1878b4d8b2156d8f88edd6aed1124.scope: Deactivated successfully.
Nov 28 10:04:33 np0005538513.localdomain podman[318252]: 2025-11-28 10:04:33.803150436 +0000 UTC m=+0.239432202 container remove cc917b7b25b70227e828e17b9dcf42c71ef1878b4d8b2156d8f88edd6aed1124 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:33 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:33.807 261084 INFO neutron.agent.dhcp.agent [None req-9c86f485-3701-4964-8ca6-129bac63be7e - - - - - -] DHCP configuration for ports {'d578c17e-efea-42d9-9b0e-1fc0d3472d19'} is completed
Nov 28 10:04:34 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-05622c3ff6db343faa6e0358a4ee5a25704b2d77fda63eb773bb03dec5bdf99e-merged.mount: Deactivated successfully.
Nov 28 10:04:34 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cc917b7b25b70227e828e17b9dcf42c71ef1878b4d8b2156d8f88edd6aed1124-userdata-shm.mount: Deactivated successfully.
Nov 28 10:04:34 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:34.522 2 INFO neutron.agent.securitygroups_rpc [None req-406cfd0d-88dd-4d36-9649-40665b36b8d2 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:34 np0005538513.localdomain podman[318335]: 
Nov 28 10:04:34 np0005538513.localdomain podman[318335]: 2025-11-28 10:04:34.810975763 +0000 UTC m=+0.099278955 container create a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:34 np0005538513.localdomain systemd[1]: Started libpod-conmon-a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b.scope.
Nov 28 10:04:34 np0005538513.localdomain podman[318335]: 2025-11-28 10:04:34.76341806 +0000 UTC m=+0.051721302 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:34 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:34 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27c3d356aba2ea0d98747d934ff4c0b639ec6e4c8c998430d9479d7fef6a3cf4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:34 np0005538513.localdomain podman[318335]: 2025-11-28 10:04:34.888400792 +0000 UTC m=+0.176703984 container init a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:34 np0005538513.localdomain podman[318335]: 2025-11-28 10:04:34.899108339 +0000 UTC m=+0.187411531 container start a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:34 np0005538513.localdomain dnsmasq[318354]: started, version 2.85 cachesize 150
Nov 28 10:04:34 np0005538513.localdomain dnsmasq[318354]: DNS service limited to local subnets
Nov 28 10:04:34 np0005538513.localdomain dnsmasq[318354]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:34 np0005538513.localdomain dnsmasq[318354]: warning: no upstream servers configured
Nov 28 10:04:34 np0005538513.localdomain dnsmasq-dhcp[318354]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:04:34 np0005538513.localdomain dnsmasq-dhcp[318354]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:04:34 np0005538513.localdomain dnsmasq[318354]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:34 np0005538513.localdomain dnsmasq-dhcp[318354]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:34 np0005538513.localdomain dnsmasq-dhcp[318354]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:34 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:34.956 261084 INFO neutron.agent.dhcp.agent [None req-616e4e52-5787-4518-8ed1-3f9d7ee5c0ab - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:33Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64b4fa0>, <neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64b4280>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64b4d00>, <neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64b45b0>], id=cddc71e8-dfe8-4bc4-ac32-c8e62bc2fd52, ip_allocation=immediate, mac_address=fa:16:3e:ab:b5:4e, name=tempest-NetworksTestDHCPv6-1132459699, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=35, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['56931ec4-adb1-4ad8-98f9-465d3608c4a7', '7498cd52-8707-4758-a964-4aef6a4317ac'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:30Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1683, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:34Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1
Nov 28 10:04:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:35 np0005538513.localdomain ceph-mon[292954]: pgmap v253: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Nov 28 10:04:35 np0005538513.localdomain dnsmasq[318354]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 2 addresses
Nov 28 10:04:35 np0005538513.localdomain dnsmasq-dhcp[318354]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:35 np0005538513.localdomain dnsmasq-dhcp[318354]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:35 np0005538513.localdomain podman[318373]: 2025-11-28 10:04:35.17309548 +0000 UTC m=+0.046154634 container kill a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 10:04:35 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:35.236 261084 INFO neutron.agent.dhcp.agent [None req-fca8271d-824a-4f4a-9fc6-3b70da62dc94 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed
Nov 28 10:04:35 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:35.472 261084 INFO neutron.agent.dhcp.agent [None req-ec1a19c5-4a32-42d6-a3a5-f85a251b619d - - - - - -] DHCP configuration for ports {'cddc71e8-dfe8-4bc4-ac32-c8e62bc2fd52'} is completed
Nov 28 10:04:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:35.539 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:35 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:35.893 2 INFO neutron.agent.securitygroups_rpc [None req-e5e44e33-1445-4a9a-aa0f-e3e5f136c603 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:36 np0005538513.localdomain dnsmasq[318354]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:36 np0005538513.localdomain dnsmasq-dhcp[318354]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:36 np0005538513.localdomain dnsmasq-dhcp[318354]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:36 np0005538513.localdomain podman[318413]: 2025-11-28 10:04:36.148158529 +0000 UTC m=+0.062365218 container kill a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:04:36 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:36.670 2 INFO neutron.agent.securitygroups_rpc [None req-260c5785-8a89-47b1-924c-143d50af86e5 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']
Nov 28 10:04:36 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:36.686 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:04:37 np0005538513.localdomain ceph-mon[292954]: pgmap v254: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Nov 28 10:04:37 np0005538513.localdomain dnsmasq[318354]: exiting on receipt of SIGTERM
Nov 28 10:04:37 np0005538513.localdomain systemd[1]: libpod-a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b.scope: Deactivated successfully.
Nov 28 10:04:37 np0005538513.localdomain podman[318454]: 2025-11-28 10:04:37.173125959 +0000 UTC m=+0.065126168 container kill a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:37 np0005538513.localdomain podman[318467]: 2025-11-28 10:04:37.246682636 +0000 UTC m=+0.062910974 container died a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:37 np0005538513.localdomain systemd[1]: tmp-crun.L4kkfN.mount: Deactivated successfully.
Nov 28 10:04:37 np0005538513.localdomain podman[318467]: 2025-11-28 10:04:37.295434143 +0000 UTC m=+0.111662421 container cleanup a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:37 np0005538513.localdomain systemd[1]: libpod-conmon-a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b.scope: Deactivated successfully.
Nov 28 10:04:37 np0005538513.localdomain podman[318474]: 2025-11-28 10:04:37.334108191 +0000 UTC m=+0.134362681 container remove a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:04:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:04:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:04:38 np0005538513.localdomain podman[318525]: 2025-11-28 10:04:38.104364192 +0000 UTC m=+0.084018798 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 10:04:38 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-27c3d356aba2ea0d98747d934ff4c0b639ec6e4c8c998430d9479d7fef6a3cf4-merged.mount: Deactivated successfully.
Nov 28 10:04:38 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b-userdata-shm.mount: Deactivated successfully.
Nov 28 10:04:38 np0005538513.localdomain podman[318526]: 2025-11-28 10:04:38.172366821 +0000 UTC m=+0.145866771 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:04:38 np0005538513.localdomain podman[318525]: 2025-11-28 10:04:38.249144601 +0000 UTC m=+0.228799157 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:38 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:04:38 np0005538513.localdomain podman[318526]: 2025-11-28 10:04:38.306891335 +0000 UTC m=+0.280391265 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:04:38 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:04:38 np0005538513.localdomain podman[318588]: 
Nov 28 10:04:38 np0005538513.localdomain podman[318588]: 2025-11-28 10:04:38.335922408 +0000 UTC m=+0.099334868 container create 34ef2e8e26f363da5521fa32c45211d9260281108154bd7307ddad892c879223 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:04:38 np0005538513.localdomain systemd[1]: Started libpod-conmon-34ef2e8e26f363da5521fa32c45211d9260281108154bd7307ddad892c879223.scope.
Nov 28 10:04:38 np0005538513.localdomain podman[318588]: 2025-11-28 10:04:38.283520646 +0000 UTC m=+0.046933156 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:38 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:38 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/088e71771c3215ab6b11832e0cd5acfbf62a86028c4cb9bd53ee87603664aa23/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:38 np0005538513.localdomain podman[318588]: 2025-11-28 10:04:38.412927354 +0000 UTC m=+0.176339804 container init 34ef2e8e26f363da5521fa32c45211d9260281108154bd7307ddad892c879223 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 28 10:04:38 np0005538513.localdomain podman[318588]: 2025-11-28 10:04:38.42254482 +0000 UTC m=+0.185957260 container start 34ef2e8e26f363da5521fa32c45211d9260281108154bd7307ddad892c879223 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:38 np0005538513.localdomain dnsmasq[318607]: started, version 2.85 cachesize 150
Nov 28 10:04:38 np0005538513.localdomain dnsmasq[318607]: DNS service limited to local subnets
Nov 28 10:04:38 np0005538513.localdomain dnsmasq[318607]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:38 np0005538513.localdomain dnsmasq[318607]: warning: no upstream servers configured
Nov 28 10:04:38 np0005538513.localdomain dnsmasq-dhcp[318607]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:04:38 np0005538513.localdomain dnsmasq[318607]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:38 np0005538513.localdomain dnsmasq-dhcp[318607]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:38 np0005538513.localdomain dnsmasq-dhcp[318607]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:38 np0005538513.localdomain dnsmasq[318607]: exiting on receipt of SIGTERM
Nov 28 10:04:38 np0005538513.localdomain podman[318625]: 2025-11-28 10:04:38.781970618 +0000 UTC m=+0.062987286 container kill 34ef2e8e26f363da5521fa32c45211d9260281108154bd7307ddad892c879223 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:38 np0005538513.localdomain systemd[1]: libpod-34ef2e8e26f363da5521fa32c45211d9260281108154bd7307ddad892c879223.scope: Deactivated successfully.
Nov 28 10:04:38 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:38.825 261084 INFO neutron.agent.dhcp.agent [None req-0bedb45d-e760-4fd5-877a-989c79e8bf6b - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed
Nov 28 10:04:38 np0005538513.localdomain podman[318637]: 2025-11-28 10:04:38.859582262 +0000 UTC m=+0.060452223 container died 34ef2e8e26f363da5521fa32c45211d9260281108154bd7307ddad892c879223 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:38 np0005538513.localdomain podman[318637]: 2025-11-28 10:04:38.89547449 +0000 UTC m=+0.096344411 container cleanup 34ef2e8e26f363da5521fa32c45211d9260281108154bd7307ddad892c879223 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:04:38 np0005538513.localdomain systemd[1]: libpod-conmon-34ef2e8e26f363da5521fa32c45211d9260281108154bd7307ddad892c879223.scope: Deactivated successfully.
Nov 28 10:04:38 np0005538513.localdomain podman[318639]: 2025-11-28 10:04:38.935840237 +0000 UTC m=+0.128037270 container remove 34ef2e8e26f363da5521fa32c45211d9260281108154bd7307ddad892c879223 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:39 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:39Z|00269|binding|INFO|Releasing lport 5fa5dbd0-889e-4133-9ba4-6f3810999535 from this chassis (sb_readonly=0)
Nov 28 10:04:39 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:39Z|00270|binding|INFO|Setting lport 5fa5dbd0-889e-4133-9ba4-6f3810999535 down in Southbound
Nov 28 10:04:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:39.007 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:39 np0005538513.localdomain kernel: device tap5fa5dbd0-88 left promiscuous mode
Nov 28 10:04:39 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:39.016 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fe5a:389/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=5fa5dbd0-889e-4133-9ba4-6f3810999535) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:39 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:39.018 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 5fa5dbd0-889e-4133-9ba4-6f3810999535 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis
Nov 28 10:04:39 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:39.022 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:39 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:39.023 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[71752455-9ff2-42af-bdc4-fe84195c5469]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:39.031 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:39 np0005538513.localdomain ceph-mon[292954]: pgmap v255: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Nov 28 10:04:39 np0005538513.localdomain systemd[1]: tmp-crun.TpxJiB.mount: Deactivated successfully.
Nov 28 10:04:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-088e71771c3215ab6b11832e0cd5acfbf62a86028c4cb9bd53ee87603664aa23-merged.mount: Deactivated successfully.
Nov 28 10:04:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-34ef2e8e26f363da5521fa32c45211d9260281108154bd7307ddad892c879223-userdata-shm.mount: Deactivated successfully.
Nov 28 10:04:39 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:39.262 261084 INFO neutron.agent.dhcp.agent [None req-e36f3119-506c-475f-a669-5900c821c31a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:04:39 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully.
Nov 28 10:04:39 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:39.535 2 INFO neutron.agent.securitygroups_rpc [None req-cb662c54-c616-44d2-81c7-5ec9bf360652 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']
Nov 28 10:04:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:39.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:04:39 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:39.998 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 2001:db8::f816:3eff:fef8:ad87'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '18', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:40 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:40.000 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated
Nov 28 10:04:40 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:40.004 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:40 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:40.005 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[f6085299-67af-48de-ae80-0a054aa20982]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:40 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:40 np0005538513.localdomain podman[238687]: time="2025-11-28T10:04:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:04:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:04:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159326 "" "Go-http-client/1.1"
Nov 28 10:04:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:04:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20219 "" "Go-http-client/1.1"
Nov 28 10:04:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:40.585 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:40.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:04:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:40.770 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:04:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:40.777 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:40 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:40.890 261084 INFO neutron.agent.linux.ip_lib [None req-9a421c33-65be-46b0-adca-b28d3ed45bb5 - - - - - -] Device tap7ba9b167-5b cannot be used as it has no MAC address
Nov 28 10:04:40 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:40.902 2 INFO neutron.agent.securitygroups_rpc [None req-019d9e71-83ce-440c-bb58-c6c7df87e29f 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:40.908 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:40 np0005538513.localdomain kernel: device tap7ba9b167-5b entered promiscuous mode
Nov 28 10:04:40 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:40Z|00271|binding|INFO|Claiming lport 7ba9b167-5b98-46f1-8868-b05a091f3734 for this chassis.
Nov 28 10:04:40 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:40Z|00272|binding|INFO|7ba9b167-5b98-46f1-8868-b05a091f3734: Claiming unknown
Nov 28 10:04:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:40.917 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:40 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324280.9178] manager: (tap7ba9b167-5b): new Generic device (/org/freedesktop/NetworkManager/Devices/47)
Nov 28 10:04:40 np0005538513.localdomain systemd-udevd[318680]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:04:40 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:40Z|00273|binding|INFO|Setting lport 7ba9b167-5b98-46f1-8868-b05a091f3734 ovn-installed in OVS
Nov 28 10:04:40 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:40Z|00274|binding|INFO|Setting lport 7ba9b167-5b98-46f1-8868-b05a091f3734 up in Southbound
Nov 28 10:04:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:40.929 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:40 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:40.929 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fefb:5081/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=7ba9b167-5b98-46f1-8868-b05a091f3734) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:40 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:40.932 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 7ba9b167-5b98-46f1-8868-b05a091f3734 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis
Nov 28 10:04:40 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:40.936 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 4132ac1f-fd14-4d9d-a235-bdf90abef7f7 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:04:40 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:40.936 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:40 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:40.940 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[e402a50c-cc47-463d-8aa7-b5033f48119f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:40 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap7ba9b167-5b: No such device
Nov 28 10:04:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:40.949 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:40 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap7ba9b167-5b: No such device
Nov 28 10:04:40 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap7ba9b167-5b: No such device
Nov 28 10:04:40 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap7ba9b167-5b: No such device
Nov 28 10:04:40 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap7ba9b167-5b: No such device
Nov 28 10:04:40 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap7ba9b167-5b: No such device
Nov 28 10:04:40 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap7ba9b167-5b: No such device
Nov 28 10:04:40 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap7ba9b167-5b: No such device
Nov 28 10:04:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:40.986 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:41.007 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:41 np0005538513.localdomain ceph-mon[292954]: pgmap v256: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Nov 28 10:04:41 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:41.389 2 INFO neutron.agent.securitygroups_rpc [None req-bb6bd82f-3718-4e2c-b707-6af77a5385f7 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:41 np0005538513.localdomain podman[318751]: 
Nov 28 10:04:41 np0005538513.localdomain podman[318751]: 2025-11-28 10:04:41.700933578 +0000 UTC m=+0.066494596 container create eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 10:04:41 np0005538513.localdomain systemd[1]: Started libpod-conmon-eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213.scope.
Nov 28 10:04:41 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:41 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b5412e2fb13d3391b4a358fa30deba9085121686461d41d600f84f99fbf2d44/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:41 np0005538513.localdomain podman[318751]: 2025-11-28 10:04:41.747294146 +0000 UTC m=+0.112855164 container init eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:04:41 np0005538513.localdomain podman[318751]: 2025-11-28 10:04:41.754888214 +0000 UTC m=+0.120449222 container start eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 28 10:04:41 np0005538513.localdomain podman[318751]: 2025-11-28 10:04:41.667910922 +0000 UTC m=+0.033471980 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:41 np0005538513.localdomain dnsmasq[318769]: started, version 2.85 cachesize 150
Nov 28 10:04:41 np0005538513.localdomain dnsmasq[318769]: DNS service limited to local subnets
Nov 28 10:04:41 np0005538513.localdomain dnsmasq[318769]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:41 np0005538513.localdomain dnsmasq[318769]: warning: no upstream servers configured
Nov 28 10:04:41 np0005538513.localdomain dnsmasq[318769]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:41.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:04:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:41.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:04:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:41.803 261084 INFO neutron.agent.dhcp.agent [None req-9a421c33-65be-46b0-adca-b28d3ed45bb5 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:40Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6626a30>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6626580>], id=84c9c7ae-4b7f-4790-858d-fbc75245c737, ip_allocation=immediate, mac_address=fa:16:3e:60:5e:59, name=tempest-NetworksTestDHCPv6-1618658599, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=38, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['75d6b4c5-411d-4659-ad69-2f7df9b2832a'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:39Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1735, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:40Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1
Nov 28 10:04:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:41.905 261084 INFO neutron.agent.dhcp.agent [None req-10b65453-8fd6-440c-9cdd-4bffc9aa6417 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed
Nov 28 10:04:41 np0005538513.localdomain dnsmasq[318769]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses
Nov 28 10:04:41 np0005538513.localdomain podman[318786]: 2025-11-28 10:04:41.937080264 +0000 UTC m=+0.038496274 container kill eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 10:04:42 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:42.121 261084 INFO neutron.agent.dhcp.agent [None req-43a8babd-8b8b-43e8-a67c-22b76e0db50c - - - - - -] DHCP configuration for ports {'84c9c7ae-4b7f-4790-858d-fbc75245c737'} is completed
Nov 28 10:04:42 np0005538513.localdomain dnsmasq[318769]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:42 np0005538513.localdomain podman[318824]: 2025-11-28 10:04:42.137124797 +0000 UTC m=+0.039068050 container kill eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:04:42 np0005538513.localdomain dnsmasq[318769]: exiting on receipt of SIGTERM
Nov 28 10:04:42 np0005538513.localdomain podman[318862]: 2025-11-28 10:04:42.531738254 +0000 UTC m=+0.061447402 container kill eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:04:42 np0005538513.localdomain systemd[1]: libpod-eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213.scope: Deactivated successfully.
Nov 28 10:04:42 np0005538513.localdomain podman[318876]: 2025-11-28 10:04:42.609320947 +0000 UTC m=+0.060718441 container died eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:04:42 np0005538513.localdomain podman[318876]: 2025-11-28 10:04:42.640307145 +0000 UTC m=+0.091704609 container cleanup eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 10:04:42 np0005538513.localdomain systemd[1]: libpod-conmon-eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213.scope: Deactivated successfully.
Nov 28 10:04:42 np0005538513.localdomain podman[318877]: 2025-11-28 10:04:42.678957362 +0000 UTC m=+0.126838615 container remove eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:42 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:42Z|00275|binding|INFO|Releasing lport 7ba9b167-5b98-46f1-8868-b05a091f3734 from this chassis (sb_readonly=0)
Nov 28 10:04:42 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:42Z|00276|binding|INFO|Setting lport 7ba9b167-5b98-46f1-8868-b05a091f3734 down in Southbound
Nov 28 10:04:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:42.692 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:42 np0005538513.localdomain kernel: device tap7ba9b167-5b left promiscuous mode
Nov 28 10:04:42 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:42.703 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fefb:5081/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=7ba9b167-5b98-46f1-8868-b05a091f3734) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:42 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:42.706 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 7ba9b167-5b98-46f1-8868-b05a091f3734 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis
Nov 28 10:04:42 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:42.712 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:42 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:42.713 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[040f5bc3-b86f-4461-90df-105db279ea18]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:42.715 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:42.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:04:42 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-1b5412e2fb13d3391b4a358fa30deba9085121686461d41d600f84f99fbf2d44-merged.mount: Deactivated successfully.
Nov 28 10:04:42 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213-userdata-shm.mount: Deactivated successfully.
Nov 28 10:04:43 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:43.037 261084 INFO neutron.agent.dhcp.agent [None req-269b331e-c9a9-442b-b4be-4979b0d42bba - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:04:43 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully.
Nov 28 10:04:43 np0005538513.localdomain ceph-mon[292954]: pgmap v257: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Nov 28 10:04:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:43.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:04:43 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:43.857 261084 INFO neutron.agent.linux.ip_lib [None req-36f2721d-0343-4138-a16d-75e529a647c3 - - - - - -] Device tap8f633916-5b cannot be used as it has no MAC address
Nov 28 10:04:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:43.909 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:43 np0005538513.localdomain kernel: device tap8f633916-5b entered promiscuous mode
Nov 28 10:04:43 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324283.9147] manager: (tap8f633916-5b): new Generic device (/org/freedesktop/NetworkManager/Devices/48)
Nov 28 10:04:43 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:43Z|00277|binding|INFO|Claiming lport 8f633916-5bf0-443d-a81d-61c50c415e1a for this chassis.
Nov 28 10:04:43 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:43Z|00278|binding|INFO|8f633916-5bf0-443d-a81d-61c50c415e1a: Claiming unknown
Nov 28 10:04:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:43.917 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:43 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:43Z|00279|binding|INFO|Setting lport 8f633916-5bf0-443d-a81d-61c50c415e1a ovn-installed in OVS
Nov 28 10:04:43 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:43Z|00280|binding|INFO|Setting lport 8f633916-5bf0-443d-a81d-61c50c415e1a up in Southbound
Nov 28 10:04:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:43.925 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe63:6927/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=8f633916-5bf0-443d-a81d-61c50c415e1a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:43.926 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 8f633916-5bf0-443d-a81d-61c50c415e1a in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis
Nov 28 10:04:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:43.927 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:43.930 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 1148230f-ceb2-4a9b-9c0f-9849b9ab7a86 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:04:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:43.930 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:43.930 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:43.931 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[7b3942be-a134-4456-976a-1b0d99c4ea31]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:43.956 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:43.996 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:44.023 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e148 do_prune osdmap full prune enabled
Nov 28 10:04:44 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:44.214 2 INFO neutron.agent.securitygroups_rpc [None req-364c5261-76b8-4bbe-a1a4-6cba1a17e718 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']
Nov 28 10:04:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e149 e149: 6 total, 6 up, 6 in
Nov 28 10:04:44 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e149: 6 total, 6 up, 6 in
Nov 28 10:04:44 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:44.484 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:04:44 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:44.655 2 INFO neutron.agent.securitygroups_rpc [None req-32328951-8e5a-4539-abdc-e00522344c39 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:44.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:04:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:44.799 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:04:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:44.800 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:04:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:44.801 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:04:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:44.801 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:04:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:44.802 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:04:44 np0005538513.localdomain podman[318970]: 
Nov 28 10:04:44 np0005538513.localdomain podman[318970]: 2025-11-28 10:04:44.837183873 +0000 UTC m=+0.082142074 container create a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:04:44 np0005538513.localdomain podman[318970]: 2025-11-28 10:04:44.787909292 +0000 UTC m=+0.032867523 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:44 np0005538513.localdomain systemd[1]: Started libpod-conmon-a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796.scope.
Nov 28 10:04:44 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:44 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f8555d1fc38767bef1d7de04762368666c8c66ba5ae9285326628ef97b61012/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:44 np0005538513.localdomain podman[318970]: 2025-11-28 10:04:44.938655461 +0000 UTC m=+0.183613652 container init a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:04:44 np0005538513.localdomain podman[318970]: 2025-11-28 10:04:44.949059059 +0000 UTC m=+0.194017240 container start a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:44 np0005538513.localdomain dnsmasq[319005]: started, version 2.85 cachesize 150
Nov 28 10:04:44 np0005538513.localdomain dnsmasq[319005]: DNS service limited to local subnets
Nov 28 10:04:44 np0005538513.localdomain dnsmasq[319005]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:44 np0005538513.localdomain dnsmasq[319005]: warning: no upstream servers configured
Nov 28 10:04:44 np0005538513.localdomain dnsmasq-dhcp[319005]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:04:44 np0005538513.localdomain dnsmasq[319005]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:44 np0005538513.localdomain dnsmasq-dhcp[319005]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:44 np0005538513.localdomain dnsmasq-dhcp[319005]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:45 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:45.010 261084 INFO neutron.agent.dhcp.agent [None req-36f2721d-0343-4138-a16d-75e529a647c3 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:43Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65b81f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65b8760>], id=4f2aa82b-68f8-4b54-9ba9-fd01c33f0c08, ip_allocation=immediate, mac_address=fa:16:3e:23:50:5c, name=tempest-NetworksTestDHCPv6-1248319796, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=40, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['f23fa24d-eb1f-4c78-8443-1b19f0eaff59'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:42Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1756, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:44Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1
Nov 28 10:04:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:45 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:45.146 261084 INFO neutron.agent.dhcp.agent [None req-81fe2799-3cb2-4c61-8245-0786de8e7908 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed
Nov 28 10:04:45 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:45.197 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:45 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:45.198 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:04:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:45.207 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:45 np0005538513.localdomain ceph-mon[292954]: pgmap v258: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:45 np0005538513.localdomain ceph-mon[292954]: osdmap e149: 6 total, 6 up, 6 in
Nov 28 10:04:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e149 do_prune osdmap full prune enabled
Nov 28 10:04:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e150 e150: 6 total, 6 up, 6 in
Nov 28 10:04:45 np0005538513.localdomain dnsmasq[319005]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses
Nov 28 10:04:45 np0005538513.localdomain dnsmasq-dhcp[319005]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:45 np0005538513.localdomain dnsmasq-dhcp[319005]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:45 np0005538513.localdomain podman[319024]: 2025-11-28 10:04:45.247690226 +0000 UTC m=+0.107834712 container kill a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:04:45 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e150: 6 total, 6 up, 6 in
Nov 28 10:04:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:45.329 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.526s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:04:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:45.403 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:04:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:45.403 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:04:45 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:45.501 261084 INFO neutron.agent.dhcp.agent [None req-966550fc-4b5b-4ee5-84fb-5ee8b77fbe95 - - - - - -] DHCP configuration for ports {'4f2aa82b-68f8-4b54-9ba9-fd01c33f0c08'} is completed
Nov 28 10:04:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:45.592 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:45.626 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:04:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:45.627 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11199MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:04:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:45.628 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:04:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:45.628 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:04:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:45.732 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 10:04:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:45.732 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:04:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:45.733 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:04:45 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:45.746 2 INFO neutron.agent.securitygroups_rpc [None req-bac2ff8c-e0f4-436a-a2ef-9ffa5731fa74 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:45.781 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:04:45 np0005538513.localdomain dnsmasq[319005]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:45 np0005538513.localdomain dnsmasq-dhcp[319005]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:45 np0005538513.localdomain dnsmasq-dhcp[319005]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:45 np0005538513.localdomain podman[319067]: 2025-11-28 10:04:45.984755216 +0000 UTC m=+0.072402116 container kill a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:04:46 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1907177804' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:04:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:46.234 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:04:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:46.242 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:04:46 np0005538513.localdomain ceph-mon[292954]: osdmap e150: 6 total, 6 up, 6 in
Nov 28 10:04:46 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2309871379' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:04:46 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1654213529' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:04:46 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1907177804' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:04:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:46.262 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:04:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:46.265 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:04:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:46.265 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.637s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:04:47 np0005538513.localdomain dnsmasq[319005]: exiting on receipt of SIGTERM
Nov 28 10:04:47 np0005538513.localdomain podman[319127]: 2025-11-28 10:04:47.227166286 +0000 UTC m=+0.070227694 container kill a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:47 np0005538513.localdomain systemd[1]: tmp-crun.j2wArk.mount: Deactivated successfully.
Nov 28 10:04:47 np0005538513.localdomain systemd[1]: libpod-a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796.scope: Deactivated successfully.
Nov 28 10:04:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:47.266 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:04:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:47.267 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:04:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:47.267 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: pgmap v261: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2706611035' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.300127) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324287300172, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 2391, "num_deletes": 268, "total_data_size": 2586776, "memory_usage": 2634144, "flush_reason": "Manual Compaction"}
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Nov 28 10:04:47 np0005538513.localdomain podman[319139]: 2025-11-28 10:04:47.309126775 +0000 UTC m=+0.065300803 container died a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324287315906, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 2510994, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25621, "largest_seqno": 28011, "table_properties": {"data_size": 2500880, "index_size": 6491, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 22084, "raw_average_key_size": 21, "raw_value_size": 2480323, "raw_average_value_size": 2455, "num_data_blocks": 275, "num_entries": 1010, "num_filter_entries": 1010, "num_deletions": 268, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324137, "oldest_key_time": 1764324137, "file_creation_time": 1764324287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 15858 microseconds, and 7044 cpu microseconds.
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.315978) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 2510994 bytes OK
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.316011) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.318321) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.318351) EVENT_LOG_v1 {"time_micros": 1764324287318343, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.318377) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 2576671, prev total WAL file size 2576671, number of live WAL files 2.
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.319460) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131373937' seq:72057594037927935, type:22 .. '7061786F73003132303439' seq:0, type:0; will stop at (end)
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(2452KB)], [45(15MB)]
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324287319500, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 18631561, "oldest_snapshot_seqno": -1}
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 12508 keys, 16635212 bytes, temperature: kUnknown
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324287406437, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 16635212, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16564008, "index_size": 38847, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31301, "raw_key_size": 334625, "raw_average_key_size": 26, "raw_value_size": 16351241, "raw_average_value_size": 1307, "num_data_blocks": 1476, "num_entries": 12508, "num_filter_entries": 12508, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:04:47 np0005538513.localdomain podman[319139]: 2025-11-28 10:04:47.408646126 +0000 UTC m=+0.164820164 container cleanup a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.407190) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 16635212 bytes
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.412135) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 214.1 rd, 191.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 15.4 +0.0 blob) out(15.9 +0.0 blob), read-write-amplify(14.0) write-amplify(6.6) OK, records in: 13055, records dropped: 547 output_compression: NoCompression
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.412182) EVENT_LOG_v1 {"time_micros": 1764324287412164, "job": 26, "event": "compaction_finished", "compaction_time_micros": 87038, "compaction_time_cpu_micros": 40633, "output_level": 6, "num_output_files": 1, "total_output_size": 16635212, "num_input_records": 13055, "num_output_records": 12508, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324287412637, "job": 26, "event": "table_file_deletion", "file_number": 47}
Nov 28 10:04:47 np0005538513.localdomain systemd[1]: libpod-conmon-a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796.scope: Deactivated successfully.
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324287414368, "job": 26, "event": "table_file_deletion", "file_number": 45}
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.319332) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.414466) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.414475) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.414479) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.414482) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:04:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.414484) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:04:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:47.415 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:04:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:47.415 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:04:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:47.416 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 10:04:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:47.416 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:04:47 np0005538513.localdomain podman[319141]: 2025-11-28 10:04:47.433135998 +0000 UTC m=+0.184221490 container remove a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 10:04:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:47.452 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:47 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:47Z|00281|binding|INFO|Releasing lport 8f633916-5bf0-443d-a81d-61c50c415e1a from this chassis (sb_readonly=0)
Nov 28 10:04:47 np0005538513.localdomain kernel: device tap8f633916-5b left promiscuous mode
Nov 28 10:04:47 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:47Z|00282|binding|INFO|Setting lport 8f633916-5bf0-443d-a81d-61c50c415e1a down in Southbound
Nov 28 10:04:47 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:47.462 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe63:6927/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=8f633916-5bf0-443d-a81d-61c50c415e1a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:47 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:47.464 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 8f633916-5bf0-443d-a81d-61c50c415e1a in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis
Nov 28 10:04:47 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:47.467 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:47 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:47.468 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[1401545f-128d-4253-b4d8-3ec921770db1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:47.479 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:47.483 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:47 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:47.755 261084 INFO neutron.agent.dhcp.agent [None req-fcee2cb3-8024-4735-b25a-a157dfa5c984 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:04:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:04:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:04:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:04:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:04:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:04:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:04:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:04:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:04:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:04:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:04:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:04:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:04:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:04:48 np0005538513.localdomain podman[319166]: 2025-11-28 10:04:48.143123971 +0000 UTC m=+0.131302983 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 10:04:48 np0005538513.localdomain podman[319166]: 2025-11-28 10:04:48.156432323 +0000 UTC m=+0.144611335 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, container_name=openstack_network_exporter, release=1755695350, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal)
Nov 28 10:04:48 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:04:48 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:48.199 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:04:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-7f8555d1fc38767bef1d7de04762368666c8c66ba5ae9285326628ef97b61012-merged.mount: Deactivated successfully.
Nov 28 10:04:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796-userdata-shm.mount: Deactivated successfully.
Nov 28 10:04:48 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully.
Nov 28 10:04:48 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1032672972' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:04:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:48.817 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:04:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:48.832 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:04:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:48.833 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 10:04:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:48.833 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:04:49 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:49Z|00283|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:04:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:49.202 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:49 np0005538513.localdomain ceph-mon[292954]: pgmap v262: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 3.9 KiB/s wr, 49 op/s
Nov 28 10:04:49 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2551675878' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:04:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:49.573 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:50 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:50 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e150 do_prune osdmap full prune enabled
Nov 28 10:04:50 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e151 e151: 6 total, 6 up, 6 in
Nov 28 10:04:50 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e151: 6 total, 6 up, 6 in
Nov 28 10:04:50 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:50.285 261084 INFO neutron.agent.linux.ip_lib [None req-8ae4241f-8dc9-4b9b-8146-7b59c1764f43 - - - - - -] Device tapd9d6658c-69 cannot be used as it has no MAC address
Nov 28 10:04:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:50.350 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:50 np0005538513.localdomain kernel: device tapd9d6658c-69 entered promiscuous mode
Nov 28 10:04:50 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:50Z|00284|binding|INFO|Claiming lport d9d6658c-69f3-434a-a139-9146d8ddb475 for this chassis.
Nov 28 10:04:50 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:50Z|00285|binding|INFO|d9d6658c-69f3-434a-a139-9146d8ddb475: Claiming unknown
Nov 28 10:04:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:50.359 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:50 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324290.3618] manager: (tapd9d6658c-69): new Generic device (/org/freedesktop/NetworkManager/Devices/49)
Nov 28 10:04:50 np0005538513.localdomain systemd-udevd[319211]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:04:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:50.373 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-02cd8163-742c-4849-a0f3-35dad7f4a404', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02cd8163-742c-4849-a0f3-35dad7f4a404', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c66e098e4fb4a349dc2bb4293454135', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6fc78fcb-5ea1-409a-899f-c137c1b47b0b, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=d9d6658c-69f3-434a-a139-9146d8ddb475) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:50.377 158130 INFO neutron.agent.ovn.metadata.agent [-] Port d9d6658c-69f3-434a-a139-9146d8ddb475 in datapath 02cd8163-742c-4849-a0f3-35dad7f4a404 bound to our chassis
Nov 28 10:04:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:50.382 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 02cd8163-742c-4849-a0f3-35dad7f4a404 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:04:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:50.384 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[8088318f-a440-4158-9fdd-97194e04564a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:50 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:50Z|00286|binding|INFO|Setting lport d9d6658c-69f3-434a-a139-9146d8ddb475 ovn-installed in OVS
Nov 28 10:04:50 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:50Z|00287|binding|INFO|Setting lport d9d6658c-69f3-434a-a139-9146d8ddb475 up in Southbound
Nov 28 10:04:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:50.392 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:50.424 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:50 np0005538513.localdomain dnsmasq[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/addn_hosts - 0 addresses
Nov 28 10:04:50 np0005538513.localdomain dnsmasq-dhcp[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/host
Nov 28 10:04:50 np0005538513.localdomain podman[319208]: 2025-11-28 10:04:50.456243952 +0000 UTC m=+0.083238746 container kill 10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 10:04:50 np0005538513.localdomain dnsmasq-dhcp[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/opts
Nov 28 10:04:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:50.476 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:50.509 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:50 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:50.519 261084 INFO neutron.agent.linux.ip_lib [None req-5b788fcf-a1ed-4f86-9533-2f641daf2beb - - - - - -] Device tap4929710e-eb cannot be used as it has no MAC address
Nov 28 10:04:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:50.581 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:50 np0005538513.localdomain kernel: device tap4929710e-eb entered promiscuous mode
Nov 28 10:04:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:50.589 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:50 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324290.5908] manager: (tap4929710e-eb): new Generic device (/org/freedesktop/NetworkManager/Devices/50)
Nov 28 10:04:50 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:50Z|00288|binding|INFO|Claiming lport 4929710e-eb4c-4144-9bca-64efc297e299 for this chassis.
Nov 28 10:04:50 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:50Z|00289|binding|INFO|4929710e-eb4c-4144-9bca-64efc297e299: Claiming unknown
Nov 28 10:04:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:50.600 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:50.607 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-553c7f35-d914-4af1-9846-a8cbe21f53f3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-553c7f35-d914-4af1-9846-a8cbe21f53f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'aa5be61eafca4d96976422f0e0103210', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=40693dd3-cde5-4c50-9ed5-4dc8ef3313af, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=4929710e-eb4c-4144-9bca-64efc297e299) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:50.611 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 4929710e-eb4c-4144-9bca-64efc297e299 in datapath 553c7f35-d914-4af1-9846-a8cbe21f53f3 bound to our chassis
Nov 28 10:04:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:50.614 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 553c7f35-d914-4af1-9846-a8cbe21f53f3 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:04:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:50.616 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[41a2dfe4-7c23-43f2-8881-be684852fc4a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:50.635 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:50 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:50Z|00290|binding|INFO|Setting lport 4929710e-eb4c-4144-9bca-64efc297e299 ovn-installed in OVS
Nov 28 10:04:50 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:50Z|00291|binding|INFO|Setting lport 4929710e-eb4c-4144-9bca-64efc297e299 up in Southbound
Nov 28 10:04:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:50.640 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:50.714 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:50.757 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:50 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:50Z|00292|binding|INFO|Releasing lport aad9e073-acbe-49ad-8b8e-e03f91cd53c9 from this chassis (sb_readonly=1)
Nov 28 10:04:50 np0005538513.localdomain kernel: device tapaad9e073-ac left promiscuous mode
Nov 28 10:04:50 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:50Z|00293|if_status|INFO|Not setting lport aad9e073-acbe-49ad-8b8e-e03f91cd53c9 down as sb is readonly
Nov 28 10:04:50 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:50Z|00294|binding|INFO|Setting lport aad9e073-acbe-49ad-8b8e-e03f91cd53c9 down in Southbound
Nov 28 10:04:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:50.766 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:50.773 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:2::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '50a1392ce96c4024bcd36a3df403ca29', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7becd79-4f22-46be-87af-f81de3e971b9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=aad9e073-acbe-49ad-8b8e-e03f91cd53c9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:50.774 158130 INFO neutron.agent.ovn.metadata.agent [-] Port aad9e073-acbe-49ad-8b8e-e03f91cd53c9 in datapath f6bc7039-ebcb-4d5c-bff1-81be4c2607bb unbound from our chassis
Nov 28 10:04:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:50.776 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f6bc7039-ebcb-4d5c-bff1-81be4c2607bb or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:04:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:50.776 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[09054bd0-c1ef-444d-93c1-d0a3c26b3a36]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:50.784 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:50 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:50.841 261084 INFO neutron.agent.linux.ip_lib [None req-59b9ca3f-4ae0-4c11-9d2d-542918d9c063 - - - - - -] Device tap02d1d927-32 cannot be used as it has no MAC address
Nov 28 10:04:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:50.843 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:04:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:50.844 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:04:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:50.844 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:04:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:50.870 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:50 np0005538513.localdomain kernel: device tap02d1d927-32 entered promiscuous mode
Nov 28 10:04:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:50.875 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:50 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:50Z|00295|binding|INFO|Claiming lport 02d1d927-321d-4f5a-aba3-0b3dba5bfaf4 for this chassis.
Nov 28 10:04:50 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324290.8764] manager: (tap02d1d927-32): new Generic device (/org/freedesktop/NetworkManager/Devices/51)
Nov 28 10:04:50 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:50Z|00296|binding|INFO|02d1d927-321d-4f5a-aba3-0b3dba5bfaf4: Claiming unknown
Nov 28 10:04:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:50.892 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fed1:953f/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=02d1d927-321d-4f5a-aba3-0b3dba5bfaf4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:50.893 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 02d1d927-321d-4f5a-aba3-0b3dba5bfaf4 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis
Nov 28 10:04:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:50.896 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 37b04545-d7a0-418e-bffa-b4f816d3d9d5 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:04:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:50.896 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:50.897 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[1bee9c27-c373-4f51-a323-782ce99f1144]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:50.906 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:50 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:50Z|00297|binding|INFO|Setting lport 02d1d927-321d-4f5a-aba3-0b3dba5bfaf4 ovn-installed in OVS
Nov 28 10:04:50 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:50Z|00298|binding|INFO|Setting lport 02d1d927-321d-4f5a-aba3-0b3dba5bfaf4 up in Southbound
Nov 28 10:04:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:50.912 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:50.913 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:50.955 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:50.984 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:51 np0005538513.localdomain ceph-mon[292954]: pgmap v263: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 3.9 KiB/s wr, 49 op/s
Nov 28 10:04:51 np0005538513.localdomain ceph-mon[292954]: osdmap e151: 6 total, 6 up, 6 in
Nov 28 10:04:51 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:51.135 2 INFO neutron.agent.securitygroups_rpc [None req-54cf6d1c-df90-4434-9ef1-afe91707ca30 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:51 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:51Z|00299|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:04:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:51.459 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:51 np0005538513.localdomain dnsmasq[318149]: exiting on receipt of SIGTERM
Nov 28 10:04:51 np0005538513.localdomain podman[319359]: 2025-11-28 10:04:51.513841057 +0000 UTC m=+0.050974511 container kill 10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:04:51 np0005538513.localdomain systemd[1]: libpod-10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730.scope: Deactivated successfully.
Nov 28 10:04:51 np0005538513.localdomain podman[319388]: 2025-11-28 10:04:51.568840182 +0000 UTC m=+0.040478690 container died 10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:51 np0005538513.localdomain systemd[1]: tmp-crun.rIHmMB.mount: Deactivated successfully.
Nov 28 10:04:51 np0005538513.localdomain podman[319388]: 2025-11-28 10:04:51.615468279 +0000 UTC m=+0.087106767 container cleanup 10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 10:04:51 np0005538513.localdomain systemd[1]: libpod-conmon-10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730.scope: Deactivated successfully.
Nov 28 10:04:51 np0005538513.localdomain podman[319414]: 
Nov 28 10:04:51 np0005538513.localdomain podman[319414]: 2025-11-28 10:04:51.644388617 +0000 UTC m=+0.084272446 container create 4a2296a67e0de640baea4a3d45180d4e83115603ecdbae322602dd4db3ff95c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 10:04:51 np0005538513.localdomain systemd[1]: Started libpod-conmon-4a2296a67e0de640baea4a3d45180d4e83115603ecdbae322602dd4db3ff95c0.scope.
Nov 28 10:04:51 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:51 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e9304802cff54ff9919214d297ca735a4bd2139de3fddfd5405c9e013af228f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:51 np0005538513.localdomain podman[319390]: 2025-11-28 10:04:51.702138422 +0000 UTC m=+0.170077154 container remove 10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:04:51 np0005538513.localdomain podman[319414]: 2025-11-28 10:04:51.61550121 +0000 UTC m=+0.055385069 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:51 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:51.729 261084 INFO neutron.agent.dhcp.agent [None req-23531d2a-0141-488a-84c1-8f0df7621e66 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:04:51 np0005538513.localdomain podman[319414]: 2025-11-28 10:04:51.739800521 +0000 UTC m=+0.179684380 container init 4a2296a67e0de640baea4a3d45180d4e83115603ecdbae322602dd4db3ff95c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:51 np0005538513.localdomain podman[319414]: 2025-11-28 10:04:51.747629015 +0000 UTC m=+0.187512884 container start 4a2296a67e0de640baea4a3d45180d4e83115603ecdbae322602dd4db3ff95c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:51 np0005538513.localdomain dnsmasq[319457]: started, version 2.85 cachesize 150
Nov 28 10:04:51 np0005538513.localdomain dnsmasq[319457]: DNS service limited to local subnets
Nov 28 10:04:51 np0005538513.localdomain dnsmasq[319457]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:51 np0005538513.localdomain dnsmasq[319457]: warning: no upstream servers configured
Nov 28 10:04:51 np0005538513.localdomain dnsmasq-dhcp[319457]: DHCP, static leases only on 10.103.0.0, lease time 1d
Nov 28 10:04:51 np0005538513.localdomain dnsmasq[319457]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/addn_hosts - 0 addresses
Nov 28 10:04:51 np0005538513.localdomain dnsmasq-dhcp[319457]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/host
Nov 28 10:04:51 np0005538513.localdomain dnsmasq-dhcp[319457]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/opts
Nov 28 10:04:51 np0005538513.localdomain dnsmasq[319457]: exiting on receipt of SIGTERM
Nov 28 10:04:51 np0005538513.localdomain systemd[1]: libpod-4a2296a67e0de640baea4a3d45180d4e83115603ecdbae322602dd4db3ff95c0.scope: Deactivated successfully.
Nov 28 10:04:51 np0005538513.localdomain podman[319466]: 2025-11-28 10:04:51.901143164 +0000 UTC m=+0.127010130 container died 4a2296a67e0de640baea4a3d45180d4e83115603ecdbae322602dd4db3ff95c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:51 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:51.909 2 INFO neutron.agent.securitygroups_rpc [None req-263d211e-775e-47b3-9274-70437dee437e 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:51 np0005538513.localdomain podman[319466]: 2025-11-28 10:04:51.935578981 +0000 UTC m=+0.161445896 container cleanup 4a2296a67e0de640baea4a3d45180d4e83115603ecdbae322602dd4db3ff95c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:04:51 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:51.939 261084 INFO neutron.agent.dhcp.agent [None req-4fbd5d11-3723-4a2f-89f6-06cd393da79f - - - - - -] DHCP configuration for ports {'7cbbd458-cac1-440b-b157-44ec4d7deea5'} is completed
Nov 28 10:04:51 np0005538513.localdomain podman[319507]: 2025-11-28 10:04:51.988871908 +0000 UTC m=+0.083663859 container cleanup 4a2296a67e0de640baea4a3d45180d4e83115603ecdbae322602dd4db3ff95c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:04:51 np0005538513.localdomain systemd[1]: libpod-conmon-4a2296a67e0de640baea4a3d45180d4e83115603ecdbae322602dd4db3ff95c0.scope: Deactivated successfully.
Nov 28 10:04:51 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:51.997 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:04:52 np0005538513.localdomain podman[319497]: 
Nov 28 10:04:52 np0005538513.localdomain podman[319497]: 2025-11-28 10:04:52.027225267 +0000 UTC m=+0.154854998 container create a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:04:52 np0005538513.localdomain podman[319524]: 2025-11-28 10:04:52.030313226 +0000 UTC m=+0.077017278 container remove 4a2296a67e0de640baea4a3d45180d4e83115603ecdbae322602dd4db3ff95c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 10:04:52 np0005538513.localdomain systemd[1]: Started libpod-conmon-a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409.scope.
Nov 28 10:04:52 np0005538513.localdomain podman[319497]: 2025-11-28 10:04:51.977665867 +0000 UTC m=+0.105295688 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:52 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:52 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b0456f1e21b6dc3596ed21a7e347b4c98e62d6679920882846a52b9c4ebe74b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:52 np0005538513.localdomain podman[319497]: 2025-11-28 10:04:52.100784175 +0000 UTC m=+0.228413926 container init a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:52 np0005538513.localdomain podman[319497]: 2025-11-28 10:04:52.109984249 +0000 UTC m=+0.237614000 container start a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 28 10:04:52 np0005538513.localdomain dnsmasq[319562]: started, version 2.85 cachesize 150
Nov 28 10:04:52 np0005538513.localdomain dnsmasq[319562]: DNS service limited to local subnets
Nov 28 10:04:52 np0005538513.localdomain dnsmasq[319562]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:52 np0005538513.localdomain dnsmasq[319562]: warning: no upstream servers configured
Nov 28 10:04:52 np0005538513.localdomain dnsmasq[319562]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:52 np0005538513.localdomain podman[319543]: 
Nov 28 10:04:52 np0005538513.localdomain podman[319543]: 2025-11-28 10:04:52.137013073 +0000 UTC m=+0.086716166 container create 2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-553c7f35-d914-4af1-9846-a8cbe21f53f3, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:52 np0005538513.localdomain systemd[1]: Started libpod-conmon-2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd.scope.
Nov 28 10:04:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:04:52 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:52.171 261084 INFO neutron.agent.dhcp.agent [None req-59b9ca3f-4ae0-4c11-9d2d-542918d9c063 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:50Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64b4580>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64b4430>], id=f63a2be6-1492-465e-b2eb-d9c3e14e96a2, ip_allocation=immediate, mac_address=fa:16:3e:36:5b:fc, name=tempest-NetworksTestDHCPv6-1872098773, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=42, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['5fc650ff-2fcc-48db-bb2f-2b0928cc0382'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:47Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1801, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:50Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1
Nov 28 10:04:52 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:52 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bcc12744151ca7e94791368e387c8ab451571c56040070ed3ec7c54499fb7f3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:52 np0005538513.localdomain podman[319543]: 2025-11-28 10:04:52.091294393 +0000 UTC m=+0.040997526 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:52 np0005538513.localdomain podman[319543]: 2025-11-28 10:04:52.193064219 +0000 UTC m=+0.142767312 container init 2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-553c7f35-d914-4af1-9846-a8cbe21f53f3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:04:52 np0005538513.localdomain dnsmasq[319577]: started, version 2.85 cachesize 150
Nov 28 10:04:52 np0005538513.localdomain dnsmasq[319577]: DNS service limited to local subnets
Nov 28 10:04:52 np0005538513.localdomain podman[319543]: 2025-11-28 10:04:52.213835765 +0000 UTC m=+0.163538828 container start 2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-553c7f35-d914-4af1-9846-a8cbe21f53f3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 10:04:52 np0005538513.localdomain dnsmasq[319577]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:52 np0005538513.localdomain dnsmasq[319577]: warning: no upstream servers configured
Nov 28 10:04:52 np0005538513.localdomain dnsmasq-dhcp[319577]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:04:52 np0005538513.localdomain dnsmasq[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/addn_hosts - 0 addresses
Nov 28 10:04:52 np0005538513.localdomain dnsmasq-dhcp[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/host
Nov 28 10:04:52 np0005538513.localdomain dnsmasq-dhcp[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/opts
Nov 28 10:04:52 np0005538513.localdomain podman[319566]: 2025-11-28 10:04:52.272299709 +0000 UTC m=+0.088007362 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:04:52 np0005538513.localdomain podman[319566]: 2025-11-28 10:04:52.278777515 +0000 UTC m=+0.094485198 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:04:52 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:04:52 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:52.293 261084 INFO neutron.agent.dhcp.agent [None req-b05cf7bc-5b05-478e-bd7a-cd72748cb9da - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed
Nov 28 10:04:52 np0005538513.localdomain dnsmasq[319562]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses
Nov 28 10:04:52 np0005538513.localdomain podman[319610]: 2025-11-28 10:04:52.419865018 +0000 UTC m=+0.067811264 container kill a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 10:04:52 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:52.431 261084 INFO neutron.agent.dhcp.agent [None req-fa590fc1-dfab-4180-a82b-49a69d1c0c8c - - - - - -] DHCP configuration for ports {'06954665-2137-4b8e-888c-1d5516ae6541'} is completed
Nov 28 10:04:52 np0005538513.localdomain systemd[1]: tmp-crun.xsm86W.mount: Deactivated successfully.
Nov 28 10:04:52 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-578830e16dc254a672039252f1b4aca3106a0a33f8e3c5f9da3689144398b5d1-merged.mount: Deactivated successfully.
Nov 28 10:04:52 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730-userdata-shm.mount: Deactivated successfully.
Nov 28 10:04:52 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2df6bc7039\x2debcb\x2d4d5c\x2dbff1\x2d81be4c2607bb.mount: Deactivated successfully.
Nov 28 10:04:52 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:52.646 261084 INFO neutron.agent.dhcp.agent [None req-c2992044-d336-44ca-b6d0-6ce8c8535f9a - - - - - -] DHCP configuration for ports {'f63a2be6-1492-465e-b2eb-d9c3e14e96a2'} is completed
Nov 28 10:04:52 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:52Z|00300|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:04:52 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:52.722 261084 ERROR neutron.agent.linux.external_process [-] dnsmasq for dhcp with uuid 02cd8163-742c-4849-a0f3-35dad7f4a404 not found. The process should not have died
Nov 28 10:04:52 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:52.722 261084 WARNING neutron.agent.linux.external_process [-] Respawning dnsmasq for uuid 02cd8163-742c-4849-a0f3-35dad7f4a404
Nov 28 10:04:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:52.762 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:52 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:52.784 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:52Z, description=, device_id=65a5fc61-8378-41fb-8a6b-788254c76348, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64586a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6458bb0>], id=ca552b8f-155d-47a1-be25-a7aeb0006de8, ip_allocation=immediate, mac_address=fa:16:3e:c1:e6:2e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:04:46Z, description=, dns_domain=, id=02cd8163-742c-4849-a0f3-35dad7f4a404, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-162114118, port_security_enabled=True, project_id=8c66e098e4fb4a349dc2bb4293454135, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25348, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1774, status=ACTIVE, subnets=['d24e0b52-5dd2-4d29-98da-71dd46882b44'], tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:04:49Z, vlan_transparent=None, network_id=02cd8163-742c-4849-a0f3-35dad7f4a404, port_security_enabled=False, project_id=8c66e098e4fb4a349dc2bb4293454135, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1810, status=DOWN, tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:04:52Z on network 02cd8163-742c-4849-a0f3-35dad7f4a404
Nov 28 10:04:52 np0005538513.localdomain systemd[1]: tmp-crun.wXtWlT.mount: Deactivated successfully.
Nov 28 10:04:52 np0005538513.localdomain dnsmasq[319562]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:52 np0005538513.localdomain podman[319648]: 2025-11-28 10:04:52.831721589 +0000 UTC m=+0.048748597 container kill a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 10:04:53 np0005538513.localdomain ceph-mon[292954]: pgmap v265: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 4.1 KiB/s wr, 53 op/s
Nov 28 10:04:53 np0005538513.localdomain podman[319716]: 
Nov 28 10:04:53 np0005538513.localdomain podman[319716]: 2025-11-28 10:04:53.236518778 +0000 UTC m=+0.137905393 container create 0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:53 np0005538513.localdomain podman[319746]: 2025-11-28 10:04:53.224651108 +0000 UTC m=+0.047762200 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:53 np0005538513.localdomain podman[319716]: 2025-11-28 10:04:53.183661594 +0000 UTC m=+0.085048269 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:53 np0005538513.localdomain systemd[1]: Started libpod-conmon-0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0.scope.
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.305 261084 ERROR neutron.agent.linux.utils [None req-1cdd1bc3-5801-4696-a775-dc6c954a1d08 - - - - - -] Exit code: 125; Cmd: ['ip', 'netns', 'exec', 'qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404', 'env', 'PROCESS_TAG=dnsmasq-02cd8163-742c-4849-a0f3-35dad7f4a404', 'dnsmasq', '--no-hosts', '--no-resolv', '--pid-file=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/pid', '--dhcp-hostsfile=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/host', '--addn-hosts=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/addn_hosts', '--dhcp-optsfile=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/opts', '--dhcp-leasefile=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/leases', '--dhcp-match=set:ipxe,175', '--dhcp-userclass=set:ipxe6,iPXE', '--local-service', '--bind-dynamic', '--dhcp-range=set:subnet-d24e0b52-5dd2-4d29-98da-71dd46882b44,10.103.0.0,static,255.255.255.240,86400s', '--dhcp-option-force=option:mtu,1442', '--dhcp-lease-max=16', '--conf-file=/dev/null', '--domain=openstacklocal']; Stdin: ; Stdout: Starting a new child container neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: ; Stderr: Error: creating container storage: the container name "neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404" is already in use by 0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0. You have to remove that container to be able to reuse that name: that name is already in use
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent [None req-1cdd1bc3-5801-4696-a775-dc6c954a1d08 - - - - - -] Unable to reload_allocations dhcp for 02cd8163-742c-4849-a0f3-35dad7f4a404.: neutron_lib.exceptions.ProcessExecutionError: Exit code: 125; Cmd: ['ip', 'netns', 'exec', 'qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404', 'env', 'PROCESS_TAG=dnsmasq-02cd8163-742c-4849-a0f3-35dad7f4a404', 'dnsmasq', '--no-hosts', '--no-resolv', '--pid-file=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/pid', '--dhcp-hostsfile=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/host', '--addn-hosts=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/addn_hosts', '--dhcp-optsfile=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/opts', '--dhcp-leasefile=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/leases', '--dhcp-match=set:ipxe,175', '--dhcp-userclass=set:ipxe6,iPXE', '--local-service', '--bind-dynamic', '--dhcp-range=set:subnet-d24e0b52-5dd2-4d29-98da-71dd46882b44,10.103.0.0,static,255.255.255.240,86400s', '--dhcp-option-force=option:mtu,1442', '--dhcp-lease-max=16', '--conf-file=/dev/null', '--domain=openstacklocal']; Stdin: ; Stdout: Starting a new child container neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: ; Stderr: Error: creating container storage: the container name "neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404" is already in use by 0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0. You have to remove that container to be able to reuse that name: that name is already in use
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 671, in reload_allocations
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent     self._spawn_or_reload_process(reload_with_HUP=True)
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 603, in _spawn_or_reload_process
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent     pm.enable(reload_cfg=reload_with_HUP, ensure_active=True)
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py", line 105, in enable
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent     ip_wrapper.netns.execute(cmd, addl_env=self.cmd_addl_env,
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 775, in execute
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent     return utils.execute(cmd, check_exit_code=check_exit_code,
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py", line 156, in execute
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent     raise exceptions.ProcessExecutionError(msg,
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent neutron_lib.exceptions.ProcessExecutionError: Exit code: 125; Cmd: ['ip', 'netns', 'exec', 'qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404', 'env', 'PROCESS_TAG=dnsmasq-02cd8163-742c-4849-a0f3-35dad7f4a404', 'dnsmasq', '--no-hosts', '--no-resolv', '--pid-file=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/pid', '--dhcp-hostsfile=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/host', '--addn-hosts=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/addn_hosts', '--dhcp-optsfile=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/opts', '--dhcp-leasefile=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/leases', '--dhcp-match=set:ipxe,175', '--dhcp-userclass=set:ipxe6,iPXE', '--local-service', '--bind-dynamic', '--dhcp-range=set:subnet-d24e0b52-5dd2-4d29-98da-71dd46882b44,10.103.0.0,static,255.255.255.240,86400s', '--dhcp-option-force=option:mtu,1442', '--dhcp-lease-max=16', '--conf-file=/dev/null', '--domain=openstacklocal']; Stdin: ; Stdout: Starting a new child container neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent ; Stderr: Error: creating container storage: the container name "neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404" is already in use by 0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0. You have to remove that container to be able to reuse that name: that name is already in use
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent 
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent 
Nov 28 10:04:53 np0005538513.localdomain dnsmasq[319562]: exiting on receipt of SIGTERM
Nov 28 10:04:53 np0005538513.localdomain podman[319759]: 2025-11-28 10:04:53.317634923 +0000 UTC m=+0.072109208 container kill a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 10:04:53 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:53 np0005538513.localdomain systemd[1]: libpod-a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409.scope: Deactivated successfully.
Nov 28 10:04:53 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/22ad0fbf2ca34ae47ded5d35885e553d449cf886f83c141d08fd858e674bbcc6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:53 np0005538513.localdomain podman[319716]: 2025-11-28 10:04:53.332713704 +0000 UTC m=+0.234100319 container init 0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:04:53 np0005538513.localdomain podman[319716]: 2025-11-28 10:04:53.34236422 +0000 UTC m=+0.243750825 container start 0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 28 10:04:53 np0005538513.localdomain dnsmasq[319793]: started, version 2.85 cachesize 150
Nov 28 10:04:53 np0005538513.localdomain dnsmasq[319793]: DNS service limited to local subnets
Nov 28 10:04:53 np0005538513.localdomain dnsmasq[319793]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:53 np0005538513.localdomain dnsmasq[319793]: warning: no upstream servers configured
Nov 28 10:04:53 np0005538513.localdomain dnsmasq-dhcp[319793]: DHCP, static leases only on 10.103.0.0, lease time 1d
Nov 28 10:04:53 np0005538513.localdomain dnsmasq[319793]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/addn_hosts - 1 addresses
Nov 28 10:04:53 np0005538513.localdomain dnsmasq-dhcp[319793]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/host
Nov 28 10:04:53 np0005538513.localdomain dnsmasq-dhcp[319793]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/opts
Nov 28 10:04:53 np0005538513.localdomain podman[319786]: 2025-11-28 10:04:53.377742645 +0000 UTC m=+0.038966368 container died a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.421 261084 INFO neutron.agent.dhcp.agent [None req-4b96f896-8a23-4d2e-9b85-484ff0d7515a - - - - - -] DHCP configuration for ports {'ca552b8f-155d-47a1-be25-a7aeb0006de8'} is completed
Nov 28 10:04:53 np0005538513.localdomain podman[319786]: 2025-11-28 10:04:53.431774532 +0000 UTC m=+0.092998225 container remove a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:53 np0005538513.localdomain systemd[1]: libpod-conmon-a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409.scope: Deactivated successfully.
Nov 28 10:04:53 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:53Z|00301|binding|INFO|Releasing lport 02d1d927-321d-4f5a-aba3-0b3dba5bfaf4 from this chassis (sb_readonly=0)
Nov 28 10:04:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:53.446 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:53 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:53Z|00302|binding|INFO|Setting lport 02d1d927-321d-4f5a-aba3-0b3dba5bfaf4 down in Southbound
Nov 28 10:04:53 np0005538513.localdomain kernel: device tap02d1d927-32 left promiscuous mode
Nov 28 10:04:53 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:53.455 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fed1:953f/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=02d1d927-321d-4f5a-aba3-0b3dba5bfaf4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:53 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:53.457 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 02d1d927-321d-4f5a-aba3-0b3dba5bfaf4 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis
Nov 28 10:04:53 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:53.459 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:53 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:53.460 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[69571c45-a761-439a-8724-50402ddb0d51]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:53.463 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:53 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-7b0456f1e21b6dc3596ed21a7e347b4c98e62d6679920882846a52b9c4ebe74b-merged.mount: Deactivated successfully.
Nov 28 10:04:53 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409-userdata-shm.mount: Deactivated successfully.
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.726 261084 INFO neutron.agent.dhcp.agent [None req-3a4c8826-6eeb-4084-b6e5-81cd4ed1d4d8 - - - - - -] Synchronizing state
Nov 28 10:04:53 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully.
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.968 261084 INFO neutron.agent.dhcp.agent [None req-8059cd0d-68ea-4676-82eb-987dfc66d573 - - - - - -] All active networks have been fetched through RPC.
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.969 261084 INFO neutron.agent.dhcp.agent [-] Starting network 02cd8163-742c-4849-a0f3-35dad7f4a404 dhcp configuration
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.973 261084 INFO neutron.agent.dhcp.agent [-] Starting network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 dhcp configuration
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.977 261084 INFO neutron.agent.dhcp.agent [-] Starting network c8ccf767-8631-44f9-8e16-74febe5b399d dhcp configuration
Nov 28 10:04:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.977 261084 INFO neutron.agent.dhcp.agent [-] Finished network c8ccf767-8631-44f9-8e16-74febe5b399d dhcp configuration
Nov 28 10:04:54 np0005538513.localdomain dnsmasq[319793]: exiting on receipt of SIGTERM
Nov 28 10:04:54 np0005538513.localdomain systemd[1]: tmp-crun.upphEw.mount: Deactivated successfully.
Nov 28 10:04:54 np0005538513.localdomain podman[319823]: 2025-11-28 10:04:54.156256832 +0000 UTC m=+0.066639681 container kill 0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:04:54 np0005538513.localdomain systemd[1]: libpod-0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0.scope: Deactivated successfully.
Nov 28 10:04:54 np0005538513.localdomain podman[319837]: 2025-11-28 10:04:54.236318726 +0000 UTC m=+0.066526647 container died 0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:54 np0005538513.localdomain podman[319837]: 2025-11-28 10:04:54.320178308 +0000 UTC m=+0.150386209 container cleanup 0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 28 10:04:54 np0005538513.localdomain systemd[1]: libpod-conmon-0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0.scope: Deactivated successfully.
Nov 28 10:04:54 np0005538513.localdomain podman[319839]: 2025-11-28 10:04:54.342523828 +0000 UTC m=+0.161260781 container remove 0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:04:54 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-22ad0fbf2ca34ae47ded5d35885e553d449cf886f83c141d08fd858e674bbcc6-merged.mount: Deactivated successfully.
Nov 28 10:04:54 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0-userdata-shm.mount: Deactivated successfully.
Nov 28 10:04:54 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:54.713 261084 INFO neutron.agent.linux.ip_lib [None req-6b10c8c6-0c7b-4f9b-89b3-b17147669ef4 - - - - - -] Device tap46c16dfb-cb cannot be used as it has no MAC address
Nov 28 10:04:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:54.754 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:54 np0005538513.localdomain kernel: device tap46c16dfb-cb entered promiscuous mode
Nov 28 10:04:54 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:54Z|00303|binding|INFO|Claiming lport 46c16dfb-cb51-4790-9470-74b6d8c3c674 for this chassis.
Nov 28 10:04:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:54.761 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:54 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:54Z|00304|binding|INFO|46c16dfb-cb51-4790-9470-74b6d8c3c674: Claiming unknown
Nov 28 10:04:54 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324294.7631] manager: (tap46c16dfb-cb): new Generic device (/org/freedesktop/NetworkManager/Devices/52)
Nov 28 10:04:54 np0005538513.localdomain systemd-udevd[319897]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:04:54 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:54Z|00305|binding|INFO|Setting lport 46c16dfb-cb51-4790-9470-74b6d8c3c674 ovn-installed in OVS
Nov 28 10:04:54 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:54Z|00306|binding|INFO|Setting lport 46c16dfb-cb51-4790-9470-74b6d8c3c674 up in Southbound
Nov 28 10:04:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:54.777 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:54.776 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe19:31/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=46c16dfb-cb51-4790-9470-74b6d8c3c674) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:54.784 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 46c16dfb-cb51-4790-9470-74b6d8c3c674 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis
Nov 28 10:04:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:54.789 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 7b0638af-ec04-4eda-8f5f-a54aa07bc574 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:04:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:54.789 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:54.790 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[e00d664e-c3b5-4ab7-9d82-4dcb511ab153]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:54.799 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:54.804 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:54.847 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:54.873 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:55 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:55 np0005538513.localdomain ceph-mon[292954]: pgmap v266: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 3.6 KiB/s wr, 47 op/s
Nov 28 10:04:55 np0005538513.localdomain podman[319956]: 
Nov 28 10:04:55 np0005538513.localdomain podman[319956]: 2025-11-28 10:04:55.447662885 +0000 UTC m=+0.088138237 container create 44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:04:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:04:55 np0005538513.localdomain systemd[1]: Started libpod-conmon-44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9.scope.
Nov 28 10:04:55 np0005538513.localdomain podman[319956]: 2025-11-28 10:04:55.404755686 +0000 UTC m=+0.045231078 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:55 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:55 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c235e185164438599917caaee605008003fd8fff25d4dac6b9b93fcc8e24479/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:55 np0005538513.localdomain podman[319956]: 2025-11-28 10:04:55.538072476 +0000 UTC m=+0.178547838 container init 44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:04:55 np0005538513.localdomain podman[319956]: 2025-11-28 10:04:55.548264758 +0000 UTC m=+0.188740110 container start 44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 10:04:55 np0005538513.localdomain dnsmasq[320009]: started, version 2.85 cachesize 150
Nov 28 10:04:55 np0005538513.localdomain dnsmasq[320009]: DNS service limited to local subnets
Nov 28 10:04:55 np0005538513.localdomain dnsmasq[320009]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:55 np0005538513.localdomain dnsmasq[320009]: warning: no upstream servers configured
Nov 28 10:04:55 np0005538513.localdomain dnsmasq-dhcp[320009]: DHCP, static leases only on 10.103.0.0, lease time 1d
Nov 28 10:04:55 np0005538513.localdomain dnsmasq[320009]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/addn_hosts - 1 addresses
Nov 28 10:04:55 np0005538513.localdomain dnsmasq-dhcp[320009]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/host
Nov 28 10:04:55 np0005538513.localdomain dnsmasq-dhcp[320009]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/opts
Nov 28 10:04:55 np0005538513.localdomain podman[319970]: 2025-11-28 10:04:55.589556931 +0000 UTC m=+0.100802589 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:04:55 np0005538513.localdomain podman[319970]: 2025-11-28 10:04:55.594097691 +0000 UTC m=+0.105343329 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:55.605 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:55 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:04:55 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:55.615 261084 INFO neutron.agent.dhcp.agent [None req-180a39a0-3291-4909-9e5b-bc144ab088b8 - - - - - -] Finished network 02cd8163-742c-4849-a0f3-35dad7f4a404 dhcp configuration
Nov 28 10:04:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:55.669 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:55 np0005538513.localdomain podman[319969]: 2025-11-28 10:04:55.686865969 +0000 UTC m=+0.200646650 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 10:04:55 np0005538513.localdomain podman[319969]: 2025-11-28 10:04:55.771426303 +0000 UTC m=+0.285206984 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 28 10:04:55 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:04:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:55.803 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:55 np0005538513.localdomain podman[320038]: 
Nov 28 10:04:55 np0005538513.localdomain podman[320038]: 2025-11-28 10:04:55.871423957 +0000 UTC m=+0.145069368 container create 6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:55 np0005538513.localdomain systemd[1]: Started libpod-conmon-6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad.scope.
Nov 28 10:04:55 np0005538513.localdomain podman[320038]: 2025-11-28 10:04:55.827999084 +0000 UTC m=+0.101644535 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:55 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:55 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:55.930 261084 INFO neutron.agent.dhcp.agent [None req-96514ed1-b1f9-4fe3-955b-4da9a2ab1882 - - - - - -] DHCP configuration for ports {'ca552b8f-155d-47a1-be25-a7aeb0006de8', '7cbbd458-cac1-440b-b157-44ec4d7deea5', 'd9d6658c-69f3-434a-a139-9146d8ddb475'} is completed
Nov 28 10:04:55 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27bf9817321185a883a9c43ffe6b9dfc96f140d8976c2f56aa39ee4eb45f7b55/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:55 np0005538513.localdomain podman[320038]: 2025-11-28 10:04:55.941940628 +0000 UTC m=+0.215586029 container init 6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:55 np0005538513.localdomain podman[320038]: 2025-11-28 10:04:55.951848842 +0000 UTC m=+0.225494243 container start 6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:55 np0005538513.localdomain dnsmasq[320057]: started, version 2.85 cachesize 150
Nov 28 10:04:55 np0005538513.localdomain dnsmasq[320057]: DNS service limited to local subnets
Nov 28 10:04:55 np0005538513.localdomain dnsmasq[320057]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:55 np0005538513.localdomain dnsmasq[320057]: warning: no upstream servers configured
Nov 28 10:04:55 np0005538513.localdomain dnsmasq-dhcp[320057]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:04:55 np0005538513.localdomain dnsmasq[320057]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:55 np0005538513.localdomain dnsmasq-dhcp[320057]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:55 np0005538513.localdomain dnsmasq-dhcp[320057]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.011 261084 INFO neutron.agent.dhcp.agent [None req-6b10c8c6-0c7b-4f9b-89b3-b17147669ef4 - - - - - -] Finished network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 dhcp configuration
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.012 261084 INFO neutron.agent.dhcp.agent [None req-8059cd0d-68ea-4676-82eb-987dfc66d573 - - - - - -] Synchronizing state complete
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.019 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:52Z, description=, device_id=65a5fc61-8378-41fb-8a6b-788254c76348, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64a6ac0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64a68e0>], id=ca552b8f-155d-47a1-be25-a7aeb0006de8, ip_allocation=immediate, mac_address=fa:16:3e:c1:e6:2e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:04:46Z, description=, dns_domain=, id=02cd8163-742c-4849-a0f3-35dad7f4a404, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-162114118, port_security_enabled=True, project_id=8c66e098e4fb4a349dc2bb4293454135, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25348, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1774, status=ACTIVE, subnets=['d24e0b52-5dd2-4d29-98da-71dd46882b44'], tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:04:49Z, vlan_transparent=None, network_id=02cd8163-742c-4849-a0f3-35dad7f4a404, port_security_enabled=False, project_id=8c66e098e4fb4a349dc2bb4293454135, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1810, status=DOWN, tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:04:52Z on network 02cd8163-742c-4849-a0f3-35dad7f4a404
Nov 28 10:04:56 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:56Z|00307|binding|INFO|Releasing lport 46c16dfb-cb51-4790-9470-74b6d8c3c674 from this chassis (sb_readonly=0)
Nov 28 10:04:56 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:56Z|00308|binding|INFO|Setting lport 46c16dfb-cb51-4790-9470-74b6d8c3c674 down in Southbound
Nov 28 10:04:56 np0005538513.localdomain kernel: device tap46c16dfb-cb left promiscuous mode
Nov 28 10:04:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:56.055 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:56 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:56.062 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe19:31/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=46c16dfb-cb51-4790-9470-74b6d8c3c674) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:56 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:56.065 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 46c16dfb-cb51-4790-9470-74b6d8c3c674 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis
Nov 28 10:04:56 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:56.069 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:56 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:56.071 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[741e55f9-e2b5-48fb-84e2-dcfd2a9a573f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:56.073 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:56 np0005538513.localdomain dnsmasq[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/addn_hosts - 0 addresses
Nov 28 10:04:56 np0005538513.localdomain dnsmasq-dhcp[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/host
Nov 28 10:04:56 np0005538513.localdomain dnsmasq-dhcp[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/opts
Nov 28 10:04:56 np0005538513.localdomain podman[320089]: 2025-11-28 10:04:56.213395376 +0000 UTC m=+0.068962976 container kill 57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a246530-be70-4846-9202-8f9cd6d862ae, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:04:56 np0005538513.localdomain dnsmasq[320009]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/addn_hosts - 1 addresses
Nov 28 10:04:56 np0005538513.localdomain dnsmasq-dhcp[320009]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/host
Nov 28 10:04:56 np0005538513.localdomain podman[320105]: 2025-11-28 10:04:56.275797295 +0000 UTC m=+0.074899347 container kill 44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:56 np0005538513.localdomain dnsmasq-dhcp[320009]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/opts
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.355 261084 INFO neutron.agent.dhcp.agent [None req-b160f751-2d82-4274-8c9c-feef1c9f442c - - - - - -] DHCP configuration for ports {'ca552b8f-155d-47a1-be25-a7aeb0006de8', 'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '7cbbd458-cac1-440b-b157-44ec4d7deea5', 'd9d6658c-69f3-434a-a139-9146d8ddb475'} is completed
Nov 28 10:04:56 np0005538513.localdomain dnsmasq[320057]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:04:56 np0005538513.localdomain dnsmasq-dhcp[320057]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:56 np0005538513.localdomain podman[320144]: 2025-11-28 10:04:56.433394991 +0000 UTC m=+0.074169477 container kill 6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:04:56 np0005538513.localdomain dnsmasq-dhcp[320057]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent [None req-a9d75015-13fd-417c-ada1-435f3853c34d - - - - - -] Unable to reload_allocations dhcp for 8642adde-54ae-4fc2-b997-bf1962c6c7f1.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap46c16dfb-cb not found in namespace qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1.
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent     return fut.result()
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent     raise self._exception
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap46c16dfb-cb not found in namespace qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1.
Nov 28 10:04:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent 
Nov 28 10:04:56 np0005538513.localdomain systemd[1]: tmp-crun.PPVIQv.mount: Deactivated successfully.
Nov 28 10:04:56 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:56Z|00309|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:04:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:57.023 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:57.025 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:57 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:57.057 2 INFO neutron.agent.securitygroups_rpc [None req-1b122d45-69b6-44ae-9f44-6255649c2a99 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:57 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:57.082 261084 INFO neutron.agent.dhcp.agent [None req-b4f5f919-ee1a-4a3b-8b11-6d4d21b4686b - - - - - -] DHCP configuration for ports {'ca552b8f-155d-47a1-be25-a7aeb0006de8', 'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '46c16dfb-cb51-4790-9470-74b6d8c3c674'} is completed
Nov 28 10:04:57 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:57Z|00310|binding|INFO|Removing iface tap404598dd-47 ovn-installed in OVS
Nov 28 10:04:57 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:57.098 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 8dc10fc1-17cf-4461-b700-c1adb49acf0d with type ""
Nov 28 10:04:57 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:57Z|00311|binding|INFO|Removing lport 404598dd-4706-4aa3-a857-56207d0fd483 ovn-installed in OVS
Nov 28 10:04:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:57.100 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:57 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:57.102 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-1a246530-be70-4846-9202-8f9cd6d862ae', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a246530-be70-4846-9202-8f9cd6d862ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '50a1392ce96c4024bcd36a3df403ca29', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d9cc17b-2c39-4130-a2f2-9d12894eaf52, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=404598dd-4706-4aa3-a857-56207d0fd483) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:57.104 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:57 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:57.107 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 404598dd-4706-4aa3-a857-56207d0fd483 in datapath 1a246530-be70-4846-9202-8f9cd6d862ae unbound from our chassis
Nov 28 10:04:57 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:57.109 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1a246530-be70-4846-9202-8f9cd6d862ae or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:04:57 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:57.110 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[0b615385-1609-435a-8639-1bcc1e8791dd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:57 np0005538513.localdomain ceph-mon[292954]: pgmap v267: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 3.1 KiB/s wr, 39 op/s
Nov 28 10:04:57 np0005538513.localdomain dnsmasq[317792]: exiting on receipt of SIGTERM
Nov 28 10:04:57 np0005538513.localdomain podman[320180]: 2025-11-28 10:04:57.219961279 +0000 UTC m=+0.048373308 container kill 57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a246530-be70-4846-9202-8f9cd6d862ae, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:04:57 np0005538513.localdomain systemd[1]: libpod-57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5.scope: Deactivated successfully.
Nov 28 10:04:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:04:57 np0005538513.localdomain podman[320194]: 2025-11-28 10:04:57.298102267 +0000 UTC m=+0.061272206 container died 57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a246530-be70-4846-9202-8f9cd6d862ae, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:57 np0005538513.localdomain podman[320194]: 2025-11-28 10:04:57.335133449 +0000 UTC m=+0.098303338 container cleanup 57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a246530-be70-4846-9202-8f9cd6d862ae, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:04:57 np0005538513.localdomain systemd[1]: libpod-conmon-57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5.scope: Deactivated successfully.
Nov 28 10:04:57 np0005538513.localdomain podman[320195]: 2025-11-28 10:04:57.37739737 +0000 UTC m=+0.137979884 container remove 57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a246530-be70-4846-9202-8f9cd6d862ae, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:57.390 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:57 np0005538513.localdomain kernel: device tap404598dd-47 left promiscuous mode
Nov 28 10:04:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:57.406 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:57 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:57.422 261084 INFO neutron.agent.dhcp.agent [None req-8059cd0d-68ea-4676-82eb-987dfc66d573 - - - - - -] Synchronizing state
Nov 28 10:04:57 np0005538513.localdomain podman[320206]: 2025-11-28 10:04:57.444491263 +0000 UTC m=+0.191993223 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 10:04:57 np0005538513.localdomain podman[320206]: 2025-11-28 10:04:57.459483062 +0000 UTC m=+0.206985022 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 10:04:57 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:04:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f57866dc5c6162f1b2474906fd9f4e8b712b44716f63d14a9891a2098561e5e3-merged.mount: Deactivated successfully.
Nov 28 10:04:57 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5-userdata-shm.mount: Deactivated successfully.
Nov 28 10:04:57 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d1a246530\x2dbe70\x2d4846\x2d9202\x2d8f9cd6d862ae.mount: Deactivated successfully.
Nov 28 10:04:57 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:57.642 261084 INFO neutron.agent.dhcp.agent [None req-ade9d9ba-7777-4bec-957e-af24ecb3c5ae - - - - - -] All active networks have been fetched through RPC.
Nov 28 10:04:57 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:57.643 261084 INFO neutron.agent.dhcp.agent [-] Starting network 1a246530-be70-4846-9202-8f9cd6d862ae dhcp configuration
Nov 28 10:04:57 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:57.644 261084 INFO neutron.agent.dhcp.agent [-] Finished network 1a246530-be70-4846-9202-8f9cd6d862ae dhcp configuration
Nov 28 10:04:57 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:57.644 261084 INFO neutron.agent.dhcp.agent [-] Starting network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 dhcp configuration
Nov 28 10:04:57 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:57.648 261084 INFO neutron.agent.dhcp.agent [-] Starting network c8ccf767-8631-44f9-8e16-74febe5b399d dhcp configuration
Nov 28 10:04:57 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:57.649 261084 INFO neutron.agent.dhcp.agent [-] Finished network c8ccf767-8631-44f9-8e16-74febe5b399d dhcp configuration
Nov 28 10:04:57 np0005538513.localdomain dnsmasq[320057]: exiting on receipt of SIGTERM
Nov 28 10:04:57 np0005538513.localdomain podman[320257]: 2025-11-28 10:04:57.826437587 +0000 UTC m=+0.064435848 container kill 6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:04:57 np0005538513.localdomain systemd[1]: libpod-6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad.scope: Deactivated successfully.
Nov 28 10:04:57 np0005538513.localdomain podman[320270]: 2025-11-28 10:04:57.902901747 +0000 UTC m=+0.057655392 container died 6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:57 np0005538513.localdomain podman[320270]: 2025-11-28 10:04:57.937946332 +0000 UTC m=+0.092699947 container cleanup 6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:57 np0005538513.localdomain systemd[1]: libpod-conmon-6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad.scope: Deactivated successfully.
Nov 28 10:04:57 np0005538513.localdomain podman[320271]: 2025-11-28 10:04:57.979861953 +0000 UTC m=+0.128087371 container remove 6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:04:58 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:58.028 261084 INFO neutron.agent.linux.ip_lib [-] Device tap46c16dfb-cb cannot be used as it has no MAC address
Nov 28 10:04:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:58.086 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:58 np0005538513.localdomain kernel: device tap46c16dfb-cb entered promiscuous mode
Nov 28 10:04:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:58.097 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:58 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324298.0979] manager: (tap46c16dfb-cb): new Generic device (/org/freedesktop/NetworkManager/Devices/53)
Nov 28 10:04:58 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:58Z|00312|binding|INFO|Claiming lport 46c16dfb-cb51-4790-9470-74b6d8c3c674 for this chassis.
Nov 28 10:04:58 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:58Z|00313|binding|INFO|46c16dfb-cb51-4790-9470-74b6d8c3c674: Claiming unknown
Nov 28 10:04:58 np0005538513.localdomain systemd-udevd[320303]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:04:58 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:58.111 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe19:31/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=46c16dfb-cb51-4790-9470-74b6d8c3c674) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:58 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:58.113 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 46c16dfb-cb51-4790-9470-74b6d8c3c674 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis
Nov 28 10:04:58 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:58.116 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 7b0638af-ec04-4eda-8f5f-a54aa07bc574 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:04:58 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:58.117 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:58 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:04:58.117 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[d44c8b46-168f-4009-8f50-ff43eb3b543c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:58 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap46c16dfb-cb: No such device
Nov 28 10:04:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:58.135 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:58 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap46c16dfb-cb: No such device
Nov 28 10:04:58 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:58Z|00314|binding|INFO|Setting lport 46c16dfb-cb51-4790-9470-74b6d8c3c674 ovn-installed in OVS
Nov 28 10:04:58 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:58Z|00315|binding|INFO|Setting lport 46c16dfb-cb51-4790-9470-74b6d8c3c674 up in Southbound
Nov 28 10:04:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:58.140 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:58 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap46c16dfb-cb: No such device
Nov 28 10:04:58 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap46c16dfb-cb: No such device
Nov 28 10:04:58 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap46c16dfb-cb: No such device
Nov 28 10:04:58 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap46c16dfb-cb: No such device
Nov 28 10:04:58 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap46c16dfb-cb: No such device
Nov 28 10:04:58 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap46c16dfb-cb: No such device
Nov 28 10:04:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:58.182 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:58.217 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:58 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:04:58Z|00316|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:04:58 np0005538513.localdomain systemd[1]: tmp-crun.yVnVTR.mount: Deactivated successfully.
Nov 28 10:04:58 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-27bf9817321185a883a9c43ffe6b9dfc96f140d8976c2f56aa39ee4eb45f7b55-merged.mount: Deactivated successfully.
Nov 28 10:04:58 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad-userdata-shm.mount: Deactivated successfully.
Nov 28 10:04:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:04:58.517 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:58 np0005538513.localdomain snmpd[66832]: empty variable list in _query
Nov 28 10:04:58 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:04:58.754 2 INFO neutron.agent.securitygroups_rpc [None req-d1d37070-b21e-47b1-9333-9a0acdf29e79 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:59 np0005538513.localdomain podman[320373]: 
Nov 28 10:04:59 np0005538513.localdomain podman[320373]: 2025-11-28 10:04:59.083804545 +0000 UTC m=+0.095131757 container create 4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:59 np0005538513.localdomain systemd[1]: Started libpod-conmon-4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc.scope.
Nov 28 10:04:59 np0005538513.localdomain podman[320373]: 2025-11-28 10:04:59.03722524 +0000 UTC m=+0.048552472 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:59 np0005538513.localdomain systemd[1]: tmp-crun.q1Abty.mount: Deactivated successfully.
Nov 28 10:04:59 np0005538513.localdomain ceph-mon[292954]: pgmap v268: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:59 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:59 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1a859774efe97f31726e746b14ef1f2253fd05145ded52792ece98c18205a30/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:59 np0005538513.localdomain podman[320373]: 2025-11-28 10:04:59.178625022 +0000 UTC m=+0.189952244 container init 4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:59 np0005538513.localdomain podman[320373]: 2025-11-28 10:04:59.187330442 +0000 UTC m=+0.198657654 container start 4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:59 np0005538513.localdomain dnsmasq[320391]: started, version 2.85 cachesize 150
Nov 28 10:04:59 np0005538513.localdomain dnsmasq[320391]: DNS service limited to local subnets
Nov 28 10:04:59 np0005538513.localdomain dnsmasq[320391]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:59 np0005538513.localdomain dnsmasq[320391]: warning: no upstream servers configured
Nov 28 10:04:59 np0005538513.localdomain dnsmasq-dhcp[320391]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:04:59 np0005538513.localdomain dnsmasq[320391]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses
Nov 28 10:04:59 np0005538513.localdomain dnsmasq-dhcp[320391]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:59 np0005538513.localdomain dnsmasq-dhcp[320391]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:59 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:59.257 261084 INFO neutron.agent.dhcp.agent [-] Finished network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 dhcp configuration
Nov 28 10:04:59 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:59.257 261084 INFO neutron.agent.dhcp.agent [None req-ade9d9ba-7777-4bec-957e-af24ecb3c5ae - - - - - -] Synchronizing state complete
Nov 28 10:04:59 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:59.259 261084 INFO neutron.agent.dhcp.agent [None req-d0f1f13b-6b00-42f2-b13c-f49ab1a924af - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:04:59 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:59.260 261084 INFO neutron.agent.dhcp.agent [None req-d0f1f13b-6b00-42f2-b13c-f49ab1a924af - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:04:59 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:59.260 261084 INFO neutron.agent.dhcp.agent [None req-d0f1f13b-6b00-42f2-b13c-f49ab1a924af - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:04:59 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:59.261 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:56Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64d6c40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6eb50a0>], id=584ff574-1899-4b39-a9e3-57a01ba01ba1, ip_allocation=immediate, mac_address=fa:16:3e:0b:06:78, name=tempest-NetworksTestDHCPv6-1476856072, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=44, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['848b098f-2f4e-4153-b931-bae96c03b751'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:53Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1828, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:56Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1
Nov 28 10:04:59 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:59.274 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:57Z, description=, device_id=0f485132-8e02-47a3-b2f5-564423c3ef9b, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65b0460>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65b05e0>], id=8d532c0c-347a-4459-8005-d390d68f5b23, ip_allocation=immediate, mac_address=fa:16:3e:b7:1c:89, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:04:47Z, description=, dns_domain=, id=553c7f35-d914-4af1-9846-a8cbe21f53f3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1486257592-network, port_security_enabled=True, project_id=aa5be61eafca4d96976422f0e0103210, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=15189, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1778, status=ACTIVE, subnets=['c1ca5641-5630-4c33-a102-f6b9f86bd61c'], tags=[], tenant_id=aa5be61eafca4d96976422f0e0103210, updated_at=2025-11-28T10:04:49Z, vlan_transparent=None, network_id=553c7f35-d914-4af1-9846-a8cbe21f53f3, port_security_enabled=False, project_id=aa5be61eafca4d96976422f0e0103210, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1834, status=DOWN, tags=[], tenant_id=aa5be61eafca4d96976422f0e0103210, updated_at=2025-11-28T10:04:57Z on network 553c7f35-d914-4af1-9846-a8cbe21f53f3
Nov 28 10:04:59 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:59.354 261084 INFO neutron.agent.dhcp.agent [None req-9edbe1b1-8a4d-493b-877d-766168ff95e6 - - - - - -] DHCP configuration for ports {'584ff574-1899-4b39-a9e3-57a01ba01ba1', 'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '46c16dfb-cb51-4790-9470-74b6d8c3c674'} is completed
Nov 28 10:04:59 np0005538513.localdomain dnsmasq[320391]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses
Nov 28 10:04:59 np0005538513.localdomain dnsmasq-dhcp[320391]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:04:59 np0005538513.localdomain dnsmasq-dhcp[320391]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:04:59 np0005538513.localdomain podman[320425]: 2025-11-28 10:04:59.493679659 +0000 UTC m=+0.064108248 container kill 4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 10:04:59 np0005538513.localdomain podman[320433]: 2025-11-28 10:04:59.516459542 +0000 UTC m=+0.060495354 container kill 2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-553c7f35-d914-4af1-9846-a8cbe21f53f3, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:04:59 np0005538513.localdomain dnsmasq[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/addn_hosts - 1 addresses
Nov 28 10:04:59 np0005538513.localdomain dnsmasq-dhcp[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/host
Nov 28 10:04:59 np0005538513.localdomain dnsmasq-dhcp[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/opts
Nov 28 10:04:59 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:59.825 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:57Z, description=, device_id=0f485132-8e02-47a3-b2f5-564423c3ef9b, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd67c3af0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd66593a0>], id=8d532c0c-347a-4459-8005-d390d68f5b23, ip_allocation=immediate, mac_address=fa:16:3e:b7:1c:89, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:04:47Z, description=, dns_domain=, id=553c7f35-d914-4af1-9846-a8cbe21f53f3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1486257592-network, port_security_enabled=True, project_id=aa5be61eafca4d96976422f0e0103210, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=15189, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1778, status=ACTIVE, subnets=['c1ca5641-5630-4c33-a102-f6b9f86bd61c'], tags=[], tenant_id=aa5be61eafca4d96976422f0e0103210, updated_at=2025-11-28T10:04:49Z, vlan_transparent=None, network_id=553c7f35-d914-4af1-9846-a8cbe21f53f3, port_security_enabled=False, project_id=aa5be61eafca4d96976422f0e0103210, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1834, status=DOWN, tags=[], tenant_id=aa5be61eafca4d96976422f0e0103210, updated_at=2025-11-28T10:04:57Z on network 553c7f35-d914-4af1-9846-a8cbe21f53f3
Nov 28 10:04:59 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:04:59.836 261084 INFO neutron.agent.dhcp.agent [None req-25fa700c-a8a5-4f94-8fd2-934793928fac - - - - - -] DHCP configuration for ports {'584ff574-1899-4b39-a9e3-57a01ba01ba1', '8d532c0c-347a-4459-8005-d390d68f5b23'} is completed
Nov 28 10:05:00 np0005538513.localdomain dnsmasq[320391]: exiting on receipt of SIGTERM
Nov 28 10:05:00 np0005538513.localdomain systemd[1]: libpod-4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc.scope: Deactivated successfully.
Nov 28 10:05:00 np0005538513.localdomain podman[320493]: 2025-11-28 10:05:00.008135301 +0000 UTC m=+0.075693560 container kill 4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 10:05:00 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:00 np0005538513.localdomain podman[320512]: 2025-11-28 10:05:00.073255947 +0000 UTC m=+0.073304692 container kill 2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-553c7f35-d914-4af1-9846-a8cbe21f53f3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:05:00 np0005538513.localdomain dnsmasq[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/addn_hosts - 1 addresses
Nov 28 10:05:00 np0005538513.localdomain dnsmasq-dhcp[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/host
Nov 28 10:05:00 np0005538513.localdomain dnsmasq-dhcp[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/opts
Nov 28 10:05:00 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:00.091 2 INFO neutron.agent.securitygroups_rpc [None req-f8f0dbe9-4862-4719-841c-e92cc8d478e0 6aa2d31d87d44bd68f3306827a96cb84 7163b69fa8fc4d998fb494edfa303457 - - default default] Security group member updated ['d5ac7cb5-5e8f-446f-ac61-9c9e90d707c1']
Nov 28 10:05:00 np0005538513.localdomain podman[320519]: 2025-11-28 10:05:00.0970976 +0000 UTC m=+0.072246801 container died 4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:05:00 np0005538513.localdomain podman[320519]: 2025-11-28 10:05:00.182483036 +0000 UTC m=+0.157632207 container cleanup 4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:05:00 np0005538513.localdomain systemd[1]: libpod-conmon-4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc.scope: Deactivated successfully.
Nov 28 10:05:00 np0005538513.localdomain podman[320524]: 2025-11-28 10:05:00.212121736 +0000 UTC m=+0.177596410 container remove 4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:05:00 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:00Z|00317|binding|INFO|Releasing lport 46c16dfb-cb51-4790-9470-74b6d8c3c674 from this chassis (sb_readonly=0)
Nov 28 10:05:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:00.269 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:00 np0005538513.localdomain kernel: device tap46c16dfb-cb left promiscuous mode
Nov 28 10:05:00 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:00Z|00318|binding|INFO|Setting lport 46c16dfb-cb51-4790-9470-74b6d8c3c674 down in Southbound
Nov 28 10:05:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:00.279 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe19:31/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=46c16dfb-cb51-4790-9470-74b6d8c3c674) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:00.281 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 46c16dfb-cb51-4790-9470-74b6d8c3c674 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis
Nov 28 10:05:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:00.284 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:05:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:00.285 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[c1b66fbf-2eda-4992-9239-1571b910e761]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:00.296 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:00 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:00.452 261084 INFO neutron.agent.dhcp.agent [None req-cf608d3b-1a14-4e77-9c42-108b9264096c - - - - - -] DHCP configuration for ports {'8d532c0c-347a-4459-8005-d390d68f5b23'} is completed
Nov 28 10:05:00 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f1a859774efe97f31726e746b14ef1f2253fd05145ded52792ece98c18205a30-merged.mount: Deactivated successfully.
Nov 28 10:05:00 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc-userdata-shm.mount: Deactivated successfully.
Nov 28 10:05:00 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully.
Nov 28 10:05:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:00.607 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:00.672 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:01 np0005538513.localdomain ceph-mon[292954]: pgmap v269: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:01 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:01.175 261084 INFO neutron.agent.linux.ip_lib [None req-b7029a3b-7f98-4652-bc32-843c78e0ba8d - - - - - -] Device tapca3c90c8-d0 cannot be used as it has no MAC address
Nov 28 10:05:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:01.201 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:01 np0005538513.localdomain kernel: device tapca3c90c8-d0 entered promiscuous mode
Nov 28 10:05:01 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:01Z|00319|binding|INFO|Claiming lport ca3c90c8-d07c-44d0-a7ec-af787d805dd0 for this chassis.
Nov 28 10:05:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:01.208 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:01 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:01Z|00320|binding|INFO|ca3c90c8-d07c-44d0-a7ec-af787d805dd0: Claiming unknown
Nov 28 10:05:01 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324301.2123] manager: (tapca3c90c8-d0): new Generic device (/org/freedesktop/NetworkManager/Devices/54)
Nov 28 10:05:01 np0005538513.localdomain systemd-udevd[320573]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:05:01 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:01.216 2 INFO neutron.agent.securitygroups_rpc [None req-8e254d5a-cb3d-4d3c-8c34-ca57b16025b1 6aa2d31d87d44bd68f3306827a96cb84 7163b69fa8fc4d998fb494edfa303457 - - default default] Security group member updated ['d5ac7cb5-5e8f-446f-ac61-9c9e90d707c1']
Nov 28 10:05:01 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:01Z|00321|binding|INFO|Setting lport ca3c90c8-d07c-44d0-a7ec-af787d805dd0 ovn-installed in OVS
Nov 28 10:05:01 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:01Z|00322|binding|INFO|Setting lport ca3c90c8-d07c-44d0-a7ec-af787d805dd0 up in Southbound
Nov 28 10:05:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:01.220 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:01.217 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=ca3c90c8-d07c-44d0-a7ec-af787d805dd0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:01.219 158130 INFO neutron.agent.ovn.metadata.agent [-] Port ca3c90c8-d07c-44d0-a7ec-af787d805dd0 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis
Nov 28 10:05:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:01.223 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port fff83133-53ad-4b5f-9ec4-5a9c4ae5262d IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:05:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:01.223 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:05:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:01.225 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:01.224 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[9f9b720b-f842-4f76-a283-897b95302510]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:01.257 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:01.340 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:01 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:01.964 2 INFO neutron.agent.securitygroups_rpc [None req-70ec53ac-b8b3-4633-a977-c5aaaab920ff 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:05:02 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:02.175 261084 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.dhcp_release_cmd', '--privsep_sock_path', '/tmp/tmpdn9o7yfg/privsep.sock']
Nov 28 10:05:02 np0005538513.localdomain podman[320628]: 
Nov 28 10:05:02 np0005538513.localdomain podman[320628]: 2025-11-28 10:05:02.240157098 +0000 UTC m=+0.101765128 container create 8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 28 10:05:02 np0005538513.localdomain systemd[1]: Started libpod-conmon-8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3.scope.
Nov 28 10:05:02 np0005538513.localdomain podman[320628]: 2025-11-28 10:05:02.190735651 +0000 UTC m=+0.052343721 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:05:02 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:05:02 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/263de5eac5bcb424e9045572c27154229aa4a2d3b22ed18705a18fbb1cd475fa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:05:02 np0005538513.localdomain podman[320628]: 2025-11-28 10:05:02.347297278 +0000 UTC m=+0.208905318 container init 8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:05:02 np0005538513.localdomain podman[320628]: 2025-11-28 10:05:02.360970529 +0000 UTC m=+0.222578559 container start 8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:05:02 np0005538513.localdomain dnsmasq[320649]: started, version 2.85 cachesize 150
Nov 28 10:05:02 np0005538513.localdomain dnsmasq[320649]: DNS service limited to local subnets
Nov 28 10:05:02 np0005538513.localdomain dnsmasq[320649]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:05:02 np0005538513.localdomain dnsmasq[320649]: warning: no upstream servers configured
Nov 28 10:05:02 np0005538513.localdomain dnsmasq-dhcp[320649]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:05:02 np0005538513.localdomain dnsmasq[320649]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:05:02 np0005538513.localdomain dnsmasq-dhcp[320649]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:05:02 np0005538513.localdomain dnsmasq-dhcp[320649]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:05:02 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:02.429 261084 INFO neutron.agent.dhcp.agent [None req-b7029a3b-7f98-4652-bc32-843c78e0ba8d - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:05:01Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6649be0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd67d38b0>], id=25b54e74-3d8f-4eda-9085-11cfc4264ad0, ip_allocation=immediate, mac_address=fa:16:3e:91:7b:bc, name=tempest-NetworksTestDHCPv6-1477056337, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=46, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['c2382461-c6a6-404b-bdb2-c92b918cec3f'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:05:00Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1841, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:05:01Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1
Nov 28 10:05:02 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:02.504 261084 INFO neutron.agent.dhcp.agent [None req-07be24c6-1979-4309-8cfa-faced2519064 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed
Nov 28 10:05:02 np0005538513.localdomain podman[320669]: 2025-11-28 10:05:02.629976097 +0000 UTC m=+0.067261548 container kill 8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:05:02 np0005538513.localdomain dnsmasq[320649]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses
Nov 28 10:05:02 np0005538513.localdomain dnsmasq-dhcp[320649]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:05:02 np0005538513.localdomain dnsmasq-dhcp[320649]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:05:02 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:02.746 2 INFO neutron.agent.securitygroups_rpc [None req-472369ad-637e-463f-8142-3dff7a706106 6aa2d31d87d44bd68f3306827a96cb84 7163b69fa8fc4d998fb494edfa303457 - - default default] Security group member updated ['d5ac7cb5-5e8f-446f-ac61-9c9e90d707c1']
Nov 28 10:05:02 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:02.851 261084 INFO neutron.agent.dhcp.agent [None req-488aa661-66d8-441d-8869-df90f7f4f979 - - - - - -] DHCP configuration for ports {'25b54e74-3d8f-4eda-9085-11cfc4264ad0'} is completed
Nov 28 10:05:02 np0005538513.localdomain kernel: device tapca3c90c8-d0 left promiscuous mode
Nov 28 10:05:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:02.876 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:02 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:02Z|00323|binding|INFO|Releasing lport ca3c90c8-d07c-44d0-a7ec-af787d805dd0 from this chassis (sb_readonly=0)
Nov 28 10:05:02 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:02Z|00324|binding|INFO|Setting lport ca3c90c8-d07c-44d0-a7ec-af787d805dd0 down in Southbound
Nov 28 10:05:02 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:02.888 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=ca3c90c8-d07c-44d0-a7ec-af787d805dd0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:02 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:02.890 158130 INFO neutron.agent.ovn.metadata.agent [-] Port ca3c90c8-d07c-44d0-a7ec-af787d805dd0 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis
Nov 28 10:05:02 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:02.893 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:05:02 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:02.895 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[5350c6fa-94c5-4051-a214-df074ff967b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:02.901 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:02.902 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:02 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:02.919 261084 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 28 10:05:02 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:02.795 320689 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 28 10:05:02 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:02.800 320689 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 28 10:05:02 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:02.804 320689 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 28 10:05:02 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:02.804 320689 INFO oslo.privsep.daemon [-] privsep daemon running as pid 320689
Nov 28 10:05:02 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:02.967 2 INFO neutron.agent.securitygroups_rpc [None req-a8642c38-ef8d-4c54-aca2-47d1d691b2fe 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:05:03 np0005538513.localdomain dnsmasq-dhcp[320009]: DHCPRELEASE(tapd9d6658c-69) 10.103.0.1 fa:16:3e:c1:e6:2e
Nov 28 10:05:03 np0005538513.localdomain ceph-mon[292954]: pgmap v270: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:03 np0005538513.localdomain dnsmasq[320649]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:05:03 np0005538513.localdomain dnsmasq-dhcp[320649]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:05:03 np0005538513.localdomain dnsmasq-dhcp[320649]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:05:03 np0005538513.localdomain podman[320713]: 2025-11-28 10:05:03.215537986 +0000 UTC m=+0.069123862 container kill 8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent [-] Unable to reload_allocations dhcp for 8642adde-54ae-4fc2-b997-bf1962c6c7f1.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapca3c90c8-d0 not found in namespace qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1.
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent     return fut.result()
Nov 28 10:05:03 np0005538513.localdomain systemd[1]: tmp-crun.L5x7lN.mount: Deactivated successfully.
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent     raise self._exception
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapca3c90c8-d0 not found in namespace qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1.
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent 
Nov 28 10:05:03 np0005538513.localdomain dnsmasq[320009]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/addn_hosts - 0 addresses
Nov 28 10:05:03 np0005538513.localdomain dnsmasq-dhcp[320009]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/host
Nov 28 10:05:03 np0005538513.localdomain dnsmasq-dhcp[320009]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/opts
Nov 28 10:05:03 np0005538513.localdomain systemd[1]: tmp-crun.8WSddx.mount: Deactivated successfully.
Nov 28 10:05:03 np0005538513.localdomain podman[320745]: 2025-11-28 10:05:03.699152943 +0000 UTC m=+0.071901991 container kill 44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:03 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:03.766 2 INFO neutron.agent.securitygroups_rpc [None req-551a0a6f-26c0-4f59-8a71-37fd214c141c 6aa2d31d87d44bd68f3306827a96cb84 7163b69fa8fc4d998fb494edfa303457 - - default default] Security group member updated ['d5ac7cb5-5e8f-446f-ac61-9c9e90d707c1']
Nov 28 10:05:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.852 261084 INFO neutron.agent.dhcp.agent [None req-ade9d9ba-7777-4bec-957e-af24ecb3c5ae - - - - - -] Synchronizing state
Nov 28 10:05:04 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:04.216 261084 INFO neutron.agent.dhcp.agent [None req-6b0cd380-e608-4081-8967-4a3f74b64491 - - - - - -] All active networks have been fetched through RPC.
Nov 28 10:05:04 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:04.218 261084 INFO neutron.agent.dhcp.agent [-] Starting network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 dhcp configuration
Nov 28 10:05:04 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:04.219 261084 INFO neutron.agent.dhcp.agent [-] Finished network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 dhcp configuration
Nov 28 10:05:04 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:04.219 261084 INFO neutron.agent.dhcp.agent [-] Starting network c8ccf767-8631-44f9-8e16-74febe5b399d dhcp configuration
Nov 28 10:05:04 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:04.220 261084 INFO neutron.agent.dhcp.agent [-] Finished network c8ccf767-8631-44f9-8e16-74febe5b399d dhcp configuration
Nov 28 10:05:04 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:04.221 261084 INFO neutron.agent.dhcp.agent [None req-6b0cd380-e608-4081-8967-4a3f74b64491 - - - - - -] Synchronizing state complete
Nov 28 10:05:04 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:04.321 261084 INFO neutron.agent.dhcp.agent [None req-cd78bbba-0ba2-4bd9-aa83-ffdb71baf0d8 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed
Nov 28 10:05:04 np0005538513.localdomain dnsmasq[320649]: exiting on receipt of SIGTERM
Nov 28 10:05:04 np0005538513.localdomain systemd[1]: libpod-8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3.scope: Deactivated successfully.
Nov 28 10:05:04 np0005538513.localdomain podman[320794]: 2025-11-28 10:05:04.516986797 +0000 UTC m=+0.072735985 container kill 8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:05:04 np0005538513.localdomain dnsmasq[320009]: exiting on receipt of SIGTERM
Nov 28 10:05:04 np0005538513.localdomain podman[320808]: 2025-11-28 10:05:04.563959483 +0000 UTC m=+0.067178716 container kill 44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:04 np0005538513.localdomain systemd[1]: libpod-44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9.scope: Deactivated successfully.
Nov 28 10:05:04 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:04Z|00325|binding|INFO|Removing iface tapd9d6658c-69 ovn-installed in OVS
Nov 28 10:05:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:04.593 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 01ac1c86-3a21-4ed2-98b0-bdb2d1c9caf2 with type ""
Nov 28 10:05:04 np0005538513.localdomain podman[320822]: 2025-11-28 10:05:04.594363904 +0000 UTC m=+0.050295842 container died 8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:05:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:04.595 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-02cd8163-742c-4849-a0f3-35dad7f4a404', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02cd8163-742c-4849-a0f3-35dad7f4a404', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c66e098e4fb4a349dc2bb4293454135', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6fc78fcb-5ea1-409a-899f-c137c1b47b0b, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=d9d6658c-69f3-434a-a139-9146d8ddb475) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:04.597 158130 INFO neutron.agent.ovn.metadata.agent [-] Port d9d6658c-69f3-434a-a139-9146d8ddb475 in datapath 02cd8163-742c-4849-a0f3-35dad7f4a404 unbound from our chassis
Nov 28 10:05:04 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:04Z|00326|binding|INFO|Removing lport d9d6658c-69f3-434a-a139-9146d8ddb475 ovn-installed in OVS
Nov 28 10:05:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:04.601 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 02cd8163-742c-4849-a0f3-35dad7f4a404, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:05:04 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:04.627 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[deab7a6e-3026-4761-8c1b-6511280ecdec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:04.629 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:04 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3-userdata-shm.mount: Deactivated successfully.
Nov 28 10:05:04 np0005538513.localdomain podman[320822]: 2025-11-28 10:05:04.671033621 +0000 UTC m=+0.126965559 container remove 8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 10:05:04 np0005538513.localdomain podman[320846]: 2025-11-28 10:05:04.680795311 +0000 UTC m=+0.099061989 container died 44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 10:05:04 np0005538513.localdomain systemd[1]: libpod-conmon-8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3.scope: Deactivated successfully.
Nov 28 10:05:04 np0005538513.localdomain podman[320846]: 2025-11-28 10:05:04.723470044 +0000 UTC m=+0.141736682 container remove 44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:05:04 np0005538513.localdomain systemd[1]: libpod-conmon-44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9.scope: Deactivated successfully.
Nov 28 10:05:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:04.736 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:04 np0005538513.localdomain kernel: device tapd9d6658c-69 left promiscuous mode
Nov 28 10:05:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:04.759 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:04 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:04.783 261084 INFO neutron.agent.dhcp.agent [None req-b449a3b4-435b-49ad-a640-df2d3f2d856c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:04 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:04.784 261084 INFO neutron.agent.dhcp.agent [None req-b449a3b4-435b-49ad-a640-df2d3f2d856c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:04 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:04.900 261084 INFO neutron.agent.dhcp.agent [None req-6db7547d-570c-4aa1-acb2-96d3bc174ee7 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:05 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:05 np0005538513.localdomain ceph-mon[292954]: pgmap v271: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:05 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:05.223 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:05 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-263de5eac5bcb424e9045572c27154229aa4a2d3b22ed18705a18fbb1cd475fa-merged.mount: Deactivated successfully.
Nov 28 10:05:05 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully.
Nov 28 10:05:05 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-4c235e185164438599917caaee605008003fd8fff25d4dac6b9b93fcc8e24479-merged.mount: Deactivated successfully.
Nov 28 10:05:05 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9-userdata-shm.mount: Deactivated successfully.
Nov 28 10:05:05 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d02cd8163\x2d742c\x2d4849\x2da0f3\x2d35dad7f4a404.mount: Deactivated successfully.
Nov 28 10:05:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:05.609 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:05 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:05Z|00327|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:05:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:05.708 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:05.762 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:07 np0005538513.localdomain ceph-mon[292954]: pgmap v272: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:07 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:07.308 261084 INFO neutron.agent.linux.ip_lib [None req-400d09d0-dbfd-4c26-b2f8-8d9d865d4979 - - - - - -] Device tap56bccdb5-6f cannot be used as it has no MAC address
Nov 28 10:05:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:07.341 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:07 np0005538513.localdomain kernel: device tap56bccdb5-6f entered promiscuous mode
Nov 28 10:05:07 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324307.3503] manager: (tap56bccdb5-6f): new Generic device (/org/freedesktop/NetworkManager/Devices/55)
Nov 28 10:05:07 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:07Z|00328|binding|INFO|Claiming lport 56bccdb5-6fe9-44a9-92c9-93e6e7a30192 for this chassis.
Nov 28 10:05:07 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:07Z|00329|binding|INFO|56bccdb5-6fe9-44a9-92c9-93e6e7a30192: Claiming unknown
Nov 28 10:05:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:07.353 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:07 np0005538513.localdomain systemd-udevd[320887]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:05:07 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:07.367 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe3d:5072/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=56bccdb5-6fe9-44a9-92c9-93e6e7a30192) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:07 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:07.370 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 56bccdb5-6fe9-44a9-92c9-93e6e7a30192 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis
Nov 28 10:05:07 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:07.373 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 980ba5c9-99ee-41f5-8394-98a27123bb4d IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:05:07 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:07.373 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:05:07 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:07.374 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[9b8318cc-3444-42a4-91f4-515908b5a752]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:07 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap56bccdb5-6f: No such device
Nov 28 10:05:07 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:07Z|00330|binding|INFO|Setting lport 56bccdb5-6fe9-44a9-92c9-93e6e7a30192 ovn-installed in OVS
Nov 28 10:05:07 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap56bccdb5-6f: No such device
Nov 28 10:05:07 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:07Z|00331|binding|INFO|Setting lport 56bccdb5-6fe9-44a9-92c9-93e6e7a30192 up in Southbound
Nov 28 10:05:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:07.400 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:07 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap56bccdb5-6f: No such device
Nov 28 10:05:07 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap56bccdb5-6f: No such device
Nov 28 10:05:07 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap56bccdb5-6f: No such device
Nov 28 10:05:07 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap56bccdb5-6f: No such device
Nov 28 10:05:07 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap56bccdb5-6f: No such device
Nov 28 10:05:07 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap56bccdb5-6f: No such device
Nov 28 10:05:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:07.457 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:07.489 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:08 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:08.235 2 INFO neutron.agent.securitygroups_rpc [None req-687de3fc-aa2f-4499-970c-3ba0a56c0388 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']
Nov 28 10:05:08 np0005538513.localdomain podman[320958]: 
Nov 28 10:05:08 np0005538513.localdomain podman[320958]: 2025-11-28 10:05:08.379190523 +0000 UTC m=+0.088496897 container create 0cf4e56e8217bf4cf9c9e2f90f0b7996959dbb50a2a9fcfea8e0c2bee8fcb38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:05:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:05:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:05:08 np0005538513.localdomain systemd[1]: Started libpod-conmon-0cf4e56e8217bf4cf9c9e2f90f0b7996959dbb50a2a9fcfea8e0c2bee8fcb38f.scope.
Nov 28 10:05:08 np0005538513.localdomain podman[320958]: 2025-11-28 10:05:08.336310605 +0000 UTC m=+0.045617029 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:05:08 np0005538513.localdomain systemd[1]: tmp-crun.M8Dvxf.mount: Deactivated successfully.
Nov 28 10:05:08 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:05:08 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b6069e3726f8d077e724937be22b6056e6a37f5666120f5627dd7e6f3d4cdecf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:05:08 np0005538513.localdomain podman[320958]: 2025-11-28 10:05:08.464643113 +0000 UTC m=+0.173949497 container init 0cf4e56e8217bf4cf9c9e2f90f0b7996959dbb50a2a9fcfea8e0c2bee8fcb38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:05:08 np0005538513.localdomain podman[320958]: 2025-11-28 10:05:08.475310887 +0000 UTC m=+0.184617271 container start 0cf4e56e8217bf4cf9c9e2f90f0b7996959dbb50a2a9fcfea8e0c2bee8fcb38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 28 10:05:08 np0005538513.localdomain dnsmasq[320998]: started, version 2.85 cachesize 150
Nov 28 10:05:08 np0005538513.localdomain dnsmasq[320998]: DNS service limited to local subnets
Nov 28 10:05:08 np0005538513.localdomain dnsmasq[320998]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:05:08 np0005538513.localdomain dnsmasq[320998]: warning: no upstream servers configured
Nov 28 10:05:08 np0005538513.localdomain dnsmasq[320998]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:05:08 np0005538513.localdomain podman[320972]: 2025-11-28 10:05:08.515730436 +0000 UTC m=+0.087770927 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:05:08 np0005538513.localdomain podman[320972]: 2025-11-28 10:05:08.524729634 +0000 UTC m=+0.096770195 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:05:08 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:05:08 np0005538513.localdomain podman[320973]: 2025-11-28 10:05:08.569399784 +0000 UTC m=+0.138748467 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd)
Nov 28 10:05:08 np0005538513.localdomain podman[320973]: 2025-11-28 10:05:08.579172373 +0000 UTC m=+0.148521076 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Nov 28 10:05:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:08.586 261084 INFO neutron.agent.dhcp.agent [None req-5d88e603-6df3-4f8d-8add-470612fe3b7f - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed
Nov 28 10:05:08 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:05:08 np0005538513.localdomain dnsmasq[320998]: exiting on receipt of SIGTERM
Nov 28 10:05:08 np0005538513.localdomain podman[321036]: 2025-11-28 10:05:08.891877514 +0000 UTC m=+0.065700433 container kill 0cf4e56e8217bf4cf9c9e2f90f0b7996959dbb50a2a9fcfea8e0c2bee8fcb38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:05:08 np0005538513.localdomain systemd[1]: libpod-0cf4e56e8217bf4cf9c9e2f90f0b7996959dbb50a2a9fcfea8e0c2bee8fcb38f.scope: Deactivated successfully.
Nov 28 10:05:08 np0005538513.localdomain podman[321050]: 2025-11-28 10:05:08.967086049 +0000 UTC m=+0.057006634 container died 0cf4e56e8217bf4cf9c9e2f90f0b7996959dbb50a2a9fcfea8e0c2bee8fcb38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 10:05:09 np0005538513.localdomain podman[321050]: 2025-11-28 10:05:09.000936839 +0000 UTC m=+0.090857454 container cleanup 0cf4e56e8217bf4cf9c9e2f90f0b7996959dbb50a2a9fcfea8e0c2bee8fcb38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:05:09 np0005538513.localdomain systemd[1]: libpod-conmon-0cf4e56e8217bf4cf9c9e2f90f0b7996959dbb50a2a9fcfea8e0c2bee8fcb38f.scope: Deactivated successfully.
Nov 28 10:05:09 np0005538513.localdomain podman[321051]: 2025-11-28 10:05:09.045930588 +0000 UTC m=+0.131722775 container remove 0cf4e56e8217bf4cf9c9e2f90f0b7996959dbb50a2a9fcfea8e0c2bee8fcb38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 10:05:09 np0005538513.localdomain ceph-mon[292954]: pgmap v273: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:09 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-b6069e3726f8d077e724937be22b6056e6a37f5666120f5627dd7e6f3d4cdecf-merged.mount: Deactivated successfully.
Nov 28 10:05:09 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0cf4e56e8217bf4cf9c9e2f90f0b7996959dbb50a2a9fcfea8e0c2bee8fcb38f-userdata-shm.mount: Deactivated successfully.
Nov 28 10:05:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:10 np0005538513.localdomain podman[238687]: time="2025-11-28T10:05:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:05:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:05:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157512 "" "Go-http-client/1.1"
Nov 28 10:05:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:05:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19744 "" "Go-http-client/1.1"
Nov 28 10:05:10 np0005538513.localdomain podman[321127]: 
Nov 28 10:05:10 np0005538513.localdomain podman[321127]: 2025-11-28 10:05:10.331808814 +0000 UTC m=+0.095732744 container create d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:05:10 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:10.357 2 INFO neutron.agent.securitygroups_rpc [None req-20bc108e-ba14-415c-a7d5-38bda2943a27 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:10 np0005538513.localdomain systemd[1]: Started libpod-conmon-d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4.scope.
Nov 28 10:05:10 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:05:10 np0005538513.localdomain podman[321127]: 2025-11-28 10:05:10.286056013 +0000 UTC m=+0.049979973 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:05:10 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd81c9126a3e041d11bddaaea13e7a044ce378612bdfad66b863e26327980a74/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:05:10 np0005538513.localdomain podman[321127]: 2025-11-28 10:05:10.399917875 +0000 UTC m=+0.163841815 container init d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:10 np0005538513.localdomain podman[321127]: 2025-11-28 10:05:10.410511179 +0000 UTC m=+0.174435099 container start d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 10:05:10 np0005538513.localdomain dnsmasq[321145]: started, version 2.85 cachesize 150
Nov 28 10:05:10 np0005538513.localdomain dnsmasq[321145]: DNS service limited to local subnets
Nov 28 10:05:10 np0005538513.localdomain dnsmasq[321145]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:05:10 np0005538513.localdomain dnsmasq[321145]: warning: no upstream servers configured
Nov 28 10:05:10 np0005538513.localdomain dnsmasq-dhcp[321145]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Nov 28 10:05:10 np0005538513.localdomain dnsmasq[321145]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:05:10 np0005538513.localdomain dnsmasq-dhcp[321145]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:05:10 np0005538513.localdomain dnsmasq-dhcp[321145]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:05:10 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:10.455 2 INFO neutron.agent.securitygroups_rpc [None req-dee3de84-6e3e-4d2c-b4d4-a44f07424a62 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:05:10 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:10.538 261084 INFO neutron.agent.dhcp.agent [None req-7cdb70de-6dff-4cbc-b711-8dce3cd0e260 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '56bccdb5-6fe9-44a9-92c9-93e6e7a30192'} is completed
Nov 28 10:05:10 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:10.569 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:05:09Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd644fa30>, <neutron.agent.linux.dhcp.DictModel object at 0x7fdcd644f790>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd644f160>, <neutron.agent.linux.dhcp.DictModel object at 0x7fdcd644fbe0>], id=ef68a835-f86a-4989-8b1b-ac699035eeaa, ip_allocation=immediate, mac_address=fa:16:3e:24:c7:20, name=tempest-NetworksTestDHCPv6-452102375, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=49, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['4a85d5d0-7333-4fd9-9755-4bb7ab639c94', 'c845e4fa-0400-41ee-a338-1ba8f32c337c'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:05:08Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1873, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:05:10Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1
Nov 28 10:05:10 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:10.618 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:10 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:10.759 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:10 np0005538513.localdomain dnsmasq[321145]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 2 addresses
Nov 28 10:05:10 np0005538513.localdomain dnsmasq-dhcp[321145]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:05:10 np0005538513.localdomain dnsmasq-dhcp[321145]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:05:10 np0005538513.localdomain podman[321165]: 2025-11-28 10:05:10.799873986 +0000 UTC m=+0.068321209 container kill d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 10:05:11 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:11.073 261084 INFO neutron.agent.dhcp.agent [None req-b7d10e26-9fc9-45e3-94dc-7691847df9c0 - - - - - -] DHCP configuration for ports {'ef68a835-f86a-4989-8b1b-ac699035eeaa'} is completed
Nov 28 10:05:11 np0005538513.localdomain ceph-mon[292954]: pgmap v274: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:11 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:11.361 2 INFO neutron.agent.securitygroups_rpc [None req-dada4afd-3538-47ee-91fc-3d547e9c2d44 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:11 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:11.648 2 INFO neutron.agent.securitygroups_rpc [None req-3c61a2a4-46f5-45d3-8f70-86c422c843ed 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']
Nov 28 10:05:11 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:11.704 2 INFO neutron.agent.securitygroups_rpc [None req-bb3cde64-53ac-43b6-992c-b149fee8302c 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:05:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:11.830 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:11 np0005538513.localdomain podman[321202]: 2025-11-28 10:05:11.942113224 +0000 UTC m=+0.052223797 container kill d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:11 np0005538513.localdomain dnsmasq[321145]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:05:11 np0005538513.localdomain dnsmasq-dhcp[321145]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:05:11 np0005538513.localdomain dnsmasq-dhcp[321145]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:05:11 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:11.965 2 INFO neutron.agent.securitygroups_rpc [None req-2c04ab47-4041-48cd-a577-3ad3edc1e57b a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:12 np0005538513.localdomain dnsmasq[321145]: exiting on receipt of SIGTERM
Nov 28 10:05:12 np0005538513.localdomain podman[321240]: 2025-11-28 10:05:12.990140644 +0000 UTC m=+0.060983409 container kill d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:12 np0005538513.localdomain systemd[1]: libpod-d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4.scope: Deactivated successfully.
Nov 28 10:05:13 np0005538513.localdomain podman[321254]: 2025-11-28 10:05:13.07340909 +0000 UTC m=+0.064874670 container died d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:05:13 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4-userdata-shm.mount: Deactivated successfully.
Nov 28 10:05:13 np0005538513.localdomain podman[321254]: 2025-11-28 10:05:13.105415917 +0000 UTC m=+0.096881457 container cleanup d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:13 np0005538513.localdomain systemd[1]: libpod-conmon-d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4.scope: Deactivated successfully.
Nov 28 10:05:13 np0005538513.localdomain podman[321255]: 2025-11-28 10:05:13.145207647 +0000 UTC m=+0.131010734 container remove d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:13 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:13.285 2 INFO neutron.agent.securitygroups_rpc [None req-865841f9-78fc-4d59-bed2-9c90df3d7ecb a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:13 np0005538513.localdomain ceph-mon[292954]: pgmap v275: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:13 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-bd81c9126a3e041d11bddaaea13e7a044ce378612bdfad66b863e26327980a74-merged.mount: Deactivated successfully.
Nov 28 10:05:14 np0005538513.localdomain podman[321333]: 
Nov 28 10:05:14 np0005538513.localdomain podman[321333]: 2025-11-28 10:05:14.028135457 +0000 UTC m=+0.087248962 container create 24c1e771cd71fd364067cf1ba78873c4dcc2e919b0f1b1f00343a6d218a15cca (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:05:14 np0005538513.localdomain systemd[1]: Started libpod-conmon-24c1e771cd71fd364067cf1ba78873c4dcc2e919b0f1b1f00343a6d218a15cca.scope.
Nov 28 10:05:14 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:05:14 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3804d8dee1d9232b07d6f91fedd0b9049c42d7f50f3153a52c55a3ee897b276c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:05:14 np0005538513.localdomain podman[321333]: 2025-11-28 10:05:13.987537563 +0000 UTC m=+0.046651118 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:05:14 np0005538513.localdomain podman[321333]: 2025-11-28 10:05:14.12491447 +0000 UTC m=+0.184027975 container init 24c1e771cd71fd364067cf1ba78873c4dcc2e919b0f1b1f00343a6d218a15cca (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 28 10:05:14 np0005538513.localdomain podman[321333]: 2025-11-28 10:05:14.141249208 +0000 UTC m=+0.200362723 container start 24c1e771cd71fd364067cf1ba78873c4dcc2e919b0f1b1f00343a6d218a15cca (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 10:05:14 np0005538513.localdomain dnsmasq[321352]: started, version 2.85 cachesize 150
Nov 28 10:05:14 np0005538513.localdomain dnsmasq[321352]: DNS service limited to local subnets
Nov 28 10:05:14 np0005538513.localdomain dnsmasq[321352]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:05:14 np0005538513.localdomain dnsmasq[321352]: warning: no upstream servers configured
Nov 28 10:05:14 np0005538513.localdomain dnsmasq[321352]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:05:14 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2371469984' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:14 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2371469984' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:14 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:14.378 261084 INFO neutron.agent.dhcp.agent [None req-b20126a7-5ad8-4f4a-8aea-f023fd1f289c - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '56bccdb5-6fe9-44a9-92c9-93e6e7a30192'} is completed
Nov 28 10:05:14 np0005538513.localdomain dnsmasq[321352]: exiting on receipt of SIGTERM
Nov 28 10:05:14 np0005538513.localdomain podman[321370]: 2025-11-28 10:05:14.493528552 +0000 UTC m=+0.064468828 container kill 24c1e771cd71fd364067cf1ba78873c4dcc2e919b0f1b1f00343a6d218a15cca (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:14 np0005538513.localdomain systemd[1]: libpod-24c1e771cd71fd364067cf1ba78873c4dcc2e919b0f1b1f00343a6d218a15cca.scope: Deactivated successfully.
Nov 28 10:05:14 np0005538513.localdomain podman[321385]: 2025-11-28 10:05:14.576585012 +0000 UTC m=+0.066761775 container died 24c1e771cd71fd364067cf1ba78873c4dcc2e919b0f1b1f00343a6d218a15cca (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 10:05:14 np0005538513.localdomain podman[321385]: 2025-11-28 10:05:14.616282829 +0000 UTC m=+0.106459552 container cleanup 24c1e771cd71fd364067cf1ba78873c4dcc2e919b0f1b1f00343a6d218a15cca (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:14 np0005538513.localdomain systemd[1]: libpod-conmon-24c1e771cd71fd364067cf1ba78873c4dcc2e919b0f1b1f00343a6d218a15cca.scope: Deactivated successfully.
Nov 28 10:05:14 np0005538513.localdomain podman[321387]: 2025-11-28 10:05:14.671317546 +0000 UTC m=+0.155602059 container remove 24c1e771cd71fd364067cf1ba78873c4dcc2e919b0f1b1f00343a6d218a15cca (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:14.686 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:14 np0005538513.localdomain kernel: device tap56bccdb5-6f left promiscuous mode
Nov 28 10:05:14 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:14Z|00332|binding|INFO|Releasing lport 56bccdb5-6fe9-44a9-92c9-93e6e7a30192 from this chassis (sb_readonly=0)
Nov 28 10:05:14 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:14Z|00333|binding|INFO|Setting lport 56bccdb5-6fe9-44a9-92c9-93e6e7a30192 down in Southbound
Nov 28 10:05:14 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:14.696 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::2/64 2001:db8::f816:3eff:fe3d:5072/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=56bccdb5-6fe9-44a9-92c9-93e6e7a30192) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:14 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:14.698 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 56bccdb5-6fe9-44a9-92c9-93e6e7a30192 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis
Nov 28 10:05:14 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:14.702 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:05:14 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:14.703 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[bf5f7241-ecc9-4b0e-a681-f4217f3bff0c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:14.705 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:14.707 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:14 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:14.919 261084 INFO neutron.agent.dhcp.agent [None req-7495b5dd-61b5-4d0e-b866-d4f43ab0d53d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:14 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-3804d8dee1d9232b07d6f91fedd0b9049c42d7f50f3153a52c55a3ee897b276c-merged.mount: Deactivated successfully.
Nov 28 10:05:14 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-24c1e771cd71fd364067cf1ba78873c4dcc2e919b0f1b1f00343a6d218a15cca-userdata-shm.mount: Deactivated successfully.
Nov 28 10:05:14 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully.
Nov 28 10:05:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:15 np0005538513.localdomain ceph-mon[292954]: pgmap v276: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:15 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:15.474 2 INFO neutron.agent.securitygroups_rpc [None req-95a3885e-d25b-4b35-9841-10ba9cd222bb a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:15.620 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:15.803 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:16 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:16.732 2 INFO neutron.agent.securitygroups_rpc [None req-8e314c79-26fe-4adc-939f-e587c4e62ad2 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:17 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:17.030 261084 INFO neutron.agent.linux.ip_lib [None req-c8307bfa-67b4-44b4-bd88-df6a413c4b68 - - - - - -] Device tapeb25319d-f0 cannot be used as it has no MAC address
Nov 28 10:05:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:17.099 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:17 np0005538513.localdomain kernel: device tapeb25319d-f0 entered promiscuous mode
Nov 28 10:05:17 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324317.1082] manager: (tapeb25319d-f0): new Generic device (/org/freedesktop/NetworkManager/Devices/56)
Nov 28 10:05:17 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:17Z|00334|binding|INFO|Claiming lport eb25319d-f07e-4bef-a7f6-ca024599d184 for this chassis.
Nov 28 10:05:17 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:17Z|00335|binding|INFO|eb25319d-f07e-4bef-a7f6-ca024599d184: Claiming unknown
Nov 28 10:05:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:17.112 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:17 np0005538513.localdomain systemd-udevd[321424]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:05:17 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:17Z|00336|binding|INFO|Setting lport eb25319d-f07e-4bef-a7f6-ca024599d184 ovn-installed in OVS
Nov 28 10:05:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:17.118 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:17 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:17Z|00337|binding|INFO|Setting lport eb25319d-f07e-4bef-a7f6-ca024599d184 up in Southbound
Nov 28 10:05:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:17.122 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:17 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:17.121 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:febd:20cb/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=eb25319d-f07e-4bef-a7f6-ca024599d184) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:17 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:17.123 158130 INFO neutron.agent.ovn.metadata.agent [-] Port eb25319d-f07e-4bef-a7f6-ca024599d184 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis
Nov 28 10:05:17 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:17.126 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 034720ca-968c-4c23-b6b3-fb448c9725c8 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:05:17 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:17.126 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:05:17 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:17.127 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[9cd157a1-2d35-4f5c-b80c-7718e668f3ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:17 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapeb25319d-f0: No such device
Nov 28 10:05:17 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapeb25319d-f0: No such device
Nov 28 10:05:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:17.151 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:17 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapeb25319d-f0: No such device
Nov 28 10:05:17 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapeb25319d-f0: No such device
Nov 28 10:05:17 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapeb25319d-f0: No such device
Nov 28 10:05:17 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapeb25319d-f0: No such device
Nov 28 10:05:17 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapeb25319d-f0: No such device
Nov 28 10:05:17 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapeb25319d-f0: No such device
Nov 28 10:05:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:17.195 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:17.231 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:17 np0005538513.localdomain ceph-mon[292954]: pgmap v277: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:18 np0005538513.localdomain podman[321495]: 
Nov 28 10:05:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:05:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:05:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:05:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:05:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:05:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:05:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:05:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:05:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:05:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:05:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:05:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:05:18 np0005538513.localdomain podman[321495]: 2025-11-28 10:05:18.103625925 +0000 UTC m=+0.092348617 container create 493424886c595aa6ffa9620d7a7213b71b14d5bf1ae32cac4298c71b921b95d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 28 10:05:18 np0005538513.localdomain podman[321495]: 2025-11-28 10:05:18.056111624 +0000 UTC m=+0.044834316 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:05:18 np0005538513.localdomain systemd[1]: Started libpod-conmon-493424886c595aa6ffa9620d7a7213b71b14d5bf1ae32cac4298c71b921b95d2.scope.
Nov 28 10:05:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:05:18 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:05:18 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9dd9c406a4caf236febbbc1bd2d17248727709a1c53c4b98aa26be3cd147678/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:05:18 np0005538513.localdomain podman[321495]: 2025-11-28 10:05:18.214488042 +0000 UTC m=+0.203210714 container init 493424886c595aa6ffa9620d7a7213b71b14d5bf1ae32cac4298c71b921b95d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 28 10:05:18 np0005538513.localdomain podman[321495]: 2025-11-28 10:05:18.223308165 +0000 UTC m=+0.212030837 container start 493424886c595aa6ffa9620d7a7213b71b14d5bf1ae32cac4298c71b921b95d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 28 10:05:18 np0005538513.localdomain dnsmasq[321520]: started, version 2.85 cachesize 150
Nov 28 10:05:18 np0005538513.localdomain dnsmasq[321520]: DNS service limited to local subnets
Nov 28 10:05:18 np0005538513.localdomain dnsmasq[321520]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:05:18 np0005538513.localdomain dnsmasq[321520]: warning: no upstream servers configured
Nov 28 10:05:18 np0005538513.localdomain dnsmasq-dhcp[321520]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:05:18 np0005538513.localdomain dnsmasq[321520]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:05:18 np0005538513.localdomain dnsmasq-dhcp[321520]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:05:18 np0005538513.localdomain dnsmasq-dhcp[321520]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:05:18 np0005538513.localdomain podman[321513]: 2025-11-28 10:05:18.297266554 +0000 UTC m=+0.092338816 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, vcs-type=git, architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible)
Nov 28 10:05:18 np0005538513.localdomain podman[321513]: 2025-11-28 10:05:18.31353001 +0000 UTC m=+0.108602252 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7)
Nov 28 10:05:18 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:05:18 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:18.334 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:18 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:18.411 261084 INFO neutron.agent.dhcp.agent [None req-57221227-0637-48ff-a646-1518f824b861 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed
Nov 28 10:05:18 np0005538513.localdomain dnsmasq[321520]: exiting on receipt of SIGTERM
Nov 28 10:05:18 np0005538513.localdomain systemd[1]: libpod-493424886c595aa6ffa9620d7a7213b71b14d5bf1ae32cac4298c71b921b95d2.scope: Deactivated successfully.
Nov 28 10:05:18 np0005538513.localdomain podman[321553]: 2025-11-28 10:05:18.700251541 +0000 UTC m=+0.084967965 container kill 493424886c595aa6ffa9620d7a7213b71b14d5bf1ae32cac4298c71b921b95d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:05:18 np0005538513.localdomain podman[321566]: 2025-11-28 10:05:18.777459294 +0000 UTC m=+0.063492851 container died 493424886c595aa6ffa9620d7a7213b71b14d5bf1ae32cac4298c71b921b95d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 28 10:05:18 np0005538513.localdomain podman[321566]: 2025-11-28 10:05:18.819166139 +0000 UTC m=+0.105199646 container cleanup 493424886c595aa6ffa9620d7a7213b71b14d5bf1ae32cac4298c71b921b95d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:18 np0005538513.localdomain systemd[1]: libpod-conmon-493424886c595aa6ffa9620d7a7213b71b14d5bf1ae32cac4298c71b921b95d2.scope: Deactivated successfully.
Nov 28 10:05:18 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:18.826 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:18 np0005538513.localdomain podman[321571]: 2025-11-28 10:05:18.868336417 +0000 UTC m=+0.137093399 container remove 493424886c595aa6ffa9620d7a7213b71b14d5bf1ae32cac4298c71b921b95d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:19 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f9dd9c406a4caf236febbbc1bd2d17248727709a1c53c4b98aa26be3cd147678-merged.mount: Deactivated successfully.
Nov 28 10:05:19 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-493424886c595aa6ffa9620d7a7213b71b14d5bf1ae32cac4298c71b921b95d2-userdata-shm.mount: Deactivated successfully.
Nov 28 10:05:19 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:19.181 2 INFO neutron.agent.securitygroups_rpc [None req-8fd6935d-936a-4ee4-abda-f9675ff09d60 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:19 np0005538513.localdomain ceph-mon[292954]: pgmap v278: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:19 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:19.799 2 INFO neutron.agent.securitygroups_rpc [None req-5f6fc136-bb5c-4f44-93cc-8b8331d1b8c7 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:05:20 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:20 np0005538513.localdomain podman[321647]: 
Nov 28 10:05:20 np0005538513.localdomain podman[321647]: 2025-11-28 10:05:20.370282213 +0000 UTC m=+0.094963871 container create 7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:05:20 np0005538513.localdomain systemd[1]: Started libpod-conmon-7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4.scope.
Nov 28 10:05:20 np0005538513.localdomain podman[321647]: 2025-11-28 10:05:20.326260623 +0000 UTC m=+0.050942321 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:05:20 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:05:20 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:20.445 2 INFO neutron.agent.securitygroups_rpc [None req-08674235-5e89-42a7-aefb-bb66d7c4db90 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:20 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b03c89dcf0faf8109c5c785fa3ed5032c1fb0ecf301a75e745f5f4a7b8eb2d71/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:05:20 np0005538513.localdomain podman[321647]: 2025-11-28 10:05:20.460057187 +0000 UTC m=+0.184738845 container init 7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 10:05:20 np0005538513.localdomain podman[321647]: 2025-11-28 10:05:20.470379522 +0000 UTC m=+0.195061180 container start 7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 10:05:20 np0005538513.localdomain dnsmasq[321665]: started, version 2.85 cachesize 150
Nov 28 10:05:20 np0005538513.localdomain dnsmasq[321665]: DNS service limited to local subnets
Nov 28 10:05:20 np0005538513.localdomain dnsmasq[321665]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:05:20 np0005538513.localdomain dnsmasq[321665]: warning: no upstream servers configured
Nov 28 10:05:20 np0005538513.localdomain dnsmasq-dhcp[321665]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Nov 28 10:05:20 np0005538513.localdomain dnsmasq-dhcp[321665]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:05:20 np0005538513.localdomain dnsmasq[321665]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:05:20 np0005538513.localdomain dnsmasq-dhcp[321665]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:05:20 np0005538513.localdomain dnsmasq-dhcp[321665]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:05:20 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:20.543 261084 INFO neutron.agent.dhcp.agent [None req-2b6e1a79-43d2-4680-90fa-c549dc6edda9 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:05:19Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6407d60>, <neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6407820>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64077f0>, <neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6407f10>], id=a1a7bb08-d189-4970-b6b2-630de75a7603, ip_allocation=immediate, mac_address=fa:16:3e:6a:01:fa, name=tempest-NetworksTestDHCPv6-636257890, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=53, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['58dd9090-d868-4db4-bc1b-d85af9f4c5b9', '5a9525ff-24a1-498c-a69f-7eb8d2a6b3f0'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:05:18Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1911, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:05:19Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1
Nov 28 10:05:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:20.650 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:20 np0005538513.localdomain dnsmasq[321665]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 2 addresses
Nov 28 10:05:20 np0005538513.localdomain podman[321684]: 2025-11-28 10:05:20.753758932 +0000 UTC m=+0.052368531 container kill 7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:05:20 np0005538513.localdomain dnsmasq-dhcp[321665]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:05:20 np0005538513.localdomain dnsmasq-dhcp[321665]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:05:20 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:20.777 261084 INFO neutron.agent.dhcp.agent [None req-6103797e-56b6-4637-9222-0c100f3c856f - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', 'eb25319d-f07e-4bef-a7f6-ca024599d184'} is completed
Nov 28 10:05:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:20.807 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:20 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:20.934 2 INFO neutron.agent.securitygroups_rpc [None req-ff8acaea-6e1e-4ede-81e1-a7305639837a 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:05:21 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:21.000 261084 INFO neutron.agent.dhcp.agent [None req-08f71709-211c-40a5-a39d-ca60eb6b9f7a - - - - - -] DHCP configuration for ports {'a1a7bb08-d189-4970-b6b2-630de75a7603'} is completed
Nov 28 10:05:21 np0005538513.localdomain dnsmasq[321665]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:05:21 np0005538513.localdomain dnsmasq-dhcp[321665]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:05:21 np0005538513.localdomain podman[321722]: 2025-11-28 10:05:21.179303155 +0000 UTC m=+0.068164824 container kill 7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:21 np0005538513.localdomain dnsmasq-dhcp[321665]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:05:21 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:21.291 2 INFO neutron.agent.securitygroups_rpc [None req-f5c2db0c-7ccc-47c0-8119-58c735104780 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:21 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:21.317 2 INFO neutron.agent.securitygroups_rpc [None req-baf56349-7def-4de1-b8e1-b12f76677166 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']
Nov 28 10:05:21 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:21.350 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:21 np0005538513.localdomain ceph-mon[292954]: pgmap v279: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:21 np0005538513.localdomain sudo[321739]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:05:21 np0005538513.localdomain sudo[321739]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:05:21 np0005538513.localdomain sudo[321739]: pam_unix(sudo:session): session closed for user root
Nov 28 10:05:21 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:21.420 2 INFO neutron.agent.securitygroups_rpc [None req-1ffc7eeb-b258-4bda-bcd4-9d4ef92a7b1f 48f54094604448288052e0203d18d8df 45fdbe27569f45449de58f1d1899ceea - - default default] Security group member updated ['0eb534d2-8077-4b6c-8550-0df9f702073c']
Nov 28 10:05:21 np0005538513.localdomain sudo[321760]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:05:21 np0005538513.localdomain sudo[321760]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:05:21 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:21.532 2 INFO neutron.agent.securitygroups_rpc [None req-1ffc7eeb-b258-4bda-bcd4-9d4ef92a7b1f 48f54094604448288052e0203d18d8df 45fdbe27569f45449de58f1d1899ceea - - default default] Security group member updated ['0eb534d2-8077-4b6c-8550-0df9f702073c']
Nov 28 10:05:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:05:21 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3152367006' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:05:21 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3152367006' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:22 np0005538513.localdomain dnsmasq[321665]: exiting on receipt of SIGTERM
Nov 28 10:05:22 np0005538513.localdomain systemd[1]: libpod-7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4.scope: Deactivated successfully.
Nov 28 10:05:22 np0005538513.localdomain podman[321809]: 2025-11-28 10:05:22.00971203 +0000 UTC m=+0.075011860 container kill 7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:22 np0005538513.localdomain podman[321826]: 2025-11-28 10:05:22.087901981 +0000 UTC m=+0.061391930 container died 7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:22 np0005538513.localdomain systemd[1]: tmp-crun.UNUyEQ.mount: Deactivated successfully.
Nov 28 10:05:22 np0005538513.localdomain sudo[321760]: pam_unix(sudo:session): session closed for user root
Nov 28 10:05:22 np0005538513.localdomain podman[321826]: 2025-11-28 10:05:22.220081558 +0000 UTC m=+0.193571527 container cleanup 7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:22 np0005538513.localdomain systemd[1]: libpod-conmon-7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4.scope: Deactivated successfully.
Nov 28 10:05:22 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:05:22 np0005538513.localdomain podman[321828]: 2025-11-28 10:05:22.244947041 +0000 UTC m=+0.206603911 container remove 7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:05:22 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:05:22 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:22.270 2 INFO neutron.agent.securitygroups_rpc [None req-a89ecc9f-78c7-44a1-98c4-c8700080c0c0 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']
Nov 28 10:05:22 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:22.299 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:22 np0005538513.localdomain sudo[321869]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:05:22 np0005538513.localdomain sudo[321869]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:05:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:05:22 np0005538513.localdomain sudo[321869]: pam_unix(sudo:session): session closed for user root
Nov 28 10:05:22 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-b03c89dcf0faf8109c5c785fa3ed5032c1fb0ecf301a75e745f5f4a7b8eb2d71-merged.mount: Deactivated successfully.
Nov 28 10:05:22 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4-userdata-shm.mount: Deactivated successfully.
Nov 28 10:05:22 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3152367006' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:22 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3152367006' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:05:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:05:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:05:22 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:05:22 np0005538513.localdomain podman[321892]: 2025-11-28 10:05:22.459671133 +0000 UTC m=+0.081600909 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:05:22 np0005538513.localdomain podman[321892]: 2025-11-28 10:05:22.494681657 +0000 UTC m=+0.116611443 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:05:22 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:05:22 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:22.551 2 INFO neutron.agent.securitygroups_rpc [None req-d94caea8-60f5-4f9f-b771-ee8db26744a5 48f54094604448288052e0203d18d8df 45fdbe27569f45449de58f1d1899ceea - - default default] Security group member updated ['0eb534d2-8077-4b6c-8550-0df9f702073c']
Nov 28 10:05:22 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:22.949 2 INFO neutron.agent.securitygroups_rpc [None req-e4fadf30-c514-41d2-a5af-51c101b5ad33 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:23 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:23.171 2 INFO neutron.agent.securitygroups_rpc [None req-e4e98f94-fcd2-4011-84de-9dbf12942472 48f54094604448288052e0203d18d8df 45fdbe27569f45449de58f1d1899ceea - - default default] Security group member updated ['0eb534d2-8077-4b6c-8550-0df9f702073c']
Nov 28 10:05:23 np0005538513.localdomain podman[321960]: 
Nov 28 10:05:23 np0005538513.localdomain podman[321960]: 2025-11-28 10:05:23.218446826 +0000 UTC m=+0.091077611 container create 90ceb0d343071c884e99daae5c2d90e744f7ccd543266fb4543aa9ba4147d430 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:05:23 np0005538513.localdomain systemd[1]: Started libpod-conmon-90ceb0d343071c884e99daae5c2d90e744f7ccd543266fb4543aa9ba4147d430.scope.
Nov 28 10:05:23 np0005538513.localdomain podman[321960]: 2025-11-28 10:05:23.176851624 +0000 UTC m=+0.049482509 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:05:23 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:05:23 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb9a48c4685e36add903d5c9ab17afdcaa556ea1a03aa31fa67d0b96b3f2a184/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:05:23 np0005538513.localdomain podman[321960]: 2025-11-28 10:05:23.298065477 +0000 UTC m=+0.170696273 container init 90ceb0d343071c884e99daae5c2d90e744f7ccd543266fb4543aa9ba4147d430 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:05:23 np0005538513.localdomain podman[321960]: 2025-11-28 10:05:23.306761887 +0000 UTC m=+0.179392672 container start 90ceb0d343071c884e99daae5c2d90e744f7ccd543266fb4543aa9ba4147d430 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:23 np0005538513.localdomain dnsmasq[321979]: started, version 2.85 cachesize 150
Nov 28 10:05:23 np0005538513.localdomain dnsmasq[321979]: DNS service limited to local subnets
Nov 28 10:05:23 np0005538513.localdomain dnsmasq[321979]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:05:23 np0005538513.localdomain dnsmasq[321979]: warning: no upstream servers configured
Nov 28 10:05:23 np0005538513.localdomain dnsmasq-dhcp[321979]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Nov 28 10:05:23 np0005538513.localdomain dnsmasq[321979]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:05:23 np0005538513.localdomain dnsmasq-dhcp[321979]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:05:23 np0005538513.localdomain dnsmasq-dhcp[321979]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:05:23 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:23.412 2 INFO neutron.agent.securitygroups_rpc [None req-ab356ba5-c3b5-409b-9ecc-7c688fb94166 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:23 np0005538513.localdomain ceph-mon[292954]: pgmap v280: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 596 B/s wr, 14 op/s
Nov 28 10:05:23 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:23.539 261084 INFO neutron.agent.dhcp.agent [None req-32f4d057-3903-4553-9801-7d3c217fd3ad - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', 'eb25319d-f07e-4bef-a7f6-ca024599d184'} is completed
Nov 28 10:05:23 np0005538513.localdomain dnsmasq[321979]: exiting on receipt of SIGTERM
Nov 28 10:05:23 np0005538513.localdomain podman[321995]: 2025-11-28 10:05:23.661733987 +0000 UTC m=+0.068146903 container kill 90ceb0d343071c884e99daae5c2d90e744f7ccd543266fb4543aa9ba4147d430 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 10:05:23 np0005538513.localdomain systemd[1]: libpod-90ceb0d343071c884e99daae5c2d90e744f7ccd543266fb4543aa9ba4147d430.scope: Deactivated successfully.
Nov 28 10:05:23 np0005538513.localdomain podman[322007]: 2025-11-28 10:05:23.737434796 +0000 UTC m=+0.058942869 container died 90ceb0d343071c884e99daae5c2d90e744f7ccd543266fb4543aa9ba4147d430 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:05:23 np0005538513.localdomain podman[322007]: 2025-11-28 10:05:23.77105573 +0000 UTC m=+0.092563773 container cleanup 90ceb0d343071c884e99daae5c2d90e744f7ccd543266fb4543aa9ba4147d430 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 28 10:05:23 np0005538513.localdomain systemd[1]: libpod-conmon-90ceb0d343071c884e99daae5c2d90e744f7ccd543266fb4543aa9ba4147d430.scope: Deactivated successfully.
Nov 28 10:05:23 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:23.800 2 INFO neutron.agent.securitygroups_rpc [None req-237dd594-9b00-4d69-9190-5464bb1ce820 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:23 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:23.816 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:23 np0005538513.localdomain podman[322009]: 2025-11-28 10:05:23.822791593 +0000 UTC m=+0.136710569 container remove 90ceb0d343071c884e99daae5c2d90e744f7ccd543266fb4543aa9ba4147d430 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:05:24 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:24.163 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:05:24 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1834815268' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:05:24 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1834815268' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-bb9a48c4685e36add903d5c9ab17afdcaa556ea1a03aa31fa67d0b96b3f2a184-merged.mount: Deactivated successfully.
Nov 28 10:05:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-90ceb0d343071c884e99daae5c2d90e744f7ccd543266fb4543aa9ba4147d430-userdata-shm.mount: Deactivated successfully.
Nov 28 10:05:24 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:24.436 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 2001:db8:0:1:f816:3eff:fef8:ad87'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '30', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 2001:db8::f816:3eff:fef8:ad87'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '28', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:24 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:24.440 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated
Nov 28 10:05:24 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:24.443 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 034720ca-968c-4c23-b6b3-fb448c9725c8 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:05:24 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:24.443 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:05:24 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:24.445 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[bf7ce151-258d-4e89-8d48-62be3ed95d3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:24 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1834815268' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:24 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1834815268' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:25 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:25 np0005538513.localdomain podman[322089]: 
Nov 28 10:05:25 np0005538513.localdomain podman[322089]: 2025-11-28 10:05:25.338539085 +0000 UTC m=+0.089140106 container create 91accf782ff3cf731a308ad6ffb0d110d5b9e240beea4be32f785cb2a09074d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 10:05:25 np0005538513.localdomain systemd[1]: Started libpod-conmon-91accf782ff3cf731a308ad6ffb0d110d5b9e240beea4be32f785cb2a09074d2.scope.
Nov 28 10:05:25 np0005538513.localdomain systemd[1]: tmp-crun.1aL7Bk.mount: Deactivated successfully.
Nov 28 10:05:25 np0005538513.localdomain podman[322089]: 2025-11-28 10:05:25.3010042 +0000 UTC m=+0.051605231 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:05:25 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:05:25 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d5138906d6fc1bdee0a9b155d7b868d3e377bc1927eaf0edf6a9cc67c712c01/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:05:25 np0005538513.localdomain podman[322089]: 2025-11-28 10:05:25.438321244 +0000 UTC m=+0.188922235 container init 91accf782ff3cf731a308ad6ffb0d110d5b9e240beea4be32f785cb2a09074d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:05:25 np0005538513.localdomain podman[322089]: 2025-11-28 10:05:25.447923069 +0000 UTC m=+0.198524060 container start 91accf782ff3cf731a308ad6ffb0d110d5b9e240beea4be32f785cb2a09074d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:05:25 np0005538513.localdomain dnsmasq[322107]: started, version 2.85 cachesize 150
Nov 28 10:05:25 np0005538513.localdomain dnsmasq[322107]: DNS service limited to local subnets
Nov 28 10:05:25 np0005538513.localdomain dnsmasq[322107]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:05:25 np0005538513.localdomain dnsmasq[322107]: warning: no upstream servers configured
Nov 28 10:05:25 np0005538513.localdomain dnsmasq-dhcp[322107]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:05:25 np0005538513.localdomain dnsmasq[322107]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:05:25 np0005538513.localdomain dnsmasq-dhcp[322107]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:05:25 np0005538513.localdomain dnsmasq-dhcp[322107]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:05:25 np0005538513.localdomain ceph-mon[292954]: pgmap v281: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 596 B/s wr, 14 op/s
Nov 28 10:05:25 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:25.474 2 INFO neutron.agent.securitygroups_rpc [None req-1ee68db6-3f9b-4e96-8e85-23a114c9c60e 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:05:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:25.653 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:25 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:25.713 261084 INFO neutron.agent.dhcp.agent [None req-0e9b5ee0-b490-441e-ab59-983aa56ee460 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', 'eb25319d-f07e-4bef-a7f6-ca024599d184'} is completed
Nov 28 10:05:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:05:25 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:05:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:25.850 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:25 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:05:25 np0005538513.localdomain dnsmasq[322107]: exiting on receipt of SIGTERM
Nov 28 10:05:25 np0005538513.localdomain podman[322130]: 2025-11-28 10:05:25.897420678 +0000 UTC m=+0.108605722 container kill 91accf782ff3cf731a308ad6ffb0d110d5b9e240beea4be32f785cb2a09074d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 10:05:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:05:25 np0005538513.localdomain systemd[1]: libpod-91accf782ff3cf731a308ad6ffb0d110d5b9e240beea4be32f785cb2a09074d2.scope: Deactivated successfully.
Nov 28 10:05:25 np0005538513.localdomain podman[322123]: 2025-11-28 10:05:25.902188975 +0000 UTC m=+0.135642797 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 10:05:25 np0005538513.localdomain podman[322123]: 2025-11-28 10:05:25.908725673 +0000 UTC m=+0.142179455 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 28 10:05:25 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:05:26 np0005538513.localdomain podman[322150]: 2025-11-28 10:05:26.032622313 +0000 UTC m=+0.121381170 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 10:05:26 np0005538513.localdomain podman[322152]: 2025-11-28 10:05:26.066907225 +0000 UTC m=+0.092126893 container died 91accf782ff3cf731a308ad6ffb0d110d5b9e240beea4be32f785cb2a09074d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:26 np0005538513.localdomain podman[322152]: 2025-11-28 10:05:26.091380196 +0000 UTC m=+0.116599834 container cleanup 91accf782ff3cf731a308ad6ffb0d110d5b9e240beea4be32f785cb2a09074d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:26 np0005538513.localdomain systemd[1]: libpod-conmon-91accf782ff3cf731a308ad6ffb0d110d5b9e240beea4be32f785cb2a09074d2.scope: Deactivated successfully.
Nov 28 10:05:26 np0005538513.localdomain podman[322150]: 2025-11-28 10:05:26.11490806 +0000 UTC m=+0.203666937 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller)
Nov 28 10:05:26 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:05:26 np0005538513.localdomain podman[322159]: 2025-11-28 10:05:26.197809566 +0000 UTC m=+0.214322454 container remove 91accf782ff3cf731a308ad6ffb0d110d5b9e240beea4be32f785cb2a09074d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:26 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-8d5138906d6fc1bdee0a9b155d7b868d3e377bc1927eaf0edf6a9cc67c712c01-merged.mount: Deactivated successfully.
Nov 28 10:05:26 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-91accf782ff3cf731a308ad6ffb0d110d5b9e240beea4be32f785cb2a09074d2-userdata-shm.mount: Deactivated successfully.
Nov 28 10:05:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:05:26 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2070944282' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:05:26 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2070944282' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:26 np0005538513.localdomain ceph-mon[292954]: pgmap v282: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 596 B/s wr, 14 op/s
Nov 28 10:05:26 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:05:26 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2070944282' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:26 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2070944282' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:26 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:26.901 2 INFO neutron.agent.securitygroups_rpc [None req-b7b686b0-df55-421e-8492-2a2961409c1a 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:05:27 np0005538513.localdomain podman[322254]: 
Nov 28 10:05:27 np0005538513.localdomain podman[322254]: 2025-11-28 10:05:27.148426825 +0000 UTC m=+0.092809581 container create 8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 28 10:05:27 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:27.190 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:27 np0005538513.localdomain podman[322254]: 2025-11-28 10:05:27.104256399 +0000 UTC m=+0.048639185 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:05:27 np0005538513.localdomain systemd[1]: Started libpod-conmon-8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e.scope.
Nov 28 10:05:27 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:05:27 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46e3f66133a2e22df0c7d7cab94349954b58c472860335b1bc1ff6e5aba2628c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:05:27 np0005538513.localdomain podman[322254]: 2025-11-28 10:05:27.237706683 +0000 UTC m=+0.182089449 container init 8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 10:05:27 np0005538513.localdomain podman[322254]: 2025-11-28 10:05:27.247634537 +0000 UTC m=+0.192017293 container start 8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:27 np0005538513.localdomain dnsmasq[322273]: started, version 2.85 cachesize 150
Nov 28 10:05:27 np0005538513.localdomain dnsmasq[322273]: DNS service limited to local subnets
Nov 28 10:05:27 np0005538513.localdomain dnsmasq[322273]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:05:27 np0005538513.localdomain dnsmasq[322273]: warning: no upstream servers configured
Nov 28 10:05:27 np0005538513.localdomain dnsmasq-dhcp[322273]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:05:27 np0005538513.localdomain dnsmasq[322273]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 2 addresses
Nov 28 10:05:27 np0005538513.localdomain dnsmasq-dhcp[322273]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:05:27 np0005538513.localdomain dnsmasq-dhcp[322273]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:05:27 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:27.310 261084 INFO neutron.agent.dhcp.agent [None req-27a81b29-092f-412d-b9e4-496af2536f6c - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:05:24Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64e9c70>, <neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64e9e80>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6426c10>, <neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64261c0>], id=073bafeb-50ac-4313-8f33-745fd94b95e2, ip_allocation=immediate, mac_address=fa:16:3e:60:05:4c, name=tempest-NetworksTestDHCPv6-71055213, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=57, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['0d5a9a0b-5034-49a5-ab15-87280961b993', '17bf8ba1-a681-4323-80ea-a68195dff493'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:05:23Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1934, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:05:25Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1
Nov 28 10:05:27 np0005538513.localdomain dnsmasq[322273]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 2 addresses
Nov 28 10:05:27 np0005538513.localdomain dnsmasq-dhcp[322273]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:05:27 np0005538513.localdomain dnsmasq-dhcp[322273]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:05:27 np0005538513.localdomain podman[322291]: 2025-11-28 10:05:27.530295126 +0000 UTC m=+0.071034166 container kill 8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:05:27 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:27.607 261084 INFO neutron.agent.dhcp.agent [None req-a198de0c-d6b6-4922-9dfc-1a9ad0102467 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', 'eb25319d-f07e-4bef-a7f6-ca024599d184', '073bafeb-50ac-4313-8f33-745fd94b95e2'} is completed
Nov 28 10:05:27 np0005538513.localdomain podman[322306]: 2025-11-28 10:05:27.659809338 +0000 UTC m=+0.100320696 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:27 np0005538513.localdomain podman[322306]: 2025-11-28 10:05:27.67731263 +0000 UTC m=+0.117823998 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 28 10:05:27 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:05:27 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:27.821 261084 INFO neutron.agent.dhcp.agent [None req-84d785b6-9e1d-42bc-a4aa-eb6338dd664b - - - - - -] DHCP configuration for ports {'073bafeb-50ac-4313-8f33-745fd94b95e2'} is completed
Nov 28 10:05:28 np0005538513.localdomain dnsmasq[322273]: exiting on receipt of SIGTERM
Nov 28 10:05:28 np0005538513.localdomain podman[322347]: 2025-11-28 10:05:28.035344329 +0000 UTC m=+0.061217786 container kill 8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:05:28 np0005538513.localdomain systemd[1]: libpod-8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e.scope: Deactivated successfully.
Nov 28 10:05:28 np0005538513.localdomain podman[322359]: 2025-11-28 10:05:28.118310555 +0000 UTC m=+0.071450647 container died 8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:05:28 np0005538513.localdomain podman[322359]: 2025-11-28 10:05:28.152956139 +0000 UTC m=+0.106096191 container cleanup 8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:05:28 np0005538513.localdomain systemd[1]: libpod-conmon-8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e.scope: Deactivated successfully.
Nov 28 10:05:28 np0005538513.localdomain podman[322366]: 2025-11-28 10:05:28.196174287 +0000 UTC m=+0.133118225 container remove 8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:05:28 np0005538513.localdomain systemd[1]: tmp-crun.TBBEQ8.mount: Deactivated successfully.
Nov 28 10:05:28 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-46e3f66133a2e22df0c7d7cab94349954b58c472860335b1bc1ff6e5aba2628c-merged.mount: Deactivated successfully.
Nov 28 10:05:28 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e-userdata-shm.mount: Deactivated successfully.
Nov 28 10:05:28 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:28.615 261084 INFO neutron.agent.dhcp.agent [None req-f04b2642-6f28-4e46-8ac4-5c450b0aadb3 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', 'eb25319d-f07e-4bef-a7f6-ca024599d184'} is completed
Nov 28 10:05:28 np0005538513.localdomain ceph-mon[292954]: pgmap v283: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 1.7 KiB/s wr, 42 op/s
Nov 28 10:05:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:05:29 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1994004222' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:05:29 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1994004222' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:29 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1994004222' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:29 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1994004222' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:30 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:30 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:30.339 261084 INFO neutron.agent.linux.ip_lib [None req-b986e48b-5bb6-415c-96f7-ed26d4af6dae - - - - - -] Device tap59628025-67 cannot be used as it has no MAC address
Nov 28 10:05:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:30.409 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:30 np0005538513.localdomain kernel: device tap59628025-67 entered promiscuous mode
Nov 28 10:05:30 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324330.4181] manager: (tap59628025-67): new Generic device (/org/freedesktop/NetworkManager/Devices/57)
Nov 28 10:05:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:30.418 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:30 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:30Z|00338|binding|INFO|Claiming lport 59628025-6745-4930-8b9e-1db836e05f1d for this chassis.
Nov 28 10:05:30 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:30Z|00339|binding|INFO|59628025-6745-4930-8b9e-1db836e05f1d: Claiming unknown
Nov 28 10:05:30 np0005538513.localdomain systemd-udevd[322412]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:05:30 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:30.431 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-3008d273-ce4e-482f-9951-930717f7a6f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3008d273-ce4e-482f-9951-930717f7a6f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '45fdbe27569f45449de58f1d1899ceea', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e4f36a67-65f2-4a54-bf93-a2e1db5ebe54, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=59628025-6745-4930-8b9e-1db836e05f1d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:30 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:30.434 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 59628025-6745-4930-8b9e-1db836e05f1d in datapath 3008d273-ce4e-482f-9951-930717f7a6f1 bound to our chassis
Nov 28 10:05:30 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:30.435 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3008d273-ce4e-482f-9951-930717f7a6f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:05:30 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:30.438 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4939faf7-0af9-403c-a3f3-a15c514ad0db]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:30 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap59628025-67: No such device
Nov 28 10:05:30 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:30Z|00340|binding|INFO|Setting lport 59628025-6745-4930-8b9e-1db836e05f1d ovn-installed in OVS
Nov 28 10:05:30 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:30Z|00341|binding|INFO|Setting lport 59628025-6745-4930-8b9e-1db836e05f1d up in Southbound
Nov 28 10:05:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:30.459 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:30 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap59628025-67: No such device
Nov 28 10:05:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:30.462 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:30 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap59628025-67: No such device
Nov 28 10:05:30 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap59628025-67: No such device
Nov 28 10:05:30 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap59628025-67: No such device
Nov 28 10:05:30 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap59628025-67: No such device
Nov 28 10:05:30 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap59628025-67: No such device
Nov 28 10:05:30 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap59628025-67: No such device
Nov 28 10:05:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:30.521 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:30 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:30.543 2 INFO neutron.agent.securitygroups_rpc [None req-3bea7d0a-d883-4475-b5cf-e7689fbb3c41 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:05:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:30.562 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:30.655 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:30.851 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:30 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:30Z|00342|binding|INFO|Removing iface tap59628025-67 ovn-installed in OVS
Nov 28 10:05:30 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:30Z|00343|binding|INFO|Removing lport 59628025-6745-4930-8b9e-1db836e05f1d ovn-installed in OVS
Nov 28 10:05:30 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:30.926 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port c4634f90-1cfe-436f-9c5d-e163574d13dd with type ""
Nov 28 10:05:30 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:30.928 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-3008d273-ce4e-482f-9951-930717f7a6f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3008d273-ce4e-482f-9951-930717f7a6f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '45fdbe27569f45449de58f1d1899ceea', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e4f36a67-65f2-4a54-bf93-a2e1db5ebe54, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=59628025-6745-4930-8b9e-1db836e05f1d) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:30.930 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:30 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:30.932 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 59628025-6745-4930-8b9e-1db836e05f1d in datapath 3008d273-ce4e-482f-9951-930717f7a6f1 unbound from our chassis
Nov 28 10:05:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:30.935 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:30 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:30.935 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3008d273-ce4e-482f-9951-930717f7a6f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:05:30 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:30.937 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[0e3a5c6f-18b8-479b-966e-cafffefdae87]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:31 np0005538513.localdomain ceph-mon[292954]: pgmap v284: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 1.7 KiB/s wr, 42 op/s
Nov 28 10:05:31 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:31.191 2 INFO neutron.agent.securitygroups_rpc [None req-de2b1d93-259b-466a-9f3c-b35e66480eb7 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:05:31 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:31Z|00344|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:05:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:31.269 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:31 np0005538513.localdomain podman[322499]: 
Nov 28 10:05:31 np0005538513.localdomain podman[322499]: 2025-11-28 10:05:31.302061842 +0000 UTC m=+0.106869303 container create d8cce07217390891e329d7cc83cbf5c2d1400c381bb4068e46fceca2476aa476 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 10:05:31 np0005538513.localdomain systemd[1]: Started libpod-conmon-d8cce07217390891e329d7cc83cbf5c2d1400c381bb4068e46fceca2476aa476.scope.
Nov 28 10:05:31 np0005538513.localdomain systemd[1]: tmp-crun.sCCq4W.mount: Deactivated successfully.
Nov 28 10:05:31 np0005538513.localdomain podman[322499]: 2025-11-28 10:05:31.262829968 +0000 UTC m=+0.067637479 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:05:31 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:05:31 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/954bd3ba3f117d5d300db4abb70c466bc9dc08030da6ad9d8da5e12e09bf209f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:05:31 np0005538513.localdomain podman[322499]: 2025-11-28 10:05:31.397909338 +0000 UTC m=+0.202716829 container init d8cce07217390891e329d7cc83cbf5c2d1400c381bb4068e46fceca2476aa476 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 10:05:31 np0005538513.localdomain podman[322499]: 2025-11-28 10:05:31.407294677 +0000 UTC m=+0.212102178 container start d8cce07217390891e329d7cc83cbf5c2d1400c381bb4068e46fceca2476aa476 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:05:31 np0005538513.localdomain dnsmasq[322532]: started, version 2.85 cachesize 150
Nov 28 10:05:31 np0005538513.localdomain dnsmasq[322532]: DNS service limited to local subnets
Nov 28 10:05:31 np0005538513.localdomain dnsmasq[322532]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:05:31 np0005538513.localdomain dnsmasq[322532]: warning: no upstream servers configured
Nov 28 10:05:31 np0005538513.localdomain dnsmasq-dhcp[322532]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:05:31 np0005538513.localdomain dnsmasq[322532]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:05:31 np0005538513.localdomain dnsmasq-dhcp[322532]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:05:31 np0005538513.localdomain dnsmasq-dhcp[322532]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:05:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:05:31 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3014863473' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:05:31 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3014863473' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:31 np0005538513.localdomain dnsmasq[322532]: exiting on receipt of SIGTERM
Nov 28 10:05:31 np0005538513.localdomain systemd[1]: libpod-d8cce07217390891e329d7cc83cbf5c2d1400c381bb4068e46fceca2476aa476.scope: Deactivated successfully.
Nov 28 10:05:31 np0005538513.localdomain podman[322539]: 2025-11-28 10:05:31.520558553 +0000 UTC m=+0.077836452 container died d8cce07217390891e329d7cc83cbf5c2d1400c381bb4068e46fceca2476aa476 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:05:31 np0005538513.localdomain podman[322539]: 2025-11-28 10:05:31.552634411 +0000 UTC m=+0.109912310 container cleanup d8cce07217390891e329d7cc83cbf5c2d1400c381bb4068e46fceca2476aa476 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 10:05:31 np0005538513.localdomain podman[322551]: 2025-11-28 10:05:31.590718033 +0000 UTC m=+0.064814928 container cleanup d8cce07217390891e329d7cc83cbf5c2d1400c381bb4068e46fceca2476aa476 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 10:05:31 np0005538513.localdomain systemd[1]: libpod-conmon-d8cce07217390891e329d7cc83cbf5c2d1400c381bb4068e46fceca2476aa476.scope: Deactivated successfully.
Nov 28 10:05:31 np0005538513.localdomain podman[322564]: 2025-11-28 10:05:31.651120453 +0000 UTC m=+0.081172597 container remove d8cce07217390891e329d7cc83cbf5c2d1400c381bb4068e46fceca2476aa476 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 28 10:05:31 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:31.699 261084 INFO neutron.agent.dhcp.agent [None req-9210b8f7-0116-4762-b89d-166ebab3df0d - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', 'eb25319d-f07e-4bef-a7f6-ca024599d184'} is completed
Nov 28 10:05:31 np0005538513.localdomain podman[322584]: 
Nov 28 10:05:31 np0005538513.localdomain podman[322584]: 2025-11-28 10:05:31.757282465 +0000 UTC m=+0.082337220 container create ed8267f6e5877157bd61357301b205935454612ff49680fa0a54a298d3cbc5bb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3008d273-ce4e-482f-9951-930717f7a6f1, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:31 np0005538513.localdomain systemd[1]: Started libpod-conmon-ed8267f6e5877157bd61357301b205935454612ff49680fa0a54a298d3cbc5bb.scope.
Nov 28 10:05:31 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:05:31 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5913fdca5af5a7fae83727499d970b246170438bd3f265b9b7404fabb91d105a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:05:31 np0005538513.localdomain podman[322584]: 2025-11-28 10:05:31.716004643 +0000 UTC m=+0.041059428 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:05:31 np0005538513.localdomain podman[322584]: 2025-11-28 10:05:31.819928651 +0000 UTC m=+0.144983386 container init ed8267f6e5877157bd61357301b205935454612ff49680fa0a54a298d3cbc5bb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3008d273-ce4e-482f-9951-930717f7a6f1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:31 np0005538513.localdomain podman[322584]: 2025-11-28 10:05:31.829300899 +0000 UTC m=+0.154355634 container start ed8267f6e5877157bd61357301b205935454612ff49680fa0a54a298d3cbc5bb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3008d273-ce4e-482f-9951-930717f7a6f1, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:31 np0005538513.localdomain dnsmasq[322610]: started, version 2.85 cachesize 150
Nov 28 10:05:31 np0005538513.localdomain dnsmasq[322610]: DNS service limited to local subnets
Nov 28 10:05:31 np0005538513.localdomain dnsmasq[322610]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:05:31 np0005538513.localdomain dnsmasq[322610]: warning: no upstream servers configured
Nov 28 10:05:31 np0005538513.localdomain dnsmasq-dhcp[322610]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Nov 28 10:05:31 np0005538513.localdomain dnsmasq[322610]: read /var/lib/neutron/dhcp/3008d273-ce4e-482f-9951-930717f7a6f1/addn_hosts - 0 addresses
Nov 28 10:05:31 np0005538513.localdomain dnsmasq-dhcp[322610]: read /var/lib/neutron/dhcp/3008d273-ce4e-482f-9951-930717f7a6f1/host
Nov 28 10:05:31 np0005538513.localdomain dnsmasq-dhcp[322610]: read /var/lib/neutron/dhcp/3008d273-ce4e-482f-9951-930717f7a6f1/opts
Nov 28 10:05:31 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:31.935 261084 INFO neutron.agent.dhcp.agent [None req-eaaef033-ae68-45f1-8bed-59bae5543439 - - - - - -] DHCP configuration for ports {'576302a1-e256-4653-ae72-a049f1fcfc76'} is completed
Nov 28 10:05:32 np0005538513.localdomain dnsmasq[322610]: exiting on receipt of SIGTERM
Nov 28 10:05:32 np0005538513.localdomain systemd[1]: libpod-ed8267f6e5877157bd61357301b205935454612ff49680fa0a54a298d3cbc5bb.scope: Deactivated successfully.
Nov 28 10:05:32 np0005538513.localdomain podman[322637]: 2025-11-28 10:05:32.073506216 +0000 UTC m=+0.065307722 container kill ed8267f6e5877157bd61357301b205935454612ff49680fa0a54a298d3cbc5bb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3008d273-ce4e-482f-9951-930717f7a6f1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:05:32 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3014863473' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:32 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3014863473' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:32 np0005538513.localdomain podman[322652]: 2025-11-28 10:05:32.153124998 +0000 UTC m=+0.061434331 container died ed8267f6e5877157bd61357301b205935454612ff49680fa0a54a298d3cbc5bb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3008d273-ce4e-482f-9951-930717f7a6f1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:05:32 np0005538513.localdomain podman[322652]: 2025-11-28 10:05:32.189431739 +0000 UTC m=+0.097741032 container cleanup ed8267f6e5877157bd61357301b205935454612ff49680fa0a54a298d3cbc5bb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3008d273-ce4e-482f-9951-930717f7a6f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 10:05:32 np0005538513.localdomain systemd[1]: libpod-conmon-ed8267f6e5877157bd61357301b205935454612ff49680fa0a54a298d3cbc5bb.scope: Deactivated successfully.
Nov 28 10:05:32 np0005538513.localdomain podman[322655]: 2025-11-28 10:05:32.246848223 +0000 UTC m=+0.139753845 container remove ed8267f6e5877157bd61357301b205935454612ff49680fa0a54a298d3cbc5bb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3008d273-ce4e-482f-9951-930717f7a6f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 28 10:05:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:32.261 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:32 np0005538513.localdomain kernel: device tap59628025-67 left promiscuous mode
Nov 28 10:05:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:32.273 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:32 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:32.304 261084 INFO neutron.agent.dhcp.agent [None req-1e0f9bb0-f757-475c-a870-1380354c0e5b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:32 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:32.305 261084 INFO neutron.agent.dhcp.agent [None req-1e0f9bb0-f757-475c-a870-1380354c0e5b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-954bd3ba3f117d5d300db4abb70c466bc9dc08030da6ad9d8da5e12e09bf209f-merged.mount: Deactivated successfully.
Nov 28 10:05:32 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d8cce07217390891e329d7cc83cbf5c2d1400c381bb4068e46fceca2476aa476-userdata-shm.mount: Deactivated successfully.
Nov 28 10:05:32 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d3008d273\x2dce4e\x2d482f\x2d9951\x2d930717f7a6f1.mount: Deactivated successfully.
Nov 28 10:05:32 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:32.453 261084 INFO neutron.agent.linux.ip_lib [None req-d9378e2c-0c7c-4a60-9b25-298141c4d17a - - - - - -] Device tapbbebc9e7-db cannot be used as it has no MAC address
Nov 28 10:05:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:05:32 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/317391599' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:05:32 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/317391599' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:32.485 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:32 np0005538513.localdomain kernel: device tapbbebc9e7-db entered promiscuous mode
Nov 28 10:05:32 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324332.4940] manager: (tapbbebc9e7-db): new Generic device (/org/freedesktop/NetworkManager/Devices/58)
Nov 28 10:05:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:32.493 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:32 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:32Z|00345|binding|INFO|Claiming lport bbebc9e7-dbe0-47b5-b390-b5bcd4b40cc9 for this chassis.
Nov 28 10:05:32 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:32Z|00346|binding|INFO|bbebc9e7-dbe0-47b5-b390-b5bcd4b40cc9: Claiming unknown
Nov 28 10:05:32 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:32.506 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.255.242/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-fa28040d-639a-454c-9515-60af86f8624b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa28040d-639a-454c-9515-60af86f8624b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8499932c523b4e26933fff84403e296e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dcca6890-1675-46ad-9260-7f267479c535, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=bbebc9e7-dbe0-47b5-b390-b5bcd4b40cc9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:32 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:32.511 158130 INFO neutron.agent.ovn.metadata.agent [-] Port bbebc9e7-dbe0-47b5-b390-b5bcd4b40cc9 in datapath fa28040d-639a-454c-9515-60af86f8624b bound to our chassis
Nov 28 10:05:32 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:32.512 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network fa28040d-639a-454c-9515-60af86f8624b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:05:32 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:32.514 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[fe50ceb5-8c08-4188-9a0b-a5346437d98b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:32 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:32Z|00347|binding|INFO|Setting lport bbebc9e7-dbe0-47b5-b390-b5bcd4b40cc9 ovn-installed in OVS
Nov 28 10:05:32 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:32Z|00348|binding|INFO|Setting lport bbebc9e7-dbe0-47b5-b390-b5bcd4b40cc9 up in Southbound
Nov 28 10:05:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:32.532 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:32.583 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:32.629 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:32 np0005538513.localdomain podman[322724]: 
Nov 28 10:05:32 np0005538513.localdomain podman[322724]: 2025-11-28 10:05:32.703427776 +0000 UTC m=+0.104120674 container create f2bb80b42a65e58c8a0974c9018531bd1b0eec105f43d137f8056bf67ae4d4b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 10:05:32 np0005538513.localdomain podman[322724]: 2025-11-28 10:05:32.655252756 +0000 UTC m=+0.055945694 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:05:32 np0005538513.localdomain systemd[1]: Started libpod-conmon-f2bb80b42a65e58c8a0974c9018531bd1b0eec105f43d137f8056bf67ae4d4b8.scope.
Nov 28 10:05:32 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:05:32 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34f6008d0476cd6ca17b69b0fcbfd3916be3191c61a21acea58861d8aa11abd9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:05:32 np0005538513.localdomain podman[322724]: 2025-11-28 10:05:32.793871457 +0000 UTC m=+0.194564345 container init f2bb80b42a65e58c8a0974c9018531bd1b0eec105f43d137f8056bf67ae4d4b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 10:05:32 np0005538513.localdomain podman[322724]: 2025-11-28 10:05:32.801768394 +0000 UTC m=+0.202461282 container start f2bb80b42a65e58c8a0974c9018531bd1b0eec105f43d137f8056bf67ae4d4b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:05:32 np0005538513.localdomain dnsmasq[322750]: started, version 2.85 cachesize 150
Nov 28 10:05:32 np0005538513.localdomain dnsmasq[322750]: DNS service limited to local subnets
Nov 28 10:05:32 np0005538513.localdomain dnsmasq[322750]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:05:32 np0005538513.localdomain dnsmasq[322750]: warning: no upstream servers configured
Nov 28 10:05:32 np0005538513.localdomain dnsmasq-dhcp[322750]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Nov 28 10:05:32 np0005538513.localdomain dnsmasq-dhcp[322750]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:05:32 np0005538513.localdomain dnsmasq[322750]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses
Nov 28 10:05:32 np0005538513.localdomain dnsmasq-dhcp[322750]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host
Nov 28 10:05:32 np0005538513.localdomain dnsmasq-dhcp[322750]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts
Nov 28 10:05:33 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:05:33 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1576133045' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:33 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:05:33 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1576133045' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:33 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:33.062 261084 INFO neutron.agent.dhcp.agent [None req-dce4de1e-8c7e-4fa3-8c8b-9cbb8cb1c36c - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', 'eb25319d-f07e-4bef-a7f6-ca024599d184'} is completed
Nov 28 10:05:33 np0005538513.localdomain ceph-mon[292954]: pgmap v285: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 3.0 KiB/s wr, 59 op/s
Nov 28 10:05:33 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/317391599' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:33 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/317391599' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:33 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1576133045' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:33 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1576133045' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:33 np0005538513.localdomain dnsmasq[322750]: exiting on receipt of SIGTERM
Nov 28 10:05:33 np0005538513.localdomain podman[322782]: 2025-11-28 10:05:33.198572124 +0000 UTC m=+0.072002834 container kill f2bb80b42a65e58c8a0974c9018531bd1b0eec105f43d137f8056bf67ae4d4b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:33 np0005538513.localdomain systemd[1]: libpod-f2bb80b42a65e58c8a0974c9018531bd1b0eec105f43d137f8056bf67ae4d4b8.scope: Deactivated successfully.
Nov 28 10:05:33 np0005538513.localdomain podman[322797]: 2025-11-28 10:05:33.274366716 +0000 UTC m=+0.060067953 container died f2bb80b42a65e58c8a0974c9018531bd1b0eec105f43d137f8056bf67ae4d4b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-34f6008d0476cd6ca17b69b0fcbfd3916be3191c61a21acea58861d8aa11abd9-merged.mount: Deactivated successfully.
Nov 28 10:05:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f2bb80b42a65e58c8a0974c9018531bd1b0eec105f43d137f8056bf67ae4d4b8-userdata-shm.mount: Deactivated successfully.
Nov 28 10:05:33 np0005538513.localdomain podman[322797]: 2025-11-28 10:05:33.316798782 +0000 UTC m=+0.102499969 container cleanup f2bb80b42a65e58c8a0974c9018531bd1b0eec105f43d137f8056bf67ae4d4b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 10:05:33 np0005538513.localdomain systemd[1]: libpod-conmon-f2bb80b42a65e58c8a0974c9018531bd1b0eec105f43d137f8056bf67ae4d4b8.scope: Deactivated successfully.
Nov 28 10:05:33 np0005538513.localdomain podman[322799]: 2025-11-28 10:05:33.366834025 +0000 UTC m=+0.144287325 container remove f2bb80b42a65e58c8a0974c9018531bd1b0eec105f43d137f8056bf67ae4d4b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:05:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:33.420 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:33 np0005538513.localdomain kernel: device tapeb25319d-f0 left promiscuous mode
Nov 28 10:05:33 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:33Z|00349|binding|INFO|Releasing lport eb25319d-f07e-4bef-a7f6-ca024599d184 from this chassis (sb_readonly=0)
Nov 28 10:05:33 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:33Z|00350|binding|INFO|Setting lport eb25319d-f07e-4bef-a7f6-ca024599d184 down in Southbound
Nov 28 10:05:33 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:33.431 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:febd:20cb/64 2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '12', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=eb25319d-f07e-4bef-a7f6-ca024599d184) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:33 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:33.433 158130 INFO neutron.agent.ovn.metadata.agent [-] Port eb25319d-f07e-4bef-a7f6-ca024599d184 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis
Nov 28 10:05:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:33.440 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:33 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:33.440 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:05:33 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:33.444 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[bc5813df-241a-4625-ac4f-219bdefddc42]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:33 np0005538513.localdomain podman[322862]: 
Nov 28 10:05:33 np0005538513.localdomain podman[322862]: 2025-11-28 10:05:33.653146939 +0000 UTC m=+0.129008517 container create d45c3118e0cb447a1f990580ae47ce23aabce3b8035e5e02e901e9a0f3bc1683 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fa28040d-639a-454c-9515-60af86f8624b, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:05:33 np0005538513.localdomain dnsmasq[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/addn_hosts - 0 addresses
Nov 28 10:05:33 np0005538513.localdomain dnsmasq-dhcp[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/host
Nov 28 10:05:33 np0005538513.localdomain dnsmasq-dhcp[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/opts
Nov 28 10:05:33 np0005538513.localdomain podman[322877]: 2025-11-28 10:05:33.653758677 +0000 UTC m=+0.071958543 container kill 2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-553c7f35-d914-4af1-9846-a8cbe21f53f3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:33 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:33.656 261084 INFO neutron.agent.dhcp.agent [None req-b7f762bf-1f61-4981-b1c0-944291fec871 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:33 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:33.656 261084 INFO neutron.agent.dhcp.agent [None req-b7f762bf-1f61-4981-b1c0-944291fec871 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:33 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully.
Nov 28 10:05:33 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:33.657 261084 INFO neutron.agent.dhcp.agent [None req-b7f762bf-1f61-4981-b1c0-944291fec871 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:33 np0005538513.localdomain systemd[1]: Started libpod-conmon-d45c3118e0cb447a1f990580ae47ce23aabce3b8035e5e02e901e9a0f3bc1683.scope.
Nov 28 10:05:33 np0005538513.localdomain podman[322862]: 2025-11-28 10:05:33.610375764 +0000 UTC m=+0.086237382 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:05:33 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:05:33 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14c3eefe9cc85a0964db7e342e45ff322ab308fd13a7f19aa1847f356aa5bafb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:05:33 np0005538513.localdomain podman[322862]: 2025-11-28 10:05:33.73764333 +0000 UTC m=+0.213504888 container init d45c3118e0cb447a1f990580ae47ce23aabce3b8035e5e02e901e9a0f3bc1683 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fa28040d-639a-454c-9515-60af86f8624b, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:33 np0005538513.localdomain podman[322862]: 2025-11-28 10:05:33.7561256 +0000 UTC m=+0.231987158 container start d45c3118e0cb447a1f990580ae47ce23aabce3b8035e5e02e901e9a0f3bc1683 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fa28040d-639a-454c-9515-60af86f8624b, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:05:33 np0005538513.localdomain dnsmasq[322901]: started, version 2.85 cachesize 150
Nov 28 10:05:33 np0005538513.localdomain dnsmasq[322901]: DNS service limited to local subnets
Nov 28 10:05:33 np0005538513.localdomain dnsmasq[322901]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:05:33 np0005538513.localdomain dnsmasq[322901]: warning: no upstream servers configured
Nov 28 10:05:33 np0005538513.localdomain dnsmasq-dhcp[322901]: DHCP, static leases only on 10.100.255.240, lease time 1d
Nov 28 10:05:33 np0005538513.localdomain dnsmasq[322901]: read /var/lib/neutron/dhcp/fa28040d-639a-454c-9515-60af86f8624b/addn_hosts - 0 addresses
Nov 28 10:05:33 np0005538513.localdomain dnsmasq-dhcp[322901]: read /var/lib/neutron/dhcp/fa28040d-639a-454c-9515-60af86f8624b/host
Nov 28 10:05:33 np0005538513.localdomain dnsmasq-dhcp[322901]: read /var/lib/neutron/dhcp/fa28040d-639a-454c-9515-60af86f8624b/opts
Nov 28 10:05:33 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:33.769 2 INFO neutron.agent.securitygroups_rpc [None req-04197ece-4fb3-43df-90bb-b0e309825a8e cb58c56533984a129050414c9b160b63 de1aeac8abd545fcb83eb3ee06f16689 - - default default] Security group member updated ['805fa77a-da24-42d8-9154-db9402b01c3e']
Nov 28 10:05:33 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:33Z|00351|binding|INFO|Releasing lport 4929710e-eb4c-4144-9bca-64efc297e299 from this chassis (sb_readonly=0)
Nov 28 10:05:33 np0005538513.localdomain kernel: device tap4929710e-eb left promiscuous mode
Nov 28 10:05:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:33.853 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:33 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:33Z|00352|binding|INFO|Setting lport 4929710e-eb4c-4144-9bca-64efc297e299 down in Southbound
Nov 28 10:05:33 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:33.864 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-553c7f35-d914-4af1-9846-a8cbe21f53f3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-553c7f35-d914-4af1-9846-a8cbe21f53f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'aa5be61eafca4d96976422f0e0103210', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=40693dd3-cde5-4c50-9ed5-4dc8ef3313af, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=4929710e-eb4c-4144-9bca-64efc297e299) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:33 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:33.866 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 4929710e-eb4c-4144-9bca-64efc297e299 in datapath 553c7f35-d914-4af1-9846-a8cbe21f53f3 unbound from our chassis
Nov 28 10:05:33 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:33.869 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 553c7f35-d914-4af1-9846-a8cbe21f53f3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:05:33 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:33.870 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[3ec09597-b513-430f-af27-17ae94054241]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:33.874 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:33 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:33.918 261084 INFO neutron.agent.dhcp.agent [None req-2815f33c-0284-42ca-bce0-2081b7192bc7 - - - - - -] DHCP configuration for ports {'e2bd862e-905b-4769-9404-fb8c9861c0f4'} is completed
Nov 28 10:05:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:35 np0005538513.localdomain ceph-mon[292954]: pgmap v286: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 2.4 KiB/s wr, 45 op/s
Nov 28 10:05:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:35.659 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:35.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:35.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 10:05:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:35.787 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 10:05:35 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:35.877 2 INFO neutron.agent.securitygroups_rpc [None req-9c55fb28-c01b-40d6-8fec-099f9b722777 cb58c56533984a129050414c9b160b63 de1aeac8abd545fcb83eb3ee06f16689 - - default default] Security group member updated ['805fa77a-da24-42d8-9154-db9402b01c3e']
Nov 28 10:05:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:35.894 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:35 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:35.907 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:36 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:36Z|00353|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:05:36 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e151 do_prune osdmap full prune enabled
Nov 28 10:05:36 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e152 e152: 6 total, 6 up, 6 in
Nov 28 10:05:36 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e152: 6 total, 6 up, 6 in
Nov 28 10:05:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:36.133 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:37 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:37Z|00354|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:05:37 np0005538513.localdomain ceph-mon[292954]: pgmap v287: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 2.4 KiB/s wr, 45 op/s
Nov 28 10:05:37 np0005538513.localdomain ceph-mon[292954]: osdmap e152: 6 total, 6 up, 6 in
Nov 28 10:05:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:37.134 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:37 np0005538513.localdomain podman[322924]: 2025-11-28 10:05:37.548094335 +0000 UTC m=+0.061486113 container kill 2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-553c7f35-d914-4af1-9846-a8cbe21f53f3, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 10:05:37 np0005538513.localdomain dnsmasq[319577]: exiting on receipt of SIGTERM
Nov 28 10:05:37 np0005538513.localdomain systemd[1]: libpod-2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd.scope: Deactivated successfully.
Nov 28 10:05:37 np0005538513.localdomain podman[322938]: 2025-11-28 10:05:37.627202732 +0000 UTC m=+0.060925777 container died 2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-553c7f35-d914-4af1-9846-a8cbe21f53f3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:37 np0005538513.localdomain podman[322938]: 2025-11-28 10:05:37.665647383 +0000 UTC m=+0.099370388 container cleanup 2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-553c7f35-d914-4af1-9846-a8cbe21f53f3, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:05:37 np0005538513.localdomain systemd[1]: libpod-conmon-2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd.scope: Deactivated successfully.
Nov 28 10:05:37 np0005538513.localdomain podman[322939]: 2025-11-28 10:05:37.701269284 +0000 UTC m=+0.128465752 container remove 2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-553c7f35-d914-4af1-9846-a8cbe21f53f3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:05:37 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:37.810 261084 INFO neutron.agent.dhcp.agent [None req-4bb69635-28b9-4340-89b7-b560ae1ec3aa - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:37 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:37.810 261084 INFO neutron.agent.dhcp.agent [None req-4bb69635-28b9-4340-89b7-b560ae1ec3aa - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:38 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-4bcc12744151ca7e94791368e387c8ab451571c56040070ed3ec7c54499fb7f3-merged.mount: Deactivated successfully.
Nov 28 10:05:38 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd-userdata-shm.mount: Deactivated successfully.
Nov 28 10:05:38 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d553c7f35\x2dd914\x2d4af1\x2d9846\x2da8cbe21f53f3.mount: Deactivated successfully.
Nov 28 10:05:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:05:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:05:38 np0005538513.localdomain podman[322967]: 2025-11-28 10:05:38.671258078 +0000 UTC m=+0.097308249 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:05:38 np0005538513.localdomain podman[322967]: 2025-11-28 10:05:38.68145276 +0000 UTC m=+0.107502911 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:05:38 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:05:38 np0005538513.localdomain podman[322985]: 2025-11-28 10:05:38.777931054 +0000 UTC m=+0.095161617 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:38 np0005538513.localdomain podman[322985]: 2025-11-28 10:05:38.791359829 +0000 UTC m=+0.108590382 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:05:38 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:05:39 np0005538513.localdomain ceph-mon[292954]: pgmap v289: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 3.5 KiB/s wr, 76 op/s
Nov 28 10:05:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/483552978' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/483552978' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:40 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:40 np0005538513.localdomain podman[238687]: time="2025-11-28T10:05:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:05:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:05:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157516 "" "Go-http-client/1.1"
Nov 28 10:05:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:05:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19745 "" "Go-http-client/1.1"
Nov 28 10:05:40 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3980002561' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:40 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3980002561' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:40.678 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:40.787 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:40.898 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:41 np0005538513.localdomain ceph-mon[292954]: pgmap v290: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 3.5 KiB/s wr, 76 op/s
Nov 28 10:05:41 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:41Z|00355|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:05:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:41.250 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:41.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:42 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:42.447 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:42.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:42.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:42.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:05:43 np0005538513.localdomain ceph-mon[292954]: pgmap v291: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 2.5 KiB/s wr, 84 op/s
Nov 28 10:05:43 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:43.755 2 INFO neutron.agent.securitygroups_rpc [None req-1640c1ae-d5d6-4903-a020-69a5c84dc198 e0f29aacf6a94315b178b4a16e3fd03d 79185418333d4a93b24c87e39a4a1847 - - default default] Security group member updated ['f7d47ffa-9780-427b-aaf2-f0de3a638f8a']
Nov 28 10:05:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:44.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:44.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:44.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:44.788 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:05:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:44.789 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:05:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:44.789 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:05:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:44.789 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:05:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:44.790 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:05:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:45 np0005538513.localdomain ceph-mon[292954]: pgmap v292: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 2.5 KiB/s wr, 84 op/s
Nov 28 10:05:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:05:45 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1626658574' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:05:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:45.240 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:05:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:45.316 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:05:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:45.316 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:05:45 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:45.529 2 INFO neutron.agent.securitygroups_rpc [None req-6e211784-e61c-41e0-bb4d-96a8d1625a59 e0f29aacf6a94315b178b4a16e3fd03d 79185418333d4a93b24c87e39a4a1847 - - default default] Security group member updated ['f7d47ffa-9780-427b-aaf2-f0de3a638f8a']
Nov 28 10:05:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:45.552 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:05:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:45.554 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11156MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:05:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:45.554 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:05:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:45.555 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:05:45 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:45.692 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:45 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:45.694 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:05:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:45.720 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:45.807 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 10:05:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:45.808 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:05:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:45.809 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:05:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:45.874 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing inventories for resource provider 35fead26-0bad-4950-b646-987079d58a17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 10:05:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:45.899 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:45.956 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating ProviderTree inventory for provider 35fead26-0bad-4950-b646-987079d58a17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 10:05:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:45.956 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 10:05:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:45.970 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing aggregate associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 10:05:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:46.060 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing trait associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_FMA3,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_ABM,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AMD_SVM,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_CLMUL,HW_CPU_X86_BMI2,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 10:05:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:46.100 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:05:46 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1626658574' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:05:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:05:46 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2969744802' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:05:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:46.559 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:05:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:46.567 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:05:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:46.586 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:05:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:46.589 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:05:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:46.589 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:05:46 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:46.678 2 INFO neutron.agent.securitygroups_rpc [None req-fda551bc-bdc3-478a-93b4-a620175c516e 8d30c732fa674cae8e1de9092f58edd9 3a67d7f32f5e49c3aed3e09278dd6c95 - - default default] Security group member updated ['371ca172-3d0a-4f94-811c-7c823124cef1']
Nov 28 10:05:47 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:47.118 2 INFO neutron.agent.securitygroups_rpc [None req-e42eed40-5a16-457a-8c1b-352e3dbcff3e e0f29aacf6a94315b178b4a16e3fd03d 79185418333d4a93b24c87e39a4a1847 - - default default] Security group member updated ['f7d47ffa-9780-427b-aaf2-f0de3a638f8a']
Nov 28 10:05:47 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:47.157 2 INFO neutron.agent.securitygroups_rpc [None req-fda551bc-bdc3-478a-93b4-a620175c516e 8d30c732fa674cae8e1de9092f58edd9 3a67d7f32f5e49c3aed3e09278dd6c95 - - default default] Security group member updated ['371ca172-3d0a-4f94-811c-7c823124cef1']
Nov 28 10:05:47 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:47.194 261084 INFO neutron.agent.linux.ip_lib [None req-9d118a07-77e2-46d6-a384-cc654cdf0e1b - - - - - -] Device tap87ef7272-14 cannot be used as it has no MAC address
Nov 28 10:05:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e152 do_prune osdmap full prune enabled
Nov 28 10:05:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e153 e153: 6 total, 6 up, 6 in
Nov 28 10:05:47 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e153: 6 total, 6 up, 6 in
Nov 28 10:05:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:47.257 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:47 np0005538513.localdomain ceph-mon[292954]: pgmap v293: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 2.5 KiB/s wr, 84 op/s
Nov 28 10:05:47 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2969744802' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:05:47 np0005538513.localdomain kernel: device tap87ef7272-14 entered promiscuous mode
Nov 28 10:05:47 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324347.2700] manager: (tap87ef7272-14): new Generic device (/org/freedesktop/NetworkManager/Devices/59)
Nov 28 10:05:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:47.274 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:47 np0005538513.localdomain systemd-udevd[323063]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:05:47 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:47Z|00356|binding|INFO|Claiming lport 87ef7272-14f7-4162-a8a9-b13090f8924f for this chassis.
Nov 28 10:05:47 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:47Z|00357|binding|INFO|87ef7272-14f7-4162-a8a9-b13090f8924f: Claiming unknown
Nov 28 10:05:47 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:47.284 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-3f532ea4-a0de-4113-8993-33f982144ec8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f532ea4-a0de-4113-8993-33f982144ec8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b2dec4622fe844c592a13e779612beaa', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4a58c096-9217-4d0b-a64c-715683dae905, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=87ef7272-14f7-4162-a8a9-b13090f8924f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:47 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:47.286 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 87ef7272-14f7-4162-a8a9-b13090f8924f in datapath 3f532ea4-a0de-4113-8993-33f982144ec8 bound to our chassis
Nov 28 10:05:47 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:47.287 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3f532ea4-a0de-4113-8993-33f982144ec8 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:05:47 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:47.288 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[dfefbd10-37c5-42e7-860d-84b0f137b2c5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:47 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap87ef7272-14: No such device
Nov 28 10:05:47 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap87ef7272-14: No such device
Nov 28 10:05:47 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap87ef7272-14: No such device
Nov 28 10:05:47 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:47Z|00358|binding|INFO|Setting lport 87ef7272-14f7-4162-a8a9-b13090f8924f ovn-installed in OVS
Nov 28 10:05:47 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:47Z|00359|binding|INFO|Setting lport 87ef7272-14f7-4162-a8a9-b13090f8924f up in Southbound
Nov 28 10:05:47 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap87ef7272-14: No such device
Nov 28 10:05:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:47.320 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:47 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap87ef7272-14: No such device
Nov 28 10:05:47 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap87ef7272-14: No such device
Nov 28 10:05:47 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap87ef7272-14: No such device
Nov 28 10:05:47 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap87ef7272-14: No such device
Nov 28 10:05:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:47.354 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:47.390 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:47.590 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:47 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:47.695 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:05:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:47.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:47.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:05:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:47.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:05:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:47.936 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:05:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:47.937 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:05:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:47.937 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 10:05:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:47.937 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:05:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:05:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:05:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:05:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:05:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:05:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:05:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:05:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:05:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:05:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:05:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:05:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:05:48 np0005538513.localdomain ceph-mon[292954]: osdmap e153: 6 total, 6 up, 6 in
Nov 28 10:05:48 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/692503189' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:05:48 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/3353686040' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:05:48 np0005538513.localdomain podman[323134]: 
Nov 28 10:05:48 np0005538513.localdomain podman[323134]: 2025-11-28 10:05:48.291535747 +0000 UTC m=+0.090948616 container create c2f76ad321100bf3dd865ae468acc4ac2ff5793c21d4c81e01c36259a18458eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f532ea4-a0de-4113-8993-33f982144ec8, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:05:48 np0005538513.localdomain systemd[1]: Started libpod-conmon-c2f76ad321100bf3dd865ae468acc4ac2ff5793c21d4c81e01c36259a18458eb.scope.
Nov 28 10:05:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:05:48 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:48.345 2 INFO neutron.agent.securitygroups_rpc [None req-7a28cb1b-8d30-40dd-8ac7-551ea3c1b87c e0f29aacf6a94315b178b4a16e3fd03d 79185418333d4a93b24c87e39a4a1847 - - default default] Security group member updated ['f7d47ffa-9780-427b-aaf2-f0de3a638f8a']
Nov 28 10:05:48 np0005538513.localdomain podman[323134]: 2025-11-28 10:05:48.248251378 +0000 UTC m=+0.047664287 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:05:48 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:05:48 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a09506b8e900e354d47418961d4d79668634f1edb0c1cb9bb0d739ec90a1c16/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:05:48 np0005538513.localdomain podman[323134]: 2025-11-28 10:05:48.366379932 +0000 UTC m=+0.165792801 container init c2f76ad321100bf3dd865ae468acc4ac2ff5793c21d4c81e01c36259a18458eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f532ea4-a0de-4113-8993-33f982144ec8, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 10:05:48 np0005538513.localdomain podman[323134]: 2025-11-28 10:05:48.379958552 +0000 UTC m=+0.179371411 container start c2f76ad321100bf3dd865ae468acc4ac2ff5793c21d4c81e01c36259a18458eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f532ea4-a0de-4113-8993-33f982144ec8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 28 10:05:48 np0005538513.localdomain dnsmasq[323161]: started, version 2.85 cachesize 150
Nov 28 10:05:48 np0005538513.localdomain dnsmasq[323161]: DNS service limited to local subnets
Nov 28 10:05:48 np0005538513.localdomain dnsmasq[323161]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:05:48 np0005538513.localdomain dnsmasq[323161]: warning: no upstream servers configured
Nov 28 10:05:48 np0005538513.localdomain dnsmasq-dhcp[323161]: DHCPv6, static leases only on 2001:db8:0:ffff::, lease time 1d
Nov 28 10:05:48 np0005538513.localdomain dnsmasq[323161]: read /var/lib/neutron/dhcp/3f532ea4-a0de-4113-8993-33f982144ec8/addn_hosts - 0 addresses
Nov 28 10:05:48 np0005538513.localdomain dnsmasq-dhcp[323161]: read /var/lib/neutron/dhcp/3f532ea4-a0de-4113-8993-33f982144ec8/host
Nov 28 10:05:48 np0005538513.localdomain dnsmasq-dhcp[323161]: read /var/lib/neutron/dhcp/3f532ea4-a0de-4113-8993-33f982144ec8/opts
Nov 28 10:05:48 np0005538513.localdomain podman[323150]: 2025-11-28 10:05:48.450103681 +0000 UTC m=+0.095879668 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, release=1755695350, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 28 10:05:48 np0005538513.localdomain podman[323150]: 2025-11-28 10:05:48.487726019 +0000 UTC m=+0.133501996 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.buildah.version=1.33.7, vendor=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc.)
Nov 28 10:05:48 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:05:48 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:48.666 261084 INFO neutron.agent.dhcp.agent [None req-978e41c3-87f3-432c-afc5-db2f2950207a - - - - - -] DHCP configuration for ports {'a1556cab-1b8b-43ba-b3a0-dfbacf198240'} is completed
Nov 28 10:05:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:48.921 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:05:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:48.938 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:05:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:48.938 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 10:05:48 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:48.994 2 INFO neutron.agent.securitygroups_rpc [None req-0fb2cb74-ebe1-4595-9145-49dcf60353ff 8d30c732fa674cae8e1de9092f58edd9 3a67d7f32f5e49c3aed3e09278dd6c95 - - default default] Security group member updated ['371ca172-3d0a-4f94-811c-7c823124cef1']
Nov 28 10:05:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e153 do_prune osdmap full prune enabled
Nov 28 10:05:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e154 e154: 6 total, 6 up, 6 in
Nov 28 10:05:49 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e154: 6 total, 6 up, 6 in
Nov 28 10:05:49 np0005538513.localdomain ceph-mon[292954]: pgmap v295: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 921 B/s wr, 29 op/s
Nov 28 10:05:49 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1333398748' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:05:49 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:49.395 2 INFO neutron.agent.securitygroups_rpc [None req-a6f5628a-dc81-40c9-a579-68393376dce2 8d30c732fa674cae8e1de9092f58edd9 3a67d7f32f5e49c3aed3e09278dd6c95 - - default default] Security group member updated ['371ca172-3d0a-4f94-811c-7c823124cef1']
Nov 28 10:05:49 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:49.485 261084 INFO neutron.agent.linux.ip_lib [None req-140c8cb8-d4ff-4fd0-803b-53ad9954a08e - - - - - -] Device tap3be78940-7b cannot be used as it has no MAC address
Nov 28 10:05:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:49.514 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:49 np0005538513.localdomain kernel: device tap3be78940-7b entered promiscuous mode
Nov 28 10:05:49 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:49Z|00360|binding|INFO|Claiming lport 3be78940-7b85-4f58-98c9-0b59e055c9b7 for this chassis.
Nov 28 10:05:49 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:49Z|00361|binding|INFO|3be78940-7b85-4f58-98c9-0b59e055c9b7: Claiming unknown
Nov 28 10:05:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:49.522 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:49 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324349.5237] manager: (tap3be78940-7b): new Generic device (/org/freedesktop/NetworkManager/Devices/60)
Nov 28 10:05:49 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:49.536 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-7d1af60e-6636-42cc-a949-e5df247a624f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d1af60e-6636-42cc-a949-e5df247a624f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8499932c523b4e26933fff84403e296e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=21e31ec5-87c8-4c59-84a8-d6708f5a124b, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=3be78940-7b85-4f58-98c9-0b59e055c9b7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:49 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:49.538 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 3be78940-7b85-4f58-98c9-0b59e055c9b7 in datapath 7d1af60e-6636-42cc-a949-e5df247a624f bound to our chassis
Nov 28 10:05:49 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:49.540 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7d1af60e-6636-42cc-a949-e5df247a624f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:05:49 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:49.541 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[8d717c0f-5a38-491b-b01f-92ee930894c2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:49 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:49Z|00362|binding|INFO|Setting lport 3be78940-7b85-4f58-98c9-0b59e055c9b7 ovn-installed in OVS
Nov 28 10:05:49 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:49Z|00363|binding|INFO|Setting lport 3be78940-7b85-4f58-98c9-0b59e055c9b7 up in Southbound
Nov 28 10:05:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:49.567 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:49.610 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:49.638 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:50 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:50 np0005538513.localdomain ceph-mon[292954]: osdmap e154: 6 total, 6 up, 6 in
Nov 28 10:05:50 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/3995934598' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:05:50 np0005538513.localdomain podman[323234]: 
Nov 28 10:05:50 np0005538513.localdomain podman[323234]: 2025-11-28 10:05:50.575681988 +0000 UTC m=+0.096597910 container create 6f04465a00f9d9e59ca3689fa115e60b0bb8549ce868c8c4e43fa90e9fc2f46b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d1af60e-6636-42cc-a949-e5df247a624f, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 28 10:05:50 np0005538513.localdomain podman[323234]: 2025-11-28 10:05:50.526626791 +0000 UTC m=+0.047542703 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:05:50 np0005538513.localdomain systemd[1]: Started libpod-conmon-6f04465a00f9d9e59ca3689fa115e60b0bb8549ce868c8c4e43fa90e9fc2f46b.scope.
Nov 28 10:05:50 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:05:50 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02a1d6e207201b1173b4f89a5144ddbc385ac151e3437841e7a59d19a4acb788/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:05:50 np0005538513.localdomain podman[323234]: 2025-11-28 10:05:50.664462801 +0000 UTC m=+0.185378643 container init 6f04465a00f9d9e59ca3689fa115e60b0bb8549ce868c8c4e43fa90e9fc2f46b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d1af60e-6636-42cc-a949-e5df247a624f, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 10:05:50 np0005538513.localdomain podman[323234]: 2025-11-28 10:05:50.676245989 +0000 UTC m=+0.197161841 container start 6f04465a00f9d9e59ca3689fa115e60b0bb8549ce868c8c4e43fa90e9fc2f46b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d1af60e-6636-42cc-a949-e5df247a624f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 28 10:05:50 np0005538513.localdomain dnsmasq[323252]: started, version 2.85 cachesize 150
Nov 28 10:05:50 np0005538513.localdomain dnsmasq[323252]: DNS service limited to local subnets
Nov 28 10:05:50 np0005538513.localdomain dnsmasq[323252]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:05:50 np0005538513.localdomain dnsmasq[323252]: warning: no upstream servers configured
Nov 28 10:05:50 np0005538513.localdomain dnsmasq-dhcp[323252]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:05:50 np0005538513.localdomain dnsmasq[323252]: read /var/lib/neutron/dhcp/7d1af60e-6636-42cc-a949-e5df247a624f/addn_hosts - 0 addresses
Nov 28 10:05:50 np0005538513.localdomain dnsmasq-dhcp[323252]: read /var/lib/neutron/dhcp/7d1af60e-6636-42cc-a949-e5df247a624f/host
Nov 28 10:05:50 np0005538513.localdomain dnsmasq-dhcp[323252]: read /var/lib/neutron/dhcp/7d1af60e-6636-42cc-a949-e5df247a624f/opts
Nov 28 10:05:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:50.767 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:50.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:50.844 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:05:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:50.844 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:05:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:50.846 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:05:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:50.903 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:50 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:50.908 261084 INFO neutron.agent.dhcp.agent [None req-abf16529-5afa-4f6f-8b54-d755a04cfef7 - - - - - -] DHCP configuration for ports {'6a3df323-6735-4de7-800b-c26cf8d05b74'} is completed
Nov 28 10:05:51 np0005538513.localdomain ceph-mon[292954]: pgmap v297: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 639 B/s rd, 511 B/s wr, 1 op/s
Nov 28 10:05:51 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1247585138' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:51 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1247585138' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:51.792 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:05:52 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:52Z|00364|binding|INFO|Releasing lport 3be78940-7b85-4f58-98c9-0b59e055c9b7 from this chassis (sb_readonly=0)
Nov 28 10:05:52 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:52Z|00365|binding|INFO|Setting lport 3be78940-7b85-4f58-98c9-0b59e055c9b7 down in Southbound
Nov 28 10:05:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:52.814 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:52 np0005538513.localdomain kernel: device tap3be78940-7b left promiscuous mode
Nov 28 10:05:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:52.824 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 5c0b54aa-cfb1-4d71-9aea-bd9c3487eadc with type ""
Nov 28 10:05:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:52.826 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-7d1af60e-6636-42cc-a949-e5df247a624f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d1af60e-6636-42cc-a949-e5df247a624f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8499932c523b4e26933fff84403e296e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=21e31ec5-87c8-4c59-84a8-d6708f5a124b, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=3be78940-7b85-4f58-98c9-0b59e055c9b7) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:52.827 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 3be78940-7b85-4f58-98c9-0b59e055c9b7 in datapath 7d1af60e-6636-42cc-a949-e5df247a624f unbound from our chassis
Nov 28 10:05:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:52.830 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d1af60e-6636-42cc-a949-e5df247a624f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:05:52 np0005538513.localdomain systemd[1]: tmp-crun.3XmyZs.mount: Deactivated successfully.
Nov 28 10:05:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:52.836 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:52 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:52.838 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[23d8df19-03ee-4b63-adfd-baa4ec6885fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:52 np0005538513.localdomain podman[323253]: 2025-11-28 10:05:52.839810483 +0000 UTC m=+0.068335268 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 10:05:52 np0005538513.localdomain podman[323253]: 2025-11-28 10:05:52.850067318 +0000 UTC m=+0.078592113 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:05:52 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:05:53 np0005538513.localdomain dnsmasq[323252]: read /var/lib/neutron/dhcp/7d1af60e-6636-42cc-a949-e5df247a624f/addn_hosts - 0 addresses
Nov 28 10:05:53 np0005538513.localdomain dnsmasq-dhcp[323252]: read /var/lib/neutron/dhcp/7d1af60e-6636-42cc-a949-e5df247a624f/host
Nov 28 10:05:53 np0005538513.localdomain dnsmasq-dhcp[323252]: read /var/lib/neutron/dhcp/7d1af60e-6636-42cc-a949-e5df247a624f/opts
Nov 28 10:05:53 np0005538513.localdomain podman[323296]: 2025-11-28 10:05:53.365479766 +0000 UTC m=+0.074573788 container kill 6f04465a00f9d9e59ca3689fa115e60b0bb8549ce868c8c4e43fa90e9fc2f46b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d1af60e-6636-42cc-a949-e5df247a624f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:05:53 np0005538513.localdomain ceph-mon[292954]: pgmap v298: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 2.2 KiB/s wr, 55 op/s
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent [None req-382043f7-0b96-4c26-ae87-5eeb4cb6913b - - - - - -] Unable to reload_allocations dhcp for 7d1af60e-6636-42cc-a949-e5df247a624f.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap3be78940-7b not found in namespace qdhcp-7d1af60e-6636-42cc-a949-e5df247a624f.
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent     return fut.result()
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent     raise self._exception
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap3be78940-7b not found in namespace qdhcp-7d1af60e-6636-42cc-a949-e5df247a624f.
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent 
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.394 261084 INFO neutron.agent.dhcp.agent [None req-6b0cd380-e608-4081-8967-4a3f74b64491 - - - - - -] Synchronizing state
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.657 261084 INFO neutron.agent.dhcp.agent [None req-3892baa5-d454-46db-a6f6-03a5e0466c91 - - - - - -] All active networks have been fetched through RPC.
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.658 261084 INFO neutron.agent.dhcp.agent [-] Starting network 7a711d7c-ff53-41f7-b3e7-fa55f4315988 dhcp configuration
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.662 261084 INFO neutron.agent.dhcp.agent [-] Starting network 7d1af60e-6636-42cc-a949-e5df247a624f dhcp configuration
Nov 28 10:05:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.663 261084 INFO neutron.agent.dhcp.agent [-] Finished network 7d1af60e-6636-42cc-a949-e5df247a624f dhcp configuration
Nov 28 10:05:53 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:53Z|00366|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:05:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:53.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:53.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 10:05:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:53.829 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:53 np0005538513.localdomain systemd[1]: tmp-crun.iFIAak.mount: Deactivated successfully.
Nov 28 10:05:54 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:54.820 261084 INFO neutron.agent.linux.ip_lib [None req-ff7760ed-79c9-4fef-bbfc-d4d324a1e55d - - - - - -] Device tap66bb996f-a9 cannot be used as it has no MAC address
Nov 28 10:05:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:54.879 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:54 np0005538513.localdomain kernel: device tap66bb996f-a9 entered promiscuous mode
Nov 28 10:05:54 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324354.8864] manager: (tap66bb996f-a9): new Generic device (/org/freedesktop/NetworkManager/Devices/61)
Nov 28 10:05:54 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:54Z|00367|binding|INFO|Claiming lport 66bb996f-a921-4ae6-b26d-1be8fa01c3bb for this chassis.
Nov 28 10:05:54 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:54Z|00368|binding|INFO|66bb996f-a921-4ae6-b26d-1be8fa01c3bb: Claiming unknown
Nov 28 10:05:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:54.889 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:54 np0005538513.localdomain systemd-udevd[323319]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:05:54 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap66bb996f-a9: No such device
Nov 28 10:05:54 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap66bb996f-a9: No such device
Nov 28 10:05:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:54.925 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:54.927 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:54 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:54Z|00369|binding|INFO|Setting lport 66bb996f-a921-4ae6-b26d-1be8fa01c3bb ovn-installed in OVS
Nov 28 10:05:54 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap66bb996f-a9: No such device
Nov 28 10:05:54 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap66bb996f-a9: No such device
Nov 28 10:05:54 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap66bb996f-a9: No such device
Nov 28 10:05:54 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap66bb996f-a9: No such device
Nov 28 10:05:54 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap66bb996f-a9: No such device
Nov 28 10:05:54 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap66bb996f-a9: No such device
Nov 28 10:05:54 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:54Z|00370|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:05:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:54.957 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-7a711d7c-ff53-41f7-b3e7-fa55f4315988', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a711d7c-ff53-41f7-b3e7-fa55f4315988', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a67d7f32f5e49c3aed3e09278dd6c95', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a8ed49a-5d05-48e5-b507-755a90a6ebc7, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=66bb996f-a921-4ae6-b26d-1be8fa01c3bb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:54.959 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 66bb996f-a921-4ae6-b26d-1be8fa01c3bb in datapath 7a711d7c-ff53-41f7-b3e7-fa55f4315988 bound to our chassis
Nov 28 10:05:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:54.960 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7a711d7c-ff53-41f7-b3e7-fa55f4315988 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:05:54 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:54.961 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[1125f8f2-855d-4654-a223-aaace0c59bde]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:54 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:54Z|00371|binding|INFO|Setting lport 66bb996f-a921-4ae6-b26d-1be8fa01c3bb up in Southbound
Nov 28 10:05:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:54.981 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:55.014 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:55 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:55 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e154 do_prune osdmap full prune enabled
Nov 28 10:05:55 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e155 e155: 6 total, 6 up, 6 in
Nov 28 10:05:55 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e155: 6 total, 6 up, 6 in
Nov 28 10:05:55 np0005538513.localdomain ceph-mon[292954]: pgmap v299: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 2.2 KiB/s wr, 55 op/s
Nov 28 10:05:55 np0005538513.localdomain ceph-mon[292954]: osdmap e155: 6 total, 6 up, 6 in
Nov 28 10:05:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:55.769 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:55.935 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:55 np0005538513.localdomain podman[323390]: 
Nov 28 10:05:55 np0005538513.localdomain podman[323390]: 2025-11-28 10:05:55.956854038 +0000 UTC m=+0.106853112 container create a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7a711d7c-ff53-41f7-b3e7-fa55f4315988, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:55 np0005538513.localdomain podman[323390]: 2025-11-28 10:05:55.898501996 +0000 UTC m=+0.048501110 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:05:56 np0005538513.localdomain systemd[1]: Started libpod-conmon-a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33.scope.
Nov 28 10:05:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:05:56 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:05:56 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60013bec6e27ad1923a5b5d469c77907300db4f1db841f191084d98ab2e16893/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:05:56 np0005538513.localdomain podman[323390]: 2025-11-28 10:05:56.05150597 +0000 UTC m=+0.201505054 container init a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7a711d7c-ff53-41f7-b3e7-fa55f4315988, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:56 np0005538513.localdomain podman[323390]: 2025-11-28 10:05:56.061191438 +0000 UTC m=+0.211190532 container start a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7a711d7c-ff53-41f7-b3e7-fa55f4315988, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 10:05:56 np0005538513.localdomain dnsmasq[323421]: started, version 2.85 cachesize 150
Nov 28 10:05:56 np0005538513.localdomain dnsmasq[323421]: DNS service limited to local subnets
Nov 28 10:05:56 np0005538513.localdomain dnsmasq[323421]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:05:56 np0005538513.localdomain dnsmasq[323421]: warning: no upstream servers configured
Nov 28 10:05:56 np0005538513.localdomain dnsmasq-dhcp[323421]: DHCP, static leases only on 10.100.0.16, lease time 1d
Nov 28 10:05:56 np0005538513.localdomain dnsmasq[323421]: read /var/lib/neutron/dhcp/7a711d7c-ff53-41f7-b3e7-fa55f4315988/addn_hosts - 0 addresses
Nov 28 10:05:56 np0005538513.localdomain dnsmasq-dhcp[323421]: read /var/lib/neutron/dhcp/7a711d7c-ff53-41f7-b3e7-fa55f4315988/host
Nov 28 10:05:56 np0005538513.localdomain dnsmasq-dhcp[323421]: read /var/lib/neutron/dhcp/7a711d7c-ff53-41f7-b3e7-fa55f4315988/opts
Nov 28 10:05:56 np0005538513.localdomain podman[323407]: 2025-11-28 10:05:56.106894227 +0000 UTC m=+0.084119401 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:05:56 np0005538513.localdomain podman[323407]: 2025-11-28 10:05:56.114577597 +0000 UTC m=+0.091802881 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 10:05:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:56.125 261084 INFO neutron.agent.dhcp.agent [None req-08eec445-ad07-4d1d-afd9-ae6effb40357 - - - - - -] Finished network 7a711d7c-ff53-41f7-b3e7-fa55f4315988 dhcp configuration
Nov 28 10:05:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:56.125 261084 INFO neutron.agent.dhcp.agent [None req-3892baa5-d454-46db-a6f6-03a5e0466c91 - - - - - -] Synchronizing state complete
Nov 28 10:05:56 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:05:56 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:56.580 261084 INFO neutron.agent.dhcp.agent [None req-b7526cda-febb-4dfc-a15f-2e223030fd8b - - - - - -] DHCP configuration for ports {'cbfe47f5-af35-45dd-ae4e-d52ae43817f1'} is completed
Nov 28 10:05:56 np0005538513.localdomain dnsmasq[323252]: exiting on receipt of SIGTERM
Nov 28 10:05:56 np0005538513.localdomain podman[323442]: 2025-11-28 10:05:56.596131396 +0000 UTC m=+0.060237988 container kill 6f04465a00f9d9e59ca3689fa115e60b0bb8549ce868c8c4e43fa90e9fc2f46b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d1af60e-6636-42cc-a949-e5df247a624f, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 10:05:56 np0005538513.localdomain systemd[1]: libpod-6f04465a00f9d9e59ca3689fa115e60b0bb8549ce868c8c4e43fa90e9fc2f46b.scope: Deactivated successfully.
Nov 28 10:05:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:05:56 np0005538513.localdomain podman[323456]: 2025-11-28 10:05:56.68074898 +0000 UTC m=+0.063745127 container died 6f04465a00f9d9e59ca3689fa115e60b0bb8549ce868c8c4e43fa90e9fc2f46b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d1af60e-6636-42cc-a949-e5df247a624f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 28 10:05:56 np0005538513.localdomain podman[323456]: 2025-11-28 10:05:56.737315621 +0000 UTC m=+0.120311728 container remove 6f04465a00f9d9e59ca3689fa115e60b0bb8549ce868c8c4e43fa90e9fc2f46b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d1af60e-6636-42cc-a949-e5df247a624f, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 28 10:05:56 np0005538513.localdomain systemd[1]: libpod-conmon-6f04465a00f9d9e59ca3689fa115e60b0bb8549ce868c8c4e43fa90e9fc2f46b.scope: Deactivated successfully.
Nov 28 10:05:56 np0005538513.localdomain podman[323460]: 2025-11-28 10:05:56.779489529 +0000 UTC m=+0.154235090 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:05:56 np0005538513.localdomain podman[323460]: 2025-11-28 10:05:56.850610388 +0000 UTC m=+0.225355929 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:05:56 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:05:56 np0005538513.localdomain dnsmasq[323421]: read /var/lib/neutron/dhcp/7a711d7c-ff53-41f7-b3e7-fa55f4315988/addn_hosts - 0 addresses
Nov 28 10:05:56 np0005538513.localdomain dnsmasq-dhcp[323421]: read /var/lib/neutron/dhcp/7a711d7c-ff53-41f7-b3e7-fa55f4315988/host
Nov 28 10:05:56 np0005538513.localdomain dnsmasq-dhcp[323421]: read /var/lib/neutron/dhcp/7a711d7c-ff53-41f7-b3e7-fa55f4315988/opts
Nov 28 10:05:56 np0005538513.localdomain podman[323517]: 2025-11-28 10:05:56.872647949 +0000 UTC m=+0.045300359 container kill a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7a711d7c-ff53-41f7-b3e7-fa55f4315988, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 10:05:56 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-02a1d6e207201b1173b4f89a5144ddbc385ac151e3437841e7a59d19a4acb788-merged.mount: Deactivated successfully.
Nov 28 10:05:56 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6f04465a00f9d9e59ca3689fa115e60b0bb8549ce868c8c4e43fa90e9fc2f46b-userdata-shm.mount: Deactivated successfully.
Nov 28 10:05:56 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d7d1af60e\x2d6636\x2d42cc\x2da949\x2de5df247a624f.mount: Deactivated successfully.
Nov 28 10:05:57 np0005538513.localdomain ceph-mon[292954]: pgmap v301: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 1.7 KiB/s wr, 54 op/s
Nov 28 10:05:57 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:57.601 261084 INFO neutron.agent.dhcp.agent [None req-a64e4494-7ff3-4570-a9a3-5f3560fa9b29 - - - - - -] DHCP configuration for ports {'cbfe47f5-af35-45dd-ae4e-d52ae43817f1', '66bb996f-a921-4ae6-b26d-1be8fa01c3bb'} is completed
Nov 28 10:05:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:05:57 np0005538513.localdomain systemd[1]: tmp-crun.yTagFs.mount: Deactivated successfully.
Nov 28 10:05:57 np0005538513.localdomain podman[323539]: 2025-11-28 10:05:57.88297835 +0000 UTC m=+0.110606530 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 28 10:05:57 np0005538513.localdomain podman[323539]: 2025-11-28 10:05:57.894478819 +0000 UTC m=+0.122107019 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:05:57 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:05:59 np0005538513.localdomain dnsmasq[323421]: exiting on receipt of SIGTERM
Nov 28 10:05:59 np0005538513.localdomain podman[323576]: 2025-11-28 10:05:59.052127842 +0000 UTC m=+0.067736332 container kill a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7a711d7c-ff53-41f7-b3e7-fa55f4315988, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:05:59 np0005538513.localdomain systemd[1]: libpod-a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33.scope: Deactivated successfully.
Nov 28 10:05:59 np0005538513.localdomain podman[323588]: 2025-11-28 10:05:59.113670116 +0000 UTC m=+0.051157948 container died a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7a711d7c-ff53-41f7-b3e7-fa55f4315988, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 10:05:59 np0005538513.localdomain systemd[1]: tmp-crun.fLLzXW.mount: Deactivated successfully.
Nov 28 10:05:59 np0005538513.localdomain podman[323588]: 2025-11-28 10:05:59.163323428 +0000 UTC m=+0.100811200 container cleanup a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7a711d7c-ff53-41f7-b3e7-fa55f4315988, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 10:05:59 np0005538513.localdomain systemd[1]: libpod-conmon-a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33.scope: Deactivated successfully.
Nov 28 10:05:59 np0005538513.localdomain podman[323595]: 2025-11-28 10:05:59.189505979 +0000 UTC m=+0.113949097 container remove a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7a711d7c-ff53-41f7-b3e7-fa55f4315988, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 10:05:59 np0005538513.localdomain kernel: device tap66bb996f-a9 left promiscuous mode
Nov 28 10:05:59 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:59Z|00372|binding|INFO|Releasing lport 66bb996f-a921-4ae6-b26d-1be8fa01c3bb from this chassis (sb_readonly=0)
Nov 28 10:05:59 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:05:59Z|00373|binding|INFO|Setting lport 66bb996f-a921-4ae6-b26d-1be8fa01c3bb down in Southbound
Nov 28 10:05:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:59.248 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:59 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:59.263 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-7a711d7c-ff53-41f7-b3e7-fa55f4315988', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a711d7c-ff53-41f7-b3e7-fa55f4315988', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a67d7f32f5e49c3aed3e09278dd6c95', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a8ed49a-5d05-48e5-b507-755a90a6ebc7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=66bb996f-a921-4ae6-b26d-1be8fa01c3bb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:59 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:59.265 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 66bb996f-a921-4ae6-b26d-1be8fa01c3bb in datapath 7a711d7c-ff53-41f7-b3e7-fa55f4315988 unbound from our chassis
Nov 28 10:05:59 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:59.267 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7a711d7c-ff53-41f7-b3e7-fa55f4315988, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:05:59 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:05:59.269 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[acc81291-13c4-4112-9f07-0dedd6bb64e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:05:59.269 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:59 np0005538513.localdomain ceph-mon[292954]: pgmap v302: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 2.1 KiB/s wr, 52 op/s
Nov 28 10:05:59 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1783542772' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:59 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1783542772' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:59 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:59.513 261084 INFO neutron.agent.dhcp.agent [None req-fb132a90-9367-460c-abde-fcbdc48eefd9 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:59 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:05:59.711 2 INFO neutron.agent.securitygroups_rpc [None req-eb7cbd19-2bf0-4a29-9cd5-8cfb0b13af7b 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:05:59 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 28 10:05:59 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/753870557' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:05:59 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:05:59.767 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:00 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-60013bec6e27ad1923a5b5d469c77907300db4f1db841f191084d98ab2e16893-merged.mount: Deactivated successfully.
Nov 28 10:06:00 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33-userdata-shm.mount: Deactivated successfully.
Nov 28 10:06:00 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d7a711d7c\x2dff53\x2d41f7\x2db3e7\x2dfa55f4315988.mount: Deactivated successfully.
Nov 28 10:06:00 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:06:00 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:00.253 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:00 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e155 do_prune osdmap full prune enabled
Nov 28 10:06:00 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e156 e156: 6 total, 6 up, 6 in
Nov 28 10:06:00 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/753870557' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:06:00 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2237297014' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:00 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2237297014' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:00 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e156: 6 total, 6 up, 6 in
Nov 28 10:06:00 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:00Z|00374|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.675 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.676 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 10:06:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:00.737 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.740 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 17020000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd845317-f579-404d-96e0-6d703e41e21d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17020000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:06:00.676394', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'd7690280-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.912291188, 'message_signature': '1844171b68b2ae6855ac910be045638da2f2754db42f228ebb574fdebf01bbdc'}]}, 'timestamp': '2025-11-28 10:06:00.741242', '_unique_id': 'f3479e4248ba43f2869d91279067b75e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.744 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.747 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '879335fb-2cfc-4e52-9033-4339135b3d7c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:06:00.744211', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'd76a0ef0-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.916120989, 'message_signature': '4fecac80c08f67fb6277b0811d4dadb1217cc5804b80e7f0427d67c353ebed38'}]}, 'timestamp': '2025-11-28 10:06:00.748076', '_unique_id': '66fb038e4b634e939f7af736678db402'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.750 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 10:06:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:00.771 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.781 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.781 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a6c8efe-e1f2-40cc-bc24-50ed781294e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:06:00.750642', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd76f2ebc-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': '81cc3aca4720f5d3541a2d5da9bfd1e7bdebb1cb7918e026d3299ff2ddc00513'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:06:00.750642', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd76f4078-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': '2b6c56439d5e2f3135258ee0072d4a4124468f0e35bcb08b44bdc3910e5e2880'}]}, 'timestamp': '2025-11-28 10:06:00.782121', '_unique_id': '186081e5f5fa490db5b4cc07c4d3cb81'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.784 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.784 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.785 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4124fe0d-c212-4775-9be8-c4a7329fa4a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:06:00.784585', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd76fb580-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': '2c54d53b0decc992e22f523c114c8e0860f32a2664e110d3485c123ac924e057'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:06:00.784585', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd76fc778-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': '37968bb3d9e549caaf0510195022c1368e06b96c1e6322546bddf3302475a5b6'}]}, 'timestamp': '2025-11-28 10:06:00.785504', '_unique_id': 'a95b5535056240bb961c7cc9a658ebc5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.787 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.787 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3917536-1da7-4cc2-990f-94742f2fcd29', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:06:00.787811', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'd7703500-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.916120989, 'message_signature': '7cd043d6ffce00bdd548c43bcbf66ae1208774d2820273da2b2f8758fefee3f1'}]}, 'timestamp': '2025-11-28 10:06:00.788335', '_unique_id': '9cee908666bc45fd92f70e87f919339e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.790 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.790 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02a17a44-ea63-4357-9fdc-20ec71a48895', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:06:00.790530', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'd7709dd8-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.916120989, 'message_signature': 'fcb4c4ce24fe48d9d23c21a3b2dad1bda7ee8b9db960c2c4a9f79bbcefb957f5'}]}, 'timestamp': '2025-11-28 10:06:00.791048', '_unique_id': 'f3a347efd73c4552ae9a2ab3c27cd371'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.793 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.793 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.793 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.794 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34cbd06f-8184-4e5e-868f-b305dd3bbc1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:06:00.793455', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd77112e0-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': 'bb493ea1c13d67268098da1445e732b8253e9ef8e0d83636ea8ad0dbb0fe888e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:06:00.793455', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd77126a4-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': 'eb5f79782ff529fda20c477ad9da771f6c950152e3d91cfe47d49aaf320ee386'}]}, 'timestamp': '2025-11-28 10:06:00.794496', '_unique_id': '9f59f84c3eba4fe8a1280b472db9c17b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.796 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.796 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3667fa21-f424-434b-a928-d9c206dba987', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:06:00.796929', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'd7719bc0-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.916120989, 'message_signature': '9a00c65c12601944aab5e12d471a78931066d9a9054ce4b8e92f7593a4d90776'}]}, 'timestamp': '2025-11-28 10:06:00.797519', '_unique_id': '10b32e8f57b6426aa1b8347047b276da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.799 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.799 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.800 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9339392-0219-474a-b3c3-e6a4c4b7ff2a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:06:00.799702', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd77203b2-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': '79021b3188ca7d4bffb13b98fca39f2bbd4d0de2a0ea9566fab8e6c2d0f6e8b0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:06:00.799702', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd77215a0-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': '10ce026b81590cacab029d55c2933fa0afcc8a115e108aa36ba24deb3722ea5c'}]}, 'timestamp': '2025-11-28 10:06:00.800608', '_unique_id': '506a3e2be93c43aaa87557d1d3061f9e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.802 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.802 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.803 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db16ea7b-b393-4168-a16c-d5a2d9b22b4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:06:00.802819', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd7727ebe-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': 'ca09f32a0e63b6ef48143297756aef36c636f2d05b3e40127d3dc02d6c479a0c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:06:00.802819', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd7728f58-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': 'f314a57bb2db18c69b5adb2d8f4ce13e7c4e14fbf5bc03622674a45c1bbed950'}]}, 'timestamp': '2025-11-28 10:06:00.803722', '_unique_id': 'fd012c33038242d5bf66d338b09a3e3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.805 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.805 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a27944d1-8f9f-4f7a-81b4-bd49d28ce295', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:06:00.805928', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'd772f880-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.912291188, 'message_signature': '1b9b2ccabe1ee71474fbfed0814a4af03caccc3a3a638a8e44ff7932bbc520a2'}]}, 'timestamp': '2025-11-28 10:06:00.806429', '_unique_id': '51aaa6b58a1a4c48871fbc276671d9af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.808 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.821 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.821 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '679f0caf-dc09-4356-8178-655023860816', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:06:00.808609', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd7755008-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.980521094, 'message_signature': '91f9b9bc58a7044ff39de526d589933619f2db5afad0fd47ade18099e8b54590'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:06:00.808609', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd775639a-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.980521094, 'message_signature': '372aad769ccde2111fb794dffb932292136b322662b3b64ebedc0e74dc58cb08'}]}, 'timestamp': '2025-11-28 10:06:00.822265', '_unique_id': 'f29a5b98d4dd49dfb45a7be981ec5433'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.824 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.824 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.825 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb767f03-c0aa-402f-b296-6646cc6cfb4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:06:00.824687', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd775d4b0-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.980521094, 'message_signature': '6678bab04d9988a803c86f6dace1dcfd4f3b48cb9bec625a81e79335722fad96'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:06:00.824687', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd775e798-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.980521094, 'message_signature': '2a216f0d2707095498965e1c00c8b5c2d5596b20da5dbd6189e5be37658be4e7'}]}, 'timestamp': '2025-11-28 10:06:00.825646', '_unique_id': '3e8e5f7ace014151b692c42be2e9c0f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.828 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.828 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.828 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a1d45ca-54ed-4ea9-b4c3-663d49e8229c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:06:00.828381', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'd7766434-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.916120989, 'message_signature': '642771d5b258c5d8000e553764658be0ea14e0d4102a84ebc22bf87bb87f85d0'}]}, 'timestamp': '2025-11-28 10:06:00.828952', '_unique_id': 'a16cdb7b60ae482dbe17a43b9526f7aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.831 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.831 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.831 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4d6ba9f-2c45-4cc5-b67e-fc1e7702d4cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:06:00.831302', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd776d61c-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.980521094, 'message_signature': '141e6e9a77e9f23b4a77f0436858162518aaa0e13eec7c6c98340611a6d7b2f6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:06:00.831302', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd776eae4-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.980521094, 'message_signature': '21c6222ea854d17feed872af83386dd4c114557b3f1aff5239c8199a6b7c298b'}]}, 'timestamp': '2025-11-28 10:06:00.832298', '_unique_id': 'a400443d65df406c8e15ef1f012bb4a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.834 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.834 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.835 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b39f1238-cf13-4112-a0e9-fe4eab580e56', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:06:00.834730', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd7775c22-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': 'b47d928d11de7813fa671747d80f464be49bc0ff691eafd299166be088bdf9ae'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:06:00.834730', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd77772de-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': '133cd7f471866d255a6398dfdf6960f12af71f08d0046c3c6f32b64bdff02ecb'}]}, 'timestamp': '2025-11-28 10:06:00.835766', '_unique_id': '0996e0723681437d81ad40e78d60ef16'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.838 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.838 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '246dadc9-cd9d-4bec-bb6e-b498b59141c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:06:00.838577', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'd777f286-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.916120989, 'message_signature': '44d2fc2ec72d85890369e61684cf34e4b58d2d0ede1c10f65ddd3e873d3c2cb3'}]}, 'timestamp': '2025-11-28 10:06:00.839093', '_unique_id': 'f52ab51d20164396a1cb36daad93516c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.841 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.841 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94520f3b-d4e1-443e-9a7a-61e5d4b39dbd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:06:00.841715', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'd7786d9c-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.916120989, 'message_signature': '3ea228459018d04d3883cd2b3d7ed38d3cbdab2dde345b11221bda67c96c3c33'}]}, 'timestamp': '2025-11-28 10:06:00.842249', '_unique_id': 'cff874e6bffc4be9824f4c50760e0ea3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.843 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.844 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.844 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84fa6121-79d8-44da-a042-bd1ee67ae122', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:06:00.844109', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'd778c6fc-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.916120989, 'message_signature': 'e074c583572d1b447682ccc3dfa7afa5b0e218e855e70577e97aa103bc57fd50'}]}, 'timestamp': '2025-11-28 10:06:00.844410', '_unique_id': '71ac992c4d7a422e87498b35f95f4403'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba20dd06-eaf6-427d-b8c5-0db2475cc954', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:06:00.845891', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'd7790f04-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.916120989, 'message_signature': 'a2a416539b94a62b0c9519f1bd6decec056f666f60a5ac0f054e87695987aa1e'}]}, 'timestamp': '2025-11-28 10:06:00.846257', '_unique_id': 'a5e4255df49d4dd8a8599a5129bf263f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.847 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.847 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8fc0bdf-fc7b-4f98-a66a-c5ddd9b14ce2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:06:00.847636', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'd779536a-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.916120989, 'message_signature': 'c2de92a97b06ef51055fca4425cea2c31f720475b6fdb5453814c128803565c2'}]}, 'timestamp': '2025-11-28 10:06:00.848006', '_unique_id': '6116905a2f814d6093545617bda6776e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:06:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:06:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:00.936 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:01 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e156 do_prune osdmap full prune enabled
Nov 28 10:06:01 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e157 e157: 6 total, 6 up, 6 in
Nov 28 10:06:01 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e157: 6 total, 6 up, 6 in
Nov 28 10:06:01 np0005538513.localdomain ceph-mon[292954]: pgmap v303: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 1.8 KiB/s wr, 44 op/s
Nov 28 10:06:01 np0005538513.localdomain ceph-mon[292954]: osdmap e156: 6 total, 6 up, 6 in
Nov 28 10:06:02 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e157 do_prune osdmap full prune enabled
Nov 28 10:06:02 np0005538513.localdomain ceph-mon[292954]: osdmap e157: 6 total, 6 up, 6 in
Nov 28 10:06:02 np0005538513.localdomain ceph-mon[292954]: pgmap v306: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 83 KiB/s rd, 3.2 MiB/s wr, 125 op/s
Nov 28 10:06:02 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e158 e158: 6 total, 6 up, 6 in
Nov 28 10:06:02 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e158: 6 total, 6 up, 6 in
Nov 28 10:06:02 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:02.857 2 INFO neutron.agent.securitygroups_rpc [None req-e67102ba-fe9d-44cc-8c94-469bc136f71c 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:03 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:03.015 2 INFO neutron.agent.securitygroups_rpc [None req-e67102ba-fe9d-44cc-8c94-469bc136f71c 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:03 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e158 do_prune osdmap full prune enabled
Nov 28 10:06:03 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e159 e159: 6 total, 6 up, 6 in
Nov 28 10:06:03 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e159: 6 total, 6 up, 6 in
Nov 28 10:06:03 np0005538513.localdomain ceph-mon[292954]: osdmap e158: 6 total, 6 up, 6 in
Nov 28 10:06:03 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:03.622 2 INFO neutron.agent.securitygroups_rpc [None req-06fb4107-5237-41cb-aedf-7ec837b7d0c6 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:04 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:04.288 2 INFO neutron.agent.securitygroups_rpc [None req-f2f37d61-bf24-49e1-9d4d-20b98269f559 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:04 np0005538513.localdomain ceph-mon[292954]: osdmap e159: 6 total, 6 up, 6 in
Nov 28 10:06:04 np0005538513.localdomain ceph-mon[292954]: pgmap v309: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 136 KiB/s rd, 5.3 MiB/s wr, 206 op/s
Nov 28 10:06:05 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:06:05 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e159 do_prune osdmap full prune enabled
Nov 28 10:06:05 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e160 e160: 6 total, 6 up, 6 in
Nov 28 10:06:05 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e160: 6 total, 6 up, 6 in
Nov 28 10:06:05 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:05.147 261084 INFO neutron.agent.linux.ip_lib [None req-50215d9c-a361-42fb-8188-be02379e7e7d - - - - - -] Device tap6b9af304-88 cannot be used as it has no MAC address
Nov 28 10:06:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:05.175 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:05 np0005538513.localdomain kernel: device tap6b9af304-88 entered promiscuous mode
Nov 28 10:06:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:05.185 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:05 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:05Z|00375|binding|INFO|Claiming lport 6b9af304-88e0-4384-8907-09d0654dd558 for this chassis.
Nov 28 10:06:05 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324365.1861] manager: (tap6b9af304-88): new Generic device (/org/freedesktop/NetworkManager/Devices/62)
Nov 28 10:06:05 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:05Z|00376|binding|INFO|6b9af304-88e0-4384-8907-09d0654dd558: Claiming unknown
Nov 28 10:06:05 np0005538513.localdomain systemd-udevd[323626]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:06:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:05.199 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8499932c523b4e26933fff84403e296e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2c47060-f3f3-4904-844c-d551d6391359, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=6b9af304-88e0-4384-8907-09d0654dd558) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:05.200 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 6b9af304-88e0-4384-8907-09d0654dd558 in datapath 7aa47953-45b5-4e9e-a0ad-6ce1121b65d3 bound to our chassis
Nov 28 10:06:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:05.202 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7aa47953-45b5-4e9e-a0ad-6ce1121b65d3 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:06:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:05.203 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[63718553-f4b8-440b-baf0-fc1e9f6155bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:05 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:05Z|00377|binding|INFO|Setting lport 6b9af304-88e0-4384-8907-09d0654dd558 ovn-installed in OVS
Nov 28 10:06:05 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:05Z|00378|binding|INFO|Setting lport 6b9af304-88e0-4384-8907-09d0654dd558 up in Southbound
Nov 28 10:06:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:05.233 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:05.237 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:05.290 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:05.376 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:05.774 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:05 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1537040644' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:05 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1537040644' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:05 np0005538513.localdomain ceph-mon[292954]: osdmap e160: 6 total, 6 up, 6 in
Nov 28 10:06:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:05.939 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:06 np0005538513.localdomain podman[323681]: 
Nov 28 10:06:06 np0005538513.localdomain podman[323681]: 2025-11-28 10:06:06.306576 +0000 UTC m=+0.092891902 container create e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:06:06 np0005538513.localdomain systemd[1]: Started libpod-conmon-e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6.scope.
Nov 28 10:06:06 np0005538513.localdomain podman[323681]: 2025-11-28 10:06:06.262814947 +0000 UTC m=+0.049130889 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:06:06 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:06:06 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7af7ba26fc5497d5193fe60b439f0ab812c8b19ba53610d2ee58bf5d35d015e0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:06:06 np0005538513.localdomain podman[323681]: 2025-11-28 10:06:06.378908393 +0000 UTC m=+0.165224305 container init e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:06:06 np0005538513.localdomain podman[323681]: 2025-11-28 10:06:06.389390904 +0000 UTC m=+0.175706806 container start e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:06:06 np0005538513.localdomain dnsmasq[323699]: started, version 2.85 cachesize 150
Nov 28 10:06:06 np0005538513.localdomain dnsmasq[323699]: DNS service limited to local subnets
Nov 28 10:06:06 np0005538513.localdomain dnsmasq[323699]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:06:06 np0005538513.localdomain dnsmasq[323699]: warning: no upstream servers configured
Nov 28 10:06:06 np0005538513.localdomain dnsmasq-dhcp[323699]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:06:06 np0005538513.localdomain dnsmasq[323699]: read /var/lib/neutron/dhcp/7aa47953-45b5-4e9e-a0ad-6ce1121b65d3/addn_hosts - 0 addresses
Nov 28 10:06:06 np0005538513.localdomain dnsmasq-dhcp[323699]: read /var/lib/neutron/dhcp/7aa47953-45b5-4e9e-a0ad-6ce1121b65d3/host
Nov 28 10:06:06 np0005538513.localdomain dnsmasq-dhcp[323699]: read /var/lib/neutron/dhcp/7aa47953-45b5-4e9e-a0ad-6ce1121b65d3/opts
Nov 28 10:06:06 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:06.531 261084 INFO neutron.agent.dhcp.agent [None req-25f77893-cfe2-4169-b8c6-84d3c462c28e - - - - - -] DHCP configuration for ports {'b362d99b-cd7e-4279-a644-ce03e6e8ad3e'} is completed
Nov 28 10:06:07 np0005538513.localdomain ceph-mon[292954]: pgmap v311: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 128 KiB/s rd, 5.0 MiB/s wr, 194 op/s
Nov 28 10:06:07 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:07Z|00379|binding|INFO|Removing iface tap6b9af304-88 ovn-installed in OVS
Nov 28 10:06:07 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:07Z|00380|binding|INFO|Removing lport 6b9af304-88e0-4384-8907-09d0654dd558 ovn-installed in OVS
Nov 28 10:06:07 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:07.591 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 8be6fc43-4d37-4bea-ad0e-c8c9afe7f73d with type ""
Nov 28 10:06:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:07.592 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:07 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:07.593 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8499932c523b4e26933fff84403e296e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2c47060-f3f3-4904-844c-d551d6391359, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=6b9af304-88e0-4384-8907-09d0654dd558) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:07 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:07.596 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 6b9af304-88e0-4384-8907-09d0654dd558 in datapath 7aa47953-45b5-4e9e-a0ad-6ce1121b65d3 unbound from our chassis
Nov 28 10:06:07 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:07.600 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7aa47953-45b5-4e9e-a0ad-6ce1121b65d3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:06:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:07.601 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:07 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:07.601 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4378f71d-7c4c-4010-891e-045354d9f477]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:07.605 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:07 np0005538513.localdomain kernel: device tap6b9af304-88 left promiscuous mode
Nov 28 10:06:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:07.625 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:08 np0005538513.localdomain dnsmasq[323699]: read /var/lib/neutron/dhcp/7aa47953-45b5-4e9e-a0ad-6ce1121b65d3/addn_hosts - 0 addresses
Nov 28 10:06:08 np0005538513.localdomain dnsmasq-dhcp[323699]: read /var/lib/neutron/dhcp/7aa47953-45b5-4e9e-a0ad-6ce1121b65d3/host
Nov 28 10:06:08 np0005538513.localdomain dnsmasq-dhcp[323699]: read /var/lib/neutron/dhcp/7aa47953-45b5-4e9e-a0ad-6ce1121b65d3/opts
Nov 28 10:06:08 np0005538513.localdomain podman[323719]: 2025-11-28 10:06:08.314114615 +0000 UTC m=+0.060852805 container kill e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent [None req-d45746ab-e98a-471b-b415-935bfb9a74da - - - - - -] Unable to reload_allocations dhcp for 7aa47953-45b5-4e9e-a0ad-6ce1121b65d3.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap6b9af304-88 not found in namespace qdhcp-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3.
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent     return fut.result()
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent     raise self._exception
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap6b9af304-88 not found in namespace qdhcp-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3.
Nov 28 10:06:08 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent 
Nov 28 10:06:08 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:08Z|00381|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:06:08 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:08.643 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:06:08 np0005538513.localdomain podman[323732]: 2025-11-28 10:06:08.853283764 +0000 UTC m=+0.084858582 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:06:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:06:08 np0005538513.localdomain podman[323732]: 2025-11-28 10:06:08.871651521 +0000 UTC m=+0.103226319 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:06:08 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:06:08 np0005538513.localdomain podman[323755]: 2025-11-28 10:06:08.976421923 +0000 UTC m=+0.099046449 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 10:06:08 np0005538513.localdomain podman[323755]: 2025-11-28 10:06:08.991498145 +0000 UTC m=+0.114122711 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:06:09 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:06:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e160 do_prune osdmap full prune enabled
Nov 28 10:06:09 np0005538513.localdomain ceph-mon[292954]: pgmap v312: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 2.5 KiB/s wr, 50 op/s
Nov 28 10:06:09 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3328946498' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:06:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e161 e161: 6 total, 6 up, 6 in
Nov 28 10:06:09 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e161: 6 total, 6 up, 6 in
Nov 28 10:06:09 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:09.390 2 INFO neutron.agent.securitygroups_rpc [None req-88638c5c-0267-4c55-b76f-e8bedf4b6799 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:09 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:09.411 261084 INFO neutron.agent.linux.ip_lib [None req-df36a7d6-c202-4395-8eec-27213b0f90c7 - - - - - -] Device tap9f28414a-bc cannot be used as it has no MAC address
Nov 28 10:06:09 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:09.446 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:09 np0005538513.localdomain kernel: device tap9f28414a-bc entered promiscuous mode
Nov 28 10:06:09 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:09.456 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:09 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324369.4565] manager: (tap9f28414a-bc): new Generic device (/org/freedesktop/NetworkManager/Devices/63)
Nov 28 10:06:09 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:09Z|00382|binding|INFO|Claiming lport 9f28414a-bc83-4ccc-ac84-3585c39e468a for this chassis.
Nov 28 10:06:09 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:09Z|00383|binding|INFO|9f28414a-bc83-4ccc-ac84-3585c39e468a: Claiming unknown
Nov 28 10:06:09 np0005538513.localdomain systemd-udevd[323784]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:06:09 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:09.468 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-a31c6261-6aec-4e5b-8552-7f0b3ff5946f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a31c6261-6aec-4e5b-8552-7f0b3ff5946f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=05816226-956c-45ae-8b67-7c74d141697e, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=9f28414a-bc83-4ccc-ac84-3585c39e468a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:09 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:09.471 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 9f28414a-bc83-4ccc-ac84-3585c39e468a in datapath a31c6261-6aec-4e5b-8552-7f0b3ff5946f bound to our chassis
Nov 28 10:06:09 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:09.475 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port f3273cf3-8b00-4e72-8ef6-318774bfd7b2 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:06:09 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:09.475 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a31c6261-6aec-4e5b-8552-7f0b3ff5946f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:06:09 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:09.477 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[a62d72fe-26bf-4ffc-8943-c6a1b24cc38e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:09 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap9f28414a-bc: No such device
Nov 28 10:06:09 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:09Z|00384|binding|INFO|Setting lport 9f28414a-bc83-4ccc-ac84-3585c39e468a ovn-installed in OVS
Nov 28 10:06:09 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:09Z|00385|binding|INFO|Setting lport 9f28414a-bc83-4ccc-ac84-3585c39e468a up in Southbound
Nov 28 10:06:09 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:09.510 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:09 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap9f28414a-bc: No such device
Nov 28 10:06:09 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap9f28414a-bc: No such device
Nov 28 10:06:09 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap9f28414a-bc: No such device
Nov 28 10:06:09 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap9f28414a-bc: No such device
Nov 28 10:06:09 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap9f28414a-bc: No such device
Nov 28 10:06:09 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap9f28414a-bc: No such device
Nov 28 10:06:09 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap9f28414a-bc: No such device
Nov 28 10:06:09 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:09.564 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:09 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:09.601 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:10 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:10.073 2 INFO neutron.agent.securitygroups_rpc [None req-58c4368b-992c-4171-8380-96e11b260575 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:10 np0005538513.localdomain podman[238687]: time="2025-11-28T10:06:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:06:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:06:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 161163 "" "Go-http-client/1.1"
Nov 28 10:06:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:06:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:06:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20695 "" "Go-http-client/1.1"
Nov 28 10:06:10 np0005538513.localdomain ceph-mon[292954]: osdmap e161: 6 total, 6 up, 6 in
Nov 28 10:06:10 np0005538513.localdomain podman[323855]: 
Nov 28 10:06:10 np0005538513.localdomain podman[323855]: 2025-11-28 10:06:10.560642181 +0000 UTC m=+0.095779582 container create 97dc091afbfb3d896ac4141ccfc99fc0fb34966e3631b0de0ec2bba64af16354 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a31c6261-6aec-4e5b-8552-7f0b3ff5946f, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 28 10:06:10 np0005538513.localdomain podman[323855]: 2025-11-28 10:06:10.514083526 +0000 UTC m=+0.049220947 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:06:10 np0005538513.localdomain systemd[1]: Started libpod-conmon-97dc091afbfb3d896ac4141ccfc99fc0fb34966e3631b0de0ec2bba64af16354.scope.
Nov 28 10:06:10 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:06:10 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05b7276827f072b306555cf6cbc24a30c5bb95b18f1dc24627a8bd098febac87/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:06:10 np0005538513.localdomain podman[323855]: 2025-11-28 10:06:10.676462404 +0000 UTC m=+0.211599795 container init 97dc091afbfb3d896ac4141ccfc99fc0fb34966e3631b0de0ec2bba64af16354 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a31c6261-6aec-4e5b-8552-7f0b3ff5946f, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 10:06:10 np0005538513.localdomain podman[323855]: 2025-11-28 10:06:10.686255838 +0000 UTC m=+0.221393229 container start 97dc091afbfb3d896ac4141ccfc99fc0fb34966e3631b0de0ec2bba64af16354 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a31c6261-6aec-4e5b-8552-7f0b3ff5946f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:10 np0005538513.localdomain dnsmasq[323873]: started, version 2.85 cachesize 150
Nov 28 10:06:10 np0005538513.localdomain dnsmasq[323873]: DNS service limited to local subnets
Nov 28 10:06:10 np0005538513.localdomain dnsmasq[323873]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:06:10 np0005538513.localdomain dnsmasq[323873]: warning: no upstream servers configured
Nov 28 10:06:10 np0005538513.localdomain dnsmasq-dhcp[323873]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:06:10 np0005538513.localdomain dnsmasq[323873]: read /var/lib/neutron/dhcp/a31c6261-6aec-4e5b-8552-7f0b3ff5946f/addn_hosts - 0 addresses
Nov 28 10:06:10 np0005538513.localdomain dnsmasq-dhcp[323873]: read /var/lib/neutron/dhcp/a31c6261-6aec-4e5b-8552-7f0b3ff5946f/host
Nov 28 10:06:10 np0005538513.localdomain dnsmasq-dhcp[323873]: read /var/lib/neutron/dhcp/a31c6261-6aec-4e5b-8552-7f0b3ff5946f/opts
Nov 28 10:06:10 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:10.753 261084 INFO neutron.agent.dhcp.agent [None req-3892baa5-d454-46db-a6f6-03a5e0466c91 - - - - - -] Synchronizing state
Nov 28 10:06:10 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:10.778 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:10 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:10.811 261084 INFO neutron.agent.dhcp.agent [None req-8e0eeb55-b1dd-4683-a456-f2bbe435ed76 - - - - - -] DHCP configuration for ports {'806f1b66-c281-4ff2-a156-2f6b3a5062cd'} is completed
Nov 28 10:06:10 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:10.908 2 INFO neutron.agent.securitygroups_rpc [None req-dbbfa92a-6f48-4219-96e5-450182918d3d e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:10 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:10.932 261084 INFO neutron.agent.dhcp.agent [None req-f0ac88e7-c8b1-41b1-80b8-08b0837b50f0 - - - - - -] All active networks have been fetched through RPC.
Nov 28 10:06:10 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:10.982 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:10 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:10.997 2 INFO neutron.agent.securitygroups_rpc [None req-dbbfa92a-6f48-4219-96e5-450182918d3d e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e161 do_prune osdmap full prune enabled
Nov 28 10:06:11 np0005538513.localdomain dnsmasq[323699]: exiting on receipt of SIGTERM
Nov 28 10:06:11 np0005538513.localdomain podman[323889]: 2025-11-28 10:06:11.126642211 +0000 UTC m=+0.067399372 container kill e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 10:06:11 np0005538513.localdomain systemd[1]: libpod-e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6.scope: Deactivated successfully.
Nov 28 10:06:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e162 e162: 6 total, 6 up, 6 in
Nov 28 10:06:11 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e162: 6 total, 6 up, 6 in
Nov 28 10:06:11 np0005538513.localdomain podman[323901]: 2025-11-28 10:06:11.21495442 +0000 UTC m=+0.072165909 container died e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:06:11 np0005538513.localdomain ceph-mon[292954]: pgmap v314: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 2.4 KiB/s wr, 49 op/s
Nov 28 10:06:11 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/144526477' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:11 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/144526477' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:11 np0005538513.localdomain ceph-mon[292954]: osdmap e162: 6 total, 6 up, 6 in
Nov 28 10:06:11 np0005538513.localdomain podman[323901]: 2025-11-28 10:06:11.25458968 +0000 UTC m=+0.111801129 container cleanup e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 28 10:06:11 np0005538513.localdomain systemd[1]: libpod-conmon-e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6.scope: Deactivated successfully.
Nov 28 10:06:11 np0005538513.localdomain podman[323903]: 2025-11-28 10:06:11.30488493 +0000 UTC m=+0.152446780 container remove e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:06:11 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:11.316 2 INFO neutron.agent.securitygroups_rpc [None req-a2df05ee-0a35-442d-aa18-eb9de4dda01c e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:11 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:11.338 261084 INFO neutron.agent.dhcp.agent [-] Starting network 4dc5b71e-287e-4ec6-b6b7-4d131e85d551 dhcp configuration
Nov 28 10:06:11 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-7af7ba26fc5497d5193fe60b439f0ab812c8b19ba53610d2ee58bf5d35d015e0-merged.mount: Deactivated successfully.
Nov 28 10:06:11 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6-userdata-shm.mount: Deactivated successfully.
Nov 28 10:06:11 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d7aa47953\x2d45b5\x2d4e9e\x2da0ad\x2d6ce1121b65d3.mount: Deactivated successfully.
Nov 28 10:06:11 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:11.779 2 INFO neutron.agent.securitygroups_rpc [None req-cbfd0cad-e26e-4d58-9458-36c5ead2083c 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:11 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:11.840 2 INFO neutron.agent.securitygroups_rpc [None req-841a5df4-1f75-4611-8939-235283ca6a97 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:11.853 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:12 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:12.122 261084 INFO neutron.agent.linux.ip_lib [None req-146628cc-f53a-48e0-97bd-a4c2ecadc8fb - - - - - -] Device tap1bff36cc-f5 cannot be used as it has no MAC address
Nov 28 10:06:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:12.191 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:12 np0005538513.localdomain kernel: device tap1bff36cc-f5 entered promiscuous mode
Nov 28 10:06:12 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324372.2012] manager: (tap1bff36cc-f5): new Generic device (/org/freedesktop/NetworkManager/Devices/64)
Nov 28 10:06:12 np0005538513.localdomain systemd-udevd[323786]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:06:12 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:12Z|00386|binding|INFO|Claiming lport 1bff36cc-f508-4066-a5d7-c55bc5baf4a9 for this chassis.
Nov 28 10:06:12 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:12Z|00387|binding|INFO|1bff36cc-f508-4066-a5d7-c55bc5baf4a9: Claiming unknown
Nov 28 10:06:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:12.203 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:12 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:12.215 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-4dc5b71e-287e-4ec6-b6b7-4d131e85d551', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4dc5b71e-287e-4ec6-b6b7-4d131e85d551', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '29d0d5b3ba0745d58aee3845ea704b73', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c0295c06-e7c1-42d0-9d25-c6c6ebd15e16, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=1bff36cc-f508-4066-a5d7-c55bc5baf4a9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:12 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:12.217 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 1bff36cc-f508-4066-a5d7-c55bc5baf4a9 in datapath 4dc5b71e-287e-4ec6-b6b7-4d131e85d551 bound to our chassis
Nov 28 10:06:12 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:12.219 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4dc5b71e-287e-4ec6-b6b7-4d131e85d551 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:06:12 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:12.220 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[5a97589f-1ab5-4f9c-b8b0-207ced6eb030]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:12 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e162 do_prune osdmap full prune enabled
Nov 28 10:06:12 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:12Z|00388|binding|INFO|Setting lport 1bff36cc-f508-4066-a5d7-c55bc5baf4a9 ovn-installed in OVS
Nov 28 10:06:12 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:12Z|00389|binding|INFO|Setting lport 1bff36cc-f508-4066-a5d7-c55bc5baf4a9 up in Southbound
Nov 28 10:06:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:12.240 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:12 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e163 e163: 6 total, 6 up, 6 in
Nov 28 10:06:12 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/170238483' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:12 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/170238483' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:12 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e163: 6 total, 6 up, 6 in
Nov 28 10:06:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:12.299 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:12.346 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:13 np0005538513.localdomain ceph-mon[292954]: pgmap v316: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 117 KiB/s rd, 5.9 KiB/s wr, 157 op/s
Nov 28 10:06:13 np0005538513.localdomain ceph-mon[292954]: osdmap e163: 6 total, 6 up, 6 in
Nov 28 10:06:13 np0005538513.localdomain podman[323995]: 
Nov 28 10:06:13 np0005538513.localdomain podman[323995]: 2025-11-28 10:06:13.315766668 +0000 UTC m=+0.087240427 container create a69ad56203ce2d58010d1a3d9560651f5f89bafa2fcdaede5f1718e219c2c9ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 10:06:13 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:06:13 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1677773907' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:13 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:06:13 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1677773907' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:13 np0005538513.localdomain systemd[1]: Started libpod-conmon-a69ad56203ce2d58010d1a3d9560651f5f89bafa2fcdaede5f1718e219c2c9ba.scope.
Nov 28 10:06:13 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:06:13 np0005538513.localdomain podman[323995]: 2025-11-28 10:06:13.272913139 +0000 UTC m=+0.044386908 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:06:13 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fbbc0b5dc4455006807612af40ea366df63f7068109c1199eb3107e33ae30da1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:06:13 np0005538513.localdomain podman[323995]: 2025-11-28 10:06:13.383008714 +0000 UTC m=+0.154482483 container init a69ad56203ce2d58010d1a3d9560651f5f89bafa2fcdaede5f1718e219c2c9ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:06:13 np0005538513.localdomain podman[323995]: 2025-11-28 10:06:13.394034986 +0000 UTC m=+0.165508705 container start a69ad56203ce2d58010d1a3d9560651f5f89bafa2fcdaede5f1718e219c2c9ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:06:13 np0005538513.localdomain dnsmasq[324012]: started, version 2.85 cachesize 150
Nov 28 10:06:13 np0005538513.localdomain dnsmasq[324012]: DNS service limited to local subnets
Nov 28 10:06:13 np0005538513.localdomain dnsmasq[324012]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:06:13 np0005538513.localdomain dnsmasq[324012]: warning: no upstream servers configured
Nov 28 10:06:13 np0005538513.localdomain dnsmasq-dhcp[324012]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:06:13 np0005538513.localdomain dnsmasq[324012]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/addn_hosts - 0 addresses
Nov 28 10:06:13 np0005538513.localdomain dnsmasq-dhcp[324012]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/host
Nov 28 10:06:13 np0005538513.localdomain dnsmasq-dhcp[324012]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/opts
Nov 28 10:06:13 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:13.468 261084 INFO neutron.agent.dhcp.agent [None req-259ccea0-5b7c-48f2-bda3-af2d63ecdc14 - - - - - -] Finished network 4dc5b71e-287e-4ec6-b6b7-4d131e85d551 dhcp configuration
Nov 28 10:06:13 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:13.470 261084 INFO neutron.agent.dhcp.agent [None req-a54afd89-63c5-4b53-bc59-976538151a8c - - - - - -] Synchronizing state complete
Nov 28 10:06:13 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:13.473 261084 INFO neutron.agent.dhcp.agent [None req-df8bf941-a81b-43f3-a5d2-ac851bb9d287 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:06:09Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd63db220>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd63dbeb0>], id=8971d2b3-e3e8-4058-a520-863fde2aaa63, ip_allocation=immediate, mac_address=fa:16:3e:ea:c0:b6, name=tempest-PortsTestJSON-538865126, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:06:05Z, description=, dns_domain=, id=a31c6261-6aec-4e5b-8552-7f0b3ff5946f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-2085773191, port_security_enabled=True, project_id=5e7a07c97c664076bc825e05137c574c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9902, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2204, status=ACTIVE, subnets=['3b5c7335-5ab3-4264-870c-37328703e1d1'], tags=[], tenant_id=5e7a07c97c664076bc825e05137c574c, updated_at=2025-11-28T10:06:07Z, vlan_transparent=None, network_id=a31c6261-6aec-4e5b-8552-7f0b3ff5946f, port_security_enabled=True, project_id=5e7a07c97c664076bc825e05137c574c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b'], standard_attr_id=2247, status=DOWN, tags=[], tenant_id=5e7a07c97c664076bc825e05137c574c, updated_at=2025-11-28T10:06:09Z on network a31c6261-6aec-4e5b-8552-7f0b3ff5946f
Nov 28 10:06:13 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:13.701 261084 INFO neutron.agent.dhcp.agent [None req-61afb93c-70a2-4d65-8044-2ca1b5bda457 - - - - - -] DHCP configuration for ports {'2f0cd637-5351-4b69-86b0-00f4d51eccc9'} is completed
Nov 28 10:06:13 np0005538513.localdomain dnsmasq[323873]: read /var/lib/neutron/dhcp/a31c6261-6aec-4e5b-8552-7f0b3ff5946f/addn_hosts - 1 addresses
Nov 28 10:06:13 np0005538513.localdomain dnsmasq-dhcp[323873]: read /var/lib/neutron/dhcp/a31c6261-6aec-4e5b-8552-7f0b3ff5946f/host
Nov 28 10:06:13 np0005538513.localdomain dnsmasq-dhcp[323873]: read /var/lib/neutron/dhcp/a31c6261-6aec-4e5b-8552-7f0b3ff5946f/opts
Nov 28 10:06:13 np0005538513.localdomain podman[324030]: 2025-11-28 10:06:13.733186408 +0000 UTC m=+0.070660133 container kill 97dc091afbfb3d896ac4141ccfc99fc0fb34966e3631b0de0ec2bba64af16354 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a31c6261-6aec-4e5b-8552-7f0b3ff5946f, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 10:06:13 np0005538513.localdomain dnsmasq[324012]: exiting on receipt of SIGTERM
Nov 28 10:06:13 np0005538513.localdomain systemd[1]: libpod-a69ad56203ce2d58010d1a3d9560651f5f89bafa2fcdaede5f1718e219c2c9ba.scope: Deactivated successfully.
Nov 28 10:06:13 np0005538513.localdomain podman[324064]: 2025-11-28 10:06:13.911130129 +0000 UTC m=+0.068351142 container kill a69ad56203ce2d58010d1a3d9560651f5f89bafa2fcdaede5f1718e219c2c9ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:06:13 np0005538513.localdomain podman[324081]: 2025-11-28 10:06:13.98366493 +0000 UTC m=+0.058324871 container died a69ad56203ce2d58010d1a3d9560651f5f89bafa2fcdaede5f1718e219c2c9ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:06:14 np0005538513.localdomain podman[324081]: 2025-11-28 10:06:14.021345418 +0000 UTC m=+0.096005320 container cleanup a69ad56203ce2d58010d1a3d9560651f5f89bafa2fcdaede5f1718e219c2c9ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 10:06:14 np0005538513.localdomain systemd[1]: libpod-conmon-a69ad56203ce2d58010d1a3d9560651f5f89bafa2fcdaede5f1718e219c2c9ba.scope: Deactivated successfully.
Nov 28 10:06:14 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:14.033 261084 INFO neutron.agent.dhcp.agent [None req-b8fb8199-99b4-4b6d-bf42-b356b5d98cf2 - - - - - -] DHCP configuration for ports {'8971d2b3-e3e8-4058-a520-863fde2aaa63'} is completed
Nov 28 10:06:14 np0005538513.localdomain podman[324083]: 2025-11-28 10:06:14.074425885 +0000 UTC m=+0.137980271 container remove a69ad56203ce2d58010d1a3d9560651f5f89bafa2fcdaede5f1718e219c2c9ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:06:14 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:14Z|00390|binding|INFO|Removing iface tap9f28414a-bc ovn-installed in OVS
Nov 28 10:06:14 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:14Z|00391|binding|INFO|Removing lport 9f28414a-bc83-4ccc-ac84-3585c39e468a ovn-installed in OVS
Nov 28 10:06:14 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:14.187 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port f3273cf3-8b00-4e72-8ef6-318774bfd7b2 with type ""
Nov 28 10:06:14 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:14.189 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-a31c6261-6aec-4e5b-8552-7f0b3ff5946f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a31c6261-6aec-4e5b-8552-7f0b3ff5946f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=05816226-956c-45ae-8b67-7c74d141697e, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=9f28414a-bc83-4ccc-ac84-3585c39e468a) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:14 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:14.190 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 9f28414a-bc83-4ccc-ac84-3585c39e468a in datapath a31c6261-6aec-4e5b-8552-7f0b3ff5946f unbound from our chassis
Nov 28 10:06:14 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:14.194 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a31c6261-6aec-4e5b-8552-7f0b3ff5946f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:06:14 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:14.195 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[1037481d-251e-4d9c-bcd6-4b4c2bfcff7b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:14.243 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e163 do_prune osdmap full prune enabled
Nov 28 10:06:14 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1677773907' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:14 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1677773907' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:14 np0005538513.localdomain dnsmasq[323873]: exiting on receipt of SIGTERM
Nov 28 10:06:14 np0005538513.localdomain podman[324131]: 2025-11-28 10:06:14.280125237 +0000 UTC m=+0.113843943 container kill 97dc091afbfb3d896ac4141ccfc99fc0fb34966e3631b0de0ec2bba64af16354 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a31c6261-6aec-4e5b-8552-7f0b3ff5946f, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 10:06:14 np0005538513.localdomain systemd[1]: libpod-97dc091afbfb3d896ac4141ccfc99fc0fb34966e3631b0de0ec2bba64af16354.scope: Deactivated successfully.
Nov 28 10:06:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e164 e164: 6 total, 6 up, 6 in
Nov 28 10:06:14 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e164: 6 total, 6 up, 6 in
Nov 28 10:06:14 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-fbbc0b5dc4455006807612af40ea366df63f7068109c1199eb3107e33ae30da1-merged.mount: Deactivated successfully.
Nov 28 10:06:14 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a69ad56203ce2d58010d1a3d9560651f5f89bafa2fcdaede5f1718e219c2c9ba-userdata-shm.mount: Deactivated successfully.
Nov 28 10:06:14 np0005538513.localdomain podman[324153]: 2025-11-28 10:06:14.373484154 +0000 UTC m=+0.064872594 container died 97dc091afbfb3d896ac4141ccfc99fc0fb34966e3631b0de0ec2bba64af16354 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a31c6261-6aec-4e5b-8552-7f0b3ff5946f, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:06:14 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-97dc091afbfb3d896ac4141ccfc99fc0fb34966e3631b0de0ec2bba64af16354-userdata-shm.mount: Deactivated successfully.
Nov 28 10:06:14 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-05b7276827f072b306555cf6cbc24a30c5bb95b18f1dc24627a8bd098febac87-merged.mount: Deactivated successfully.
Nov 28 10:06:14 np0005538513.localdomain podman[324153]: 2025-11-28 10:06:14.413768233 +0000 UTC m=+0.105156643 container remove 97dc091afbfb3d896ac4141ccfc99fc0fb34966e3631b0de0ec2bba64af16354 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a31c6261-6aec-4e5b-8552-7f0b3ff5946f, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 10:06:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:14.432 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:14 np0005538513.localdomain kernel: device tap9f28414a-bc left promiscuous mode
Nov 28 10:06:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:14.447 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:14 np0005538513.localdomain systemd[1]: libpod-conmon-97dc091afbfb3d896ac4141ccfc99fc0fb34966e3631b0de0ec2bba64af16354.scope: Deactivated successfully.
Nov 28 10:06:14 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2da31c6261\x2d6aec\x2d4e5b\x2d8552\x2d7f0b3ff5946f.mount: Deactivated successfully.
Nov 28 10:06:14 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:14.493 261084 INFO neutron.agent.dhcp.agent [None req-8fb76a28-58bf-4f2e-8c38-5de0ca321bfa - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:14 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:14.494 261084 INFO neutron.agent.dhcp.agent [None req-8fb76a28-58bf-4f2e-8c38-5de0ca321bfa - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:14 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:14.835 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:15 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:15.011 2 INFO neutron.agent.securitygroups_rpc [None req-e90d3b0d-d250-425f-9007-eccb585011b0 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e164 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:06:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e164 do_prune osdmap full prune enabled
Nov 28 10:06:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e165 e165: 6 total, 6 up, 6 in
Nov 28 10:06:15 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e165: 6 total, 6 up, 6 in
Nov 28 10:06:15 np0005538513.localdomain podman[324216]: 
Nov 28 10:06:15 np0005538513.localdomain podman[324216]: 2025-11-28 10:06:15.161263424 +0000 UTC m=+0.107355151 container create c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 28 10:06:15 np0005538513.localdomain systemd[1]: Started libpod-conmon-c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952.scope.
Nov 28 10:06:15 np0005538513.localdomain podman[324216]: 2025-11-28 10:06:15.106138474 +0000 UTC m=+0.052230241 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:06:15 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:06:15 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/518cfa22c4e318530a6e5aefab2c4e20bcb90abb837fcbd18b8f75b2b31294f3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:06:15 np0005538513.localdomain podman[324216]: 2025-11-28 10:06:15.230509553 +0000 UTC m=+0.176601290 container init c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 28 10:06:15 np0005538513.localdomain podman[324216]: 2025-11-28 10:06:15.240614916 +0000 UTC m=+0.186706633 container start c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:06:15 np0005538513.localdomain dnsmasq[324234]: started, version 2.85 cachesize 150
Nov 28 10:06:15 np0005538513.localdomain dnsmasq[324234]: DNS service limited to local subnets
Nov 28 10:06:15 np0005538513.localdomain dnsmasq[324234]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:06:15 np0005538513.localdomain dnsmasq[324234]: warning: no upstream servers configured
Nov 28 10:06:15 np0005538513.localdomain dnsmasq-dhcp[324234]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:06:15 np0005538513.localdomain dnsmasq[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/addn_hosts - 0 addresses
Nov 28 10:06:15 np0005538513.localdomain dnsmasq-dhcp[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/host
Nov 28 10:06:15 np0005538513.localdomain dnsmasq-dhcp[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/opts
Nov 28 10:06:15 np0005538513.localdomain ceph-mon[292954]: pgmap v318: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 93 KiB/s rd, 4.0 KiB/s wr, 124 op/s
Nov 28 10:06:15 np0005538513.localdomain ceph-mon[292954]: osdmap e164: 6 total, 6 up, 6 in
Nov 28 10:06:15 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3736797209' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:15 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3736797209' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:15 np0005538513.localdomain ceph-mon[292954]: osdmap e165: 6 total, 6 up, 6 in
Nov 28 10:06:15 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:15.558 261084 INFO neutron.agent.dhcp.agent [None req-d655e655-3f16-4703-a66f-e36878eb1591 - - - - - -] DHCP configuration for ports {'1bff36cc-f508-4066-a5d7-c55bc5baf4a9', '2f0cd637-5351-4b69-86b0-00f4d51eccc9'} is completed
Nov 28 10:06:15 np0005538513.localdomain dnsmasq[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/addn_hosts - 0 addresses
Nov 28 10:06:15 np0005538513.localdomain dnsmasq-dhcp[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/host
Nov 28 10:06:15 np0005538513.localdomain dnsmasq-dhcp[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/opts
Nov 28 10:06:15 np0005538513.localdomain podman[324252]: 2025-11-28 10:06:15.741853556 +0000 UTC m=+0.064315496 container kill c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 10:06:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:15.802 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:15 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:15.899 261084 INFO neutron.agent.dhcp.agent [None req-a54afd89-63c5-4b53-bc59-976538151a8c - - - - - -] Synchronizing state
Nov 28 10:06:15 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:15Z|00392|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:06:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:15.946 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:15.985 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:16 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:16.022 2 INFO neutron.agent.securitygroups_rpc [None req-3771b2dc-83c7-4322-8cb3-68ef5f3840bb e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:16 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:16.158 261084 INFO neutron.agent.dhcp.agent [None req-6a72d357-0d95-4adc-8d0a-cc258cb0edd9 - - - - - -] DHCP configuration for ports {'1bff36cc-f508-4066-a5d7-c55bc5baf4a9', '2f0cd637-5351-4b69-86b0-00f4d51eccc9'} is completed
Nov 28 10:06:16 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:16.188 261084 INFO neutron.agent.dhcp.agent [None req-48712969-2fd3-4809-8736-106860b7dd0e - - - - - -] All active networks have been fetched through RPC.
Nov 28 10:06:16 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:16.189 261084 INFO neutron.agent.dhcp.agent [-] Starting network 66c5dde3-dd95-4799-b7fb-daebf3806263 dhcp configuration
Nov 28 10:06:16 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:16.189 261084 INFO neutron.agent.dhcp.agent [-] Finished network 66c5dde3-dd95-4799-b7fb-daebf3806263 dhcp configuration
Nov 28 10:06:16 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:16.190 261084 INFO neutron.agent.dhcp.agent [None req-48712969-2fd3-4809-8736-106860b7dd0e - - - - - -] Synchronizing state complete
Nov 28 10:06:16 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:16.191 261084 INFO neutron.agent.dhcp.agent [None req-d2bc7583-b8ae-435a-8225-f6a351893a7d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:16 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:16.191 261084 INFO neutron.agent.dhcp.agent [None req-d2bc7583-b8ae-435a-8225-f6a351893a7d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:17 np0005538513.localdomain ceph-mon[292954]: pgmap v321: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 120 KiB/s rd, 5.2 KiB/s wr, 160 op/s
Nov 28 10:06:17 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 28 10:06:17 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3947693664' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:06:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:06:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:06:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:06:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:06:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:06:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:06:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:06:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:06:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:06:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:06:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:06:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:06:18 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e165 do_prune osdmap full prune enabled
Nov 28 10:06:18 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e166 e166: 6 total, 6 up, 6 in
Nov 28 10:06:18 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e166: 6 total, 6 up, 6 in
Nov 28 10:06:18 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3947693664' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:06:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:06:18 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:18.857 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:18 np0005538513.localdomain podman[324274]: 2025-11-28 10:06:18.901893956 +0000 UTC m=+0.125643700 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, architecture=x86_64)
Nov 28 10:06:18 np0005538513.localdomain podman[324274]: 2025-11-28 10:06:18.91881249 +0000 UTC m=+0.142562224 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, config_id=edpm, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vendor=Red Hat, Inc.)
Nov 28 10:06:18 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:06:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e166 do_prune osdmap full prune enabled
Nov 28 10:06:19 np0005538513.localdomain ceph-mon[292954]: pgmap v322: 177 pgs: 177 active+clean; 145 MiB data, 810 MiB used, 41 GiB / 42 GiB avail; 49 KiB/s rd, 2.5 KiB/s wr, 65 op/s
Nov 28 10:06:19 np0005538513.localdomain ceph-mon[292954]: osdmap e166: 6 total, 6 up, 6 in
Nov 28 10:06:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e167 e167: 6 total, 6 up, 6 in
Nov 28 10:06:19 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e167: 6 total, 6 up, 6 in
Nov 28 10:06:20 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:06:20 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e167 do_prune osdmap full prune enabled
Nov 28 10:06:20 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e168 e168: 6 total, 6 up, 6 in
Nov 28 10:06:20 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e168: 6 total, 6 up, 6 in
Nov 28 10:06:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:20.181 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:20 np0005538513.localdomain ceph-mon[292954]: osdmap e167: 6 total, 6 up, 6 in
Nov 28 10:06:20 np0005538513.localdomain ceph-mon[292954]: osdmap e168: 6 total, 6 up, 6 in
Nov 28 10:06:20 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:20.471 261084 INFO neutron.agent.linux.ip_lib [None req-7734d0e4-46d2-4c3f-a26c-3a43df1cc1ff - - - - - -] Device tapdc62470e-a4 cannot be used as it has no MAC address
Nov 28 10:06:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:20.498 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:20 np0005538513.localdomain kernel: device tapdc62470e-a4 entered promiscuous mode
Nov 28 10:06:20 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324380.5055] manager: (tapdc62470e-a4): new Generic device (/org/freedesktop/NetworkManager/Devices/65)
Nov 28 10:06:20 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:20Z|00393|binding|INFO|Claiming lport dc62470e-a40f-4105-848f-1f2b879c4aae for this chassis.
Nov 28 10:06:20 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:20Z|00394|binding|INFO|dc62470e-a40f-4105-848f-1f2b879c4aae: Claiming unknown
Nov 28 10:06:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:20.508 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:20 np0005538513.localdomain systemd-udevd[324302]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:06:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:20.535 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:20 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdc62470e-a4: No such device
Nov 28 10:06:20 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:20Z|00395|binding|INFO|Setting lport dc62470e-a40f-4105-848f-1f2b879c4aae ovn-installed in OVS
Nov 28 10:06:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:20.541 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:20.543 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:20 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdc62470e-a4: No such device
Nov 28 10:06:20 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdc62470e-a4: No such device
Nov 28 10:06:20 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:20Z|00396|binding|INFO|Setting lport dc62470e-a40f-4105-848f-1f2b879c4aae up in Southbound
Nov 28 10:06:20 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdc62470e-a4: No such device
Nov 28 10:06:20 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:20.553 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-10c0858a-69b4-4de1-aea8-8d780005bf13', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-10c0858a-69b4-4de1-aea8-8d780005bf13', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b2dec4622fe844c592a13e779612beaa', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64753474-6cc2-4012-bab6-4f0449c46fab, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=1, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=dc62470e-a40f-4105-848f-1f2b879c4aae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:20 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:20.555 158130 INFO neutron.agent.ovn.metadata.agent [-] Port dc62470e-a40f-4105-848f-1f2b879c4aae in datapath 10c0858a-69b4-4de1-aea8-8d780005bf13 bound to our chassis
Nov 28 10:06:20 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:20.557 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 10c0858a-69b4-4de1-aea8-8d780005bf13 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:06:20 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:20.558 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[0972ce58-f1f3-4460-8c99-6513129d7229]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:20 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdc62470e-a4: No such device
Nov 28 10:06:20 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdc62470e-a4: No such device
Nov 28 10:06:20 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdc62470e-a4: No such device
Nov 28 10:06:20 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdc62470e-a4: No such device
Nov 28 10:06:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:20.591 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:20.618 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:20 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:20.620 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:06:20Z, description=, device_id=5b45f823-0eca-4648-936e-96781a85013b, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64525e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64529a0>], id=d39baeac-faf7-4050-b1c2-2f8c9573c064, ip_allocation=immediate, mac_address=fa:16:3e:45:9b:f5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:06:08Z, description=, dns_domain=, id=4dc5b71e-287e-4ec6-b6b7-4d131e85d551, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-346806519-network, port_security_enabled=True, project_id=29d0d5b3ba0745d58aee3845ea704b73, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=16781, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2238, status=ACTIVE, subnets=['1d51cccc-0a3c-4da7-88f2-d129e18efd59'], tags=[], tenant_id=29d0d5b3ba0745d58aee3845ea704b73, updated_at=2025-11-28T10:06:09Z, vlan_transparent=None, network_id=4dc5b71e-287e-4ec6-b6b7-4d131e85d551, port_security_enabled=False, project_id=29d0d5b3ba0745d58aee3845ea704b73, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2306, status=DOWN, tags=[], tenant_id=29d0d5b3ba0745d58aee3845ea704b73, updated_at=2025-11-28T10:06:20Z on network 4dc5b71e-287e-4ec6-b6b7-4d131e85d551
Nov 28 10:06:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:20.806 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:20 np0005538513.localdomain podman[324350]: 2025-11-28 10:06:20.852464102 +0000 UTC m=+0.059763675 container kill c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:20 np0005538513.localdomain dnsmasq[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/addn_hosts - 1 addresses
Nov 28 10:06:20 np0005538513.localdomain dnsmasq-dhcp[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/host
Nov 28 10:06:20 np0005538513.localdomain dnsmasq-dhcp[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/opts
Nov 28 10:06:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:20.988 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e168 do_prune osdmap full prune enabled
Nov 28 10:06:21 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:21.153 261084 INFO neutron.agent.dhcp.agent [None req-2764cf61-d9b3-474b-8e9f-6056c6b8e402 - - - - - -] DHCP configuration for ports {'d39baeac-faf7-4050-b1c2-2f8c9573c064'} is completed
Nov 28 10:06:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e169 e169: 6 total, 6 up, 6 in
Nov 28 10:06:21 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e169: 6 total, 6 up, 6 in
Nov 28 10:06:21 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:21.264 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port ff390443-5640-48d9-91fb-f1efebde638f with type ""
Nov 28 10:06:21 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:21Z|00397|binding|INFO|Removing iface tapdc62470e-a4 ovn-installed in OVS
Nov 28 10:06:21 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:21Z|00398|binding|INFO|Removing lport dc62470e-a40f-4105-848f-1f2b879c4aae ovn-installed in OVS
Nov 28 10:06:21 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:21.266 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-10c0858a-69b4-4de1-aea8-8d780005bf13', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-10c0858a-69b4-4de1-aea8-8d780005bf13', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b2dec4622fe844c592a13e779612beaa', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64753474-6cc2-4012-bab6-4f0449c46fab, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=1, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=dc62470e-a40f-4105-848f-1f2b879c4aae) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:21.267 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:21 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:21.270 158130 INFO neutron.agent.ovn.metadata.agent [-] Port dc62470e-a40f-4105-848f-1f2b879c4aae in datapath 10c0858a-69b4-4de1-aea8-8d780005bf13 unbound from our chassis
Nov 28 10:06:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:21.270 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:21 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:21.271 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 10c0858a-69b4-4de1-aea8-8d780005bf13 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:06:21 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:21.273 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[7176f691-d30f-48e6-b509-1b7fccc23695]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:21 np0005538513.localdomain ceph-mon[292954]: pgmap v325: 177 pgs: 177 active+clean; 145 MiB data, 810 MiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 2.7 KiB/s wr, 72 op/s
Nov 28 10:06:21 np0005538513.localdomain ceph-mon[292954]: osdmap e169: 6 total, 6 up, 6 in
Nov 28 10:06:21 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:21Z|00399|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:06:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:21.543 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:21 np0005538513.localdomain podman[324409]: 
Nov 28 10:06:21 np0005538513.localdomain podman[324409]: 2025-11-28 10:06:21.633856954 +0000 UTC m=+0.098100775 container create 08f2491e5443093d105a1e930be3d3c856a41ce1ee3b9888ac54ff837a0f5839 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10c0858a-69b4-4de1-aea8-8d780005bf13, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 10:06:21 np0005538513.localdomain systemd[1]: Started libpod-conmon-08f2491e5443093d105a1e930be3d3c856a41ce1ee3b9888ac54ff837a0f5839.scope.
Nov 28 10:06:21 np0005538513.localdomain podman[324409]: 2025-11-28 10:06:21.585794283 +0000 UTC m=+0.050038124 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:06:21 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:06:21 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bcb6162e51a81069b70e3cfbeb695cb5cd524dd6654c7ed4ff41275202a34476/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:06:21 np0005538513.localdomain podman[324409]: 2025-11-28 10:06:21.725152627 +0000 UTC m=+0.189396408 container init 08f2491e5443093d105a1e930be3d3c856a41ce1ee3b9888ac54ff837a0f5839 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10c0858a-69b4-4de1-aea8-8d780005bf13, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:06:21 np0005538513.localdomain podman[324409]: 2025-11-28 10:06:21.741154454 +0000 UTC m=+0.205398255 container start 08f2491e5443093d105a1e930be3d3c856a41ce1ee3b9888ac54ff837a0f5839 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10c0858a-69b4-4de1-aea8-8d780005bf13, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 10:06:21 np0005538513.localdomain dnsmasq[324427]: started, version 2.85 cachesize 150
Nov 28 10:06:21 np0005538513.localdomain dnsmasq[324427]: DNS service limited to local subnets
Nov 28 10:06:21 np0005538513.localdomain dnsmasq[324427]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:06:21 np0005538513.localdomain dnsmasq[324427]: warning: no upstream servers configured
Nov 28 10:06:21 np0005538513.localdomain dnsmasq-dhcp[324427]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:06:21 np0005538513.localdomain dnsmasq[324427]: read /var/lib/neutron/dhcp/10c0858a-69b4-4de1-aea8-8d780005bf13/addn_hosts - 0 addresses
Nov 28 10:06:21 np0005538513.localdomain dnsmasq-dhcp[324427]: read /var/lib/neutron/dhcp/10c0858a-69b4-4de1-aea8-8d780005bf13/host
Nov 28 10:06:21 np0005538513.localdomain dnsmasq-dhcp[324427]: read /var/lib/neutron/dhcp/10c0858a-69b4-4de1-aea8-8d780005bf13/opts
Nov 28 10:06:21 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:21.795 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:06:20Z, description=, device_id=5b45f823-0eca-4648-936e-96781a85013b, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd63e52b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd63e59a0>], id=d39baeac-faf7-4050-b1c2-2f8c9573c064, ip_allocation=immediate, mac_address=fa:16:3e:45:9b:f5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:06:08Z, description=, dns_domain=, id=4dc5b71e-287e-4ec6-b6b7-4d131e85d551, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-346806519-network, port_security_enabled=True, project_id=29d0d5b3ba0745d58aee3845ea704b73, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=16781, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2238, status=ACTIVE, subnets=['1d51cccc-0a3c-4da7-88f2-d129e18efd59'], tags=[], tenant_id=29d0d5b3ba0745d58aee3845ea704b73, updated_at=2025-11-28T10:06:09Z, vlan_transparent=None, network_id=4dc5b71e-287e-4ec6-b6b7-4d131e85d551, port_security_enabled=False, project_id=29d0d5b3ba0745d58aee3845ea704b73, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2306, status=DOWN, tags=[], tenant_id=29d0d5b3ba0745d58aee3845ea704b73, updated_at=2025-11-28T10:06:20Z on network 4dc5b71e-287e-4ec6-b6b7-4d131e85d551
Nov 28 10:06:21 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:21.874 261084 INFO neutron.agent.dhcp.agent [None req-a1e6e3d9-c842-444a-be15-1415125307fa - - - - - -] DHCP configuration for ports {'d8b529de-dee0-4172-a457-4cf0aa46cb55'} is completed
Nov 28 10:06:21 np0005538513.localdomain dnsmasq[324427]: exiting on receipt of SIGTERM
Nov 28 10:06:21 np0005538513.localdomain podman[324448]: 2025-11-28 10:06:21.998123905 +0000 UTC m=+0.070396245 container kill 08f2491e5443093d105a1e930be3d3c856a41ce1ee3b9888ac54ff837a0f5839 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10c0858a-69b4-4de1-aea8-8d780005bf13, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:06:22 np0005538513.localdomain systemd[1]: libpod-08f2491e5443093d105a1e930be3d3c856a41ce1ee3b9888ac54ff837a0f5839.scope: Deactivated successfully.
Nov 28 10:06:22 np0005538513.localdomain podman[324473]: 2025-11-28 10:06:22.070134 +0000 UTC m=+0.056886466 container died 08f2491e5443093d105a1e930be3d3c856a41ce1ee3b9888ac54ff837a0f5839 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10c0858a-69b4-4de1-aea8-8d780005bf13, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:06:22 np0005538513.localdomain podman[324473]: 2025-11-28 10:06:22.108333565 +0000 UTC m=+0.095085991 container cleanup 08f2491e5443093d105a1e930be3d3c856a41ce1ee3b9888ac54ff837a0f5839 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10c0858a-69b4-4de1-aea8-8d780005bf13, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:06:22 np0005538513.localdomain systemd[1]: libpod-conmon-08f2491e5443093d105a1e930be3d3c856a41ce1ee3b9888ac54ff837a0f5839.scope: Deactivated successfully.
Nov 28 10:06:22 np0005538513.localdomain podman[324475]: 2025-11-28 10:06:22.164206929 +0000 UTC m=+0.141010947 container remove 08f2491e5443093d105a1e930be3d3c856a41ce1ee3b9888ac54ff837a0f5839 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10c0858a-69b4-4de1-aea8-8d780005bf13, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 10:06:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:22.217 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:22 np0005538513.localdomain dnsmasq[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/addn_hosts - 1 addresses
Nov 28 10:06:22 np0005538513.localdomain kernel: device tapdc62470e-a4 left promiscuous mode
Nov 28 10:06:22 np0005538513.localdomain podman[324498]: 2025-11-28 10:06:22.218868964 +0000 UTC m=+0.148790317 container kill c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:06:22 np0005538513.localdomain dnsmasq-dhcp[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/host
Nov 28 10:06:22 np0005538513.localdomain dnsmasq-dhcp[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/opts
Nov 28 10:06:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:22.230 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:22 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:22.267 261084 INFO neutron.agent.dhcp.agent [None req-815c770b-1db7-4e8b-922f-aafbd2b45173 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:22 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:22.269 261084 INFO neutron.agent.dhcp.agent [None req-815c770b-1db7-4e8b-922f-aafbd2b45173 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:22 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:22.548 261084 INFO neutron.agent.dhcp.agent [None req-7cb2b08c-bbd3-4595-a4a3-bf79857633c2 - - - - - -] DHCP configuration for ports {'d39baeac-faf7-4050-b1c2-2f8c9573c064'} is completed
Nov 28 10:06:22 np0005538513.localdomain sudo[324526]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:06:22 np0005538513.localdomain systemd[1]: tmp-crun.4dOuKN.mount: Deactivated successfully.
Nov 28 10:06:22 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-bcb6162e51a81069b70e3cfbeb695cb5cd524dd6654c7ed4ff41275202a34476-merged.mount: Deactivated successfully.
Nov 28 10:06:22 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-08f2491e5443093d105a1e930be3d3c856a41ce1ee3b9888ac54ff837a0f5839-userdata-shm.mount: Deactivated successfully.
Nov 28 10:06:22 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d10c0858a\x2d69b4\x2d4de1\x2daea8\x2d8d780005bf13.mount: Deactivated successfully.
Nov 28 10:06:22 np0005538513.localdomain sudo[324526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:06:22 np0005538513.localdomain sudo[324526]: pam_unix(sudo:session): session closed for user root
Nov 28 10:06:22 np0005538513.localdomain sudo[324544]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:06:22 np0005538513.localdomain sudo[324544]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:06:23 np0005538513.localdomain sudo[324544]: pam_unix(sudo:session): session closed for user root
Nov 28 10:06:23 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e169 do_prune osdmap full prune enabled
Nov 28 10:06:23 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e170 e170: 6 total, 6 up, 6 in
Nov 28 10:06:23 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e170: 6 total, 6 up, 6 in
Nov 28 10:06:23 np0005538513.localdomain ceph-mon[292954]: pgmap v328: 177 pgs: 177 active+clean; 145 MiB data, 811 MiB used, 41 GiB / 42 GiB avail; 73 KiB/s rd, 4.7 KiB/s wr, 100 op/s
Nov 28 10:06:23 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:06:23 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:06:23 np0005538513.localdomain sudo[324594]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:06:23 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:23.633 2 INFO neutron.agent.securitygroups_rpc [None req-5b881bab-382f-42c7-b1b6-bde09ef38c32 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:23 np0005538513.localdomain sudo[324594]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:06:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:06:23 np0005538513.localdomain sudo[324594]: pam_unix(sudo:session): session closed for user root
Nov 28 10:06:23 np0005538513.localdomain podman[324612]: 2025-11-28 10:06:23.76329246 +0000 UTC m=+0.099612142 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:06:23 np0005538513.localdomain podman[324612]: 2025-11-28 10:06:23.806548362 +0000 UTC m=+0.142868074 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:06:23 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:06:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:06:24 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1033241070' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:06:24 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1033241070' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:24 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:24.468 261084 INFO neutron.agent.linux.ip_lib [None req-52d0945c-7c9f-4996-ad7b-72d0dbba49ac - - - - - -] Device tapac54af53-f9 cannot be used as it has no MAC address
Nov 28 10:06:24 np0005538513.localdomain ceph-mon[292954]: osdmap e170: 6 total, 6 up, 6 in
Nov 28 10:06:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:06:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:06:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:06:24 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:06:24 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1033241070' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:24 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1033241070' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:24.498 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:24 np0005538513.localdomain kernel: device tapac54af53-f9 entered promiscuous mode
Nov 28 10:06:24 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324384.5090] manager: (tapac54af53-f9): new Generic device (/org/freedesktop/NetworkManager/Devices/66)
Nov 28 10:06:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:24.509 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:24 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:24.512 2 INFO neutron.agent.securitygroups_rpc [None req-043214bd-8f49-4013-b76d-a6a4f382dbad e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:24 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:24Z|00400|binding|INFO|Claiming lport ac54af53-f927-47a3-a012-007eb09610ba for this chassis.
Nov 28 10:06:24 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:24Z|00401|binding|INFO|ac54af53-f927-47a3-a012-007eb09610ba: Claiming unknown
Nov 28 10:06:24 np0005538513.localdomain systemd-udevd[324647]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:06:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e170 do_prune osdmap full prune enabled
Nov 28 10:06:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e171 e171: 6 total, 6 up, 6 in
Nov 28 10:06:24 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapac54af53-f9: No such device
Nov 28 10:06:24 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:24.537 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-c2ece010-6b32-45eb-a0e7-54a94c6d37c8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c2ece010-6b32-45eb-a0e7-54a94c6d37c8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b2dec4622fe844c592a13e779612beaa', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5d67334e-0468-412d-8122-94888f52f93e, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=ac54af53-f927-47a3-a012-007eb09610ba) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:24 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:24.539 158130 INFO neutron.agent.ovn.metadata.agent [-] Port ac54af53-f927-47a3-a012-007eb09610ba in datapath c2ece010-6b32-45eb-a0e7-54a94c6d37c8 bound to our chassis
Nov 28 10:06:24 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e171: 6 total, 6 up, 6 in
Nov 28 10:06:24 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:24.541 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c2ece010-6b32-45eb-a0e7-54a94c6d37c8 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:06:24 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:24.542 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[1f272300-3832-4aa8-b8c9-daa2457dd068]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:24 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapac54af53-f9: No such device
Nov 28 10:06:24 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:24Z|00402|binding|INFO|Setting lport ac54af53-f927-47a3-a012-007eb09610ba ovn-installed in OVS
Nov 28 10:06:24 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:24Z|00403|binding|INFO|Setting lport ac54af53-f927-47a3-a012-007eb09610ba up in Southbound
Nov 28 10:06:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:24.550 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:24 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapac54af53-f9: No such device
Nov 28 10:06:24 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapac54af53-f9: No such device
Nov 28 10:06:24 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapac54af53-f9: No such device
Nov 28 10:06:24 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapac54af53-f9: No such device
Nov 28 10:06:24 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapac54af53-f9: No such device
Nov 28 10:06:24 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapac54af53-f9: No such device
Nov 28 10:06:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:24.593 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:24.623 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:25 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e171 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:06:25 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e171 do_prune osdmap full prune enabled
Nov 28 10:06:25 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e172 e172: 6 total, 6 up, 6 in
Nov 28 10:06:25 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e172: 6 total, 6 up, 6 in
Nov 28 10:06:25 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:25.416 2 INFO neutron.agent.securitygroups_rpc [None req-dd385026-e816-420f-a351-7652f9735a8b e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:25 np0005538513.localdomain ceph-mon[292954]: pgmap v330: 177 pgs: 177 active+clean; 145 MiB data, 811 MiB used, 41 GiB / 42 GiB avail; 67 KiB/s rd, 4.3 KiB/s wr, 91 op/s
Nov 28 10:06:25 np0005538513.localdomain ceph-mon[292954]: osdmap e171: 6 total, 6 up, 6 in
Nov 28 10:06:25 np0005538513.localdomain ceph-mon[292954]: osdmap e172: 6 total, 6 up, 6 in
Nov 28 10:06:25 np0005538513.localdomain podman[324718]: 
Nov 28 10:06:25 np0005538513.localdomain podman[324718]: 2025-11-28 10:06:25.630264802 +0000 UTC m=+0.103133171 container create c50b6e12f7635965fc8766068ad19b1eea4c436af11e8c7810c10881b6721190 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 28 10:06:25 np0005538513.localdomain podman[324718]: 2025-11-28 10:06:25.583816021 +0000 UTC m=+0.056684450 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:06:25 np0005538513.localdomain systemd[1]: Started libpod-conmon-c50b6e12f7635965fc8766068ad19b1eea4c436af11e8c7810c10881b6721190.scope.
Nov 28 10:06:25 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:06:25 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb8c60485550cb70649c48654a78f2ae0852e1cfaa2dfda9defc695ca31eee49/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:06:25 np0005538513.localdomain podman[324718]: 2025-11-28 10:06:25.723294158 +0000 UTC m=+0.196162537 container init c50b6e12f7635965fc8766068ad19b1eea4c436af11e8c7810c10881b6721190 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:06:25 np0005538513.localdomain podman[324718]: 2025-11-28 10:06:25.732910927 +0000 UTC m=+0.205779296 container start c50b6e12f7635965fc8766068ad19b1eea4c436af11e8c7810c10881b6721190 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 10:06:25 np0005538513.localdomain dnsmasq[324736]: started, version 2.85 cachesize 150
Nov 28 10:06:25 np0005538513.localdomain dnsmasq[324736]: DNS service limited to local subnets
Nov 28 10:06:25 np0005538513.localdomain dnsmasq[324736]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:06:25 np0005538513.localdomain dnsmasq[324736]: warning: no upstream servers configured
Nov 28 10:06:25 np0005538513.localdomain dnsmasq-dhcp[324736]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:06:25 np0005538513.localdomain dnsmasq[324736]: read /var/lib/neutron/dhcp/c2ece010-6b32-45eb-a0e7-54a94c6d37c8/addn_hosts - 0 addresses
Nov 28 10:06:25 np0005538513.localdomain dnsmasq-dhcp[324736]: read /var/lib/neutron/dhcp/c2ece010-6b32-45eb-a0e7-54a94c6d37c8/host
Nov 28 10:06:25 np0005538513.localdomain dnsmasq-dhcp[324736]: read /var/lib/neutron/dhcp/c2ece010-6b32-45eb-a0e7-54a94c6d37c8/opts
Nov 28 10:06:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:25.808 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:25 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:06:25 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:06:25 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:25.958 261084 INFO neutron.agent.dhcp.agent [None req-3b9216c1-d5b5-46ec-8219-0fbc48816222 - - - - - -] DHCP configuration for ports {'9de81172-b84e-48ed-a6ac-538e374abb7b'} is completed
Nov 28 10:06:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:25.993 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:26 np0005538513.localdomain dnsmasq[324736]: exiting on receipt of SIGTERM
Nov 28 10:06:26 np0005538513.localdomain podman[324754]: 2025-11-28 10:06:26.135337282 +0000 UTC m=+0.066528175 container kill c50b6e12f7635965fc8766068ad19b1eea4c436af11e8c7810c10881b6721190 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 10:06:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:06:26 np0005538513.localdomain systemd[1]: libpod-c50b6e12f7635965fc8766068ad19b1eea4c436af11e8c7810c10881b6721190.scope: Deactivated successfully.
Nov 28 10:06:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e172 do_prune osdmap full prune enabled
Nov 28 10:06:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e173 e173: 6 total, 6 up, 6 in
Nov 28 10:06:26 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e173: 6 total, 6 up, 6 in
Nov 28 10:06:26 np0005538513.localdomain podman[324766]: 2025-11-28 10:06:26.21618232 +0000 UTC m=+0.060884500 container died c50b6e12f7635965fc8766068ad19b1eea4c436af11e8c7810c10881b6721190 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 10:06:26 np0005538513.localdomain podman[324766]: 2025-11-28 10:06:26.24548491 +0000 UTC m=+0.090187050 container cleanup c50b6e12f7635965fc8766068ad19b1eea4c436af11e8c7810c10881b6721190 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 10:06:26 np0005538513.localdomain systemd[1]: libpod-conmon-c50b6e12f7635965fc8766068ad19b1eea4c436af11e8c7810c10881b6721190.scope: Deactivated successfully.
Nov 28 10:06:26 np0005538513.localdomain podman[324769]: 2025-11-28 10:06:26.292112326 +0000 UTC m=+0.126022671 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:26 np0005538513.localdomain podman[324768]: 2025-11-28 10:06:26.342057005 +0000 UTC m=+0.181905664 container remove c50b6e12f7635965fc8766068ad19b1eea4c436af11e8c7810c10881b6721190 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:06:26 np0005538513.localdomain podman[324769]: 2025-11-28 10:06:26.370868339 +0000 UTC m=+0.204778664 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 10:06:26 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:06:26 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:26.470 2 INFO neutron.agent.securitygroups_rpc [None req-1c752cf9-f4dc-4e37-b8ac-f785450dc01f 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:26 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-eb8c60485550cb70649c48654a78f2ae0852e1cfaa2dfda9defc695ca31eee49-merged.mount: Deactivated successfully.
Nov 28 10:06:26 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c50b6e12f7635965fc8766068ad19b1eea4c436af11e8c7810c10881b6721190-userdata-shm.mount: Deactivated successfully.
Nov 28 10:06:26 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:26.651 2 INFO neutron.agent.securitygroups_rpc [None req-0d9bcbd8-d3ad-4d76-88b4-dcdc400ccf8b e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:26 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:26.734 2 INFO neutron.agent.securitygroups_rpc [None req-2fa3b339-652f-411d-b715-041869496ad1 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:26 np0005538513.localdomain ceph-mon[292954]: pgmap v333: 177 pgs: 177 active+clean; 145 MiB data, 811 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 4.1 KiB/s wr, 86 op/s
Nov 28 10:06:26 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:06:26 np0005538513.localdomain ceph-mon[292954]: osdmap e173: 6 total, 6 up, 6 in
Nov 28 10:06:27 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:27.055 2 INFO neutron.agent.securitygroups_rpc [None req-c1a7ab1c-728b-412f-8d01-654c720e39af 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:27 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:27.252 2 INFO neutron.agent.securitygroups_rpc [None req-35d7c929-00df-4969-aae8-f80246e7ea1b e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:27 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:27.567 2 INFO neutron.agent.securitygroups_rpc [None req-c6801d78-6b56-496a-b37f-67c7bc7e4554 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:06:27 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:27.830 2 INFO neutron.agent.securitygroups_rpc [None req-70e4eaf3-c859-433d-b609-e53f73e65383 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:27 np0005538513.localdomain podman[324835]: 2025-11-28 10:06:27.862411937 +0000 UTC m=+0.091023358 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 28 10:06:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e173 do_prune osdmap full prune enabled
Nov 28 10:06:27 np0005538513.localdomain podman[324835]: 2025-11-28 10:06:27.920665103 +0000 UTC m=+0.149276584 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:06:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e174 e174: 6 total, 6 up, 6 in
Nov 28 10:06:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:06:27 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e174: 6 total, 6 up, 6 in
Nov 28 10:06:27 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:06:28 np0005538513.localdomain podman[324863]: 2025-11-28 10:06:28.043051433 +0000 UTC m=+0.093403209 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Nov 28 10:06:28 np0005538513.localdomain podman[324863]: 2025-11-28 10:06:28.059439008 +0000 UTC m=+0.109790784 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:06:28 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:06:28 np0005538513.localdomain podman[324903]: 
Nov 28 10:06:28 np0005538513.localdomain podman[324903]: 2025-11-28 10:06:28.294448049 +0000 UTC m=+0.101965252 container create cccea3c566abb3eb2f1c1273d5967910e2ccce3eebec24f7603a8fc6ef2a2e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 28 10:06:28 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:28.318 2 INFO neutron.agent.securitygroups_rpc [None req-8150f43d-22f5-4d4c-88f9-cca24e00dc84 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:28 np0005538513.localdomain systemd[1]: Started libpod-conmon-cccea3c566abb3eb2f1c1273d5967910e2ccce3eebec24f7603a8fc6ef2a2e42.scope.
Nov 28 10:06:28 np0005538513.localdomain podman[324903]: 2025-11-28 10:06:28.251205397 +0000 UTC m=+0.058722590 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:06:28 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:06:28 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee9e293b8742e167b01ef0934def034c313f2eb915f944093d63d89ce9a34521/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:06:28 np0005538513.localdomain podman[324903]: 2025-11-28 10:06:28.36876627 +0000 UTC m=+0.176283483 container init cccea3c566abb3eb2f1c1273d5967910e2ccce3eebec24f7603a8fc6ef2a2e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 10:06:28 np0005538513.localdomain podman[324903]: 2025-11-28 10:06:28.377981813 +0000 UTC m=+0.185499026 container start cccea3c566abb3eb2f1c1273d5967910e2ccce3eebec24f7603a8fc6ef2a2e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:06:28 np0005538513.localdomain dnsmasq[324922]: started, version 2.85 cachesize 150
Nov 28 10:06:28 np0005538513.localdomain dnsmasq[324922]: DNS service limited to local subnets
Nov 28 10:06:28 np0005538513.localdomain dnsmasq[324922]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:06:28 np0005538513.localdomain dnsmasq[324922]: warning: no upstream servers configured
Nov 28 10:06:28 np0005538513.localdomain dnsmasq-dhcp[324922]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:06:28 np0005538513.localdomain dnsmasq-dhcp[324922]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:06:28 np0005538513.localdomain dnsmasq[324922]: read /var/lib/neutron/dhcp/c2ece010-6b32-45eb-a0e7-54a94c6d37c8/addn_hosts - 0 addresses
Nov 28 10:06:28 np0005538513.localdomain dnsmasq-dhcp[324922]: read /var/lib/neutron/dhcp/c2ece010-6b32-45eb-a0e7-54a94c6d37c8/host
Nov 28 10:06:28 np0005538513.localdomain dnsmasq-dhcp[324922]: read /var/lib/neutron/dhcp/c2ece010-6b32-45eb-a0e7-54a94c6d37c8/opts
Nov 28 10:06:28 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:28.614 261084 INFO neutron.agent.dhcp.agent [None req-ec7a9a4e-be2e-4d97-a715-30cb9146efd8 - - - - - -] DHCP configuration for ports {'ac54af53-f927-47a3-a012-007eb09610ba', '9de81172-b84e-48ed-a6ac-538e374abb7b'} is completed
Nov 28 10:06:28 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e174 do_prune osdmap full prune enabled
Nov 28 10:06:28 np0005538513.localdomain ceph-mon[292954]: pgmap v335: 177 pgs: 177 active+clean; 175 MiB data, 819 MiB used, 41 GiB / 42 GiB avail; 4.9 MiB/s rd, 2.0 MiB/s wr, 203 op/s
Nov 28 10:06:28 np0005538513.localdomain ceph-mon[292954]: osdmap e174: 6 total, 6 up, 6 in
Nov 28 10:06:28 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e175 e175: 6 total, 6 up, 6 in
Nov 28 10:06:29 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e175: 6 total, 6 up, 6 in
Nov 28 10:06:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e175 do_prune osdmap full prune enabled
Nov 28 10:06:30 np0005538513.localdomain ceph-mon[292954]: osdmap e175: 6 total, 6 up, 6 in
Nov 28 10:06:30 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2154361656' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:06:30 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e176 e176: 6 total, 6 up, 6 in
Nov 28 10:06:30 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e176: 6 total, 6 up, 6 in
Nov 28 10:06:30 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e176 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:06:30 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e176 do_prune osdmap full prune enabled
Nov 28 10:06:30 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e177 e177: 6 total, 6 up, 6 in
Nov 28 10:06:30 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e177: 6 total, 6 up, 6 in
Nov 28 10:06:30 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:30.313 2 INFO neutron.agent.securitygroups_rpc [None req-61635fc4-7cf2-4d88-91e1-3ec9d744288e 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:30.821 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:30.996 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:31 np0005538513.localdomain ceph-mon[292954]: pgmap v338: 177 pgs: 177 active+clean; 175 MiB data, 819 MiB used, 41 GiB / 42 GiB avail; 4.5 MiB/s rd, 1.9 MiB/s wr, 189 op/s
Nov 28 10:06:31 np0005538513.localdomain ceph-mon[292954]: osdmap e176: 6 total, 6 up, 6 in
Nov 28 10:06:31 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/885376607' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:31 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/885376607' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:31 np0005538513.localdomain ceph-mon[292954]: osdmap e177: 6 total, 6 up, 6 in
Nov 28 10:06:31 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:31.237 261084 INFO neutron.agent.linux.ip_lib [None req-39937e8a-9887-42fc-9240-e17ad0de1399 - - - - - -] Device tapa7bebe57-e6 cannot be used as it has no MAC address
Nov 28 10:06:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:31.263 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:31 np0005538513.localdomain kernel: device tapa7bebe57-e6 entered promiscuous mode
Nov 28 10:06:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:31.271 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:31 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324391.2723] manager: (tapa7bebe57-e6): new Generic device (/org/freedesktop/NetworkManager/Devices/67)
Nov 28 10:06:31 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:31Z|00404|binding|INFO|Claiming lport a7bebe57-e6af-47fd-a208-2b421ce68fb2 for this chassis.
Nov 28 10:06:31 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:31Z|00405|binding|INFO|a7bebe57-e6af-47fd-a208-2b421ce68fb2: Claiming unknown
Nov 28 10:06:31 np0005538513.localdomain systemd-udevd[324933]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:06:31 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:31.287 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8462a4a9a313405e8fd212f9ec4a0c92', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=337e4655-c093-49ec-8f8f-37f4f2ac3a09, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=a7bebe57-e6af-47fd-a208-2b421ce68fb2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:31 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:31.289 158130 INFO neutron.agent.ovn.metadata.agent [-] Port a7bebe57-e6af-47fd-a208-2b421ce68fb2 in datapath ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec bound to our chassis
Nov 28 10:06:31 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:31.291 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:06:31 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:31.295 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[2e029e96-50cd-4888-8ee5-d69628bfe77c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:31 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapa7bebe57-e6: No such device
Nov 28 10:06:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:31.306 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:31 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapa7bebe57-e6: No such device
Nov 28 10:06:31 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:31Z|00406|binding|INFO|Setting lport a7bebe57-e6af-47fd-a208-2b421ce68fb2 ovn-installed in OVS
Nov 28 10:06:31 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:31Z|00407|binding|INFO|Setting lport a7bebe57-e6af-47fd-a208-2b421ce68fb2 up in Southbound
Nov 28 10:06:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:31.313 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:31 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapa7bebe57-e6: No such device
Nov 28 10:06:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:31.315 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:31 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapa7bebe57-e6: No such device
Nov 28 10:06:31 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapa7bebe57-e6: No such device
Nov 28 10:06:31 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapa7bebe57-e6: No such device
Nov 28 10:06:31 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapa7bebe57-e6: No such device
Nov 28 10:06:31 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapa7bebe57-e6: No such device
Nov 28 10:06:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:31.356 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:31.388 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e177 do_prune osdmap full prune enabled
Nov 28 10:06:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e178 e178: 6 total, 6 up, 6 in
Nov 28 10:06:32 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e178: 6 total, 6 up, 6 in
Nov 28 10:06:32 np0005538513.localdomain podman[325004]: 
Nov 28 10:06:32 np0005538513.localdomain podman[325004]: 2025-11-28 10:06:32.234163512 +0000 UTC m=+0.093468221 container create 3b5a924724bbfd3f876266e615b1440da44bfeb8e3e3830b03306159687c33b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:32 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:32.265 2 INFO neutron.agent.securitygroups_rpc [None req-b832b7b5-4ef2-4bcf-bb1b-eacd2f3a21fc e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:32 np0005538513.localdomain systemd[1]: Started libpod-conmon-3b5a924724bbfd3f876266e615b1440da44bfeb8e3e3830b03306159687c33b6.scope.
Nov 28 10:06:32 np0005538513.localdomain podman[325004]: 2025-11-28 10:06:32.189699693 +0000 UTC m=+0.049004432 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:06:32 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:06:32 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6442dac4c49b82e1b453723658ecb4ea232eb6ab4ad6772df820c1942a4ca81f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:06:32 np0005538513.localdomain podman[325004]: 2025-11-28 10:06:32.324089933 +0000 UTC m=+0.183394652 container init 3b5a924724bbfd3f876266e615b1440da44bfeb8e3e3830b03306159687c33b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:06:32 np0005538513.localdomain podman[325004]: 2025-11-28 10:06:32.335806424 +0000 UTC m=+0.195111133 container start 3b5a924724bbfd3f876266e615b1440da44bfeb8e3e3830b03306159687c33b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 28 10:06:32 np0005538513.localdomain dnsmasq[325023]: started, version 2.85 cachesize 150
Nov 28 10:06:32 np0005538513.localdomain dnsmasq[325023]: DNS service limited to local subnets
Nov 28 10:06:32 np0005538513.localdomain dnsmasq[325023]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:06:32 np0005538513.localdomain dnsmasq[325023]: warning: no upstream servers configured
Nov 28 10:06:32 np0005538513.localdomain dnsmasq-dhcp[325023]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:06:32 np0005538513.localdomain dnsmasq[325023]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/addn_hosts - 0 addresses
Nov 28 10:06:32 np0005538513.localdomain dnsmasq-dhcp[325023]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/host
Nov 28 10:06:32 np0005538513.localdomain dnsmasq-dhcp[325023]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/opts
Nov 28 10:06:32 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:32.563 261084 INFO neutron.agent.dhcp.agent [None req-51490f9d-9eb5-4300-b409-e06914b0e646 - - - - - -] DHCP configuration for ports {'bd93ee1b-7d5f-4009-bc66-cf9872b0e906'} is completed
Nov 28 10:06:32 np0005538513.localdomain dnsmasq[325023]: exiting on receipt of SIGTERM
Nov 28 10:06:32 np0005538513.localdomain podman[325039]: 2025-11-28 10:06:32.752890376 +0000 UTC m=+0.049470956 container kill 3b5a924724bbfd3f876266e615b1440da44bfeb8e3e3830b03306159687c33b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:06:32 np0005538513.localdomain systemd[1]: libpod-3b5a924724bbfd3f876266e615b1440da44bfeb8e3e3830b03306159687c33b6.scope: Deactivated successfully.
Nov 28 10:06:32 np0005538513.localdomain podman[325051]: 2025-11-28 10:06:32.831932081 +0000 UTC m=+0.063386644 container died 3b5a924724bbfd3f876266e615b1440da44bfeb8e3e3830b03306159687c33b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:32 np0005538513.localdomain podman[325051]: 2025-11-28 10:06:32.866888638 +0000 UTC m=+0.098343171 container cleanup 3b5a924724bbfd3f876266e615b1440da44bfeb8e3e3830b03306159687c33b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:06:32 np0005538513.localdomain systemd[1]: libpod-conmon-3b5a924724bbfd3f876266e615b1440da44bfeb8e3e3830b03306159687c33b6.scope: Deactivated successfully.
Nov 28 10:06:32 np0005538513.localdomain podman[325053]: 2025-11-28 10:06:32.938274278 +0000 UTC m=+0.161749235 container remove 3b5a924724bbfd3f876266e615b1440da44bfeb8e3e3830b03306159687c33b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:06:33 np0005538513.localdomain ceph-mon[292954]: pgmap v341: 177 pgs: 177 active+clean; 334 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 5.5 MiB/s rd, 32 MiB/s wr, 256 op/s
Nov 28 10:06:33 np0005538513.localdomain ceph-mon[292954]: osdmap e178: 6 total, 6 up, 6 in
Nov 28 10:06:33 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e178 do_prune osdmap full prune enabled
Nov 28 10:06:33 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e179 e179: 6 total, 6 up, 6 in
Nov 28 10:06:33 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e179: 6 total, 6 up, 6 in
Nov 28 10:06:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-6442dac4c49b82e1b453723658ecb4ea232eb6ab4ad6772df820c1942a4ca81f-merged.mount: Deactivated successfully.
Nov 28 10:06:33 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3b5a924724bbfd3f876266e615b1440da44bfeb8e3e3830b03306159687c33b6-userdata-shm.mount: Deactivated successfully.
Nov 28 10:06:33 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:33.310 2 INFO neutron.agent.securitygroups_rpc [None req-995edc3e-e8fd-43bb-892b-18b0775677c3 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:33 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:33.825 2 INFO neutron.agent.securitygroups_rpc [None req-ca5aaf48-8c6f-4efd-a0a9-4566338fc9f9 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:34 np0005538513.localdomain ceph-mon[292954]: osdmap e179: 6 total, 6 up, 6 in
Nov 28 10:06:34 np0005538513.localdomain podman[325131]: 
Nov 28 10:06:34 np0005538513.localdomain podman[325131]: 2025-11-28 10:06:34.418439805 +0000 UTC m=+0.066827751 container create f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:06:34 np0005538513.localdomain systemd[1]: Started libpod-conmon-f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924.scope.
Nov 28 10:06:34 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:06:34 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49b9eae002394005d1393e4090115e2e8233d8d4759be7627d4b8be05351a27e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:06:34 np0005538513.localdomain podman[325131]: 2025-11-28 10:06:34.384390556 +0000 UTC m=+0.032778522 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:06:34 np0005538513.localdomain podman[325131]: 2025-11-28 10:06:34.530069244 +0000 UTC m=+0.178457200 container init f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 28 10:06:34 np0005538513.localdomain podman[325131]: 2025-11-28 10:06:34.544789018 +0000 UTC m=+0.193176964 container start f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 28 10:06:34 np0005538513.localdomain dnsmasq[325150]: started, version 2.85 cachesize 150
Nov 28 10:06:34 np0005538513.localdomain dnsmasq[325150]: DNS service limited to local subnets
Nov 28 10:06:34 np0005538513.localdomain dnsmasq[325150]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:06:34 np0005538513.localdomain dnsmasq[325150]: warning: no upstream servers configured
Nov 28 10:06:34 np0005538513.localdomain dnsmasq-dhcp[325150]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Nov 28 10:06:34 np0005538513.localdomain dnsmasq-dhcp[325150]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:06:34 np0005538513.localdomain dnsmasq[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/addn_hosts - 2 addresses
Nov 28 10:06:34 np0005538513.localdomain dnsmasq-dhcp[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/host
Nov 28 10:06:34 np0005538513.localdomain dnsmasq-dhcp[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/opts
Nov 28 10:06:34 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:34.581 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:db:09 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d1c7e9a2-1241-45c0-8a7d-a563a8d4e9f3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1c7e9a2-1241-45c0-8a7d-a563a8d4e9f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b879ef3c-9a06-48a8-9e87-0eac0ec86fcf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=6db6a620-dcc3-4cb5-ab27-f70881c20730) old=Port_Binding(mac=['fa:16:3e:a3:db:09 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d1c7e9a2-1241-45c0-8a7d-a563a8d4e9f3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1c7e9a2-1241-45c0-8a7d-a563a8d4e9f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:34 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:34.583 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 6db6a620-dcc3-4cb5-ab27-f70881c20730 in datapath d1c7e9a2-1241-45c0-8a7d-a563a8d4e9f3 updated
Nov 28 10:06:34 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:34.587 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d1c7e9a2-1241-45c0-8a7d-a563a8d4e9f3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:06:34 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:34.588 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[e1dd2b9b-aff7-46e7-882b-9491670e6f0c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:34 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:34.611 2 INFO neutron.agent.securitygroups_rpc [None req-c85e903a-7c43-4db3-84d5-12d5f3b5c956 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:34 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:34.611 261084 INFO neutron.agent.dhcp.agent [None req-98b7c1d3-4579-411b-808c-fa917aa96b17 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:06:31Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd66592b0>, <neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6659700>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6659250>, <neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6659af0>], id=7f8efc94-f33c-45bd-8ba3-5aae61ce07bb, ip_allocation=immediate, mac_address=fa:16:3e:c3:7f:0b, name=tempest-PortsIpV6TestJSON-733345616, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:06:28Z, description=, dns_domain=, id=ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-584530130, port_security_enabled=True, project_id=8462a4a9a313405e8fd212f9ec4a0c92, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=56666, qos_policy_id=None, revision_number=3, router:external=False, shared=False, standard_attr_id=2340, status=ACTIVE, subnets=['364769b3-b014-43ab-8260-f65a82aa0e26', 'aea3f6fd-3ed0-459b-8972-d3dc4b7dc981'], tags=[], tenant_id=8462a4a9a313405e8fd212f9ec4a0c92, updated_at=2025-11-28T10:06:31Z, vlan_transparent=None, network_id=ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, port_security_enabled=True, project_id=8462a4a9a313405e8fd212f9ec4a0c92, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['78343c03-098f-4faf-a880-2814fe3611d6'], standard_attr_id=2352, status=DOWN, tags=[], tenant_id=8462a4a9a313405e8fd212f9ec4a0c92, updated_at=2025-11-28T10:06:32Z on network ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec
Nov 28 10:06:34 np0005538513.localdomain dnsmasq[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/addn_hosts - 2 addresses
Nov 28 10:06:34 np0005538513.localdomain podman[325168]: 2025-11-28 10:06:34.759981878 +0000 UTC m=+0.046695420 container kill f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:06:34 np0005538513.localdomain dnsmasq-dhcp[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/host
Nov 28 10:06:34 np0005538513.localdomain dnsmasq-dhcp[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/opts
Nov 28 10:06:34 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:34.797 261084 INFO neutron.agent.dhcp.agent [None req-21956997-4457-454d-a56e-5040ea8f3871 - - - - - -] DHCP configuration for ports {'7f8efc94-f33c-45bd-8ba3-5aae61ce07bb', 'a7bebe57-e6af-47fd-a208-2b421ce68fb2', 'bd93ee1b-7d5f-4009-bc66-cf9872b0e906'} is completed
Nov 28 10:06:34 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:34.853 261084 INFO neutron.agent.dhcp.agent [None req-98b7c1d3-4579-411b-808c-fa917aa96b17 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:06:31Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd63e5fd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6629070>], id=7f8efc94-f33c-45bd-8ba3-5aae61ce07bb, ip_allocation=immediate, mac_address=fa:16:3e:c3:7f:0b, name=tempest-PortsIpV6TestJSON-733345616, network_id=ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, port_security_enabled=True, project_id=8462a4a9a313405e8fd212f9ec4a0c92, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['78343c03-098f-4faf-a880-2814fe3611d6'], standard_attr_id=2352, status=DOWN, tags=[], tenant_id=8462a4a9a313405e8fd212f9ec4a0c92, updated_at=2025-11-28T10:06:32Z on network ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec
Nov 28 10:06:34 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:34.942 261084 INFO neutron.agent.dhcp.agent [None req-95653764-e084-4088-8d4b-3286ea811f41 - - - - - -] DHCP configuration for ports {'7f8efc94-f33c-45bd-8ba3-5aae61ce07bb'} is completed
Nov 28 10:06:34 np0005538513.localdomain dnsmasq[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/addn_hosts - 1 addresses
Nov 28 10:06:34 np0005538513.localdomain dnsmasq-dhcp[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/host
Nov 28 10:06:34 np0005538513.localdomain dnsmasq-dhcp[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/opts
Nov 28 10:06:34 np0005538513.localdomain podman[325208]: 2025-11-28 10:06:34.984176436 +0000 UTC m=+0.046898416 container kill f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: pgmap v344: 177 pgs: 177 active+clean; 334 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 5.5 MiB/s rd, 32 MiB/s wr, 256 op/s
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3040858317' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:06:35 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:35.129 261084 INFO neutron.agent.dhcp.agent [None req-98b7c1d3-4579-411b-808c-fa917aa96b17 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:06:31Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd63b2cd0>, <neutron.agent.linux.dhcp.DictModel object at 0x7fdcd63b2610>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd63b29a0>, <neutron.agent.linux.dhcp.DictModel object at 0x7fdcd63b2e20>], id=7f8efc94-f33c-45bd-8ba3-5aae61ce07bb, ip_allocation=immediate, mac_address=fa:16:3e:c3:7f:0b, name=tempest-PortsIpV6TestJSON-733345616, network_id=ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, port_security_enabled=True, project_id=8462a4a9a313405e8fd212f9ec4a0c92, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['78343c03-098f-4faf-a880-2814fe3611d6'], standard_attr_id=2352, status=DOWN, tags=[], tenant_id=8462a4a9a313405e8fd212f9ec4a0c92, updated_at=2025-11-28T10:06:33Z on network ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e179 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e179 do_prune osdmap full prune enabled
Nov 28 10:06:35 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:35.378 261084 INFO neutron.agent.dhcp.agent [None req-c8ba8810-26a5-423d-9bad-10653cc8e7af - - - - - -] DHCP configuration for ports {'7f8efc94-f33c-45bd-8ba3-5aae61ce07bb'} is completed
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e180 e180: 6 total, 6 up, 6 in
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e180: 6 total, 6 up, 6 in
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.401312) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324395401357, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 1793, "num_deletes": 266, "total_data_size": 1787086, "memory_usage": 1827776, "flush_reason": "Manual Compaction"}
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324395416611, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 1738394, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28012, "largest_seqno": 29804, "table_properties": {"data_size": 1730746, "index_size": 4541, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16990, "raw_average_key_size": 20, "raw_value_size": 1714900, "raw_average_value_size": 2099, "num_data_blocks": 197, "num_entries": 817, "num_filter_entries": 817, "num_deletions": 266, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324288, "oldest_key_time": 1764324288, "file_creation_time": 1764324395, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 15358 microseconds, and 6260 cpu microseconds.
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.416663) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 1738394 bytes OK
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.416692) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.419135) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.419161) EVENT_LOG_v1 {"time_micros": 1764324395419154, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.419183) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 1779189, prev total WAL file size 1779189, number of live WAL files 2.
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.420340) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034303231' seq:72057594037927935, type:22 .. '6C6F676D0034323735' seq:0, type:0; will stop at (end)
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(1697KB)], [48(15MB)]
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324395420413, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 18373606, "oldest_snapshot_seqno": -1}
Nov 28 10:06:35 np0005538513.localdomain dnsmasq[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/addn_hosts - 2 addresses
Nov 28 10:06:35 np0005538513.localdomain dnsmasq-dhcp[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/host
Nov 28 10:06:35 np0005538513.localdomain dnsmasq-dhcp[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/opts
Nov 28 10:06:35 np0005538513.localdomain systemd[1]: tmp-crun.pFJxPR.mount: Deactivated successfully.
Nov 28 10:06:35 np0005538513.localdomain podman[325246]: 2025-11-28 10:06:35.425097352 +0000 UTC m=+0.051997873 container kill f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 12778 keys, 17853894 bytes, temperature: kUnknown
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324395517883, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 17853894, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17779155, "index_size": 41678, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32005, "raw_key_size": 341741, "raw_average_key_size": 26, "raw_value_size": 17559952, "raw_average_value_size": 1374, "num_data_blocks": 1585, "num_entries": 12778, "num_filter_entries": 12778, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324395, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.518328) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 17853894 bytes
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.528712) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 188.2 rd, 182.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 15.9 +0.0 blob) out(17.0 +0.0 blob), read-write-amplify(20.8) write-amplify(10.3) OK, records in: 13325, records dropped: 547 output_compression: NoCompression
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.528751) EVENT_LOG_v1 {"time_micros": 1764324395528734, "job": 28, "event": "compaction_finished", "compaction_time_micros": 97623, "compaction_time_cpu_micros": 29339, "output_level": 6, "num_output_files": 1, "total_output_size": 17853894, "num_input_records": 13325, "num_output_records": 12778, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324395529206, "job": 28, "event": "table_file_deletion", "file_number": 50}
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324395531739, "job": 28, "event": "table_file_deletion", "file_number": 48}
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.420219) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.531831) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.531840) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.531843) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.531847) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:06:35 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.531849) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:06:35 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:35.615 261084 INFO neutron.agent.dhcp.agent [None req-9b598890-afd1-4569-81c1-6cab4cabdcc3 - - - - - -] DHCP configuration for ports {'7f8efc94-f33c-45bd-8ba3-5aae61ce07bb'} is completed
Nov 28 10:06:35 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:35.793 2 INFO neutron.agent.securitygroups_rpc [None req-8c0c18c9-b8b0-46ef-89c5-259d43510268 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:35.823 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:35 np0005538513.localdomain dnsmasq[325150]: exiting on receipt of SIGTERM
Nov 28 10:06:35 np0005538513.localdomain podman[325285]: 2025-11-28 10:06:35.940726561 +0000 UTC m=+0.077981194 container kill f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:06:35 np0005538513.localdomain systemd[1]: libpod-f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924.scope: Deactivated successfully.
Nov 28 10:06:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:35.999 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:36 np0005538513.localdomain podman[325297]: 2025-11-28 10:06:36.026231905 +0000 UTC m=+0.073662671 container died f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:06:36 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924-userdata-shm.mount: Deactivated successfully.
Nov 28 10:06:36 np0005538513.localdomain podman[325297]: 2025-11-28 10:06:36.091800575 +0000 UTC m=+0.139231291 container cleanup f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:06:36 np0005538513.localdomain systemd[1]: libpod-conmon-f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924.scope: Deactivated successfully.
Nov 28 10:06:36 np0005538513.localdomain podman[325304]: 2025-11-28 10:06:36.117763795 +0000 UTC m=+0.149504148 container remove f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 28 10:06:36 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:36.180 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port dec1ab9d-b8d3-4c8e-9f82-8a5f826a7152 with type ""
Nov 28 10:06:36 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:36Z|00408|binding|INFO|Removing iface tapa7bebe57-e6 ovn-installed in OVS
Nov 28 10:06:36 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:36Z|00409|binding|INFO|Removing lport a7bebe57-e6af-47fd-a208-2b421ce68fb2 ovn-installed in OVS
Nov 28 10:06:36 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:36.182 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::2/64 2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8462a4a9a313405e8fd212f9ec4a0c92', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=337e4655-c093-49ec-8f8f-37f4f2ac3a09, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=a7bebe57-e6af-47fd-a208-2b421ce68fb2) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:36 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:36.184 158130 INFO neutron.agent.ovn.metadata.agent [-] Port a7bebe57-e6af-47fd-a208-2b421ce68fb2 in datapath ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec unbound from our chassis
Nov 28 10:06:36 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:36.186 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:06:36 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:36.188 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[293fd241-bdf4-403f-a37c-0ce9847e93ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:36.214 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:36 np0005538513.localdomain ceph-mon[292954]: osdmap e180: 6 total, 6 up, 6 in
Nov 28 10:06:36 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/662129540' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:36 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/662129540' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:36 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-49b9eae002394005d1393e4090115e2e8233d8d4759be7627d4b8be05351a27e-merged.mount: Deactivated successfully.
Nov 28 10:06:36 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:36.603 2 INFO neutron.agent.securitygroups_rpc [None req-9cc8449a-c364-4646-9e4e-de66c7fab687 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:36 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:36Z|00410|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:06:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:36.856 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:37 np0005538513.localdomain podman[325376]: 
Nov 28 10:06:37 np0005538513.localdomain podman[325376]: 2025-11-28 10:06:37.0837441 +0000 UTC m=+0.093941186 container create 5421a7596753b4f40758f2935d73525fcbb1061f82225d4b12111cb760dbfd1d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 10:06:37 np0005538513.localdomain systemd[1]: Started libpod-conmon-5421a7596753b4f40758f2935d73525fcbb1061f82225d4b12111cb760dbfd1d.scope.
Nov 28 10:06:37 np0005538513.localdomain podman[325376]: 2025-11-28 10:06:37.03895962 +0000 UTC m=+0.049156726 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:06:37 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:06:37 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d03bc8d97f8286d422fc4b16ab382ed0828346ca9e29d28b267fbe4ebf7f5e44/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:06:37 np0005538513.localdomain podman[325376]: 2025-11-28 10:06:37.16133443 +0000 UTC m=+0.171531516 container init 5421a7596753b4f40758f2935d73525fcbb1061f82225d4b12111cb760dbfd1d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:06:37 np0005538513.localdomain podman[325376]: 2025-11-28 10:06:37.171453562 +0000 UTC m=+0.181650638 container start 5421a7596753b4f40758f2935d73525fcbb1061f82225d4b12111cb760dbfd1d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 10:06:37 np0005538513.localdomain dnsmasq[325394]: started, version 2.85 cachesize 150
Nov 28 10:06:37 np0005538513.localdomain dnsmasq[325394]: DNS service limited to local subnets
Nov 28 10:06:37 np0005538513.localdomain dnsmasq[325394]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:06:37 np0005538513.localdomain dnsmasq[325394]: warning: no upstream servers configured
Nov 28 10:06:37 np0005538513.localdomain dnsmasq-dhcp[325394]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:06:37 np0005538513.localdomain dnsmasq[325394]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/addn_hosts - 0 addresses
Nov 28 10:06:37 np0005538513.localdomain dnsmasq-dhcp[325394]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/host
Nov 28 10:06:37 np0005538513.localdomain dnsmasq-dhcp[325394]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/opts
Nov 28 10:06:37 np0005538513.localdomain dnsmasq[325394]: exiting on receipt of SIGTERM
Nov 28 10:06:37 np0005538513.localdomain podman[325411]: 2025-11-28 10:06:37.408986101 +0000 UTC m=+0.060472054 container kill 5421a7596753b4f40758f2935d73525fcbb1061f82225d4b12111cb760dbfd1d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:06:37 np0005538513.localdomain systemd[1]: libpod-5421a7596753b4f40758f2935d73525fcbb1061f82225d4b12111cb760dbfd1d.scope: Deactivated successfully.
Nov 28 10:06:37 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:37.410 261084 INFO neutron.agent.dhcp.agent [None req-488084a0-f5ea-43bb-b091-6a7241dd8663 - - - - - -] DHCP configuration for ports {'a7bebe57-e6af-47fd-a208-2b421ce68fb2', 'bd93ee1b-7d5f-4009-bc66-cf9872b0e906'} is completed
Nov 28 10:06:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e180 do_prune osdmap full prune enabled
Nov 28 10:06:37 np0005538513.localdomain podman[325423]: 2025-11-28 10:06:37.472871779 +0000 UTC m=+0.053008313 container died 5421a7596753b4f40758f2935d73525fcbb1061f82225d4b12111cb760dbfd1d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:37 np0005538513.localdomain ceph-mon[292954]: pgmap v346: 177 pgs: 177 active+clean; 334 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 3.9 MiB/s rd, 23 MiB/s wr, 181 op/s
Nov 28 10:06:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e181 e181: 6 total, 6 up, 6 in
Nov 28 10:06:37 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e181: 6 total, 6 up, 6 in
Nov 28 10:06:37 np0005538513.localdomain systemd[1]: tmp-crun.DxVM1n.mount: Deactivated successfully.
Nov 28 10:06:37 np0005538513.localdomain podman[325423]: 2025-11-28 10:06:37.521541219 +0000 UTC m=+0.101677703 container cleanup 5421a7596753b4f40758f2935d73525fcbb1061f82225d4b12111cb760dbfd1d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:37 np0005538513.localdomain systemd[1]: libpod-conmon-5421a7596753b4f40758f2935d73525fcbb1061f82225d4b12111cb760dbfd1d.scope: Deactivated successfully.
Nov 28 10:06:37 np0005538513.localdomain podman[325430]: 2025-11-28 10:06:37.566874796 +0000 UTC m=+0.128783719 container remove 5421a7596753b4f40758f2935d73525fcbb1061f82225d4b12111cb760dbfd1d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:06:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:37.580 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:37 np0005538513.localdomain kernel: device tapa7bebe57-e6 left promiscuous mode
Nov 28 10:06:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:37.599 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:37 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:37.645 261084 INFO neutron.agent.dhcp.agent [None req-c9c02c35-0d8e-4eb8-bc38-641d5c13d0fa - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:37 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:37.646 261084 INFO neutron.agent.dhcp.agent [None req-c9c02c35-0d8e-4eb8-bc38-641d5c13d0fa - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:37 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:37.646 261084 INFO neutron.agent.dhcp.agent [None req-c9c02c35-0d8e-4eb8-bc38-641d5c13d0fa - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:38 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:38.265 2 INFO neutron.agent.securitygroups_rpc [None req-f5ad8b43-3120-4741-9a0b-dea18e860a97 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:38 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-d03bc8d97f8286d422fc4b16ab382ed0828346ca9e29d28b267fbe4ebf7f5e44-merged.mount: Deactivated successfully.
Nov 28 10:06:38 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5421a7596753b4f40758f2935d73525fcbb1061f82225d4b12111cb760dbfd1d-userdata-shm.mount: Deactivated successfully.
Nov 28 10:06:38 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2dae6c35a6\x2dc3b5\x2d4f0d\x2da342\x2d15a4ee5d55ec.mount: Deactivated successfully.
Nov 28 10:06:38 np0005538513.localdomain ceph-mon[292954]: osdmap e181: 6 total, 6 up, 6 in
Nov 28 10:06:39 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:39.196 2 INFO neutron.agent.securitygroups_rpc [None req-cef696eb-a0d2-4ba5-86dc-1a9cdfd33b8c 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:06:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:06:39 np0005538513.localdomain podman[325455]: 2025-11-28 10:06:39.373191773 +0000 UTC m=+0.102182449 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:06:39 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:39.386 261084 INFO neutron.agent.linux.ip_lib [None req-232134e7-f096-47d4-b144-e1d26af50f89 - - - - - -] Device tapa9eb3ece-27 cannot be used as it has no MAC address
Nov 28 10:06:39 np0005538513.localdomain podman[325455]: 2025-11-28 10:06:39.388449373 +0000 UTC m=+0.117440029 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:06:39 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:06:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:39.410 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:39 np0005538513.localdomain kernel: device tapa9eb3ece-27 entered promiscuous mode
Nov 28 10:06:39 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:39Z|00411|binding|INFO|Claiming lport a9eb3ece-27d0-4844-bcef-e2f142103dde for this chassis.
Nov 28 10:06:39 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:39Z|00412|binding|INFO|a9eb3ece-27d0-4844-bcef-e2f142103dde: Claiming unknown
Nov 28 10:06:39 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324399.4242] manager: (tapa9eb3ece-27): new Generic device (/org/freedesktop/NetworkManager/Devices/68)
Nov 28 10:06:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:39.421 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:39 np0005538513.localdomain systemd-udevd[325498]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:06:39 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:39.434 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-31e5a6ac-615e-4a89-968d-3e51c941359f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31e5a6ac-615e-4a89-968d-3e51c941359f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b2dec4622fe844c592a13e779612beaa', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3c093f57-d9af-4f3d-94fe-03cca09b4bb2, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=a9eb3ece-27d0-4844-bcef-e2f142103dde) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:39 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:39.436 158130 INFO neutron.agent.ovn.metadata.agent [-] Port a9eb3ece-27d0-4844-bcef-e2f142103dde in datapath 31e5a6ac-615e-4a89-968d-3e51c941359f bound to our chassis
Nov 28 10:06:39 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:39.438 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 31e5a6ac-615e-4a89-968d-3e51c941359f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:06:39 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:39.439 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[b6dc554b-8ee2-495e-bd1b-34cc6e0572c2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:39 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapa9eb3ece-27: No such device
Nov 28 10:06:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:39.461 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:39 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:39Z|00413|binding|INFO|Setting lport a9eb3ece-27d0-4844-bcef-e2f142103dde ovn-installed in OVS
Nov 28 10:06:39 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:39Z|00414|binding|INFO|Setting lport a9eb3ece-27d0-4844-bcef-e2f142103dde up in Southbound
Nov 28 10:06:39 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapa9eb3ece-27: No such device
Nov 28 10:06:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:39.465 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:39 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapa9eb3ece-27: No such device
Nov 28 10:06:39 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapa9eb3ece-27: No such device
Nov 28 10:06:39 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapa9eb3ece-27: No such device
Nov 28 10:06:39 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapa9eb3ece-27: No such device
Nov 28 10:06:39 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapa9eb3ece-27: No such device
Nov 28 10:06:39 np0005538513.localdomain podman[325456]: 2025-11-28 10:06:39.495380778 +0000 UTC m=+0.221034672 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 10:06:39 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapa9eb3ece-27: No such device
Nov 28 10:06:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:39.515 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e181 do_prune osdmap full prune enabled
Nov 28 10:06:39 np0005538513.localdomain ceph-mon[292954]: pgmap v348: 177 pgs: 177 active+clean; 508 MiB data, 1.6 GiB used, 40 GiB / 42 GiB avail; 3.8 MiB/s rd, 26 MiB/s wr, 195 op/s
Nov 28 10:06:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2383859164' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2383859164' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:39 np0005538513.localdomain podman[325456]: 2025-11-28 10:06:39.537107714 +0000 UTC m=+0.262761688 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 28 10:06:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e182 e182: 6 total, 6 up, 6 in
Nov 28 10:06:39 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e182: 6 total, 6 up, 6 in
Nov 28 10:06:39 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:06:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:39.561 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:40 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:40.086 2 INFO neutron.agent.securitygroups_rpc [None req-11baa0de-2c9c-4582-9ccd-cefaf494809d e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:40 np0005538513.localdomain podman[238687]: time="2025-11-28T10:06:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:06:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:06:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 163084 "" "Go-http-client/1.1"
Nov 28 10:06:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:06:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 21185 "" "Go-http-client/1.1"
Nov 28 10:06:40 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e182 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:06:40 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:40Z|00415|binding|INFO|Removing iface tapa9eb3ece-27 ovn-installed in OVS
Nov 28 10:06:40 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:40Z|00416|binding|INFO|Removing lport a9eb3ece-27d0-4844-bcef-e2f142103dde ovn-installed in OVS
Nov 28 10:06:40 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:40.596 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port a13bc4c2-7b6a-4d42-b85f-b16822a81f92 with type ""
Nov 28 10:06:40 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:40.598 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-31e5a6ac-615e-4a89-968d-3e51c941359f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31e5a6ac-615e-4a89-968d-3e51c941359f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b2dec4622fe844c592a13e779612beaa', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3c093f57-d9af-4f3d-94fe-03cca09b4bb2, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=a9eb3ece-27d0-4844-bcef-e2f142103dde) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:40.598 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:40 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:40.601 158130 INFO neutron.agent.ovn.metadata.agent [-] Port a9eb3ece-27d0-4844-bcef-e2f142103dde in datapath 31e5a6ac-615e-4a89-968d-3e51c941359f unbound from our chassis
Nov 28 10:06:40 np0005538513.localdomain ceph-mon[292954]: osdmap e182: 6 total, 6 up, 6 in
Nov 28 10:06:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:40.602 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:40 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:40.603 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 31e5a6ac-615e-4a89-968d-3e51c941359f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:06:40 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:40.605 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[24cc693e-f538-4b3d-8801-e65f6241bb1a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:40 np0005538513.localdomain podman[325578]: 
Nov 28 10:06:40 np0005538513.localdomain podman[325578]: 2025-11-28 10:06:40.517197773 +0000 UTC m=+0.049481336 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:06:40 np0005538513.localdomain podman[325578]: 2025-11-28 10:06:40.623204179 +0000 UTC m=+0.155487702 container create 343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31e5a6ac-615e-4a89-968d-3e51c941359f, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 10:06:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:40.645 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:06:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:40.666 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Triggering sync for uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 28 10:06:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:40.667 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:06:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:40.668 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:06:40 np0005538513.localdomain systemd[1]: Started libpod-conmon-343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877.scope.
Nov 28 10:06:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:40.693 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.025s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:06:40 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:06:40 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95f6be0500b09649aba6df959f6cf789054a0f5abb35ea720a3084df1677d81d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:06:40 np0005538513.localdomain podman[325578]: 2025-11-28 10:06:40.731442465 +0000 UTC m=+0.263725998 container init 343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31e5a6ac-615e-4a89-968d-3e51c941359f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 10:06:40 np0005538513.localdomain podman[325578]: 2025-11-28 10:06:40.740667799 +0000 UTC m=+0.272951332 container start 343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31e5a6ac-615e-4a89-968d-3e51c941359f, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 10:06:40 np0005538513.localdomain dnsmasq[325597]: started, version 2.85 cachesize 150
Nov 28 10:06:40 np0005538513.localdomain dnsmasq[325597]: DNS service limited to local subnets
Nov 28 10:06:40 np0005538513.localdomain dnsmasq[325597]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:06:40 np0005538513.localdomain dnsmasq[325597]: warning: no upstream servers configured
Nov 28 10:06:40 np0005538513.localdomain dnsmasq-dhcp[325597]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:06:40 np0005538513.localdomain dnsmasq[325597]: read /var/lib/neutron/dhcp/31e5a6ac-615e-4a89-968d-3e51c941359f/addn_hosts - 0 addresses
Nov 28 10:06:40 np0005538513.localdomain dnsmasq-dhcp[325597]: read /var/lib/neutron/dhcp/31e5a6ac-615e-4a89-968d-3e51c941359f/host
Nov 28 10:06:40 np0005538513.localdomain dnsmasq-dhcp[325597]: read /var/lib/neutron/dhcp/31e5a6ac-615e-4a89-968d-3e51c941359f/opts
Nov 28 10:06:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:40.793 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:06:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:40.826 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:40 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:40.925 2 INFO neutron.agent.securitygroups_rpc [None req-13572f85-aaed-465a-b457-59f9816ff0f0 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:40 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:40.926 261084 INFO neutron.agent.dhcp.agent [None req-a002d351-2131-493b-b881-45282a5c2164 - - - - - -] DHCP configuration for ports {'9cc2649b-29dd-4c3a-a346-a2df81021394'} is completed
Nov 28 10:06:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:41.005 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:41 np0005538513.localdomain podman[325613]: 2025-11-28 10:06:41.138531108 +0000 UTC m=+0.065310353 container kill 343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31e5a6ac-615e-4a89-968d-3e51c941359f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:06:41 np0005538513.localdomain dnsmasq[325597]: read /var/lib/neutron/dhcp/31e5a6ac-615e-4a89-968d-3e51c941359f/addn_hosts - 0 addresses
Nov 28 10:06:41 np0005538513.localdomain dnsmasq-dhcp[325597]: read /var/lib/neutron/dhcp/31e5a6ac-615e-4a89-968d-3e51c941359f/host
Nov 28 10:06:41 np0005538513.localdomain dnsmasq-dhcp[325597]: read /var/lib/neutron/dhcp/31e5a6ac-615e-4a89-968d-3e51c941359f/opts
Nov 28 10:06:41 np0005538513.localdomain kernel: device tapa9eb3ece-27 left promiscuous mode
Nov 28 10:06:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:41.373 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:41.384 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.454 261084 INFO neutron.agent.dhcp.agent [None req-cd180424-dc1d-43a8-aaaf-f233bc118d28 - - - - - -] DHCP configuration for ports {'9cc2649b-29dd-4c3a-a346-a2df81021394'} is completed
Nov 28 10:06:41 np0005538513.localdomain dnsmasq[325597]: read /var/lib/neutron/dhcp/31e5a6ac-615e-4a89-968d-3e51c941359f/addn_hosts - 0 addresses
Nov 28 10:06:41 np0005538513.localdomain dnsmasq-dhcp[325597]: read /var/lib/neutron/dhcp/31e5a6ac-615e-4a89-968d-3e51c941359f/host
Nov 28 10:06:41 np0005538513.localdomain dnsmasq-dhcp[325597]: read /var/lib/neutron/dhcp/31e5a6ac-615e-4a89-968d-3e51c941359f/opts
Nov 28 10:06:41 np0005538513.localdomain podman[325654]: 2025-11-28 10:06:41.599797371 +0000 UTC m=+0.065518370 container kill 343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31e5a6ac-615e-4a89-968d-3e51c941359f, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:06:41 np0005538513.localdomain ceph-mon[292954]: pgmap v350: 177 pgs: 177 active+clean; 508 MiB data, 1.6 GiB used, 40 GiB / 42 GiB avail; 3.7 MiB/s rd, 25 MiB/s wr, 187 op/s
Nov 28 10:06:41 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/859948979' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:41 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/859948979' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e182 do_prune osdmap full prune enabled
Nov 28 10:06:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e183 e183: 6 total, 6 up, 6 in
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent [None req-25a6e0c4-d89b-40f6-871a-bfbc558909aa - - - - - -] Unable to reload_allocations dhcp for 31e5a6ac-615e-4a89-968d-3e51c941359f.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapa9eb3ece-27 not found in namespace qdhcp-31e5a6ac-615e-4a89-968d-3e51c941359f.
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent     return fut.result()
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent     raise self._exception
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapa9eb3ece-27 not found in namespace qdhcp-31e5a6ac-615e-4a89-968d-3e51c941359f.
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent 
Nov 28 10:06:41 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e183: 6 total, 6 up, 6 in
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.642 261084 INFO neutron.agent.dhcp.agent [None req-48712969-2fd3-4809-8736-106860b7dd0e - - - - - -] Synchronizing state
Nov 28 10:06:41 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:41Z|00417|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:06:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:41.791 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:41 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.895 261084 INFO neutron.agent.dhcp.agent [None req-3d2868a0-ec37-4279-b7d1-5c39919092c4 - - - - - -] All active networks have been fetched through RPC.
Nov 28 10:06:42 np0005538513.localdomain dnsmasq[325597]: exiting on receipt of SIGTERM
Nov 28 10:06:42 np0005538513.localdomain podman[325684]: 2025-11-28 10:06:42.083135784 +0000 UTC m=+0.051244150 container kill 343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31e5a6ac-615e-4a89-968d-3e51c941359f, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 10:06:42 np0005538513.localdomain systemd[1]: libpod-343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877.scope: Deactivated successfully.
Nov 28 10:06:42 np0005538513.localdomain podman[325699]: 2025-11-28 10:06:42.166195333 +0000 UTC m=+0.059700091 container died 343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31e5a6ac-615e-4a89-968d-3e51c941359f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 28 10:06:42 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877-userdata-shm.mount: Deactivated successfully.
Nov 28 10:06:42 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-95f6be0500b09649aba6df959f6cf789054a0f5abb35ea720a3084df1677d81d-merged.mount: Deactivated successfully.
Nov 28 10:06:42 np0005538513.localdomain podman[325699]: 2025-11-28 10:06:42.271804757 +0000 UTC m=+0.165309485 container remove 343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31e5a6ac-615e-4a89-968d-3e51c941359f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:42 np0005538513.localdomain systemd[1]: libpod-conmon-343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877.scope: Deactivated successfully.
Nov 28 10:06:42 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:42.304 261084 INFO neutron.agent.dhcp.agent [-] Starting network 220389a9-aaf3-4df4-9c10-df31c76e1a58 dhcp configuration
Nov 28 10:06:42 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:42.305 261084 INFO neutron.agent.dhcp.agent [-] Finished network 220389a9-aaf3-4df4-9c10-df31c76e1a58 dhcp configuration
Nov 28 10:06:42 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:42.306 261084 INFO neutron.agent.dhcp.agent [None req-1ab4110a-ac13-4ddb-a592-496f6a9cf3c6 - - - - - -] Synchronizing state complete
Nov 28 10:06:42 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:42.346 2 INFO neutron.agent.securitygroups_rpc [None req-38d8b4a3-75dd-41d2-a0db-a9c73ae0e2bb e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:42 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d31e5a6ac\x2d615e\x2d4a89\x2d968d\x2d3e51c941359f.mount: Deactivated successfully.
Nov 28 10:06:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e183 do_prune osdmap full prune enabled
Nov 28 10:06:42 np0005538513.localdomain ceph-mon[292954]: osdmap e183: 6 total, 6 up, 6 in
Nov 28 10:06:42 np0005538513.localdomain ceph-mon[292954]: pgmap v352: 177 pgs: 1 active+clean+snaptrim_wait, 2 active+clean+snaptrim, 174 active+clean; 598 MiB data, 1.9 GiB used, 40 GiB / 42 GiB avail; 3.7 MiB/s rd, 48 MiB/s wr, 315 op/s
Nov 28 10:06:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e184 e184: 6 total, 6 up, 6 in
Nov 28 10:06:42 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e184: 6 total, 6 up, 6 in
Nov 28 10:06:43 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:43.024 2 INFO neutron.agent.securitygroups_rpc [None req-c8fe2171-fc33-407b-b80c-443549ec2e39 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:43 np0005538513.localdomain ceph-mon[292954]: osdmap e184: 6 total, 6 up, 6 in
Nov 28 10:06:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:43.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:06:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:43.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:06:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:43.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:06:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:43.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:06:43 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:06:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1049265492' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:43 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:06:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1049265492' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:44 np0005538513.localdomain podman[325743]: 2025-11-28 10:06:44.052897547 +0000 UTC m=+0.061122984 container kill cccea3c566abb3eb2f1c1273d5967910e2ccce3eebec24f7603a8fc6ef2a2e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 10:06:44 np0005538513.localdomain dnsmasq[324922]: exiting on receipt of SIGTERM
Nov 28 10:06:44 np0005538513.localdomain systemd[1]: libpod-cccea3c566abb3eb2f1c1273d5967910e2ccce3eebec24f7603a8fc6ef2a2e42.scope: Deactivated successfully.
Nov 28 10:06:44 np0005538513.localdomain podman[325757]: 2025-11-28 10:06:44.145965625 +0000 UTC m=+0.074467405 container died cccea3c566abb3eb2f1c1273d5967910e2ccce3eebec24f7603a8fc6ef2a2e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:06:44 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cccea3c566abb3eb2f1c1273d5967910e2ccce3eebec24f7603a8fc6ef2a2e42-userdata-shm.mount: Deactivated successfully.
Nov 28 10:06:44 np0005538513.localdomain podman[325757]: 2025-11-28 10:06:44.182278384 +0000 UTC m=+0.110780114 container cleanup cccea3c566abb3eb2f1c1273d5967910e2ccce3eebec24f7603a8fc6ef2a2e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:06:44 np0005538513.localdomain systemd[1]: libpod-conmon-cccea3c566abb3eb2f1c1273d5967910e2ccce3eebec24f7603a8fc6ef2a2e42.scope: Deactivated successfully.
Nov 28 10:06:44 np0005538513.localdomain podman[325758]: 2025-11-28 10:06:44.270227404 +0000 UTC m=+0.192445341 container remove cccea3c566abb3eb2f1c1273d5967910e2ccce3eebec24f7603a8fc6ef2a2e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:06:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e184 do_prune osdmap full prune enabled
Nov 28 10:06:44 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:44.707 2 INFO neutron.agent.securitygroups_rpc [None req-f4e28522-faee-4418-9069-18a470f0a6f6 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e185 e185: 6 total, 6 up, 6 in
Nov 28 10:06:44 np0005538513.localdomain ceph-mon[292954]: pgmap v354: 177 pgs: 1 active+clean+snaptrim_wait, 2 active+clean+snaptrim, 174 active+clean; 598 MiB data, 1.9 GiB used, 40 GiB / 42 GiB avail; 89 KiB/s rd, 23 MiB/s wr, 128 op/s
Nov 28 10:06:44 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1049265492' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:44 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1049265492' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:44 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e185: 6 total, 6 up, 6 in
Nov 28 10:06:45 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:45.005 261084 INFO neutron.agent.linux.ip_lib [None req-3b7aaba5-4be3-4472-85bf-062426fad64a - - - - - -] Device tap2dcd02a6-6e cannot be used as it has no MAC address
Nov 28 10:06:45 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:45.043 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 832a5508-3562-4501-a0e7-3a860e35260b with type ""
Nov 28 10:06:45 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:45Z|00418|binding|INFO|Removing iface tapac54af53-f9 ovn-installed in OVS
Nov 28 10:06:45 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:45Z|00419|binding|INFO|Removing lport ac54af53-f927-47a3-a012-007eb09610ba ovn-installed in OVS
Nov 28 10:06:45 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:45.046 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-c2ece010-6b32-45eb-a0e7-54a94c6d37c8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c2ece010-6b32-45eb-a0e7-54a94c6d37c8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b2dec4622fe844c592a13e779612beaa', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5d67334e-0468-412d-8122-94888f52f93e, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=ac54af53-f927-47a3-a012-007eb09610ba) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:45 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:45.048 158130 INFO neutron.agent.ovn.metadata.agent [-] Port ac54af53-f927-47a3-a012-007eb09610ba in datapath c2ece010-6b32-45eb-a0e7-54a94c6d37c8 unbound from our chassis
Nov 28 10:06:45 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-ee9e293b8742e167b01ef0934def034c313f2eb915f944093d63d89ce9a34521-merged.mount: Deactivated successfully.
Nov 28 10:06:45 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:45.059 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c2ece010-6b32-45eb-a0e7-54a94c6d37c8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:06:45 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:45.060 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[7881d730-dc52-4712-9450-4691a6db08a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:45.074 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:45.077 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:45 np0005538513.localdomain kernel: device tap2dcd02a6-6e entered promiscuous mode
Nov 28 10:06:45 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324405.0854] manager: (tap2dcd02a6-6e): new Generic device (/org/freedesktop/NetworkManager/Devices/69)
Nov 28 10:06:45 np0005538513.localdomain systemd-udevd[325824]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:06:45 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:45Z|00420|binding|INFO|Claiming lport 2dcd02a6-6ee7-4a78-a00f-13dc1294595f for this chassis.
Nov 28 10:06:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:45.089 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:45 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:45Z|00421|binding|INFO|2dcd02a6-6ee7-4a78-a00f-13dc1294595f: Claiming unknown
Nov 28 10:06:45 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:45.104 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-49f178c5-0cae-4b0e-9bb3-8615842f2e56', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-49f178c5-0cae-4b0e-9bb3-8615842f2e56', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac48318a-2b44-44d6-ac83-b660fc51fa7f, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=2dcd02a6-6ee7-4a78-a00f-13dc1294595f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:45 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:45.107 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 2dcd02a6-6ee7-4a78-a00f-13dc1294595f in datapath 49f178c5-0cae-4b0e-9bb3-8615842f2e56 bound to our chassis
Nov 28 10:06:45 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:45.111 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 2a341328-31a8-4e24-99e8-5139e38e23a6 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:06:45 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:45.111 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 49f178c5-0cae-4b0e-9bb3-8615842f2e56, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:06:45 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:45.113 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[3b81918c-4715-4f16-a964-10e3c5d3a852]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:45 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:45Z|00422|binding|INFO|Setting lport 2dcd02a6-6ee7-4a78-a00f-13dc1294595f ovn-installed in OVS
Nov 28 10:06:45 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:45Z|00423|binding|INFO|Setting lport 2dcd02a6-6ee7-4a78-a00f-13dc1294595f up in Southbound
Nov 28 10:06:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:45.133 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:45.187 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:45.234 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:45 np0005538513.localdomain podman[325853]: 
Nov 28 10:06:45 np0005538513.localdomain podman[325853]: 2025-11-28 10:06:45.320763044 +0000 UTC m=+0.084626218 container create 9219fadb3f566625ebf598f73b4aa760b705fa2fb7c258e7460938a933ab8232 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:45 np0005538513.localdomain podman[325853]: 2025-11-28 10:06:45.272800256 +0000 UTC m=+0.036663460 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:06:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:06:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e185 do_prune osdmap full prune enabled
Nov 28 10:06:45 np0005538513.localdomain systemd[1]: Started libpod-conmon-9219fadb3f566625ebf598f73b4aa760b705fa2fb7c258e7460938a933ab8232.scope.
Nov 28 10:06:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e186 e186: 6 total, 6 up, 6 in
Nov 28 10:06:45 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e186: 6 total, 6 up, 6 in
Nov 28 10:06:45 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:06:45 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1d6ced94728650500b10b276b0c43bf2e09a46765f20bb4a704944ec73bf2bc3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:06:45 np0005538513.localdomain podman[325853]: 2025-11-28 10:06:45.438635366 +0000 UTC m=+0.202498550 container init 9219fadb3f566625ebf598f73b4aa760b705fa2fb7c258e7460938a933ab8232 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:45 np0005538513.localdomain podman[325853]: 2025-11-28 10:06:45.449235783 +0000 UTC m=+0.213098957 container start 9219fadb3f566625ebf598f73b4aa760b705fa2fb7c258e7460938a933ab8232 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:06:45 np0005538513.localdomain dnsmasq[325876]: started, version 2.85 cachesize 150
Nov 28 10:06:45 np0005538513.localdomain dnsmasq[325876]: DNS service limited to local subnets
Nov 28 10:06:45 np0005538513.localdomain dnsmasq[325876]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:06:45 np0005538513.localdomain dnsmasq[325876]: warning: no upstream servers configured
Nov 28 10:06:45 np0005538513.localdomain dnsmasq-dhcp[325876]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:06:45 np0005538513.localdomain dnsmasq[325876]: read /var/lib/neutron/dhcp/c2ece010-6b32-45eb-a0e7-54a94c6d37c8/addn_hosts - 0 addresses
Nov 28 10:06:45 np0005538513.localdomain dnsmasq-dhcp[325876]: read /var/lib/neutron/dhcp/c2ece010-6b32-45eb-a0e7-54a94c6d37c8/host
Nov 28 10:06:45 np0005538513.localdomain dnsmasq-dhcp[325876]: read /var/lib/neutron/dhcp/c2ece010-6b32-45eb-a0e7-54a94c6d37c8/opts
Nov 28 10:06:45 np0005538513.localdomain ceph-mon[292954]: osdmap e185: 6 total, 6 up, 6 in
Nov 28 10:06:45 np0005538513.localdomain ceph-mon[292954]: osdmap e186: 6 total, 6 up, 6 in
Nov 28 10:06:45 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2385808969' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:45 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2385808969' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:45.768 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:06:45 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:45.773 261084 INFO neutron.agent.dhcp.agent [None req-1b4e5cd6-f2b9-4940-acfa-8801257eb07a - - - - - -] DHCP configuration for ports {'ac54af53-f927-47a3-a012-007eb09610ba', '9de81172-b84e-48ed-a6ac-538e374abb7b'} is completed
Nov 28 10:06:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:45.828 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:45 np0005538513.localdomain dnsmasq[325876]: exiting on receipt of SIGTERM
Nov 28 10:06:45 np0005538513.localdomain podman[325908]: 2025-11-28 10:06:45.920346469 +0000 UTC m=+0.055930545 container kill 9219fadb3f566625ebf598f73b4aa760b705fa2fb7c258e7460938a933ab8232 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:06:45 np0005538513.localdomain systemd[1]: libpod-9219fadb3f566625ebf598f73b4aa760b705fa2fb7c258e7460938a933ab8232.scope: Deactivated successfully.
Nov 28 10:06:46 np0005538513.localdomain podman[325922]: 2025-11-28 10:06:46.001268372 +0000 UTC m=+0.065952573 container died 9219fadb3f566625ebf598f73b4aa760b705fa2fb7c258e7460938a933ab8232 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 28 10:06:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:46.006 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:46 np0005538513.localdomain podman[325922]: 2025-11-28 10:06:46.031394511 +0000 UTC m=+0.096078682 container cleanup 9219fadb3f566625ebf598f73b4aa760b705fa2fb7c258e7460938a933ab8232 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 10:06:46 np0005538513.localdomain systemd[1]: libpod-conmon-9219fadb3f566625ebf598f73b4aa760b705fa2fb7c258e7460938a933ab8232.scope: Deactivated successfully.
Nov 28 10:06:46 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-1d6ced94728650500b10b276b0c43bf2e09a46765f20bb4a704944ec73bf2bc3-merged.mount: Deactivated successfully.
Nov 28 10:06:46 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9219fadb3f566625ebf598f73b4aa760b705fa2fb7c258e7460938a933ab8232-userdata-shm.mount: Deactivated successfully.
Nov 28 10:06:46 np0005538513.localdomain podman[325924]: 2025-11-28 10:06:46.088270453 +0000 UTC m=+0.143105480 container remove 9219fadb3f566625ebf598f73b4aa760b705fa2fb7c258e7460938a933ab8232 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 10:06:46 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:46Z|00424|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:06:46 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:46.120 2 INFO neutron.agent.securitygroups_rpc [None req-33288b8b-e565-4871-a7ed-f7e6a715772e 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:46.133 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:46 np0005538513.localdomain kernel: device tapac54af53-f9 left promiscuous mode
Nov 28 10:06:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:46.159 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:46 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:46.204 261084 INFO neutron.agent.dhcp.agent [None req-5af3994c-f40c-432a-8f54-67748103df35 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:46 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2dc2ece010\x2d6b32\x2d45eb\x2da0e7\x2d54a94c6d37c8.mount: Deactivated successfully.
Nov 28 10:06:46 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:46.205 261084 INFO neutron.agent.dhcp.agent [None req-5af3994c-f40c-432a-8f54-67748103df35 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:46 np0005538513.localdomain podman[325970]: 
Nov 28 10:06:46 np0005538513.localdomain podman[325970]: 2025-11-28 10:06:46.302477393 +0000 UTC m=+0.113779016 container create 9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-49f178c5-0cae-4b0e-9bb3-8615842f2e56, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:06:46 np0005538513.localdomain systemd[1]: Started libpod-conmon-9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901.scope.
Nov 28 10:06:46 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:06:46 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9bc781df874fccff849953263ff4783d37195584bd3533cf543244d5f9159b50/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:06:46 np0005538513.localdomain podman[325970]: 2025-11-28 10:06:46.25985244 +0000 UTC m=+0.071154093 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:06:46 np0005538513.localdomain podman[325970]: 2025-11-28 10:06:46.36825899 +0000 UTC m=+0.179560613 container init 9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-49f178c5-0cae-4b0e-9bb3-8615842f2e56, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:06:46 np0005538513.localdomain podman[325970]: 2025-11-28 10:06:46.377869226 +0000 UTC m=+0.189170849 container start 9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-49f178c5-0cae-4b0e-9bb3-8615842f2e56, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 28 10:06:46 np0005538513.localdomain dnsmasq[325988]: started, version 2.85 cachesize 150
Nov 28 10:06:46 np0005538513.localdomain dnsmasq[325988]: DNS service limited to local subnets
Nov 28 10:06:46 np0005538513.localdomain dnsmasq[325988]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:06:46 np0005538513.localdomain dnsmasq[325988]: warning: no upstream servers configured
Nov 28 10:06:46 np0005538513.localdomain dnsmasq-dhcp[325988]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:06:46 np0005538513.localdomain dnsmasq[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/addn_hosts - 0 addresses
Nov 28 10:06:46 np0005538513.localdomain dnsmasq-dhcp[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/host
Nov 28 10:06:46 np0005538513.localdomain dnsmasq-dhcp[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/opts
Nov 28 10:06:46 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:46.453 261084 INFO neutron.agent.dhcp.agent [None req-f9f16e8e-da79-4140-89ae-8b2b2e290d57 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:06:44Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd63dd310>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd63dd610>], id=62795658-8866-4a6e-9294-b579a034da47, ip_allocation=immediate, mac_address=fa:16:3e:0b:8c:a9, name=tempest-PortsTestJSON-1966847573, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:06:42Z, description=, dns_domain=, id=49f178c5-0cae-4b0e-9bb3-8615842f2e56, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-1662837750, port_security_enabled=True, project_id=5e7a07c97c664076bc825e05137c574c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=8809, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2398, status=ACTIVE, subnets=['f761224f-305a-4c1b-8715-9d97740c76a2'], tags=[], tenant_id=5e7a07c97c664076bc825e05137c574c, updated_at=2025-11-28T10:06:43Z, vlan_transparent=None, network_id=49f178c5-0cae-4b0e-9bb3-8615842f2e56, port_security_enabled=True, project_id=5e7a07c97c664076bc825e05137c574c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b'], standard_attr_id=2410, status=DOWN, tags=[], tenant_id=5e7a07c97c664076bc825e05137c574c, updated_at=2025-11-28T10:06:44Z on network 49f178c5-0cae-4b0e-9bb3-8615842f2e56
Nov 28 10:06:46 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:46.573 261084 INFO neutron.agent.dhcp.agent [None req-de1b2128-68a6-4865-b222-eacdfefbcf63 - - - - - -] DHCP configuration for ports {'f5d400af-a230-4bee-9fd3-04e139732c3d'} is completed
Nov 28 10:06:46 np0005538513.localdomain dnsmasq[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/addn_hosts - 1 addresses
Nov 28 10:06:46 np0005538513.localdomain podman[326006]: 2025-11-28 10:06:46.717435669 +0000 UTC m=+0.066075437 container kill 9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-49f178c5-0cae-4b0e-9bb3-8615842f2e56, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:46 np0005538513.localdomain dnsmasq-dhcp[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/host
Nov 28 10:06:46 np0005538513.localdomain dnsmasq-dhcp[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/opts
Nov 28 10:06:46 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:46.764 2 INFO neutron.agent.securitygroups_rpc [None req-92a83df5-6e17-4f34-81bc-c1c1907c0872 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:46.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:06:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:46.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:06:46 np0005538513.localdomain ceph-mon[292954]: pgmap v357: 177 pgs: 1 active+clean+snaptrim_wait, 2 active+clean+snaptrim, 174 active+clean; 598 MiB data, 1.9 GiB used, 40 GiB / 42 GiB avail; 128 KiB/s rd, 33 MiB/s wr, 185 op/s
Nov 28 10:06:46 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3198145338' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:46 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3198145338' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:46 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:46.908 261084 INFO neutron.agent.dhcp.agent [None req-c0c9a2dd-6aba-4de0-8de3-956b4c569218 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:06:45Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd63dbb50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd63db550>], id=8e769ce7-7e42-447c-9aaf-873d0ff4a173, ip_allocation=immediate, mac_address=fa:16:3e:7a:92:f9, name=tempest-PortsTestJSON-895012161, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:06:42Z, description=, dns_domain=, id=49f178c5-0cae-4b0e-9bb3-8615842f2e56, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-1662837750, port_security_enabled=True, project_id=5e7a07c97c664076bc825e05137c574c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=8809, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2398, status=ACTIVE, subnets=['f761224f-305a-4c1b-8715-9d97740c76a2'], tags=[], tenant_id=5e7a07c97c664076bc825e05137c574c, updated_at=2025-11-28T10:06:43Z, vlan_transparent=None, network_id=49f178c5-0cae-4b0e-9bb3-8615842f2e56, port_security_enabled=True, project_id=5e7a07c97c664076bc825e05137c574c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b'], standard_attr_id=2416, status=DOWN, tags=[], tenant_id=5e7a07c97c664076bc825e05137c574c, updated_at=2025-11-28T10:06:45Z on network 49f178c5-0cae-4b0e-9bb3-8615842f2e56
Nov 28 10:06:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:46.923 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:06:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:46.925 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:06:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:46.925 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:06:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:46.926 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:06:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:46.927 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:06:46 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:46.991 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:46.992 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:46 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:46.994 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:06:47 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:47.097 261084 INFO neutron.agent.dhcp.agent [None req-38cb43d6-ffed-431f-a006-834980804f11 - - - - - -] DHCP configuration for ports {'62795658-8866-4a6e-9294-b579a034da47'} is completed
Nov 28 10:06:47 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:47.162 2 INFO neutron.agent.securitygroups_rpc [None req-09174bbe-f95e-4ff7-9cd4-901486ff4f01 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:47 np0005538513.localdomain dnsmasq[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/addn_hosts - 2 addresses
Nov 28 10:06:47 np0005538513.localdomain podman[326062]: 2025-11-28 10:06:47.232299333 +0000 UTC m=+0.070970057 container kill 9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-49f178c5-0cae-4b0e-9bb3-8615842f2e56, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:06:47 np0005538513.localdomain dnsmasq-dhcp[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/host
Nov 28 10:06:47 np0005538513.localdomain dnsmasq-dhcp[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/opts
Nov 28 10:06:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:06:47 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2354762036' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:06:47 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:47.374 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:47.378 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:06:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:47.474 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:06:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:47.475 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:06:47 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:47.550 261084 INFO neutron.agent.dhcp.agent [None req-a56efb65-c118-4c96-af6b-d3e69a4c11a4 - - - - - -] DHCP configuration for ports {'8e769ce7-7e42-447c-9aaf-873d0ff4a173'} is completed
Nov 28 10:06:47 np0005538513.localdomain dnsmasq[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/addn_hosts - 1 addresses
Nov 28 10:06:47 np0005538513.localdomain podman[326103]: 2025-11-28 10:06:47.715159582 +0000 UTC m=+0.075763006 container kill 9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-49f178c5-0cae-4b0e-9bb3-8615842f2e56, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:06:47 np0005538513.localdomain dnsmasq-dhcp[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/host
Nov 28 10:06:47 np0005538513.localdomain dnsmasq-dhcp[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/opts
Nov 28 10:06:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:47.721 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:06:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:47.723 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11091MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:06:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:47.723 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:06:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:47.723 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:06:47 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2354762036' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:06:47 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/3613923806' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:06:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:47.814 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 10:06:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:47.815 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:06:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:47.815 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:06:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:47.867 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:06:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:06:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:06:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:06:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:06:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:06:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:06:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:06:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:06:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:06:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:06:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:06:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:06:48 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:48.341 2 INFO neutron.agent.securitygroups_rpc [None req-5e549f88-7906-4d9c-9524-bb7ac558174f 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:48 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:06:48 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1057030524' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:06:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:48.386 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.520s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:06:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:48.393 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:06:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:48.420 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:06:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:48.421 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:06:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:48.421 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.698s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:06:48 np0005538513.localdomain dnsmasq[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/addn_hosts - 0 addresses
Nov 28 10:06:48 np0005538513.localdomain podman[326164]: 2025-11-28 10:06:48.573171599 +0000 UTC m=+0.050814817 container kill 9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-49f178c5-0cae-4b0e-9bb3-8615842f2e56, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:06:48 np0005538513.localdomain dnsmasq-dhcp[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/host
Nov 28 10:06:48 np0005538513.localdomain dnsmasq-dhcp[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/opts
Nov 28 10:06:49 np0005538513.localdomain ceph-mon[292954]: pgmap v358: 177 pgs: 177 active+clean; 633 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 119 KiB/s rd, 21 MiB/s wr, 168 op/s
Nov 28 10:06:49 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1057030524' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:06:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:49.422 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:06:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:49.423 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:06:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:49.423 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:06:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:06:49 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:49.861 2 INFO neutron.agent.securitygroups_rpc [None req-be3aec7b-abf4-4211-97ca-2df83bf1c365 87cf17c48bab44279b2e7eca9e0882a2 d3c0d1ce8d854a7b9ffc953e88cd2c44 - - default default] Security group rule updated ['c52603b5-5f47-4123-b8fe-cc9f0a56d914']
Nov 28 10:06:49 np0005538513.localdomain podman[326197]: 2025-11-28 10:06:49.874806225 +0000 UTC m=+0.101848809 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64)
Nov 28 10:06:49 np0005538513.localdomain podman[326197]: 2025-11-28 10:06:49.929112879 +0000 UTC m=+0.156155493 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, name=ubi9-minimal)
Nov 28 10:06:49 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:06:49 np0005538513.localdomain dnsmasq[322901]: exiting on receipt of SIGTERM
Nov 28 10:06:49 np0005538513.localdomain podman[326214]: 2025-11-28 10:06:49.96323431 +0000 UTC m=+0.119742381 container kill d45c3118e0cb447a1f990580ae47ce23aabce3b8035e5e02e901e9a0f3bc1683 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fa28040d-639a-454c-9515-60af86f8624b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:06:49 np0005538513.localdomain systemd[1]: libpod-d45c3118e0cb447a1f990580ae47ce23aabce3b8035e5e02e901e9a0f3bc1683.scope: Deactivated successfully.
Nov 28 10:06:49 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:49Z|00425|binding|INFO|Removing iface tapbbebc9e7-db ovn-installed in OVS
Nov 28 10:06:49 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:49Z|00426|binding|INFO|Removing lport bbebc9e7-dbe0-47b5-b390-b5bcd4b40cc9 ovn-installed in OVS
Nov 28 10:06:49 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:49.981 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port c9b05cd7-5e0a-4448-ad54-e279b37c3a36 with type ""
Nov 28 10:06:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:49.982 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:49 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:49.984 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.255.242/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-fa28040d-639a-454c-9515-60af86f8624b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa28040d-639a-454c-9515-60af86f8624b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8499932c523b4e26933fff84403e296e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dcca6890-1675-46ad-9260-7f267479c535, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=bbebc9e7-dbe0-47b5-b390-b5bcd4b40cc9) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:49 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:49.990 158130 INFO neutron.agent.ovn.metadata.agent [-] Port bbebc9e7-dbe0-47b5-b390-b5bcd4b40cc9 in datapath fa28040d-639a-454c-9515-60af86f8624b unbound from our chassis
Nov 28 10:06:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:49.991 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:49 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:49.993 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fa28040d-639a-454c-9515-60af86f8624b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:06:49 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:49.994 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[10d47530-ba3e-4435-bf5a-f26f56a95320]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:50 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:06:50 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "format": "json"}]: dispatch
Nov 28 10:06:50 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:06:50 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/419078793' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:06:50 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1129667614' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:06:50 np0005538513.localdomain podman[326248]: 2025-11-28 10:06:50.033635609 +0000 UTC m=+0.056131480 container died d45c3118e0cb447a1f990580ae47ce23aabce3b8035e5e02e901e9a0f3bc1683 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fa28040d-639a-454c-9515-60af86f8624b, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:06:50 np0005538513.localdomain podman[326248]: 2025-11-28 10:06:50.073809688 +0000 UTC m=+0.096305519 container cleanup d45c3118e0cb447a1f990580ae47ce23aabce3b8035e5e02e901e9a0f3bc1683 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fa28040d-639a-454c-9515-60af86f8624b, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:06:50 np0005538513.localdomain systemd[1]: libpod-conmon-d45c3118e0cb447a1f990580ae47ce23aabce3b8035e5e02e901e9a0f3bc1683.scope: Deactivated successfully.
Nov 28 10:06:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:50.084 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:06:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:50.085 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:06:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:50.085 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 10:06:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:50.086 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:06:50 np0005538513.localdomain podman[326250]: 2025-11-28 10:06:50.104557524 +0000 UTC m=+0.106753070 container remove d45c3118e0cb447a1f990580ae47ce23aabce3b8035e5e02e901e9a0f3bc1683 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fa28040d-639a-454c-9515-60af86f8624b, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:50.119 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:50 np0005538513.localdomain kernel: device tapbbebc9e7-db left promiscuous mode
Nov 28 10:06:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:50.134 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:50 np0005538513.localdomain dnsmasq[325988]: exiting on receipt of SIGTERM
Nov 28 10:06:50 np0005538513.localdomain podman[326272]: 2025-11-28 10:06:50.142609027 +0000 UTC m=+0.115660035 container kill 9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-49f178c5-0cae-4b0e-9bb3-8615842f2e56, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:06:50 np0005538513.localdomain systemd[1]: libpod-9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901.scope: Deactivated successfully.
Nov 28 10:06:50 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:50.150 261084 INFO neutron.agent.dhcp.agent [None req-8bd5054b-d03e-4c05-8799-43d2ea5c917f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:50 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:50.193 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:50 np0005538513.localdomain podman[326298]: 2025-11-28 10:06:50.206101594 +0000 UTC m=+0.042165321 container died 9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-49f178c5-0cae-4b0e-9bb3-8615842f2e56, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:50 np0005538513.localdomain podman[326298]: 2025-11-28 10:06:50.24232091 +0000 UTC m=+0.078384617 container remove 9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-49f178c5-0cae-4b0e-9bb3-8615842f2e56, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:06:50 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:50Z|00427|binding|INFO|Releasing lport 2dcd02a6-6ee7-4a78-a00f-13dc1294595f from this chassis (sb_readonly=0)
Nov 28 10:06:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:50.258 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:50 np0005538513.localdomain kernel: device tap2dcd02a6-6e left promiscuous mode
Nov 28 10:06:50 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:50Z|00428|binding|INFO|Setting lport 2dcd02a6-6ee7-4a78-a00f-13dc1294595f down in Southbound
Nov 28 10:06:50 np0005538513.localdomain systemd[1]: libpod-conmon-9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901.scope: Deactivated successfully.
Nov 28 10:06:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:50.272 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-49f178c5-0cae-4b0e-9bb3-8615842f2e56', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-49f178c5-0cae-4b0e-9bb3-8615842f2e56', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac48318a-2b44-44d6-ac83-b660fc51fa7f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=2dcd02a6-6ee7-4a78-a00f-13dc1294595f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:50.274 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:50.275 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 2dcd02a6-6ee7-4a78-a00f-13dc1294595f in datapath 49f178c5-0cae-4b0e-9bb3-8615842f2e56 unbound from our chassis
Nov 28 10:06:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:50.278 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:50.279 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 49f178c5-0cae-4b0e-9bb3-8615842f2e56, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:06:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:50.280 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[85e3c570-9f3c-4a60-b91d-44921c14effb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:50 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:50.300 261084 INFO neutron.agent.dhcp.agent [None req-13f1892a-bb63-4f6b-97c5-f89954cfd0c0 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:50 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:50.313 2 INFO neutron.agent.securitygroups_rpc [None req-500702cc-90b6-4aaa-a265-7d7caae35722 87cf17c48bab44279b2e7eca9e0882a2 d3c0d1ce8d854a7b9ffc953e88cd2c44 - - default default] Security group rule updated ['c52603b5-5f47-4123-b8fe-cc9f0a56d914']
Nov 28 10:06:50 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:50Z|00429|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:06:50 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:06:50 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e186 do_prune osdmap full prune enabled
Nov 28 10:06:50 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e187 e187: 6 total, 6 up, 6 in
Nov 28 10:06:50 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e187: 6 total, 6 up, 6 in
Nov 28 10:06:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:50.495 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:50 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:50.692 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:50.831 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:50.845 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:06:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:50.846 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:06:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:50.846 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:06:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-9bc781df874fccff849953263ff4783d37195584bd3533cf543244d5f9159b50-merged.mount: Deactivated successfully.
Nov 28 10:06:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901-userdata-shm.mount: Deactivated successfully.
Nov 28 10:06:50 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d49f178c5\x2d0cae\x2d4b0e\x2d9bb3\x2d8615842f2e56.mount: Deactivated successfully.
Nov 28 10:06:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-14c3eefe9cc85a0964db7e342e45ff322ab308fd13a7f19aa1847f356aa5bafb-merged.mount: Deactivated successfully.
Nov 28 10:06:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:50.869 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:06:50 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d45c3118e0cb447a1f990580ae47ce23aabce3b8035e5e02e901e9a0f3bc1683-userdata-shm.mount: Deactivated successfully.
Nov 28 10:06:50 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2dfa28040d\x2d639a\x2d454c\x2d9515\x2d60af86f8624b.mount: Deactivated successfully.
Nov 28 10:06:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:50.892 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:06:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:50.892 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 10:06:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:50.893 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:06:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:50.997 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:06:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:51.008 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:51 np0005538513.localdomain dnsmasq[323161]: exiting on receipt of SIGTERM
Nov 28 10:06:51 np0005538513.localdomain podman[326341]: 2025-11-28 10:06:51.171775738 +0000 UTC m=+0.067274414 container kill c2f76ad321100bf3dd865ae468acc4ac2ff5793c21d4c81e01c36259a18458eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f532ea4-a0de-4113-8993-33f982144ec8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 28 10:06:51 np0005538513.localdomain systemd[1]: libpod-c2f76ad321100bf3dd865ae468acc4ac2ff5793c21d4c81e01c36259a18458eb.scope: Deactivated successfully.
Nov 28 10:06:51 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e46: np0005538515.yfkzhl(active, since 8m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:06:51 np0005538513.localdomain podman[326355]: 2025-11-28 10:06:51.265438024 +0000 UTC m=+0.077949913 container died c2f76ad321100bf3dd865ae468acc4ac2ff5793c21d4c81e01c36259a18458eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f532ea4-a0de-4113-8993-33f982144ec8, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:06:51 np0005538513.localdomain ceph-mon[292954]: pgmap v359: 177 pgs: 177 active+clean; 633 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 100 KiB/s rd, 18 MiB/s wr, 142 op/s
Nov 28 10:06:51 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/691215413' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:06:51 np0005538513.localdomain ceph-mon[292954]: osdmap e187: 6 total, 6 up, 6 in
Nov 28 10:06:51 np0005538513.localdomain podman[326355]: 2025-11-28 10:06:51.309352008 +0000 UTC m=+0.121863857 container cleanup c2f76ad321100bf3dd865ae468acc4ac2ff5793c21d4c81e01c36259a18458eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f532ea4-a0de-4113-8993-33f982144ec8, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Nov 28 10:06:51 np0005538513.localdomain systemd[1]: libpod-conmon-c2f76ad321100bf3dd865ae468acc4ac2ff5793c21d4c81e01c36259a18458eb.scope: Deactivated successfully.
Nov 28 10:06:51 np0005538513.localdomain podman[326357]: 2025-11-28 10:06:51.355970674 +0000 UTC m=+0.161743795 container remove c2f76ad321100bf3dd865ae468acc4ac2ff5793c21d4c81e01c36259a18458eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f532ea4-a0de-4113-8993-33f982144ec8, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:51.369 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:51 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:51Z|00430|binding|INFO|Releasing lport 87ef7272-14f7-4162-a8a9-b13090f8924f from this chassis (sb_readonly=0)
Nov 28 10:06:51 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:51Z|00431|binding|INFO|Setting lport 87ef7272-14f7-4162-a8a9-b13090f8924f down in Southbound
Nov 28 10:06:51 np0005538513.localdomain kernel: device tap87ef7272-14 left promiscuous mode
Nov 28 10:06:51 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:51.378 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-3f532ea4-a0de-4113-8993-33f982144ec8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f532ea4-a0de-4113-8993-33f982144ec8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b2dec4622fe844c592a13e779612beaa', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4a58c096-9217-4d0b-a64c-715683dae905, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=87ef7272-14f7-4162-a8a9-b13090f8924f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:51 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:51.380 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 87ef7272-14f7-4162-a8a9-b13090f8924f in datapath 3f532ea4-a0de-4113-8993-33f982144ec8 unbound from our chassis
Nov 28 10:06:51 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:51.381 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3f532ea4-a0de-4113-8993-33f982144ec8 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:06:51 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:06:51.383 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[817fdb59-38d3-4d65-9037-2bb02ecf8aa0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:51.392 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:51 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:51.425 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:51 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:51.726 2 INFO neutron.agent.securitygroups_rpc [None req-7220764d-588c-486e-8a18-42779bef93d4 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:51 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:06:51.759 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:51 np0005538513.localdomain systemd[1]: tmp-crun.FUdifc.mount: Deactivated successfully.
Nov 28 10:06:51 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-0a09506b8e900e354d47418961d4d79668634f1edb0c1cb9bb0d739ec90a1c16-merged.mount: Deactivated successfully.
Nov 28 10:06:51 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c2f76ad321100bf3dd865ae468acc4ac2ff5793c21d4c81e01c36259a18458eb-userdata-shm.mount: Deactivated successfully.
Nov 28 10:06:51 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d3f532ea4\x2da0de\x2d4113\x2d8993\x2d33f982144ec8.mount: Deactivated successfully.
Nov 28 10:06:52 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:06:52Z|00432|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:06:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:52.119 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:52 np0005538513.localdomain ceph-mon[292954]: mgrmap e46: np0005538515.yfkzhl(active, since 8m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:06:53 np0005538513.localdomain ceph-mon[292954]: pgmap v361: 177 pgs: 177 active+clean; 745 MiB data, 2.5 GiB used, 39 GiB / 42 GiB avail; 115 KiB/s rd, 34 MiB/s wr, 169 op/s
Nov 28 10:06:54 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:06:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:06:54 np0005538513.localdomain podman[326383]: 2025-11-28 10:06:54.855359469 +0000 UTC m=+0.087312972 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:06:54 np0005538513.localdomain podman[326383]: 2025-11-28 10:06:54.867570286 +0000 UTC m=+0.099523829 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 10:06:54 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:06:55 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:06:55 np0005538513.localdomain ceph-mon[292954]: pgmap v362: 177 pgs: 177 active+clean; 745 MiB data, 2.5 GiB used, 39 GiB / 42 GiB avail; 96 KiB/s rd, 29 MiB/s wr, 142 op/s
Nov 28 10:06:55 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1555194e-595d-4b14-be99-858d500899d3", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:06:55 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1555194e-595d-4b14-be99-858d500899d3", "format": "json"}]: dispatch
Nov 28 10:06:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:55.833 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:55 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:06:55.873 2 INFO neutron.agent.securitygroups_rpc [None req-3852d4d4-6237-4177-8a94-f6db1e58a4b7 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:06:56.014 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:06:56 np0005538513.localdomain podman[326406]: 2025-11-28 10:06:56.842576201 +0000 UTC m=+0.082372720 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:56 np0005538513.localdomain podman[326406]: 2025-11-28 10:06:56.876511356 +0000 UTC m=+0.116307846 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 28 10:06:56 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:06:57 np0005538513.localdomain ceph-mon[292954]: pgmap v363: 177 pgs: 177 active+clean; 745 MiB data, 2.5 GiB used, 39 GiB / 42 GiB avail; 81 KiB/s rd, 24 MiB/s wr, 119 op/s
Nov 28 10:06:57 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1390478790' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:06:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:06:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:06:58 np0005538513.localdomain podman[326426]: 2025-11-28 10:06:58.907005181 +0000 UTC m=+0.142729339 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Nov 28 10:06:58 np0005538513.localdomain podman[326426]: 2025-11-28 10:06:58.919839426 +0000 UTC m=+0.155563594 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true)
Nov 28 10:06:58 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:06:58 np0005538513.localdomain systemd[1]: tmp-crun.870Jpy.mount: Deactivated successfully.
Nov 28 10:06:58 np0005538513.localdomain podman[326427]: 2025-11-28 10:06:58.974993926 +0000 UTC m=+0.196567088 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:59 np0005538513.localdomain podman[326427]: 2025-11-28 10:06:59.015635017 +0000 UTC m=+0.237208169 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Nov 28 10:06:59 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:06:59 np0005538513.localdomain ceph-mon[292954]: pgmap v364: 177 pgs: 177 active+clean; 865 MiB data, 2.9 GiB used, 39 GiB / 42 GiB avail; 19 KiB/s rd, 23 MiB/s wr, 36 op/s
Nov 28 10:06:59 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "1555194e-595d-4b14-be99-858d500899d3", "new_size": 2147483648, "format": "json"}]: dispatch
Nov 28 10:07:00 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:00.836 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:01.016 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:01 np0005538513.localdomain ceph-mon[292954]: pgmap v365: 177 pgs: 177 active+clean; 865 MiB data, 2.9 GiB used, 39 GiB / 42 GiB avail; 19 KiB/s rd, 23 MiB/s wr, 36 op/s
Nov 28 10:07:01 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2257434953' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:01 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2257434953' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:02 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e47: np0005538515.yfkzhl(active, since 8m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:07:03 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:07:03.080 2 INFO neutron.agent.securitygroups_rpc [req-e18f2dde-6bdd-4008-97a4-be84187f4807 req-3c6747af-afe3-40df-86cf-89416982a794 87cf17c48bab44279b2e7eca9e0882a2 d3c0d1ce8d854a7b9ffc953e88cd2c44 - - default default] Security group member updated ['c52603b5-5f47-4123-b8fe-cc9f0a56d914']
Nov 28 10:07:03 np0005538513.localdomain ceph-mon[292954]: pgmap v366: 177 pgs: 177 active+clean; 1.0 GiB data, 3.3 GiB used, 39 GiB / 42 GiB avail; 52 KiB/s rd, 33 MiB/s wr, 87 op/s
Nov 28 10:07:03 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1555194e-595d-4b14-be99-858d500899d3", "format": "json"}]: dispatch
Nov 28 10:07:03 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1555194e-595d-4b14-be99-858d500899d3", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:03 np0005538513.localdomain ceph-mon[292954]: mgrmap e47: np0005538515.yfkzhl(active, since 8m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:07:03 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/4053547237' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:03 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/4053547237' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:04 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:07:04.539 2 INFO neutron.agent.securitygroups_rpc [None req-ae7d6dfb-c119-45e0-bfec-235340ad22c9 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['19d31bf3-ea7b-49ec-820d-ba3fe5752e88']
Nov 28 10:07:04 np0005538513.localdomain ceph-mon[292954]: pgmap v367: 177 pgs: 177 active+clean; 1.0 GiB data, 3.3 GiB used, 39 GiB / 42 GiB avail; 41 KiB/s rd, 22 MiB/s wr, 67 op/s
Nov 28 10:07:05 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:05.839 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:06 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:06.018 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:06 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/4069209835' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:06 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/4069209835' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:07 np0005538513.localdomain ceph-mon[292954]: pgmap v368: 177 pgs: 177 active+clean; 1.0 GiB data, 3.3 GiB used, 39 GiB / 42 GiB avail; 41 KiB/s rd, 22 MiB/s wr, 67 op/s
Nov 28 10:07:07 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:07:08 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f30195cf-a779-4b65-9774-df0ab49a62cf", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:07:08 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f30195cf-a779-4b65-9774-df0ab49a62cf", "format": "json"}]: dispatch
Nov 28 10:07:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e187 do_prune osdmap full prune enabled
Nov 28 10:07:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e188 e188: 6 total, 6 up, 6 in
Nov 28 10:07:09 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e188: 6 total, 6 up, 6 in
Nov 28 10:07:09 np0005538513.localdomain ceph-mon[292954]: pgmap v369: 177 pgs: 177 active+clean; 1.1 GiB data, 3.7 GiB used, 38 GiB / 42 GiB avail; 67 KiB/s rd, 32 MiB/s wr, 106 op/s
Nov 28 10:07:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:07:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:07:09 np0005538513.localdomain podman[326469]: 2025-11-28 10:07:09.855937956 +0000 UTC m=+0.090966634 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:07:09 np0005538513.localdomain podman[326469]: 2025-11-28 10:07:09.863920402 +0000 UTC m=+0.098949060 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:07:09 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:07:09 np0005538513.localdomain podman[326470]: 2025-11-28 10:07:09.916379338 +0000 UTC m=+0.147838516 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:07:09 np0005538513.localdomain podman[326470]: 2025-11-28 10:07:09.931594127 +0000 UTC m=+0.163053285 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible)
Nov 28 10:07:09 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:07:10 np0005538513.localdomain podman[238687]: time="2025-11-28T10:07:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:07:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:07:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157512 "" "Go-http-client/1.1"
Nov 28 10:07:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:07:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19749 "" "Go-http-client/1.1"
Nov 28 10:07:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:10 np0005538513.localdomain ceph-mon[292954]: osdmap e188: 6 total, 6 up, 6 in
Nov 28 10:07:10 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/3389479421' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:07:10 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1134720533' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:07:10 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:07:10.635 2 INFO neutron.agent.securitygroups_rpc [None req-570a0175-8080-4b40-9e80-d9942b63779e e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['c81397c2-33ee-481d-8257-b39c2b0c331e', '19d31bf3-ea7b-49ec-820d-ba3fe5752e88']
Nov 28 10:07:10 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:07:10.639 2 INFO neutron.agent.securitygroups_rpc [None req-22769ed5-ed71-4ef8-ab49-99801270d0d3 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:07:10 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:10.855 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:11.021 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e188 do_prune osdmap full prune enabled
Nov 28 10:07:11 np0005538513.localdomain ceph-mon[292954]: pgmap v371: 177 pgs: 177 active+clean; 1.1 GiB data, 3.7 GiB used, 38 GiB / 42 GiB avail; 72 KiB/s rd, 26 MiB/s wr, 109 op/s
Nov 28 10:07:11 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "f30195cf-a779-4b65-9774-df0ab49a62cf", "new_size": 2147483648, "format": "json"}]: dispatch
Nov 28 10:07:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e189 e189: 6 total, 6 up, 6 in
Nov 28 10:07:11 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e189: 6 total, 6 up, 6 in
Nov 28 10:07:12 np0005538513.localdomain ceph-mon[292954]: osdmap e189: 6 total, 6 up, 6 in
Nov 28 10:07:12 np0005538513.localdomain ceph-mon[292954]: pgmap v373: 177 pgs: 12 active+clean+snaptrim_wait, 8 active+clean+snaptrim, 157 active+clean; 1.2 GiB data, 3.9 GiB used, 38 GiB / 42 GiB avail; 65 KiB/s rd, 22 MiB/s wr, 96 op/s
Nov 28 10:07:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1541328209' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1541328209' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:14 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:07:14.560 2 INFO neutron.agent.securitygroups_rpc [None req-94433583-aa1a-4467-b851-fbd6872bea34 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['c81397c2-33ee-481d-8257-b39c2b0c331e']
Nov 28 10:07:14 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f30195cf-a779-4b65-9774-df0ab49a62cf", "format": "json"}]: dispatch
Nov 28 10:07:14 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f30195cf-a779-4b65-9774-df0ab49a62cf", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:14 np0005538513.localdomain ceph-mon[292954]: pgmap v374: 177 pgs: 12 active+clean+snaptrim_wait, 8 active+clean+snaptrim, 157 active+clean; 1.2 GiB data, 3.9 GiB used, 38 GiB / 42 GiB avail; 65 KiB/s rd, 22 MiB/s wr, 96 op/s
Nov 28 10:07:14 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2355630358' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:14 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2355630358' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e189 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:15.857 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:16.024 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:16 np0005538513.localdomain dnsmasq[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/addn_hosts - 0 addresses
Nov 28 10:07:16 np0005538513.localdomain dnsmasq-dhcp[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/host
Nov 28 10:07:16 np0005538513.localdomain podman[326528]: 2025-11-28 10:07:16.419207218 +0000 UTC m=+0.061868377 container kill c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:07:16 np0005538513.localdomain dnsmasq-dhcp[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/opts
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.435897) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324436435984, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 873, "num_deletes": 255, "total_data_size": 1217512, "memory_usage": 1241120, "flush_reason": "Manual Compaction"}
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324436445197, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 1203374, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29805, "largest_seqno": 30677, "table_properties": {"data_size": 1199158, "index_size": 1879, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10649, "raw_average_key_size": 21, "raw_value_size": 1190251, "raw_average_value_size": 2352, "num_data_blocks": 82, "num_entries": 506, "num_filter_entries": 506, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324396, "oldest_key_time": 1764324396, "file_creation_time": 1764324436, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 9341 microseconds, and 4115 cpu microseconds.
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.445249) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 1203374 bytes OK
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.445275) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.447466) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.447488) EVENT_LOG_v1 {"time_micros": 1764324436447481, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.447511) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 1213078, prev total WAL file size 1213078, number of live WAL files 2.
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.448417) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132303438' seq:72057594037927935, type:22 .. '7061786F73003132333030' seq:0, type:0; will stop at (end)
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(1175KB)], [51(17MB)]
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324436448476, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 19057268, "oldest_snapshot_seqno": -1}
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 12755 keys, 17845211 bytes, temperature: kUnknown
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324436563959, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 17845211, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17771070, "index_size": 41144, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31941, "raw_key_size": 342033, "raw_average_key_size": 26, "raw_value_size": 17552586, "raw_average_value_size": 1376, "num_data_blocks": 1556, "num_entries": 12755, "num_filter_entries": 12755, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324436, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.564331) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 17845211 bytes
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.566296) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 164.9 rd, 154.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 17.0 +0.0 blob) out(17.0 +0.0 blob), read-write-amplify(30.7) write-amplify(14.8) OK, records in: 13284, records dropped: 529 output_compression: NoCompression
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.566325) EVENT_LOG_v1 {"time_micros": 1764324436566313, "job": 30, "event": "compaction_finished", "compaction_time_micros": 115597, "compaction_time_cpu_micros": 51154, "output_level": 6, "num_output_files": 1, "total_output_size": 17845211, "num_input_records": 13284, "num_output_records": 12755, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324436566678, "job": 30, "event": "table_file_deletion", "file_number": 53}
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324436569248, "job": 30, "event": "table_file_deletion", "file_number": 51}
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.448319) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.569533) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.569560) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.569564) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.569567) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:07:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.569571) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:07:17 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e189 do_prune osdmap full prune enabled
Nov 28 10:07:17 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e190 e190: 6 total, 6 up, 6 in
Nov 28 10:07:17 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e190: 6 total, 6 up, 6 in
Nov 28 10:07:17 np0005538513.localdomain ceph-mon[292954]: pgmap v375: 177 pgs: 12 active+clean+snaptrim_wait, 8 active+clean+snaptrim, 157 active+clean; 1.2 GiB data, 3.9 GiB used, 38 GiB / 42 GiB avail; 26 KiB/s rd, 8.0 MiB/s wr, 38 op/s
Nov 28 10:07:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:07:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:07:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:07:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:07:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:07:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:07:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:07:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:07:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:07:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:07:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:07:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:07:18 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e190 do_prune osdmap full prune enabled
Nov 28 10:07:18 np0005538513.localdomain ceph-mon[292954]: osdmap e190: 6 total, 6 up, 6 in
Nov 28 10:07:18 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:07:18 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e191 e191: 6 total, 6 up, 6 in
Nov 28 10:07:18 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e191: 6 total, 6 up, 6 in
Nov 28 10:07:18 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:18.908 261084 INFO neutron.agent.linux.ip_lib [None req-bd01f321-4ae9-463f-bc5e-ff371036a5cc - - - - - -] Device tape7ad6507-8c cannot be used as it has no MAC address
Nov 28 10:07:18 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:18.986 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:18 np0005538513.localdomain kernel: device tape7ad6507-8c entered promiscuous mode
Nov 28 10:07:18 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324438.9966] manager: (tape7ad6507-8c): new Generic device (/org/freedesktop/NetworkManager/Devices/70)
Nov 28 10:07:18 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:18Z|00433|binding|INFO|Claiming lport e7ad6507-8cb9-4c54-9fde-23c7028d341d for this chassis.
Nov 28 10:07:18 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:18Z|00434|binding|INFO|e7ad6507-8cb9-4c54-9fde-23c7028d341d: Claiming unknown
Nov 28 10:07:18 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:18.997 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:19 np0005538513.localdomain systemd-udevd[326560]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:07:19 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:19.013 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe83:d5c4/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-e9912372-cec2-427d-b398-8b7ba1d00441', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e9912372-cec2-427d-b398-8b7ba1d00441', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ae10569a38284f298c961498da620c5f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5e17820c-609d-4556-ae57-25b66af88ca4, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=e7ad6507-8cb9-4c54-9fde-23c7028d341d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:19 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:19.015 158130 INFO neutron.agent.ovn.metadata.agent [-] Port e7ad6507-8cb9-4c54-9fde-23c7028d341d in datapath e9912372-cec2-427d-b398-8b7ba1d00441 bound to our chassis
Nov 28 10:07:19 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:19.018 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 9cf0f181-4a32-47e3-bbc4-a22358682295 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:07:19 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:19.019 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e9912372-cec2-427d-b398-8b7ba1d00441, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:07:19 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:19.019 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[d01d889d-4829-4555-bd10-f24fc8d9409c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:07:19 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tape7ad6507-8c: No such device
Nov 28 10:07:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:19.032 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:19 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tape7ad6507-8c: No such device
Nov 28 10:07:19 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:19Z|00435|binding|INFO|Setting lport e7ad6507-8cb9-4c54-9fde-23c7028d341d ovn-installed in OVS
Nov 28 10:07:19 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:19Z|00436|binding|INFO|Setting lport e7ad6507-8cb9-4c54-9fde-23c7028d341d up in Southbound
Nov 28 10:07:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:19.040 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:19 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tape7ad6507-8c: No such device
Nov 28 10:07:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:19.044 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:19 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tape7ad6507-8c: No such device
Nov 28 10:07:19 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tape7ad6507-8c: No such device
Nov 28 10:07:19 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tape7ad6507-8c: No such device
Nov 28 10:07:19 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tape7ad6507-8c: No such device
Nov 28 10:07:19 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tape7ad6507-8c: No such device
Nov 28 10:07:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:19.074 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:19 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:19.104 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:19 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:07:19.267 2 INFO neutron.agent.securitygroups_rpc [None req-5151d855-124b-4682-b39f-fc47e0550bce 6b80a7d6f65f4ebe8363ccaae26f3e87 ae10569a38284f298c961498da620c5f - - default default] Security group member updated ['c5eee24b-0bed-4035-a2ab-e6c531c94e43']
Nov 28 10:07:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e191 do_prune osdmap full prune enabled
Nov 28 10:07:19 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "74c605fe-3105-407e-80b8-d4fb6f7d4329", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:07:19 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "74c605fe-3105-407e-80b8-d4fb6f7d4329", "format": "json"}]: dispatch
Nov 28 10:07:19 np0005538513.localdomain ceph-mon[292954]: pgmap v377: 177 pgs: 177 active+clean; 192 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 8.0 MiB/s wr, 202 op/s
Nov 28 10:07:19 np0005538513.localdomain ceph-mon[292954]: osdmap e191: 6 total, 6 up, 6 in
Nov 28 10:07:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e192 e192: 6 total, 6 up, 6 in
Nov 28 10:07:19 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e192: 6 total, 6 up, 6 in
Nov 28 10:07:19 np0005538513.localdomain podman[326631]: 
Nov 28 10:07:19 np0005538513.localdomain podman[326631]: 2025-11-28 10:07:19.890459956 +0000 UTC m=+0.079987536 container create cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e9912372-cec2-427d-b398-8b7ba1d00441, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 28 10:07:19 np0005538513.localdomain systemd[1]: Started libpod-conmon-cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0.scope.
Nov 28 10:07:19 np0005538513.localdomain podman[326631]: 2025-11-28 10:07:19.843494729 +0000 UTC m=+0.033022369 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:07:19 np0005538513.localdomain systemd[1]: tmp-crun.QsoE6J.mount: Deactivated successfully.
Nov 28 10:07:19 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:07:19 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:07:19 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c04ff6f7b4fcac360df3aa4ab5727cea04640414569d0565dad260384a9ac12/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:07:19 np0005538513.localdomain podman[326631]: 2025-11-28 10:07:19.974826905 +0000 UTC m=+0.164354465 container init cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e9912372-cec2-427d-b398-8b7ba1d00441, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:07:19 np0005538513.localdomain podman[326631]: 2025-11-28 10:07:19.982280315 +0000 UTC m=+0.171807895 container start cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e9912372-cec2-427d-b398-8b7ba1d00441, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 28 10:07:19 np0005538513.localdomain dnsmasq[326657]: started, version 2.85 cachesize 150
Nov 28 10:07:19 np0005538513.localdomain dnsmasq[326657]: DNS service limited to local subnets
Nov 28 10:07:19 np0005538513.localdomain dnsmasq[326657]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:07:19 np0005538513.localdomain dnsmasq[326657]: warning: no upstream servers configured
Nov 28 10:07:19 np0005538513.localdomain dnsmasq[326657]: read /var/lib/neutron/dhcp/e9912372-cec2-427d-b398-8b7ba1d00441/addn_hosts - 0 addresses
Nov 28 10:07:20 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:20.051 261084 INFO neutron.agent.dhcp.agent [None req-bd01f321-4ae9-463f-bc5e-ff371036a5cc - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:07:18Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6694a60>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6694220>], id=d2f8dc65-4856-423e-a9bf-07311833bbd3, ip_allocation=immediate, mac_address=fa:16:3e:fd:dd:41, name=tempest-NetworksIpV6TestAttrs-1405885968, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:07:08Z, description=, dns_domain=, id=e9912372-cec2-427d-b398-8b7ba1d00441, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksIpV6TestAttrs-test-network-1194919161, port_security_enabled=True, project_id=ae10569a38284f298c961498da620c5f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=8532, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2471, status=ACTIVE, subnets=['961cfa80-c1ec-4a76-97a0-a027c3bf1d91'], tags=[], tenant_id=ae10569a38284f298c961498da620c5f, updated_at=2025-11-28T10:07:14Z, vlan_transparent=None, network_id=e9912372-cec2-427d-b398-8b7ba1d00441, port_security_enabled=True, project_id=ae10569a38284f298c961498da620c5f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['c5eee24b-0bed-4035-a2ab-e6c531c94e43'], standard_attr_id=2489, status=DOWN, tags=[], tenant_id=ae10569a38284f298c961498da620c5f, updated_at=2025-11-28T10:07:18Z on network e9912372-cec2-427d-b398-8b7ba1d00441
Nov 28 10:07:20 np0005538513.localdomain podman[326648]: 2025-11-28 10:07:20.073138275 +0000 UTC m=+0.103782969 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, release=1755695350, version=9.6, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7)
Nov 28 10:07:20 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:20.112 261084 INFO neutron.agent.dhcp.agent [None req-d31dc310-df46-4ca4-a98c-03f2a7a6da98 - - - - - -] DHCP configuration for ports {'82cd7cf2-21f9-4a3e-b3cf-743f79ff64b7'} is completed
Nov 28 10:07:20 np0005538513.localdomain podman[326648]: 2025-11-28 10:07:20.116628685 +0000 UTC m=+0.147273419 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, version=9.6, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9)
Nov 28 10:07:20 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:07:20 np0005538513.localdomain dnsmasq[326657]: read /var/lib/neutron/dhcp/e9912372-cec2-427d-b398-8b7ba1d00441/addn_hosts - 1 addresses
Nov 28 10:07:20 np0005538513.localdomain podman[326688]: 2025-11-28 10:07:20.258777674 +0000 UTC m=+0.062634550 container kill cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e9912372-cec2-427d-b398-8b7ba1d00441, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:20 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e192 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:20 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e192 do_prune osdmap full prune enabled
Nov 28 10:07:20 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e193 e193: 6 total, 6 up, 6 in
Nov 28 10:07:20 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e193: 6 total, 6 up, 6 in
Nov 28 10:07:20 np0005538513.localdomain ceph-mon[292954]: osdmap e192: 6 total, 6 up, 6 in
Nov 28 10:07:20 np0005538513.localdomain ceph-mon[292954]: osdmap e193: 6 total, 6 up, 6 in
Nov 28 10:07:20 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:20.575 261084 INFO neutron.agent.dhcp.agent [None req-65c014d6-57ba-49e0-9664-c40a5c4d2c1c - - - - - -] DHCP configuration for ports {'d2f8dc65-4856-423e-a9bf-07311833bbd3'} is completed
Nov 28 10:07:20 np0005538513.localdomain dnsmasq[326657]: exiting on receipt of SIGTERM
Nov 28 10:07:20 np0005538513.localdomain podman[326728]: 2025-11-28 10:07:20.717735326 +0000 UTC m=+0.060512315 container kill cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e9912372-cec2-427d-b398-8b7ba1d00441, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:20 np0005538513.localdomain systemd[1]: libpod-cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0.scope: Deactivated successfully.
Nov 28 10:07:20 np0005538513.localdomain podman[326741]: 2025-11-28 10:07:20.785246506 +0000 UTC m=+0.055671715 container died cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e9912372-cec2-427d-b398-8b7ba1d00441, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 28 10:07:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:20.858 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:20 np0005538513.localdomain podman[326741]: 2025-11-28 10:07:20.870849554 +0000 UTC m=+0.141274783 container cleanup cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e9912372-cec2-427d-b398-8b7ba1d00441, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:07:20 np0005538513.localdomain systemd[1]: libpod-conmon-cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0.scope: Deactivated successfully.
Nov 28 10:07:20 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-7c04ff6f7b4fcac360df3aa4ab5727cea04640414569d0565dad260384a9ac12-merged.mount: Deactivated successfully.
Nov 28 10:07:20 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0-userdata-shm.mount: Deactivated successfully.
Nov 28 10:07:20 np0005538513.localdomain podman[326748]: 2025-11-28 10:07:20.900974143 +0000 UTC m=+0.156399170 container remove cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e9912372-cec2-427d-b398-8b7ba1d00441, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:07:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:20.913 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:20 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:20Z|00437|binding|INFO|Releasing lport e7ad6507-8cb9-4c54-9fde-23c7028d341d from this chassis (sb_readonly=0)
Nov 28 10:07:20 np0005538513.localdomain kernel: device tape7ad6507-8c left promiscuous mode
Nov 28 10:07:20 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:20Z|00438|binding|INFO|Setting lport e7ad6507-8cb9-4c54-9fde-23c7028d341d down in Southbound
Nov 28 10:07:20 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:20.928 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe83:d5c4/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-e9912372-cec2-427d-b398-8b7ba1d00441', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e9912372-cec2-427d-b398-8b7ba1d00441', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ae10569a38284f298c961498da620c5f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5e17820c-609d-4556-ae57-25b66af88ca4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=e7ad6507-8cb9-4c54-9fde-23c7028d341d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:20 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:20.930 158130 INFO neutron.agent.ovn.metadata.agent [-] Port e7ad6507-8cb9-4c54-9fde-23c7028d341d in datapath e9912372-cec2-427d-b398-8b7ba1d00441 unbound from our chassis
Nov 28 10:07:20 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:20.932 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e9912372-cec2-427d-b398-8b7ba1d00441, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:07:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:20.933 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:20 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:20.933 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[f85f68d9-2111-4e02-98f9-00ba59b7151b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:07:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:21.027 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:21 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:07:21.377 2 INFO neutron.agent.securitygroups_rpc [None req-fe8ed995-ee4b-4312-80c7-fb60647feb81 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['235f4ca9-4e7e-483e-ba22-a609f7751fe8']
Nov 28 10:07:21 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2de9912372\x2dcec2\x2d427d\x2db398\x2d8b7ba1d00441.mount: Deactivated successfully.
Nov 28 10:07:21 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:21.560 261084 INFO neutron.agent.dhcp.agent [None req-cc845ac2-af30-45d2-8b35-e5e6b189e3e7 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:07:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e193 do_prune osdmap full prune enabled
Nov 28 10:07:21 np0005538513.localdomain ceph-mon[292954]: pgmap v380: 177 pgs: 177 active+clean; 192 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 3.9 MiB/s rd, 36 KiB/s wr, 218 op/s
Nov 28 10:07:21 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "74c605fe-3105-407e-80b8-d4fb6f7d4329", "format": "json"}]: dispatch
Nov 28 10:07:21 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "74c605fe-3105-407e-80b8-d4fb6f7d4329", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e194 e194: 6 total, 6 up, 6 in
Nov 28 10:07:21 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e194: 6 total, 6 up, 6 in
Nov 28 10:07:21 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:21Z|00439|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:07:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:21.729 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:22 np0005538513.localdomain podman[326788]: 2025-11-28 10:07:22.553187072 +0000 UTC m=+0.067388028 container kill c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 10:07:22 np0005538513.localdomain dnsmasq[324234]: exiting on receipt of SIGTERM
Nov 28 10:07:22 np0005538513.localdomain systemd[1]: libpod-c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952.scope: Deactivated successfully.
Nov 28 10:07:22 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e194 do_prune osdmap full prune enabled
Nov 28 10:07:22 np0005538513.localdomain ceph-mon[292954]: osdmap e194: 6 total, 6 up, 6 in
Nov 28 10:07:22 np0005538513.localdomain ceph-mon[292954]: pgmap v383: 177 pgs: 1 active+clean+snaptrim, 176 active+clean; 192 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 77 KiB/s rd, 18 KiB/s wr, 108 op/s
Nov 28 10:07:22 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e195 e195: 6 total, 6 up, 6 in
Nov 28 10:07:22 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e195: 6 total, 6 up, 6 in
Nov 28 10:07:22 np0005538513.localdomain podman[326803]: 2025-11-28 10:07:22.649294973 +0000 UTC m=+0.076450296 container died c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 10:07:22 np0005538513.localdomain systemd[1]: tmp-crun.tmSlzw.mount: Deactivated successfully.
Nov 28 10:07:22 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:22Z|00440|binding|INFO|Removing iface tap1bff36cc-f5 ovn-installed in OVS
Nov 28 10:07:22 np0005538513.localdomain podman[326803]: 2025-11-28 10:07:22.749982066 +0000 UTC m=+0.177137339 container cleanup c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 28 10:07:22 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:22.751 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port d4b5658d-b7d6-4f11-9507-550379ce2d7c with type ""
Nov 28 10:07:22 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:22.752 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-4dc5b71e-287e-4ec6-b6b7-4d131e85d551', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4dc5b71e-287e-4ec6-b6b7-4d131e85d551', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '29d0d5b3ba0745d58aee3845ea704b73', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c0295c06-e7c1-42d0-9d25-c6c6ebd15e16, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=1bff36cc-f508-4066-a5d7-c55bc5baf4a9) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:22 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:22.754 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 1bff36cc-f508-4066-a5d7-c55bc5baf4a9 in datapath 4dc5b71e-287e-4ec6-b6b7-4d131e85d551 unbound from our chassis
Nov 28 10:07:22 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:22Z|00441|binding|INFO|Removing lport 1bff36cc-f508-4066-a5d7-c55bc5baf4a9 ovn-installed in OVS
Nov 28 10:07:22 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:22.757 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4dc5b71e-287e-4ec6-b6b7-4d131e85d551, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:07:22 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:22.758 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[0789e78a-bf16-42fd-acbc-847949501156]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:07:22 np0005538513.localdomain systemd[1]: libpod-conmon-c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952.scope: Deactivated successfully.
Nov 28 10:07:22 np0005538513.localdomain podman[326804]: 2025-11-28 10:07:22.781730554 +0000 UTC m=+0.205308747 container remove c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:22.799 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:22.807 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:22 np0005538513.localdomain kernel: device tap1bff36cc-f5 left promiscuous mode
Nov 28 10:07:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:22.819 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:22 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:22.841 261084 INFO neutron.agent.dhcp.agent [None req-5eb5fd94-8165-46ea-b681-aa8e86340968 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:07:23 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:23.033 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:07:23 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-518cfa22c4e318530a6e5aefab2c4e20bcb90abb837fcbd18b8f75b2b31294f3-merged.mount: Deactivated successfully.
Nov 28 10:07:23 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952-userdata-shm.mount: Deactivated successfully.
Nov 28 10:07:23 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d4dc5b71e\x2d287e\x2d4ec6\x2db6b7\x2d4d131e85d551.mount: Deactivated successfully.
Nov 28 10:07:23 np0005538513.localdomain ceph-mon[292954]: osdmap e195: 6 total, 6 up, 6 in
Nov 28 10:07:23 np0005538513.localdomain sudo[326831]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:07:23 np0005538513.localdomain sudo[326831]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:07:23 np0005538513.localdomain sudo[326831]: pam_unix(sudo:session): session closed for user root
Nov 28 10:07:23 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:23Z|00442|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:07:23 np0005538513.localdomain sudo[326849]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:07:24 np0005538513.localdomain sudo[326849]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:07:24 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:24.027 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:24 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:07:24.085 2 INFO neutron.agent.securitygroups_rpc [None req-b1e53ab7-c922-4511-9eea-36891b374394 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['ef6c27ab-7008-4940-88ab-f495a3348997', '235f4ca9-4e7e-483e-ba22-a609f7751fe8', 'ad28b9ca-0164-4a23-9923-7d61ac565e84']
Nov 28 10:07:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e195 do_prune osdmap full prune enabled
Nov 28 10:07:24 np0005538513.localdomain ceph-mon[292954]: pgmap v385: 177 pgs: 1 active+clean+snaptrim, 176 active+clean; 192 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 71 KiB/s rd, 17 KiB/s wr, 100 op/s
Nov 28 10:07:24 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "84a40f08-8b6b-4686-8d05-33a3e9292f4f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:07:24 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "84a40f08-8b6b-4686-8d05-33a3e9292f4f", "format": "json"}]: dispatch
Nov 28 10:07:24 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:07:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e196 e196: 6 total, 6 up, 6 in
Nov 28 10:07:24 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e196: 6 total, 6 up, 6 in
Nov 28 10:07:24 np0005538513.localdomain sudo[326849]: pam_unix(sudo:session): session closed for user root
Nov 28 10:07:24 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:07:24.749 2 INFO neutron.agent.securitygroups_rpc [None req-ded8a162-fe49-472e-9107-11e07cb8573a e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['ef6c27ab-7008-4940-88ab-f495a3348997', 'ad28b9ca-0164-4a23-9923-7d61ac565e84']
Nov 28 10:07:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:07:24 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:07:24 np0005538513.localdomain sudo[326898]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:07:24 np0005538513.localdomain sudo[326898]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:07:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:07:24 np0005538513.localdomain sudo[326898]: pam_unix(sudo:session): session closed for user root
Nov 28 10:07:25 np0005538513.localdomain systemd[1]: tmp-crun.WHYdv1.mount: Deactivated successfully.
Nov 28 10:07:25 np0005538513.localdomain podman[326916]: 2025-11-28 10:07:25.080261327 +0000 UTC m=+0.111039052 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:07:25 np0005538513.localdomain podman[326916]: 2025-11-28 10:07:25.087893512 +0000 UTC m=+0.118671227 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:07:25 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:07:25 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e196 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:25 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:07:25.418 2 INFO neutron.agent.securitygroups_rpc [None req-84883400-a0bd-45dd-a8ae-3bc8b417b162 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['acf02bd6-8fdb-4bdf-b655-c11d3c48057a']
Nov 28 10:07:25 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:07:25.467 2 INFO neutron.agent.securitygroups_rpc [None req-0fb934e7-9ad4-4c2e-8ef8-4b9c21b34e7a 6b80a7d6f65f4ebe8363ccaae26f3e87 ae10569a38284f298c961498da620c5f - - default default] Security group member updated ['c5eee24b-0bed-4035-a2ab-e6c531c94e43']
Nov 28 10:07:25 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:25.550 261084 INFO neutron.agent.linux.ip_lib [None req-d39bfd49-8596-4874-b6a1-9865d4404117 - - - - - -] Device tapce85abaa-55 cannot be used as it has no MAC address
Nov 28 10:07:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:25.575 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:25 np0005538513.localdomain kernel: device tapce85abaa-55 entered promiscuous mode
Nov 28 10:07:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:25.584 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:25 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324445.5858] manager: (tapce85abaa-55): new Generic device (/org/freedesktop/NetworkManager/Devices/71)
Nov 28 10:07:25 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:25Z|00443|binding|INFO|Claiming lport ce85abaa-55ed-4384-873f-4fd7f3eb0d9a for this chassis.
Nov 28 10:07:25 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:25Z|00444|binding|INFO|ce85abaa-55ed-4384-873f-4fd7f3eb0d9a: Claiming unknown
Nov 28 10:07:25 np0005538513.localdomain systemd-udevd[326949]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:07:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:25.603 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ea7bdd98a9904e47a2dfed5bcc54bc4a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e24bc530-a0d1-4a44-8a84-effdb241e447, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=ce85abaa-55ed-4384-873f-4fd7f3eb0d9a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:25.606 158130 INFO neutron.agent.ovn.metadata.agent [-] Port ce85abaa-55ed-4384-873f-4fd7f3eb0d9a in datapath 92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4 bound to our chassis
Nov 28 10:07:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:25.609 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port a82ab757-b45b-4425-bce5-f385f9345b1a IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:07:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:25.609 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:07:25 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:25.610 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[ec241232-189e-4039-81cd-1dab12410f05]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:07:25 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapce85abaa-55: No such device
Nov 28 10:07:25 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:25Z|00445|binding|INFO|Setting lport ce85abaa-55ed-4384-873f-4fd7f3eb0d9a ovn-installed in OVS
Nov 28 10:07:25 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:25Z|00446|binding|INFO|Setting lport ce85abaa-55ed-4384-873f-4fd7f3eb0d9a up in Southbound
Nov 28 10:07:25 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapce85abaa-55: No such device
Nov 28 10:07:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:25.618 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:25 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapce85abaa-55: No such device
Nov 28 10:07:25 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapce85abaa-55: No such device
Nov 28 10:07:25 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapce85abaa-55: No such device
Nov 28 10:07:25 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapce85abaa-55: No such device
Nov 28 10:07:25 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapce85abaa-55: No such device
Nov 28 10:07:25 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapce85abaa-55: No such device
Nov 28 10:07:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:25.657 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:25 np0005538513.localdomain ceph-mon[292954]: osdmap e196: 6 total, 6 up, 6 in
Nov 28 10:07:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:07:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:07:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:07:25 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:07:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:25.688 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:25.859 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:25 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:07:25 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:07:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:26.031 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:26 np0005538513.localdomain podman[327020]: 
Nov 28 10:07:26 np0005538513.localdomain podman[327020]: 2025-11-28 10:07:26.611176958 +0000 UTC m=+0.094625687 container create 99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:07:26 np0005538513.localdomain podman[327020]: 2025-11-28 10:07:26.564653254 +0000 UTC m=+0.048102003 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:07:26 np0005538513.localdomain systemd[1]: Started libpod-conmon-99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d.scope.
Nov 28 10:07:26 np0005538513.localdomain systemd[1]: tmp-crun.UnMZ5k.mount: Deactivated successfully.
Nov 28 10:07:26 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:07:26 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82a10448f7eafab39fdb6ae375b218bc52faf1c65a99161c7b78ef796fa5b5c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:07:26 np0005538513.localdomain podman[327020]: 2025-11-28 10:07:26.742518065 +0000 UTC m=+0.225966784 container init 99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 28 10:07:26 np0005538513.localdomain podman[327020]: 2025-11-28 10:07:26.752382429 +0000 UTC m=+0.235831158 container start 99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:07:26 np0005538513.localdomain dnsmasq[327038]: started, version 2.85 cachesize 150
Nov 28 10:07:26 np0005538513.localdomain dnsmasq[327038]: DNS service limited to local subnets
Nov 28 10:07:26 np0005538513.localdomain dnsmasq[327038]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:07:26 np0005538513.localdomain dnsmasq[327038]: warning: no upstream servers configured
Nov 28 10:07:26 np0005538513.localdomain dnsmasq-dhcp[327038]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:07:26 np0005538513.localdomain dnsmasq[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/addn_hosts - 0 addresses
Nov 28 10:07:26 np0005538513.localdomain dnsmasq-dhcp[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/host
Nov 28 10:07:26 np0005538513.localdomain dnsmasq-dhcp[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/opts
Nov 28 10:07:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:07:26 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/408988445' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:07:26 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/408988445' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:26 np0005538513.localdomain ceph-mon[292954]: pgmap v387: 177 pgs: 1 active+clean+snaptrim, 176 active+clean; 192 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 57 KiB/s rd, 14 KiB/s wr, 80 op/s
Nov 28 10:07:26 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:07:26 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/408988445' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:26 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/408988445' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:27 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:27.011 261084 INFO neutron.agent.dhcp.agent [None req-bedd59c9-509d-419b-8106-7330bf2f0acf - - - - - -] DHCP configuration for ports {'1a03bcf8-8713-4678-8570-227cfd5a5392'} is completed
Nov 28 10:07:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:07:27 np0005538513.localdomain podman[327039]: 2025-11-28 10:07:27.138493006 +0000 UTC m=+0.094283825 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:27 np0005538513.localdomain podman[327039]: 2025-11-28 10:07:27.167621904 +0000 UTC m=+0.123412693 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:07:27 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:07:27 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:27.300 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:07:26Z, description=, device_id=b1aacfa1-fecc-4e03-9799-90b5b92b4c0a, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64d0ac0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64d0d00>], id=042bddfe-2b7c-46de-b29c-e6d7c7393875, ip_allocation=immediate, mac_address=fa:16:3e:3d:6c:06, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:07:18Z, description=, dns_domain=, id=92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--616540833, port_security_enabled=True, project_id=ea7bdd98a9904e47a2dfed5bcc54bc4a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=53568, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2487, status=ACTIVE, subnets=['bf8de819-4f5d-4834-9015-5caa46badb1e'], tags=[], tenant_id=ea7bdd98a9904e47a2dfed5bcc54bc4a, updated_at=2025-11-28T10:07:22Z, vlan_transparent=None, network_id=92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, port_security_enabled=False, project_id=ea7bdd98a9904e47a2dfed5bcc54bc4a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2527, status=DOWN, tags=[], tenant_id=ea7bdd98a9904e47a2dfed5bcc54bc4a, updated_at=2025-11-28T10:07:26Z on network 92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4
Nov 28 10:07:27 np0005538513.localdomain dnsmasq[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/addn_hosts - 1 addresses
Nov 28 10:07:27 np0005538513.localdomain dnsmasq-dhcp[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/host
Nov 28 10:07:27 np0005538513.localdomain dnsmasq-dhcp[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/opts
Nov 28 10:07:27 np0005538513.localdomain podman[327077]: 2025-11-28 10:07:27.542592978 +0000 UTC m=+0.062464377 container kill 99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 10:07:27 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:27.766 261084 INFO neutron.agent.dhcp.agent [None req-13302878-b347-41b2-b109-a81db60f8daa - - - - - -] DHCP configuration for ports {'042bddfe-2b7c-46de-b29c-e6d7c7393875'} is completed
Nov 28 10:07:27 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "84a40f08-8b6b-4686-8d05-33a3e9292f4f", "format": "json"}]: dispatch
Nov 28 10:07:27 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "84a40f08-8b6b-4686-8d05-33a3e9292f4f", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:28 np0005538513.localdomain ceph-mon[292954]: pgmap v388: 177 pgs: 177 active+clean; 213 MiB data, 1016 MiB used, 41 GiB / 42 GiB avail; 651 KiB/s rd, 4.0 MiB/s wr, 230 op/s
Nov 28 10:07:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:07:29 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4119185207' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:07:29 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4119185207' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:07:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:07:29 np0005538513.localdomain podman[327100]: 2025-11-28 10:07:29.861849599 +0000 UTC m=+0.092027396 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=edpm, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:07:29 np0005538513.localdomain podman[327101]: 2025-11-28 10:07:29.912533562 +0000 UTC m=+0.137343933 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 10:07:29 np0005538513.localdomain podman[327100]: 2025-11-28 10:07:29.932317431 +0000 UTC m=+0.162495188 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:07:29 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:07:29 np0005538513.localdomain podman[327101]: 2025-11-28 10:07:29.946352834 +0000 UTC m=+0.171163175 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:07:29 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:07:29 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/4119185207' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:29 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/4119185207' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:30 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:07:30.293 2 INFO neutron.agent.securitygroups_rpc [None req-bb1d0f4f-4080-47c2-b71c-a5aaec3a62e2 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:07:30 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:30.314 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:07:26Z, description=, device_id=b1aacfa1-fecc-4e03-9799-90b5b92b4c0a, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6416fa0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6416bb0>], id=042bddfe-2b7c-46de-b29c-e6d7c7393875, ip_allocation=immediate, mac_address=fa:16:3e:3d:6c:06, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:07:18Z, description=, dns_domain=, id=92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--616540833, port_security_enabled=True, project_id=ea7bdd98a9904e47a2dfed5bcc54bc4a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=53568, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2487, status=ACTIVE, subnets=['bf8de819-4f5d-4834-9015-5caa46badb1e'], tags=[], tenant_id=ea7bdd98a9904e47a2dfed5bcc54bc4a, updated_at=2025-11-28T10:07:22Z, vlan_transparent=None, network_id=92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, port_security_enabled=False, project_id=ea7bdd98a9904e47a2dfed5bcc54bc4a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2527, status=DOWN, tags=[], tenant_id=ea7bdd98a9904e47a2dfed5bcc54bc4a, updated_at=2025-11-28T10:07:26Z on network 92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4
Nov 28 10:07:30 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e196 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:30 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e196 do_prune osdmap full prune enabled
Nov 28 10:07:30 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e197 e197: 6 total, 6 up, 6 in
Nov 28 10:07:30 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e197: 6 total, 6 up, 6 in
Nov 28 10:07:30 np0005538513.localdomain dnsmasq[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/addn_hosts - 1 addresses
Nov 28 10:07:30 np0005538513.localdomain podman[327158]: 2025-11-28 10:07:30.565095678 +0000 UTC m=+0.048387771 container kill 99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 28 10:07:30 np0005538513.localdomain dnsmasq-dhcp[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/host
Nov 28 10:07:30 np0005538513.localdomain dnsmasq-dhcp[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/opts
Nov 28 10:07:30 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:30.833 261084 INFO neutron.agent.dhcp.agent [None req-1b50611d-c2b0-4e83-b826-b82babeb1694 - - - - - -] DHCP configuration for ports {'042bddfe-2b7c-46de-b29c-e6d7c7393875'} is completed
Nov 28 10:07:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:30.861 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:31.034 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:31 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:31.389 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:3b:02 10.100.0.19 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af9fa2ef-906d-4c5f-8a61-e350b84d90cf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=636c21fa-d6bd-405e-95ed-d59498827d6f) old=Port_Binding(mac=['fa:16:3e:af:3b:02 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:31 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:31.391 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 636c21fa-d6bd-405e-95ed-d59498827d6f in datapath 744b5a82-3c5c-4b41-ba44-527244a209c4 updated
Nov 28 10:07:31 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:31.394 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 744b5a82-3c5c-4b41-ba44-527244a209c4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:07:31 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:31.395 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[bed56a50-2b2f-4f8f-bdf2-c06e8e8aea0c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:07:31 np0005538513.localdomain ceph-mon[292954]: pgmap v389: 177 pgs: 177 active+clean; 213 MiB data, 1016 MiB used, 41 GiB / 42 GiB avail; 466 KiB/s rd, 3.1 MiB/s wr, 124 op/s
Nov 28 10:07:31 np0005538513.localdomain ceph-mon[292954]: osdmap e197: 6 total, 6 up, 6 in
Nov 28 10:07:32 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "3715b589-30b5-48c3-ae35-bbb103548e98", "format": "json"}]: dispatch
Nov 28 10:07:32 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:07:32.625 2 INFO neutron.agent.securitygroups_rpc [None req-b1e499b5-5b30-44cb-89ef-2d90dabf973f 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['b5d46958-1542-44c0-a82a-37e69acb7089', 'acf02bd6-8fdb-4bdf-b655-c11d3c48057a']
Nov 28 10:07:32 np0005538513.localdomain dnsmasq[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/addn_hosts - 0 addresses
Nov 28 10:07:32 np0005538513.localdomain podman[327197]: 2025-11-28 10:07:32.700105033 +0000 UTC m=+0.072608098 container kill 99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:07:32 np0005538513.localdomain dnsmasq-dhcp[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/host
Nov 28 10:07:32 np0005538513.localdomain dnsmasq-dhcp[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/opts
Nov 28 10:07:32 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:32.769 261084 INFO neutron.agent.linux.ip_lib [None req-b63dd2b8-eda8-420d-866d-e5e5e70c08ce - - - - - -] Device tap5d322373-6d cannot be used as it has no MAC address
Nov 28 10:07:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:32.840 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:32 np0005538513.localdomain kernel: device tap5d322373-6d entered promiscuous mode
Nov 28 10:07:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:32.849 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:32 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:32Z|00447|binding|INFO|Claiming lport 5d322373-6d84-4420-9452-6cf70afb5d0c for this chassis.
Nov 28 10:07:32 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:32Z|00448|binding|INFO|5d322373-6d84-4420-9452-6cf70afb5d0c: Claiming unknown
Nov 28 10:07:32 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324452.8527] manager: (tap5d322373-6d): new Generic device (/org/freedesktop/NetworkManager/Devices/72)
Nov 28 10:07:32 np0005538513.localdomain systemd-udevd[327227]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:07:32 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:32.859 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe37:d8d9/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-daa663db-7797-4de5-a3b0-b0197a7ec3d6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-daa663db-7797-4de5-a3b0-b0197a7ec3d6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ae10569a38284f298c961498da620c5f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=949040a4-3a48-4918-a792-56947d4e0e1e, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=5d322373-6d84-4420-9452-6cf70afb5d0c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:32 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:32.861 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 5d322373-6d84-4420-9452-6cf70afb5d0c in datapath daa663db-7797-4de5-a3b0-b0197a7ec3d6 bound to our chassis
Nov 28 10:07:32 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:32.863 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network daa663db-7797-4de5-a3b0-b0197a7ec3d6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:07:32 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:32.864 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[9ac2a2d2-5f44-4441-a46a-1bbb76650433]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:07:32 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap5d322373-6d: No such device
Nov 28 10:07:32 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:32Z|00449|binding|INFO|Setting lport 5d322373-6d84-4420-9452-6cf70afb5d0c ovn-installed in OVS
Nov 28 10:07:32 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:32Z|00450|binding|INFO|Setting lport 5d322373-6d84-4420-9452-6cf70afb5d0c up in Southbound
Nov 28 10:07:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:32.887 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:32 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap5d322373-6d: No such device
Nov 28 10:07:32 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap5d322373-6d: No such device
Nov 28 10:07:32 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap5d322373-6d: No such device
Nov 28 10:07:32 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap5d322373-6d: No such device
Nov 28 10:07:32 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap5d322373-6d: No such device
Nov 28 10:07:32 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap5d322373-6d: No such device
Nov 28 10:07:32 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap5d322373-6d: No such device
Nov 28 10:07:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:32.940 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:32 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:32Z|00451|binding|INFO|Releasing lport ce85abaa-55ed-4384-873f-4fd7f3eb0d9a from this chassis (sb_readonly=0)
Nov 28 10:07:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:32.960 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:32 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:32Z|00452|binding|INFO|Setting lport ce85abaa-55ed-4384-873f-4fd7f3eb0d9a down in Southbound
Nov 28 10:07:32 np0005538513.localdomain kernel: device tapce85abaa-55 left promiscuous mode
Nov 28 10:07:32 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:32.968 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ea7bdd98a9904e47a2dfed5bcc54bc4a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e24bc530-a0d1-4a44-8a84-effdb241e447, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=ce85abaa-55ed-4384-873f-4fd7f3eb0d9a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:32 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:32.970 158130 INFO neutron.agent.ovn.metadata.agent [-] Port ce85abaa-55ed-4384-873f-4fd7f3eb0d9a in datapath 92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4 unbound from our chassis
Nov 28 10:07:32 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:32.973 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:07:32 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:32.974 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[d7ec5c04-8b84-4678-ae20-ad8c963935ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:07:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:32.981 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:32.985 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:33 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:07:33.592 2 INFO neutron.agent.securitygroups_rpc [None req-7e9cce24-3864-452e-838f-0b8e85be3343 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['b5d46958-1542-44c0-a82a-37e69acb7089']
Nov 28 10:07:34 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:34Z|00453|binding|INFO|Removing iface tap5d322373-6d ovn-installed in OVS
Nov 28 10:07:34 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:34Z|00454|binding|INFO|Removing lport 5d322373-6d84-4420-9452-6cf70afb5d0c ovn-installed in OVS
Nov 28 10:07:34 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:34.744 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 16d3b12c-1081-4d8a-95a4-ea996c678275 with type ""
Nov 28 10:07:34 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:34.746 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-daa663db-7797-4de5-a3b0-b0197a7ec3d6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-daa663db-7797-4de5-a3b0-b0197a7ec3d6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ae10569a38284f298c961498da620c5f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=949040a4-3a48-4918-a792-56947d4e0e1e, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=5d322373-6d84-4420-9452-6cf70afb5d0c) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:34 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:34.747 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 5d322373-6d84-4420-9452-6cf70afb5d0c in datapath daa663db-7797-4de5-a3b0-b0197a7ec3d6 unbound from our chassis
Nov 28 10:07:34 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:34.749 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network daa663db-7797-4de5-a3b0-b0197a7ec3d6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:07:34 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:34.775 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[8f28470d-dbe0-4fba-8aaf-a80cc68a64dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:07:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:34.776 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:34 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:34Z|00455|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:07:34 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:34.899 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:35.891 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:36.398 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:36 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e197 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:36 np0005538513.localdomain ceph-mon[292954]: pgmap v391: 177 pgs: 177 active+clean; 225 MiB data, 1022 MiB used, 41 GiB / 42 GiB avail; 546 KiB/s rd, 3.2 MiB/s wr, 172 op/s
Nov 28 10:07:36 np0005538513.localdomain podman[327299]: 
Nov 28 10:07:36 np0005538513.localdomain podman[327299]: 2025-11-28 10:07:36.531008592 +0000 UTC m=+0.075518128 container create 3055ecc468897a11ebcb877d0194042c517b3fedc96876f552471a5d8cbabedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-daa663db-7797-4de5-a3b0-b0197a7ec3d6, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:07:36 np0005538513.localdomain systemd[1]: Started libpod-conmon-3055ecc468897a11ebcb877d0194042c517b3fedc96876f552471a5d8cbabedf.scope.
Nov 28 10:07:36 np0005538513.localdomain systemd[1]: tmp-crun.mvCUfp.mount: Deactivated successfully.
Nov 28 10:07:36 np0005538513.localdomain podman[327299]: 2025-11-28 10:07:36.491257148 +0000 UTC m=+0.035766714 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:07:36 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:07:36 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/127f1433ad0e975c7d90441f253c42229b39618ab4ce9e24e298473f69915576/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:07:36 np0005538513.localdomain podman[327299]: 2025-11-28 10:07:36.609553762 +0000 UTC m=+0.154063308 container init 3055ecc468897a11ebcb877d0194042c517b3fedc96876f552471a5d8cbabedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-daa663db-7797-4de5-a3b0-b0197a7ec3d6, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:07:36 np0005538513.localdomain podman[327299]: 2025-11-28 10:07:36.622001896 +0000 UTC m=+0.166511432 container start 3055ecc468897a11ebcb877d0194042c517b3fedc96876f552471a5d8cbabedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-daa663db-7797-4de5-a3b0-b0197a7ec3d6, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:36 np0005538513.localdomain dnsmasq[327317]: started, version 2.85 cachesize 150
Nov 28 10:07:36 np0005538513.localdomain dnsmasq[327317]: DNS service limited to local subnets
Nov 28 10:07:36 np0005538513.localdomain dnsmasq[327317]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:07:36 np0005538513.localdomain dnsmasq[327317]: warning: no upstream servers configured
Nov 28 10:07:36 np0005538513.localdomain dnsmasq[327317]: read /var/lib/neutron/dhcp/daa663db-7797-4de5-a3b0-b0197a7ec3d6/addn_hosts - 0 addresses
Nov 28 10:07:36 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:36.737 261084 INFO neutron.agent.dhcp.agent [None req-d95b2dfb-3a6b-473b-bb81-b381767e15ed - - - - - -] DHCP configuration for ports {'2795b1ae-64ed-4052-8e9d-7d6b8bc4f852'} is completed
Nov 28 10:07:36 np0005538513.localdomain dnsmasq[327317]: exiting on receipt of SIGTERM
Nov 28 10:07:36 np0005538513.localdomain systemd[1]: libpod-3055ecc468897a11ebcb877d0194042c517b3fedc96876f552471a5d8cbabedf.scope: Deactivated successfully.
Nov 28 10:07:36 np0005538513.localdomain podman[327335]: 2025-11-28 10:07:36.849424353 +0000 UTC m=+0.071096342 container kill 3055ecc468897a11ebcb877d0194042c517b3fedc96876f552471a5d8cbabedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-daa663db-7797-4de5-a3b0-b0197a7ec3d6, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:07:36 np0005538513.localdomain podman[327347]: 2025-11-28 10:07:36.911894267 +0000 UTC m=+0.054489859 container died 3055ecc468897a11ebcb877d0194042c517b3fedc96876f552471a5d8cbabedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-daa663db-7797-4de5-a3b0-b0197a7ec3d6, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:07:36 np0005538513.localdomain podman[327347]: 2025-11-28 10:07:36.943068778 +0000 UTC m=+0.085664370 container cleanup 3055ecc468897a11ebcb877d0194042c517b3fedc96876f552471a5d8cbabedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-daa663db-7797-4de5-a3b0-b0197a7ec3d6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:36 np0005538513.localdomain systemd[1]: libpod-conmon-3055ecc468897a11ebcb877d0194042c517b3fedc96876f552471a5d8cbabedf.scope: Deactivated successfully.
Nov 28 10:07:36 np0005538513.localdomain podman[327354]: 2025-11-28 10:07:36.962251799 +0000 UTC m=+0.088483248 container remove 3055ecc468897a11ebcb877d0194042c517b3fedc96876f552471a5d8cbabedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-daa663db-7797-4de5-a3b0-b0197a7ec3d6, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:07:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:37.006 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:37 np0005538513.localdomain kernel: device tap5d322373-6d left promiscuous mode
Nov 28 10:07:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:37.023 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:37 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:37.080 261084 INFO neutron.agent.dhcp.agent [None req-3ba41adf-3747-42f4-8d5a-9aecc82b2488 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:07:37 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:37.080 261084 INFO neutron.agent.dhcp.agent [None req-3ba41adf-3747-42f4-8d5a-9aecc82b2488 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:07:37 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:37.091 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:3b:02 10.100.0.19 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af9fa2ef-906d-4c5f-8a61-e350b84d90cf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=636c21fa-d6bd-405e-95ed-d59498827d6f) old=Port_Binding(mac=['fa:16:3e:af:3b:02 10.100.0.19 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:37 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:37.093 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 636c21fa-d6bd-405e-95ed-d59498827d6f in datapath 744b5a82-3c5c-4b41-ba44-527244a209c4 updated
Nov 28 10:07:37 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:37.096 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 744b5a82-3c5c-4b41-ba44-527244a209c4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:07:37 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:37.097 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[bdc38dfe-92c9-4160-b4bd-4a4e7e246c07]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:07:37 np0005538513.localdomain ceph-mon[292954]: pgmap v392: 177 pgs: 177 active+clean; 225 MiB data, 1022 MiB used, 41 GiB / 42 GiB avail; 478 KiB/s rd, 2.8 MiB/s wr, 151 op/s
Nov 28 10:07:37 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "3715b589-30b5-48c3-ae35-bbb103548e98_18a37cfd-0766-4bf4-885c-08ab764ad956", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:37 np0005538513.localdomain ceph-mon[292954]: pgmap v393: 177 pgs: 177 active+clean; 225 MiB data, 1022 MiB used, 41 GiB / 42 GiB avail; 437 KiB/s rd, 2.6 MiB/s wr, 138 op/s
Nov 28 10:07:37 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "3715b589-30b5-48c3-ae35-bbb103548e98", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:37 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:07:37 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1980575955' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:07:37 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-127f1433ad0e975c7d90441f253c42229b39618ab4ce9e24e298473f69915576-merged.mount: Deactivated successfully.
Nov 28 10:07:37 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3055ecc468897a11ebcb877d0194042c517b3fedc96876f552471a5d8cbabedf-userdata-shm.mount: Deactivated successfully.
Nov 28 10:07:37 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2ddaa663db\x2d7797\x2d4de5\x2da3b0\x2db0197a7ec3d6.mount: Deactivated successfully.
Nov 28 10:07:37 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:37.733 261084 INFO neutron.agent.linux.ip_lib [None req-c525de8f-f56d-43be-a185-942dd3110014 - - - - - -] Device tap00e4c315-3e cannot be used as it has no MAC address
Nov 28 10:07:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:37.755 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:37 np0005538513.localdomain kernel: device tap00e4c315-3e entered promiscuous mode
Nov 28 10:07:37 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324457.7629] manager: (tap00e4c315-3e): new Generic device (/org/freedesktop/NetworkManager/Devices/73)
Nov 28 10:07:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:37.762 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:37 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:37Z|00456|binding|INFO|Claiming lport 00e4c315-3e1e-4938-965c-c4f68912eeb6 for this chassis.
Nov 28 10:07:37 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:37Z|00457|binding|INFO|00e4c315-3e1e-4938-965c-c4f68912eeb6: Claiming unknown
Nov 28 10:07:37 np0005538513.localdomain systemd-udevd[327387]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:07:37 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:37.780 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-ad4aa8e2-9c92-45f9-bbe9-94669f61eefc', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad4aa8e2-9c92-45f9-bbe9-94669f61eefc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ae10569a38284f298c961498da620c5f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=33dc3788-7d28-4537-aa7d-ffee00e0827e, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=00e4c315-3e1e-4938-965c-c4f68912eeb6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:37 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:37.782 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 00e4c315-3e1e-4938-965c-c4f68912eeb6 in datapath ad4aa8e2-9c92-45f9-bbe9-94669f61eefc bound to our chassis
Nov 28 10:07:37 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:37.783 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ad4aa8e2-9c92-45f9-bbe9-94669f61eefc or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:07:37 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:37.784 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[6673fee1-4b97-4476-a6e2-d053176068de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:07:37 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:37Z|00458|binding|INFO|Setting lport 00e4c315-3e1e-4938-965c-c4f68912eeb6 ovn-installed in OVS
Nov 28 10:07:37 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:37Z|00459|binding|INFO|Setting lport 00e4c315-3e1e-4938-965c-c4f68912eeb6 up in Southbound
Nov 28 10:07:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:37.803 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:37.836 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:37.861 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:38 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e197 do_prune osdmap full prune enabled
Nov 28 10:07:38 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "80ccc338-1d8e-4716-ba2a-1f18e7a6e806", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:07:38 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "80ccc338-1d8e-4716-ba2a-1f18e7a6e806", "format": "json"}]: dispatch
Nov 28 10:07:38 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e198 e198: 6 total, 6 up, 6 in
Nov 28 10:07:38 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e198: 6 total, 6 up, 6 in
Nov 28 10:07:38 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:38Z|00460|binding|INFO|Removing iface tap00e4c315-3e ovn-installed in OVS
Nov 28 10:07:38 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:38.660 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 671b42da-83dc-42d9-a64b-2b08afdb0874 with type ""
Nov 28 10:07:38 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:38Z|00461|binding|INFO|Removing lport 00e4c315-3e1e-4938-965c-c4f68912eeb6 ovn-installed in OVS
Nov 28 10:07:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:38.700 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:38 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:38.701 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-ad4aa8e2-9c92-45f9-bbe9-94669f61eefc', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad4aa8e2-9c92-45f9-bbe9-94669f61eefc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ae10569a38284f298c961498da620c5f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=33dc3788-7d28-4537-aa7d-ffee00e0827e, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=00e4c315-3e1e-4938-965c-c4f68912eeb6) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:38 np0005538513.localdomain podman[327442]: 
Nov 28 10:07:38 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:38.704 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 00e4c315-3e1e-4938-965c-c4f68912eeb6 in datapath ad4aa8e2-9c92-45f9-bbe9-94669f61eefc unbound from our chassis
Nov 28 10:07:38 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:38.705 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ad4aa8e2-9c92-45f9-bbe9-94669f61eefc or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:07:38 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:38.706 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[74cc4119-f1d8-4d2c-98b9-fe44647cae50]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:07:38 np0005538513.localdomain podman[327442]: 2025-11-28 10:07:38.716962396 +0000 UTC m=+0.131464202 container create 121ee354b14d751973dfa6418a9f1066874b780d89e8d75d41c201c608817066 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad4aa8e2-9c92-45f9-bbe9-94669f61eefc, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:38 np0005538513.localdomain podman[327442]: 2025-11-28 10:07:38.636567469 +0000 UTC m=+0.051069295 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:07:38 np0005538513.localdomain systemd[1]: Started libpod-conmon-121ee354b14d751973dfa6418a9f1066874b780d89e8d75d41c201c608817066.scope.
Nov 28 10:07:38 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:07:38 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1cea78f4777bc89c99ec85a41246e19eead3cfa72d20170a383cd0e27dbabb0f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:07:38 np0005538513.localdomain podman[327442]: 2025-11-28 10:07:38.79174214 +0000 UTC m=+0.206243936 container init 121ee354b14d751973dfa6418a9f1066874b780d89e8d75d41c201c608817066 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad4aa8e2-9c92-45f9-bbe9-94669f61eefc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:07:38 np0005538513.localdomain podman[327442]: 2025-11-28 10:07:38.800738807 +0000 UTC m=+0.215240603 container start 121ee354b14d751973dfa6418a9f1066874b780d89e8d75d41c201c608817066 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad4aa8e2-9c92-45f9-bbe9-94669f61eefc, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:07:38 np0005538513.localdomain dnsmasq[327460]: started, version 2.85 cachesize 150
Nov 28 10:07:38 np0005538513.localdomain dnsmasq[327460]: DNS service limited to local subnets
Nov 28 10:07:38 np0005538513.localdomain dnsmasq[327460]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:07:38 np0005538513.localdomain dnsmasq[327460]: warning: no upstream servers configured
Nov 28 10:07:38 np0005538513.localdomain dnsmasq-dhcp[327460]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:07:38 np0005538513.localdomain dnsmasq[327460]: read /var/lib/neutron/dhcp/ad4aa8e2-9c92-45f9-bbe9-94669f61eefc/addn_hosts - 0 addresses
Nov 28 10:07:38 np0005538513.localdomain dnsmasq-dhcp[327460]: read /var/lib/neutron/dhcp/ad4aa8e2-9c92-45f9-bbe9-94669f61eefc/host
Nov 28 10:07:38 np0005538513.localdomain dnsmasq-dhcp[327460]: read /var/lib/neutron/dhcp/ad4aa8e2-9c92-45f9-bbe9-94669f61eefc/opts
Nov 28 10:07:38 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:07:38Z|00462|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:07:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:38.886 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:38 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:38.963 261084 INFO neutron.agent.dhcp.agent [None req-9c0aa46a-b06b-4fae-a46e-f35377c92106 - - - - - -] DHCP configuration for ports {'41918217-ad17-401a-852c-a666f673df3f'} is completed
Nov 28 10:07:39 np0005538513.localdomain dnsmasq[327460]: exiting on receipt of SIGTERM
Nov 28 10:07:39 np0005538513.localdomain podman[327478]: 2025-11-28 10:07:39.035732308 +0000 UTC m=+0.048916888 container kill 121ee354b14d751973dfa6418a9f1066874b780d89e8d75d41c201c608817066 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad4aa8e2-9c92-45f9-bbe9-94669f61eefc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 10:07:39 np0005538513.localdomain systemd[1]: libpod-121ee354b14d751973dfa6418a9f1066874b780d89e8d75d41c201c608817066.scope: Deactivated successfully.
Nov 28 10:07:39 np0005538513.localdomain podman[327493]: 2025-11-28 10:07:39.105641142 +0000 UTC m=+0.052280921 container died 121ee354b14d751973dfa6418a9f1066874b780d89e8d75d41c201c608817066 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad4aa8e2-9c92-45f9-bbe9-94669f61eefc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:07:39 np0005538513.localdomain podman[327493]: 2025-11-28 10:07:39.150865895 +0000 UTC m=+0.097505634 container remove 121ee354b14d751973dfa6418a9f1066874b780d89e8d75d41c201c608817066 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad4aa8e2-9c92-45f9-bbe9-94669f61eefc, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 10:07:39 np0005538513.localdomain systemd[1]: libpod-conmon-121ee354b14d751973dfa6418a9f1066874b780d89e8d75d41c201c608817066.scope: Deactivated successfully.
Nov 28 10:07:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:39.166 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:39 np0005538513.localdomain kernel: device tap00e4c315-3e left promiscuous mode
Nov 28 10:07:39 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:39.183 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:39 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:39.213 261084 INFO neutron.agent.dhcp.agent [None req-5126da88-cfac-4ecb-b04a-5153371e68f8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:07:39 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:39.214 261084 INFO neutron.agent.dhcp.agent [None req-5126da88-cfac-4ecb-b04a-5153371e68f8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:07:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e198 do_prune osdmap full prune enabled
Nov 28 10:07:39 np0005538513.localdomain ceph-mon[292954]: pgmap v394: 177 pgs: 177 active+clean; 225 MiB data, 1023 MiB used, 41 GiB / 42 GiB avail; 65 KiB/s rd, 141 KiB/s wr, 42 op/s
Nov 28 10:07:39 np0005538513.localdomain ceph-mon[292954]: osdmap e198: 6 total, 6 up, 6 in
Nov 28 10:07:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1666464710' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:07:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e199 e199: 6 total, 6 up, 6 in
Nov 28 10:07:39 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e199: 6 total, 6 up, 6 in
Nov 28 10:07:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-1cea78f4777bc89c99ec85a41246e19eead3cfa72d20170a383cd0e27dbabb0f-merged.mount: Deactivated successfully.
Nov 28 10:07:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-121ee354b14d751973dfa6418a9f1066874b780d89e8d75d41c201c608817066-userdata-shm.mount: Deactivated successfully.
Nov 28 10:07:39 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2dad4aa8e2\x2d9c92\x2d45f9\x2dbbe9\x2d94669f61eefc.mount: Deactivated successfully.
Nov 28 10:07:39 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:07:39.727 2 INFO neutron.agent.securitygroups_rpc [None req-c0e4f748-a4dd-449c-b793-430f30c9256f 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['58a0f932-b9f0-4bad-a5cb-a3c8cba7c65b']
Nov 28 10:07:40 np0005538513.localdomain podman[238687]: time="2025-11-28T10:07:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:07:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:07:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157512 "" "Go-http-client/1.1"
Nov 28 10:07:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:07:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19747 "" "Go-http-client/1.1"
Nov 28 10:07:40 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e199 do_prune osdmap full prune enabled
Nov 28 10:07:40 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "80ccc338-1d8e-4716-ba2a-1f18e7a6e806", "snap_name": "a7f380a2-0f60-4351-b764-bd78832f244d", "format": "json"}]: dispatch
Nov 28 10:07:40 np0005538513.localdomain ceph-mon[292954]: osdmap e199: 6 total, 6 up, 6 in
Nov 28 10:07:40 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e200 e200: 6 total, 6 up, 6 in
Nov 28 10:07:40 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e200: 6 total, 6 up, 6 in
Nov 28 10:07:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:07:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:07:40 np0005538513.localdomain podman[327520]: 2025-11-28 10:07:40.856672326 +0000 UTC m=+0.085364821 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:07:40 np0005538513.localdomain podman[327520]: 2025-11-28 10:07:40.869388618 +0000 UTC m=+0.098081113 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:07:40 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:07:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:40.894 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:40 np0005538513.localdomain podman[327521]: 2025-11-28 10:07:40.969383648 +0000 UTC m=+0.193567735 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd)
Nov 28 10:07:40 np0005538513.localdomain podman[327521]: 2025-11-28 10:07:40.980324926 +0000 UTC m=+0.204508973 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 10:07:40 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:07:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e200 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:41.446 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e200 do_prune osdmap full prune enabled
Nov 28 10:07:41 np0005538513.localdomain ceph-mon[292954]: pgmap v397: 177 pgs: 177 active+clean; 225 MiB data, 1023 MiB used, 41 GiB / 42 GiB avail; 1.6 KiB/s rd, 26 KiB/s wr, 4 op/s
Nov 28 10:07:41 np0005538513.localdomain ceph-mon[292954]: osdmap e200: 6 total, 6 up, 6 in
Nov 28 10:07:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e201 e201: 6 total, 6 up, 6 in
Nov 28 10:07:41 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e201: 6 total, 6 up, 6 in
Nov 28 10:07:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e201 do_prune osdmap full prune enabled
Nov 28 10:07:42 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "1ad625c7-a9e5-4671-b5b8-87ed0dc833b6", "format": "json"}]: dispatch
Nov 28 10:07:42 np0005538513.localdomain ceph-mon[292954]: osdmap e201: 6 total, 6 up, 6 in
Nov 28 10:07:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e202 e202: 6 total, 6 up, 6 in
Nov 28 10:07:42 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e202: 6 total, 6 up, 6 in
Nov 28 10:07:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:42.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:07:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:43.483 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:3b:02 10.100.0.19 10.100.0.2 10.100.0.34'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af9fa2ef-906d-4c5f-8a61-e350b84d90cf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=636c21fa-d6bd-405e-95ed-d59498827d6f) old=Port_Binding(mac=['fa:16:3e:af:3b:02 10.100.0.19 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:43.485 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 636c21fa-d6bd-405e-95ed-d59498827d6f in datapath 744b5a82-3c5c-4b41-ba44-527244a209c4 updated
Nov 28 10:07:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:43.488 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 744b5a82-3c5c-4b41-ba44-527244a209c4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:07:43 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:43.489 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4d2b736c-053b-484f-8a47-d373ec4aef9a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:07:43 np0005538513.localdomain ceph-mon[292954]: pgmap v400: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 128 KiB/s rd, 35 KiB/s wr, 184 op/s
Nov 28 10:07:43 np0005538513.localdomain ceph-mon[292954]: osdmap e202: 6 total, 6 up, 6 in
Nov 28 10:07:43 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 28 10:07:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1359194721' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:07:44 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:07:44.329 2 INFO neutron.agent.securitygroups_rpc [None req-cecabd37-7803-4c2d-a13a-d3905bbc0cfc 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['18fcd73e-4837-425f-bf44-9ed4ac2aa187', '58a0f932-b9f0-4bad-a5cb-a3c8cba7c65b', 'bec6547e-445f-4500-b371-6e2fc240d4db']
Nov 28 10:07:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e202 do_prune osdmap full prune enabled
Nov 28 10:07:44 np0005538513.localdomain ceph-mon[292954]: pgmap v402: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 118 KiB/s rd, 32 KiB/s wr, 170 op/s
Nov 28 10:07:44 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "80ccc338-1d8e-4716-ba2a-1f18e7a6e806", "snap_name": "a7f380a2-0f60-4351-b764-bd78832f244d_c6ebf898-e639-41ae-92f7-1c129b5ee100", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:44 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1359194721' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:07:44 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "80ccc338-1d8e-4716-ba2a-1f18e7a6e806", "snap_name": "a7f380a2-0f60-4351-b764-bd78832f244d", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e203 e203: 6 total, 6 up, 6 in
Nov 28 10:07:44 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e203: 6 total, 6 up, 6 in
Nov 28 10:07:44 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:07:44.742 2 INFO neutron.agent.securitygroups_rpc [None req-d99eee51-2169-45db-889c-fcddfc1e6db2 6b80a7d6f65f4ebe8363ccaae26f3e87 ae10569a38284f298c961498da620c5f - - default default] Security group member updated ['c5eee24b-0bed-4035-a2ab-e6c531c94e43']
Nov 28 10:07:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:44.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:07:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:44.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:07:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e203 do_prune osdmap full prune enabled
Nov 28 10:07:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e204 e204: 6 total, 6 up, 6 in
Nov 28 10:07:45 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e204: 6 total, 6 up, 6 in
Nov 28 10:07:45 np0005538513.localdomain ceph-mon[292954]: osdmap e203: 6 total, 6 up, 6 in
Nov 28 10:07:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:45.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:07:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:45.770 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:07:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:45.896 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:46 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:07:46.121 2 INFO neutron.agent.securitygroups_rpc [None req-ff6bd390-74ec-4285-8021-b1b301c7b944 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['18fcd73e-4837-425f-bf44-9ed4ac2aa187', 'bec6547e-445f-4500-b371-6e2fc240d4db']
Nov 28 10:07:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e204 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e204 do_prune osdmap full prune enabled
Nov 28 10:07:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e205 e205: 6 total, 6 up, 6 in
Nov 28 10:07:46 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e205: 6 total, 6 up, 6 in
Nov 28 10:07:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:07:46 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/611242034' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:07:46 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/611242034' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:46.483 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:46 np0005538513.localdomain ceph-mon[292954]: osdmap e204: 6 total, 6 up, 6 in
Nov 28 10:07:46 np0005538513.localdomain ceph-mon[292954]: pgmap v405: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 120 KiB/s rd, 32 KiB/s wr, 172 op/s
Nov 28 10:07:46 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "2c0f5770-a785-4c87-af5d-f92c8abce21d", "format": "json"}]: dispatch
Nov 28 10:07:46 np0005538513.localdomain ceph-mon[292954]: osdmap e205: 6 total, 6 up, 6 in
Nov 28 10:07:46 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/611242034' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:46 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/611242034' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:46.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:07:46 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:07:46.872 2 INFO neutron.agent.securitygroups_rpc [None req-02f5cc80-e4fe-45b5-9f60-d631f95eb2c9 6b80a7d6f65f4ebe8363ccaae26f3e87 ae10569a38284f298c961498da620c5f - - default default] Security group member updated ['c5eee24b-0bed-4035-a2ab-e6c531c94e43']
Nov 28 10:07:47 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:47.002 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:07:47 np0005538513.localdomain podman[327578]: 2025-11-28 10:07:47.157767749 +0000 UTC m=+0.064315382 container kill 99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:07:47 np0005538513.localdomain dnsmasq[327038]: exiting on receipt of SIGTERM
Nov 28 10:07:47 np0005538513.localdomain systemd[1]: tmp-crun.mmSGk4.mount: Deactivated successfully.
Nov 28 10:07:47 np0005538513.localdomain systemd[1]: libpod-99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d.scope: Deactivated successfully.
Nov 28 10:07:47 np0005538513.localdomain podman[327593]: 2025-11-28 10:07:47.213452935 +0000 UTC m=+0.040447397 container died 99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:07:47 np0005538513.localdomain podman[327593]: 2025-11-28 10:07:47.238009562 +0000 UTC m=+0.065003994 container cleanup 99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:47 np0005538513.localdomain systemd[1]: libpod-conmon-99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d.scope: Deactivated successfully.
Nov 28 10:07:47 np0005538513.localdomain podman[327594]: 2025-11-28 10:07:47.298687192 +0000 UTC m=+0.121397333 container remove 99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:07:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e205 do_prune osdmap full prune enabled
Nov 28 10:07:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e206 e206: 6 total, 6 up, 6 in
Nov 28 10:07:47 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e206: 6 total, 6 up, 6 in
Nov 28 10:07:47 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "80ccc338-1d8e-4716-ba2a-1f18e7a6e806", "format": "json"}]: dispatch
Nov 28 10:07:47 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "80ccc338-1d8e-4716-ba2a-1f18e7a6e806", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:47.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:07:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:47.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:07:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:07:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:07:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:07:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:07:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:07:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:07:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:07:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:07:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:07:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:07:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:07:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:07:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f82a10448f7eafab39fdb6ae375b218bc52faf1c65a99161c7b78ef796fa5b5c-merged.mount: Deactivated successfully.
Nov 28 10:07:48 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d-userdata-shm.mount: Deactivated successfully.
Nov 28 10:07:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:48.229 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:07:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:48.230 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:07:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:48.230 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:07:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:48.230 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:07:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:48.231 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:07:48 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:48.372 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:48.372 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:48 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:48.375 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:07:48 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:48.457 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:07:48 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d92e1c9c1\x2d574e\x2d40d0\x2d8b6a\x2dbd313cc5f7d4.mount: Deactivated successfully.
Nov 28 10:07:48 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:48.548 261084 INFO neutron.agent.dhcp.agent [None req-1464a885-bb69-4317-98d8-b05b7a31be10 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:07:48 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:07:48 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1635609151' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:07:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:48.680 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:07:48 np0005538513.localdomain ceph-mon[292954]: osdmap e206: 6 total, 6 up, 6 in
Nov 28 10:07:48 np0005538513.localdomain ceph-mon[292954]: pgmap v408: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 142 KiB/s rd, 30 KiB/s wr, 198 op/s
Nov 28 10:07:48 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1635609151' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:07:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:48.787 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:07:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:48.788 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:07:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:48.996 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:07:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:48.998 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11102MB free_disk=41.7000732421875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:07:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:48.999 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:07:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:48.999 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:07:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:49.561 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 10:07:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:49.562 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:07:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:49.562 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:07:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:49.622 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:07:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e206 do_prune osdmap full prune enabled
Nov 28 10:07:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e207 e207: 6 total, 6 up, 6 in
Nov 28 10:07:49 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e207: 6 total, 6 up, 6 in
Nov 28 10:07:49 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1383712359' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:07:50 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:07:50 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1418648319' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:07:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:50.078 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:07:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:50.084 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:07:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:50.131 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:07:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:50.134 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:07:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:50.134 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:07:50 np0005538513.localdomain ceph-mon[292954]: osdmap e207: 6 total, 6 up, 6 in
Nov 28 10:07:50 np0005538513.localdomain ceph-mon[292954]: pgmap v410: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 135 KiB/s rd, 29 KiB/s wr, 188 op/s
Nov 28 10:07:50 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "46e657b3-6b45-48d1-8f94-979e6670605a", "format": "json"}]: dispatch
Nov 28 10:07:50 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1418648319' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:07:50 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/674541035' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:07:50 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3345109558' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:07:50 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:07:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:50.846 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:07:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:50.846 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:07:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:50.847 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:07:50 np0005538513.localdomain podman[327665]: 2025-11-28 10:07:50.85241129 +0000 UTC m=+0.083982049 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, architecture=x86_64, config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 10:07:50 np0005538513.localdomain podman[327665]: 2025-11-28 10:07:50.890638558 +0000 UTC m=+0.122209377 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, release=1755695350, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 10:07:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:50.898 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:50 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:07:51 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:07:51.392 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:07:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e207 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e207 do_prune osdmap full prune enabled
Nov 28 10:07:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e208 e208: 6 total, 6 up, 6 in
Nov 28 10:07:51 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e208: 6 total, 6 up, 6 in
Nov 28 10:07:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:51.485 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:51 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1126093872' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:07:51 np0005538513.localdomain ceph-mon[292954]: osdmap e208: 6 total, 6 up, 6 in
Nov 28 10:07:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:07:51 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3232289778' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:07:51 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3232289778' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:52.135 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:07:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:52.136 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:07:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:52.136 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:07:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:52.266 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:07:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:52.266 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:07:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:52.267 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 10:07:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:52.267 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:07:52 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e208 do_prune osdmap full prune enabled
Nov 28 10:07:52 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e209 e209: 6 total, 6 up, 6 in
Nov 28 10:07:52 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e209: 6 total, 6 up, 6 in
Nov 28 10:07:52 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:07:52.454 2 INFO neutron.agent.securitygroups_rpc [None req-6326bf78-d29d-46c0-b3b7-72df824a50bd 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:07:52 np0005538513.localdomain ceph-mon[292954]: pgmap v412: 177 pgs: 177 active+clean; 225 MiB data, 1016 MiB used, 41 GiB / 42 GiB avail; 185 KiB/s rd, 41 KiB/s wr, 254 op/s
Nov 28 10:07:52 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3232289778' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:52 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3232289778' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:52 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2095041212' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:07:52 np0005538513.localdomain ceph-mon[292954]: osdmap e209: 6 total, 6 up, 6 in
Nov 28 10:07:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:53.317 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:07:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:53.506 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:07:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:53.507 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 10:07:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:53.507 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:07:53 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e209 do_prune osdmap full prune enabled
Nov 28 10:07:53 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e210 e210: 6 total, 6 up, 6 in
Nov 28 10:07:53 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e210: 6 total, 6 up, 6 in
Nov 28 10:07:54 np0005538513.localdomain ceph-mon[292954]: pgmap v414: 177 pgs: 177 active+clean; 225 MiB data, 1016 MiB used, 41 GiB / 42 GiB avail; 71 KiB/s rd, 17 KiB/s wr, 96 op/s
Nov 28 10:07:54 np0005538513.localdomain ceph-mon[292954]: osdmap e210: 6 total, 6 up, 6 in
Nov 28 10:07:54 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "a5860537-4c07-4bc9-9db5-8ed95b4a5b7c", "format": "json"}]: dispatch
Nov 28 10:07:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e210 do_prune osdmap full prune enabled
Nov 28 10:07:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e211 e211: 6 total, 6 up, 6 in
Nov 28 10:07:54 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e211: 6 total, 6 up, 6 in
Nov 28 10:07:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:07:55 np0005538513.localdomain systemd[1]: tmp-crun.uFpSun.mount: Deactivated successfully.
Nov 28 10:07:55 np0005538513.localdomain podman[327686]: 2025-11-28 10:07:55.858634275 +0000 UTC m=+0.094828892 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:07:55 np0005538513.localdomain podman[327686]: 2025-11-28 10:07:55.86950569 +0000 UTC m=+0.105700277 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 10:07:55 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:07:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:55.900 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:55 np0005538513.localdomain ceph-mon[292954]: osdmap e211: 6 total, 6 up, 6 in
Nov 28 10:07:56 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:07:56.377 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:07:56 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e211 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:56 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e211 do_prune osdmap full prune enabled
Nov 28 10:07:56 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e212 e212: 6 total, 6 up, 6 in
Nov 28 10:07:56 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e212: 6 total, 6 up, 6 in
Nov 28 10:07:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:56.487 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:57 np0005538513.localdomain ceph-mon[292954]: pgmap v417: 177 pgs: 177 active+clean; 225 MiB data, 1016 MiB used, 41 GiB / 42 GiB avail; 97 KiB/s rd, 23 KiB/s wr, 131 op/s
Nov 28 10:07:57 np0005538513.localdomain ceph-mon[292954]: osdmap e212: 6 total, 6 up, 6 in
Nov 28 10:07:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:07:57 np0005538513.localdomain podman[327708]: 2025-11-28 10:07:57.838953884 +0000 UTC m=+0.077709275 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:07:57 np0005538513.localdomain podman[327708]: 2025-11-28 10:07:57.844801754 +0000 UTC m=+0.083557115 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 10:07:57 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:07:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:07:58.139 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:07:58 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "3e64b882-8f97-4fa5-8cd3-e92a05aac6b2", "format": "json"}]: dispatch
Nov 28 10:07:59 np0005538513.localdomain ceph-mon[292954]: pgmap v419: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 11 KiB/s wr, 73 op/s
Nov 28 10:07:59 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:07:59.975 2 INFO neutron.agent.securitygroups_rpc [None req-db55947c-11ec-46cf-8b4d-c6d9fdfd5571 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['6089c18b-265b-455e-adb1-d3701c826867']
Nov 28 10:08:00 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:00.370 2 INFO neutron.agent.securitygroups_rpc [None req-c7648c00-3d9b-484d-a8fa-078b96af4727 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['6089c18b-265b-455e-adb1-d3701c826867']
Nov 28 10:08:00 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:00.423 2 INFO neutron.agent.securitygroups_rpc [req-b7f1a51c-049a-4a1b-9cd5-f5d4f2c3744c req-dd3110e0-51f5-4337-ac22-6a0998bcc00c 87cf17c48bab44279b2e7eca9e0882a2 d3c0d1ce8d854a7b9ffc953e88cd2c44 - - default default] Security group member updated ['c52603b5-5f47-4123-b8fe-cc9f0a56d914']
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.676 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.677 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.681 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be5790d3-fbfe-48cc-8616-ffeac053ccba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:08:00.677387', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '1ee69e92-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.849278866, 'message_signature': '787a5a161a65cd00b8218feff9d59b51ac107d5a9df60b386c15c906906a5250'}]}, 'timestamp': '2025-11-28 10:08:00.682664', '_unique_id': '71bfa88aa0f142e29764fceb3f8f6d49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.685 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.685 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ccb3c9c0-e991-485c-b2ff-2bbcb89520bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:08:00.685541', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '1ee723da-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.849278866, 'message_signature': '771e76ad5d87a316ecb3f8b61a92dff0c8a7166e5c5525d03124c316424852de'}]}, 'timestamp': '2025-11-28 10:08:00.686008', '_unique_id': 'b49b5debc4c84e399eaab7ee4dd17810'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.688 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.688 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b4a8a4d-2cf1-4007-bc02-ddff0ae3a66a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:08:00.688187', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '1ee78b36-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.849278866, 'message_signature': 'c3868b7dd8f0e841828363811df492f5dc526a826494aa72bf96f56952e25d0e'}]}, 'timestamp': '2025-11-28 10:08:00.688652', '_unique_id': 'ab5a97db60cd44d39acc8234361cc98d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.690 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.690 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.690 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f08f13cc-08df-49ef-be0c-5f53a8efd67b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:08:00.690938', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '1ee7f882-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.849278866, 'message_signature': '0ab2e221c63939ec07d33e759ba42c7f2f951e32c62d3f621e796aa8d78ff02f'}]}, 'timestamp': '2025-11-28 10:08:00.691450', '_unique_id': '582cdb3ce55e4123b80d0952d8f7deca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.693 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.709 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 17630000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3a64570-a236-4ebc-81bc-64c7b3f33324', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17630000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:08:00.693537', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '1eeae128-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.881761447, 'message_signature': '125ed3b80a3da8c73370b315701298d3209758b9de7572b8678afbb5e108b360'}]}, 'timestamp': '2025-11-28 10:08:00.710510', '_unique_id': 'e63312a722234c71a65f425ed9103f86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.712 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.740 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.740 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38ee98b8-6b68-42b6-bb8b-d840ec3e02c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:08:00.712825', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1eef800c-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': 'a41986580a900725e88a143d9c97d14e7d62495c0b7358b3b4da8be4fbe87820'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:08:00.712825', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1eef9164-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': 'bd6f8dbcce7d43b411e6a17c07fdfd720ca743fc4cbe78c298d1ab3962a17c42'}]}, 'timestamp': '2025-11-28 10:08:00.741241', '_unique_id': '029d9993920f47a8aa0b0e0ac09e4ed9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.743 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.743 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0953e16-0969-4b67-9442-3e68d2f6795e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:08:00.743798', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '1ef008f6-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.881761447, 'message_signature': '6c558c1acd8cf2f7b3446a12bc495b16c89f0ae9909d80bb4177efaf6c8e8430'}]}, 'timestamp': '2025-11-28 10:08:00.744287', '_unique_id': '4f294d41408c40609879741b010ea03b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.746 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.746 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ca549ff-a878-4b56-822f-966de80da965', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:08:00.746447', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '1ef06f1c-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.849278866, 'message_signature': '64240da5122c1c58245044c902c43f113b105cee5a4ae11ced3e614fa81e0a11'}]}, 'timestamp': '2025-11-28 10:08:00.746945', '_unique_id': '724d38e915e64d4f966e2592f8e32fd7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.749 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.749 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.749 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a36f8b63-d922-4228-b49b-476e5cf4482f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:08:00.749224', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ef0db82-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': 'df838aa38e8f321a83d1fd209af32cd6c4f5bbaf17892da5be09357e0b7775c9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:08:00.749224', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ef0eb86-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': '15f7d9ab38fffc9147ac52d51348028671135000a72d0f3c40781d41dd9d2d96'}]}, 'timestamp': '2025-11-28 10:08:00.750093', '_unique_id': 'e393e062ee0d4d74bb3734c7f16ddfe6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.752 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.752 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.752 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8141fc43-9139-4010-bb34-7702ee0cc7a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:08:00.752330', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ef158c8-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': 'f228323b9028861439b0528800f6491c68d6ca7e869932b67e3a937010b18d4f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:08:00.752330', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ef16a34-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': '3b19b85686d26e93874e902d78626a9307de860842ff8e0b811a8fe690a0e526'}]}, 'timestamp': '2025-11-28 10:08:00.753309', '_unique_id': 'a1c7a7e705da44bca88ee38b187cc83d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.755 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.755 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '401924f4-1ae6-466a-a99c-018474204d59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:08:00.755501', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '1ef1d0dc-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.849278866, 'message_signature': '2fdcf0a8413f734bc797cb378c5fc1ef98a35e6db56a6229c4163f28969ca670'}]}, 'timestamp': '2025-11-28 10:08:00.755967', '_unique_id': '55797a652864478cba8a2af1a8e6b2a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.757 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.771 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.772 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb274843-5e8f-49eb-a74a-0791caf49fa8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:08:00.758095', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ef45924-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.930011233, 'message_signature': 'c38e31573b2ffede32c2878e4230d9cc2d8d268f3f401f4ca558d1c475214542'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:08:00.758095', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ef46d60-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.930011233, 'message_signature': '304980e490d74051d8f5c4b73193f3f81ec85db623d3c9879543eb2d6816c41a'}]}, 'timestamp': '2025-11-28 10:08:00.773098', '_unique_id': '3e92882fe14748148827a550d94ccd8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.775 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.775 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53aa3748-353b-4ae3-af2b-b817259656c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:08:00.775914', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '1ef4f0be-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.849278866, 'message_signature': '79988ea8a3cd6219c519d42624817219698af698c867530e1708048d80a33e98'}]}, 'timestamp': '2025-11-28 10:08:00.776449', '_unique_id': '687538df9687422a8c02838c18306e7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.779 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.779 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa3bb2aa-5c3f-48c6-babb-a32563c6d34b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:08:00.779598', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '1ef58358-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.849278866, 'message_signature': '02b6cb7fc59985b52df167714c04368576bafb4dbab56a20db83533d588d0c9c'}]}, 'timestamp': '2025-11-28 10:08:00.780319', '_unique_id': 'b36df62bb2c1444da010952e6cfb8a13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.783 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.783 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd214fe69-39fd-4d48-88b0-376f85ea2644', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:08:00.783346', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '1ef61052-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.849278866, 'message_signature': 'f4679ad9d36c3172ca4936811cee3bbe9b3b8c5c8891c576776cdcdd2c9a2ae1'}]}, 'timestamp': '2025-11-28 10:08:00.783807', '_unique_id': '04b0d05b2a584b21bb829fe204ac246b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.785 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.785 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27561f6f-04b8-46fe-801d-2a8e24e06bc7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:08:00.785916', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '1ef675c4-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.849278866, 'message_signature': '10d1be8fb126e733cd7c75db48dfac2f2fb3a8946f6ec79009c7d2c108cda10e'}]}, 'timestamp': '2025-11-28 10:08:00.786402', '_unique_id': '2717a9fa624a4214a291c2f4b6676d39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.788 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.788 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.788 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.789 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95f27a3a-629a-44c2-a0f2-6e5dd8afc54a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:08:00.788663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ef6e04a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.930011233, 'message_signature': '9290f4784dcb6d804a8c9f2c67eb55fa9fc216e47c8246f453bcab0a52c3eb20'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:08:00.788663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ef6f17a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.930011233, 'message_signature': '0b3672baab00173fd5c62104fbb2d08681932d14e442b845842965ea9a6a63b2'}]}, 'timestamp': '2025-11-28 10:08:00.789539', '_unique_id': '2a499300f95943209f1c4a05635db35c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.791 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.791 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.792 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb162322-9b11-4c76-b7f9-f49573689d57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:08:00.791705', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ef75638-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.930011233, 'message_signature': '6729ba7d410ac70188a03499688cf866e3a6eae3b9645266fddaff5805d012e0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:08:00.791705', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ef76772-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.930011233, 'message_signature': '1033354e8dff52a8a3fe00b08fa1ac3f6b26d0b6f761ab33d22e025a52d1bd33'}]}, 'timestamp': '2025-11-28 10:08:00.792589', '_unique_id': '317724244dcd43c3bc8ca91a8f8aae43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.794 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4393f2c-e8ce-4752-b001-0dde6c4e707f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:08:00.795058', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ef7d9dc-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': 'b5c86c3669ec3fa3aca6fb65b21a273ba1baf195a8410029dbcf79bd350ebc62'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:08:00.795058', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ef7e94a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': '609fda4cdbe3ce932e2d91b1916018283d136ca8711f0a80cad3d73d5ee79a63'}]}, 'timestamp': '2025-11-28 10:08:00.795879', '_unique_id': 'af12df17a2a7425fbc0dd7213afb6816'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.797 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.797 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.798 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6e2c83c-2b13-410d-a748-bf5e41d70bef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:08:00.797943', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ef84b4c-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': '657003e21620bfd0acd1f1755d0e3be83947cbffa80276c9085ab3c15db6ca7b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:08:00.797943', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ef85ad8-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': 'aa9e2a670636c241fc3e30e56b6c39620d65265d036d4af6c21d909ef87028a5'}]}, 'timestamp': '2025-11-28 10:08:00.798786', '_unique_id': '9e458f9315204d0ebd943d005a26f86f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.800 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.800 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.801 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '964445b0-af43-4b97-95d3-3e1de9d334cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:08:00.800937', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ef8c40a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': '597d133acb51312cabaf3d0e0a8d9f8ed43cd1ac3471d762105718cc5a35531e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:08:00.800937', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ef8d3b4-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': '618cc57e0f261d145e37b4c6c04b8817bb07e217bf59d2c320d0bc4d9c049691'}]}, 'timestamp': '2025-11-28 10:08:00.801878', '_unique_id': '185d7c7aad7340e3ba7cb6d56aa2089b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:08:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:08:00 np0005538513.localdomain podman[327727]: 2025-11-28 10:08:00.860099405 +0000 UTC m=+0.092218033 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:08:00 np0005538513.localdomain podman[327727]: 2025-11-28 10:08:00.874904521 +0000 UTC m=+0.107023139 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:08:00 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:08:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:00.902 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:00 np0005538513.localdomain systemd[1]: tmp-crun.TnU2rj.mount: Deactivated successfully.
Nov 28 10:08:00 np0005538513.localdomain podman[327728]: 2025-11-28 10:08:00.965136981 +0000 UTC m=+0.196958740 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:08:01 np0005538513.localdomain podman[327728]: 2025-11-28 10:08:01.005423213 +0000 UTC m=+0.237244972 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:08:01 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:08:01 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Nov 28 10:08:01 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:01.204 2 INFO neutron.agent.securitygroups_rpc [None req-565e0ad3-78a4-4813-9446-c55fe79fb3b2 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['d7b997f9-4b8e-48df-a7bc-cf1a88435b19']
Nov 28 10:08:01 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:01.334 2 INFO neutron.agent.securitygroups_rpc [None req-51f1f60e-f6ad-4b91-b7b5-2db567d0c6ed 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']
Nov 28 10:08:01 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:01.395 2 INFO neutron.agent.securitygroups_rpc [None req-84e0009b-c65d-4325-9480-9689cb3fdfb2 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['d7b997f9-4b8e-48df-a7bc-cf1a88435b19']
Nov 28 10:08:01 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e212 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:01 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e212 do_prune osdmap full prune enabled
Nov 28 10:08:01 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e213 e213: 6 total, 6 up, 6 in
Nov 28 10:08:01 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e213: 6 total, 6 up, 6 in
Nov 28 10:08:01 np0005538513.localdomain ceph-mon[292954]: pgmap v420: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 47 KiB/s rd, 10 KiB/s wr, 66 op/s
Nov 28 10:08:01 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2546522305' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:08:01 np0005538513.localdomain ceph-mon[292954]: osdmap e213: 6 total, 6 up, 6 in
Nov 28 10:08:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:01.491 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:01 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:01.552 2 INFO neutron.agent.securitygroups_rpc [None req-f01fa5c9-de49-46e8-bc93-b89c3fed3f57 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']
Nov 28 10:08:01 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:08:01.772 261084 INFO neutron.agent.linux.ip_lib [None req-ffde1999-4035-4756-98bf-8def416c45d7 - - - - - -] Device tapdfb15593-fe cannot be used as it has no MAC address
Nov 28 10:08:01 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:01.774 2 INFO neutron.agent.securitygroups_rpc [None req-05d81c06-d401-4205-8dff-14a86090a368 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']
Nov 28 10:08:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:01.801 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:01 np0005538513.localdomain kernel: device tapdfb15593-fe entered promiscuous mode
Nov 28 10:08:01 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:08:01Z|00463|binding|INFO|Claiming lport dfb15593-fe1f-4aaf-af25-32ab66c1a780 for this chassis.
Nov 28 10:08:01 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:08:01Z|00464|binding|INFO|dfb15593-fe1f-4aaf-af25-32ab66c1a780: Claiming unknown
Nov 28 10:08:01 np0005538513.localdomain systemd-udevd[327781]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:08:01 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324481.8139] manager: (tapdfb15593-fe): new Generic device (/org/freedesktop/NetworkManager/Devices/74)
Nov 28 10:08:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:08:01.830 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-39526589-27d4-41ad-9aef-f63f534ecbf0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-39526589-27d4-41ad-9aef-f63f534ecbf0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ce143270a4649669232b53b6a44e4ba', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7ab6281c-5843-47df-b4b0-ac9f0fa87790, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=dfb15593-fe1f-4aaf-af25-32ab66c1a780) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:08:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:08:01.834 158130 INFO neutron.agent.ovn.metadata.agent [-] Port dfb15593-fe1f-4aaf-af25-32ab66c1a780 in datapath 39526589-27d4-41ad-9aef-f63f534ecbf0 bound to our chassis
Nov 28 10:08:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:08:01.837 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 39526589-27d4-41ad-9aef-f63f534ecbf0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:08:01 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:08:01.838 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[23b7541e-bd13-4cc8-b102-47bcf4ee2e60]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:08:01 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdfb15593-fe: No such device
Nov 28 10:08:01 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdfb15593-fe: No such device
Nov 28 10:08:01 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:08:01Z|00465|binding|INFO|Setting lport dfb15593-fe1f-4aaf-af25-32ab66c1a780 ovn-installed in OVS
Nov 28 10:08:01 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:08:01Z|00466|binding|INFO|Setting lport dfb15593-fe1f-4aaf-af25-32ab66c1a780 up in Southbound
Nov 28 10:08:01 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdfb15593-fe: No such device
Nov 28 10:08:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:01.851 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:01 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdfb15593-fe: No such device
Nov 28 10:08:01 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdfb15593-fe: No such device
Nov 28 10:08:01 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdfb15593-fe: No such device
Nov 28 10:08:01 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdfb15593-fe: No such device
Nov 28 10:08:01 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdfb15593-fe: No such device
Nov 28 10:08:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:01.892 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:01.926 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:01 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:01.939 2 INFO neutron.agent.securitygroups_rpc [None req-22968046-714b-4dc6-9b18-f55692eae2e8 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']
Nov 28 10:08:02 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:02.109 2 INFO neutron.agent.securitygroups_rpc [None req-1c9f5803-20cb-4e69-804a-0617a068dfc7 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']
Nov 28 10:08:02 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:02.253 2 INFO neutron.agent.securitygroups_rpc [None req-2bab2459-ad99-4514-94d3-affbaffcf884 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']
Nov 28 10:08:02 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "3e64b882-8f97-4fa5-8cd3-e92a05aac6b2_36a63ece-b219-4438-a9c8-be964316c461", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:02 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "3e64b882-8f97-4fa5-8cd3-e92a05aac6b2", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:02 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:02.782 2 INFO neutron.agent.securitygroups_rpc [None req-c50c53a2-fd90-43a9-8f31-8eb2d8fc3f23 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']
Nov 28 10:08:02 np0005538513.localdomain podman[327852]: 
Nov 28 10:08:02 np0005538513.localdomain podman[327852]: 2025-11-28 10:08:02.838231476 +0000 UTC m=+0.094030958 container create 7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39526589-27d4-41ad-9aef-f63f534ecbf0, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 10:08:02 np0005538513.localdomain systemd[1]: Started libpod-conmon-7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36.scope.
Nov 28 10:08:02 np0005538513.localdomain podman[327852]: 2025-11-28 10:08:02.794102976 +0000 UTC m=+0.049902478 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:08:02 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:08:02 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4341b7420827c25c7624eca3707ed25a6232f868ac4d13c1195965a03f452de4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:08:02 np0005538513.localdomain podman[327852]: 2025-11-28 10:08:02.918077436 +0000 UTC m=+0.173876928 container init 7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39526589-27d4-41ad-9aef-f63f534ecbf0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 10:08:02 np0005538513.localdomain podman[327852]: 2025-11-28 10:08:02.92924013 +0000 UTC m=+0.185039612 container start 7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39526589-27d4-41ad-9aef-f63f534ecbf0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:08:02 np0005538513.localdomain dnsmasq[327872]: started, version 2.85 cachesize 150
Nov 28 10:08:02 np0005538513.localdomain dnsmasq[327872]: DNS service limited to local subnets
Nov 28 10:08:02 np0005538513.localdomain dnsmasq[327872]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:08:02 np0005538513.localdomain dnsmasq[327872]: warning: no upstream servers configured
Nov 28 10:08:02 np0005538513.localdomain dnsmasq-dhcp[327872]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:08:02 np0005538513.localdomain dnsmasq[327872]: read /var/lib/neutron/dhcp/39526589-27d4-41ad-9aef-f63f534ecbf0/addn_hosts - 0 addresses
Nov 28 10:08:02 np0005538513.localdomain dnsmasq-dhcp[327872]: read /var/lib/neutron/dhcp/39526589-27d4-41ad-9aef-f63f534ecbf0/host
Nov 28 10:08:02 np0005538513.localdomain dnsmasq-dhcp[327872]: read /var/lib/neutron/dhcp/39526589-27d4-41ad-9aef-f63f534ecbf0/opts
Nov 28 10:08:02 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:02.982 2 INFO neutron.agent.securitygroups_rpc [None req-be6115b0-0729-4a7d-96a1-2edd3462b47c 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']
Nov 28 10:08:03 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:03.025 2 INFO neutron.agent.securitygroups_rpc [None req-bd38caac-e12b-4351-8e6f-97a983a9711e 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']
Nov 28 10:08:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:08:03.160 261084 INFO neutron.agent.dhcp.agent [None req-a7d3a5cf-ca51-413f-864d-c6bf6e800648 - - - - - -] DHCP configuration for ports {'5d2c2476-c1d0-4426-9803-949f4055c439'} is completed
Nov 28 10:08:03 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:03.179 2 INFO neutron.agent.securitygroups_rpc [None req-edb67cac-fd5d-4927-8f94-9c49e0c903d7 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']
Nov 28 10:08:03 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:03.331 2 INFO neutron.agent.securitygroups_rpc [None req-20312780-c455-466a-b7ab-f675d7eabde1 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']
Nov 28 10:08:03 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:03.422 2 INFO neutron.agent.securitygroups_rpc [None req-14a5f88b-dc6d-4539-ba50-a6898f3db0a1 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']
Nov 28 10:08:03 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:03.476 2 INFO neutron.agent.securitygroups_rpc [None req-13772359-a1cd-4764-8f15-d95cc5fef900 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']
Nov 28 10:08:03 np0005538513.localdomain ceph-mon[292954]: pgmap v422: 177 pgs: 177 active+clean; 146 MiB data, 899 MiB used, 41 GiB / 42 GiB avail; 74 KiB/s rd, 29 KiB/s wr, 107 op/s
Nov 28 10:08:03 np0005538513.localdomain systemd[1]: tmp-crun.zoUCyR.mount: Deactivated successfully.
Nov 28 10:08:03 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:03.869 2 INFO neutron.agent.securitygroups_rpc [None req-6ebeaf50-19b9-453a-b8c6-1888da8279d8 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']
Nov 28 10:08:03 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:03.889 2 INFO neutron.agent.securitygroups_rpc [None req-13ec4ccc-0950-4875-9a0e-1e0334716892 f56d2237e5b74576a33d9840c9346817 9ce143270a4649669232b53b6a44e4ba - - default default] Security group member updated ['f3e50b86-f5a6-4339-897f-e9e754c264f3']
Nov 28 10:08:03 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:03.892 2 INFO neutron.agent.securitygroups_rpc [None req-a9fad0b5-d55e-463a-8ffe-8066b04b35f2 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']
Nov 28 10:08:03 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:08:03.948 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:08:03Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64071c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6407dc0>], id=0edceee0-9905-42aa-b928-f37bc201af3e, ip_allocation=immediate, mac_address=fa:16:3e:3c:35:7f, name=tempest-TagsExtTest-478727359, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:07:59Z, description=, dns_domain=, id=39526589-27d4-41ad-9aef-f63f534ecbf0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TagsExtTest-test-network-1109491877, port_security_enabled=True, project_id=9ce143270a4649669232b53b6a44e4ba, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42885, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2638, status=ACTIVE, subnets=['9308db1a-3a7e-47d2-a1c6-556103f500ef'], tags=[], tenant_id=9ce143270a4649669232b53b6a44e4ba, updated_at=2025-11-28T10:08:00Z, vlan_transparent=None, network_id=39526589-27d4-41ad-9aef-f63f534ecbf0, port_security_enabled=True, project_id=9ce143270a4649669232b53b6a44e4ba, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['f3e50b86-f5a6-4339-897f-e9e754c264f3'], standard_attr_id=2669, status=DOWN, tags=[], tenant_id=9ce143270a4649669232b53b6a44e4ba, updated_at=2025-11-28T10:08:03Z on network 39526589-27d4-41ad-9aef-f63f534ecbf0
Nov 28 10:08:04 np0005538513.localdomain systemd[1]: tmp-crun.Ffqw6h.mount: Deactivated successfully.
Nov 28 10:08:04 np0005538513.localdomain dnsmasq[327872]: read /var/lib/neutron/dhcp/39526589-27d4-41ad-9aef-f63f534ecbf0/addn_hosts - 1 addresses
Nov 28 10:08:04 np0005538513.localdomain dnsmasq-dhcp[327872]: read /var/lib/neutron/dhcp/39526589-27d4-41ad-9aef-f63f534ecbf0/host
Nov 28 10:08:04 np0005538513.localdomain dnsmasq-dhcp[327872]: read /var/lib/neutron/dhcp/39526589-27d4-41ad-9aef-f63f534ecbf0/opts
Nov 28 10:08:04 np0005538513.localdomain podman[327890]: 2025-11-28 10:08:04.191587497 +0000 UTC m=+0.077642934 container kill 7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39526589-27d4-41ad-9aef-f63f534ecbf0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:08:04 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:04.234 2 INFO neutron.agent.securitygroups_rpc [None req-179f04c6-9ef2-42e7-8385-4b9a01b4f84f 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']
Nov 28 10:08:04 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:04.272 2 INFO neutron.agent.securitygroups_rpc [None req-703ffdf6-c471-414a-b1d5-827cd1496408 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']
Nov 28 10:08:04 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:08:04.409 261084 INFO neutron.agent.dhcp.agent [None req-d27c7e74-de34-4f03-9809-a4bbc65da21f - - - - - -] DHCP configuration for ports {'0edceee0-9905-42aa-b928-f37bc201af3e'} is completed
Nov 28 10:08:04 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e213 do_prune osdmap full prune enabled
Nov 28 10:08:04 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:04.651 2 INFO neutron.agent.securitygroups_rpc [None req-0ac9408a-42d9-4a9d-8ec9-d45f37e64efb 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']
Nov 28 10:08:04 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:04.675 2 INFO neutron.agent.securitygroups_rpc [None req-1da31546-dc0a-4e96-b9fc-5fecbaaa8ced 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']
Nov 28 10:08:04 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1986905182' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:04 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1986905182' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:04 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e214 e214: 6 total, 6 up, 6 in
Nov 28 10:08:05 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Nov 28 10:08:05 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e214: 6 total, 6 up, 6 in
Nov 28 10:08:05 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:05.181 2 INFO neutron.agent.securitygroups_rpc [None req-7ed5809e-b041-4ea1-b265-703b22d08b78 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']
Nov 28 10:08:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:05.905 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:05 np0005538513.localdomain ceph-mon[292954]: pgmap v423: 177 pgs: 177 active+clean; 146 MiB data, 899 MiB used, 41 GiB / 42 GiB avail; 64 KiB/s rd, 25 KiB/s wr, 93 op/s
Nov 28 10:08:05 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "a5860537-4c07-4bc9-9db5-8ed95b4a5b7c_b0090dae-0a99-4947-ac5e-d28013b7f627", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:05 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "a5860537-4c07-4bc9-9db5-8ed95b4a5b7c", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:05 np0005538513.localdomain ceph-mon[292954]: osdmap e214: 6 total, 6 up, 6 in
Nov 28 10:08:06 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e214 do_prune osdmap full prune enabled
Nov 28 10:08:06 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e215 e215: 6 total, 6 up, 6 in
Nov 28 10:08:06 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e215: 6 total, 6 up, 6 in
Nov 28 10:08:06 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e215 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:06 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:06.495 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:06 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:06.529 2 INFO neutron.agent.securitygroups_rpc [None req-c3c34998-5ba8-4c09-bfd4-af4362061351 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['77da4666-3c7e-4eb4-bd89-e0f6bc0cfb77']
Nov 28 10:08:06 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:06.832 2 INFO neutron.agent.securitygroups_rpc [None req-cf7500dc-e72d-4341-8bc2-ebaed58e7094 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['2521adb0-8644-4922-aaf5-9462c312df8d']
Nov 28 10:08:07 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e215 do_prune osdmap full prune enabled
Nov 28 10:08:07 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e216 e216: 6 total, 6 up, 6 in
Nov 28 10:08:07 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e216: 6 total, 6 up, 6 in
Nov 28 10:08:07 np0005538513.localdomain ceph-mon[292954]: pgmap v425: 177 pgs: 177 active+clean; 146 MiB data, 899 MiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 18 KiB/s wr, 43 op/s
Nov 28 10:08:07 np0005538513.localdomain ceph-mon[292954]: osdmap e215: 6 total, 6 up, 6 in
Nov 28 10:08:07 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "56764117-bba7-4a1d-bc16-3de8a089b757", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:08:07 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:08:07 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:07.943 2 INFO neutron.agent.securitygroups_rpc [None req-410f4b75-8a53-43bd-aeb3-73827c1fb9d5 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['cc6c7909-68f3-4243-ad85-ca295b324967']
Nov 28 10:08:08 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "56764117-bba7-4a1d-bc16-3de8a089b757", "format": "json"}]: dispatch
Nov 28 10:08:08 np0005538513.localdomain ceph-mon[292954]: osdmap e216: 6 total, 6 up, 6 in
Nov 28 10:08:08 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1480453872' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:08 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1480453872' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:08 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "46e657b3-6b45-48d1-8f94-979e6670605a_4c7473fe-0e78-460c-a4b8-b3878dfb39c2", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:08 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3398271039' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:08 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3398271039' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:08 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:08.423 2 INFO neutron.agent.securitygroups_rpc [None req-9e0fb103-8bda-4ed9-896b-20548b225439 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['cc6c7909-68f3-4243-ad85-ca295b324967']
Nov 28 10:08:08 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:08.637 2 INFO neutron.agent.securitygroups_rpc [None req-5e0e412c-d85c-446d-af13-157bbc4d1b94 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['723b43a6-7c6c-4eb3-a519-023b34d9a2b5']
Nov 28 10:08:08 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:08.893 2 INFO neutron.agent.securitygroups_rpc [None req-aa17c84c-96b9-4f58-abde-f1675a957a10 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['723b43a6-7c6c-4eb3-a519-023b34d9a2b5']
Nov 28 10:08:09 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "46e657b3-6b45-48d1-8f94-979e6670605a", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:09 np0005538513.localdomain ceph-mon[292954]: pgmap v428: 177 pgs: 2 active+clean+snaptrim, 5 active+clean+snaptrim_wait, 170 active+clean; 146 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 30 KiB/s wr, 61 op/s
Nov 28 10:08:09 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:09.707 2 INFO neutron.agent.securitygroups_rpc [None req-e693dfaa-acf9-4ef6-91f0-fa32111d34d5 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['89d85f46-caf0-4632-8f88-6aa2b20ffab5']
Nov 28 10:08:09 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:09.967 2 INFO neutron.agent.securitygroups_rpc [None req-40ac5e44-c334-4848-ad22-bdb0d1d393a9 f56d2237e5b74576a33d9840c9346817 9ce143270a4649669232b53b6a44e4ba - - default default] Security group member updated ['f3e50b86-f5a6-4339-897f-e9e754c264f3']
Nov 28 10:08:10 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:10.028 2 INFO neutron.agent.securitygroups_rpc [None req-aa8fcbbc-e15f-465d-a79b-54ba8e5d4dfa 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['89d85f46-caf0-4632-8f88-6aa2b20ffab5']
Nov 28 10:08:10 np0005538513.localdomain podman[238687]: time="2025-11-28T10:08:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:08:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:08:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157512 "" "Go-http-client/1.1"
Nov 28 10:08:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:08:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19739 "" "Go-http-client/1.1"
Nov 28 10:08:10 np0005538513.localdomain dnsmasq[327872]: read /var/lib/neutron/dhcp/39526589-27d4-41ad-9aef-f63f534ecbf0/addn_hosts - 0 addresses
Nov 28 10:08:10 np0005538513.localdomain dnsmasq-dhcp[327872]: read /var/lib/neutron/dhcp/39526589-27d4-41ad-9aef-f63f534ecbf0/host
Nov 28 10:08:10 np0005538513.localdomain dnsmasq-dhcp[327872]: read /var/lib/neutron/dhcp/39526589-27d4-41ad-9aef-f63f534ecbf0/opts
Nov 28 10:08:10 np0005538513.localdomain podman[327930]: 2025-11-28 10:08:10.257114721 +0000 UTC m=+0.061998281 container kill 7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39526589-27d4-41ad-9aef-f63f534ecbf0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:08:10 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:10.758 2 INFO neutron.agent.securitygroups_rpc [None req-652590f9-d0df-4db1-90c9-4125d049cb03 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['527f33ea-6583-4033-be5c-a5d3ccd20912']
Nov 28 10:08:10 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:10.907 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:11 np0005538513.localdomain dnsmasq[327872]: exiting on receipt of SIGTERM
Nov 28 10:08:11 np0005538513.localdomain podman[327966]: 2025-11-28 10:08:11.032674168 +0000 UTC m=+0.065880481 container kill 7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39526589-27d4-41ad-9aef-f63f534ecbf0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:08:11 np0005538513.localdomain systemd[1]: tmp-crun.Rqf2pe.mount: Deactivated successfully.
Nov 28 10:08:11 np0005538513.localdomain systemd[1]: libpod-7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36.scope: Deactivated successfully.
Nov 28 10:08:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:08:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:08:11 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:08:11Z|00467|binding|INFO|Removing iface tapdfb15593-fe ovn-installed in OVS
Nov 28 10:08:11 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:08:11Z|00468|binding|INFO|Removing lport dfb15593-fe1f-4aaf-af25-32ab66c1a780 ovn-installed in OVS
Nov 28 10:08:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:11.118 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:11 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:08:11.121 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port a7fb3ffc-8e40-4885-9fc0-3b779892c20d with type ""
Nov 28 10:08:11 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:08:11.122 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-39526589-27d4-41ad-9aef-f63f534ecbf0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-39526589-27d4-41ad-9aef-f63f534ecbf0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ce143270a4649669232b53b6a44e4ba', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7ab6281c-5843-47df-b4b0-ac9f0fa87790, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=dfb15593-fe1f-4aaf-af25-32ab66c1a780) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:08:11 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:08:11.124 158130 INFO neutron.agent.ovn.metadata.agent [-] Port dfb15593-fe1f-4aaf-af25-32ab66c1a780 in datapath 39526589-27d4-41ad-9aef-f63f534ecbf0 unbound from our chassis
Nov 28 10:08:11 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:08:11.126 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 39526589-27d4-41ad-9aef-f63f534ecbf0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:08:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:11.127 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:11 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:08:11.127 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[59296e7b-81f6-43e3-864c-174c03bff93c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:08:11 np0005538513.localdomain podman[327980]: 2025-11-28 10:08:11.128768589 +0000 UTC m=+0.080029867 container died 7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39526589-27d4-41ad-9aef-f63f534ecbf0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:08:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e216 do_prune osdmap full prune enabled
Nov 28 10:08:11 np0005538513.localdomain podman[327980]: 2025-11-28 10:08:11.160965121 +0000 UTC m=+0.112226369 container cleanup 7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39526589-27d4-41ad-9aef-f63f534ecbf0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 10:08:11 np0005538513.localdomain systemd[1]: libpod-conmon-7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36.scope: Deactivated successfully.
Nov 28 10:08:11 np0005538513.localdomain podman[327982]: 2025-11-28 10:08:11.195744462 +0000 UTC m=+0.137975512 container remove 7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39526589-27d4-41ad-9aef-f63f534ecbf0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:08:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:11.207 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:11 np0005538513.localdomain kernel: device tapdfb15593-fe left promiscuous mode
Nov 28 10:08:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:11.222 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:11 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:08:11.252 261084 INFO neutron.agent.dhcp.agent [None req-d46cfb24-85e5-4d25-bf2b-4f53a1e873cc - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:08:11 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-4341b7420827c25c7624eca3707ed25a6232f868ac4d13c1195965a03f452de4-merged.mount: Deactivated successfully.
Nov 28 10:08:11 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36-userdata-shm.mount: Deactivated successfully.
Nov 28 10:08:11 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d39526589\x2d27d4\x2d41ad\x2d9aef\x2df63f534ecbf0.mount: Deactivated successfully.
Nov 28 10:08:11 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:11.260 2 INFO neutron.agent.securitygroups_rpc [None req-8e5272cc-6346-4fdc-ba9b-b2622fce7146 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['527f33ea-6583-4033-be5c-a5d3ccd20912']
Nov 28 10:08:11 np0005538513.localdomain podman[327988]: 2025-11-28 10:08:11.260686573 +0000 UTC m=+0.193193253 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:08:11 np0005538513.localdomain podman[327988]: 2025-11-28 10:08:11.268200395 +0000 UTC m=+0.200707105 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:08:11 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "56764117-bba7-4a1d-bc16-3de8a089b757", "format": "json"}]: dispatch
Nov 28 10:08:11 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "56764117-bba7-4a1d-bc16-3de8a089b757", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:11 np0005538513.localdomain ceph-mon[292954]: pgmap v429: 177 pgs: 2 active+clean+snaptrim, 5 active+clean+snaptrim_wait, 170 active+clean; 146 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 30 KiB/s wr, 61 op/s
Nov 28 10:08:11 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:08:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e217 e217: 6 total, 6 up, 6 in
Nov 28 10:08:11 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e217: 6 total, 6 up, 6 in
Nov 28 10:08:11 np0005538513.localdomain podman[327994]: 2025-11-28 10:08:11.315984887 +0000 UTC m=+0.245821435 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 10:08:11 np0005538513.localdomain podman[327994]: 2025-11-28 10:08:11.353768882 +0000 UTC m=+0.283605450 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125)
Nov 28 10:08:11 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:08:11 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:08:11.375 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:08:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e217 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e217 do_prune osdmap full prune enabled
Nov 28 10:08:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e218 e218: 6 total, 6 up, 6 in
Nov 28 10:08:11 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e218: 6 total, 6 up, 6 in
Nov 28 10:08:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:11.497 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:11 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:08:11Z|00469|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:08:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:11.555 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:11 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:11.737 2 INFO neutron.agent.securitygroups_rpc [None req-fc93531f-0f28-40b2-be64-5c8ae1a05f2f 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['f4a575b2-e757-4f17-8902-ce29566c2707']
Nov 28 10:08:12 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "2c0f5770-a785-4c87-af5d-f92c8abce21d_c59c3d50-8922-4eb8-a42f-dbb4854b184b", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:12 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "2c0f5770-a785-4c87-af5d-f92c8abce21d", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:12 np0005538513.localdomain ceph-mon[292954]: osdmap e217: 6 total, 6 up, 6 in
Nov 28 10:08:12 np0005538513.localdomain ceph-mon[292954]: osdmap e218: 6 total, 6 up, 6 in
Nov 28 10:08:12 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:12.425 2 INFO neutron.agent.securitygroups_rpc [None req-950a4a21-ae98-4893-86d1-a721665d75e4 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['f4a575b2-e757-4f17-8902-ce29566c2707']
Nov 28 10:08:12 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:12.616 2 INFO neutron.agent.securitygroups_rpc [None req-70e65ce9-39cb-464e-830b-530df9be0aa7 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['609542d6-94ac-4376-bfbb-29a5ac4d9008']
Nov 28 10:08:12 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:12.928 2 INFO neutron.agent.securitygroups_rpc [None req-75b9de2d-ffc0-4329-a7f5-226897331b9b 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['f4a575b2-e757-4f17-8902-ce29566c2707']
Nov 28 10:08:13 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:13.009 2 INFO neutron.agent.securitygroups_rpc [None req-d8139777-22d3-42a1-ac35-78ef0fdf0858 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['609542d6-94ac-4376-bfbb-29a5ac4d9008']
Nov 28 10:08:13 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:13.254 2 INFO neutron.agent.securitygroups_rpc [None req-08ef0569-f479-4e5d-8704-2ba0f20ecb11 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['609542d6-94ac-4376-bfbb-29a5ac4d9008']
Nov 28 10:08:13 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:13.388 2 INFO neutron.agent.securitygroups_rpc [None req-a98371b3-db2f-4475-8ff8-c8019a0287c5 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['f4a575b2-e757-4f17-8902-ce29566c2707']
Nov 28 10:08:13 np0005538513.localdomain ceph-mon[292954]: pgmap v432: 177 pgs: 4 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 170 active+clean; 146 MiB data, 923 MiB used, 41 GiB / 42 GiB avail; 85 KiB/s rd, 76 KiB/s wr, 126 op/s
Nov 28 10:08:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1770151037' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1770151037' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:13 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:13.575 2 INFO neutron.agent.securitygroups_rpc [None req-86361f15-5494-4722-ab9a-5dacf7fdc04d 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['609542d6-94ac-4376-bfbb-29a5ac4d9008']
Nov 28 10:08:13 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:13.689 2 INFO neutron.agent.securitygroups_rpc [None req-d46ea504-c7ea-4ca3-916c-fe85715f24c7 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['f4a575b2-e757-4f17-8902-ce29566c2707']
Nov 28 10:08:13 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:13.891 2 INFO neutron.agent.securitygroups_rpc [None req-865b6df2-96d7-4677-b689-d1af588b7477 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['609542d6-94ac-4376-bfbb-29a5ac4d9008']
Nov 28 10:08:14 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:14.008 2 INFO neutron.agent.securitygroups_rpc [None req-447348cf-3ac4-498a-af94-5fdba49716f9 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['f4a575b2-e757-4f17-8902-ce29566c2707']
Nov 28 10:08:14 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:14.121 2 INFO neutron.agent.securitygroups_rpc [None req-e5d09537-743d-43a9-bf9b-ecb2832cda1b 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['609542d6-94ac-4376-bfbb-29a5ac4d9008']
Nov 28 10:08:14 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:14.890 2 INFO neutron.agent.securitygroups_rpc [None req-185e5877-2263-42be-9a4e-22f963823be4 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['dae70bc2-83a0-4e05-bc5e-659aa86d0528']
Nov 28 10:08:14 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:14.913 2 INFO neutron.agent.securitygroups_rpc [None req-d7a70732-d57d-42ae-a12a-793909186bfd 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['d99d3754-6453-4c3f-8498-8ac20a4744c7']
Nov 28 10:08:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e218 do_prune osdmap full prune enabled
Nov 28 10:08:15 np0005538513.localdomain ceph-mon[292954]: pgmap v433: 177 pgs: 4 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 170 active+clean; 146 MiB data, 923 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 39 KiB/s wr, 54 op/s
Nov 28 10:08:15 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "1ad625c7-a9e5-4671-b5b8-87ed0dc833b6_a8a8759a-ecad-4b03-8e69-d4e30de4d63c", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:15 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "1ad625c7-a9e5-4671-b5b8-87ed0dc833b6", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e219 e219: 6 total, 6 up, 6 in
Nov 28 10:08:15 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e219: 6 total, 6 up, 6 in
Nov 28 10:08:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:15.910 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e219 do_prune osdmap full prune enabled
Nov 28 10:08:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e220 e220: 6 total, 6 up, 6 in
Nov 28 10:08:16 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e220: 6 total, 6 up, 6 in
Nov 28 10:08:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:16.500 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:16 np0005538513.localdomain ceph-mon[292954]: osdmap e219: 6 total, 6 up, 6 in
Nov 28 10:08:16 np0005538513.localdomain ceph-mon[292954]: osdmap e220: 6 total, 6 up, 6 in
Nov 28 10:08:16 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:08:17 np0005538513.localdomain ceph-mon[292954]: pgmap v435: 177 pgs: 4 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 170 active+clean; 146 MiB data, 923 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 44 KiB/s wr, 61 op/s
Nov 28 10:08:17 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:08:17 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "format": "json"}]: dispatch
Nov 28 10:08:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:08:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:08:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:08:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:08:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:08:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:08:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:08:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:08:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:08:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:08:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:08:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:08:18 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "format": "json"}]: dispatch
Nov 28 10:08:18 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:19 np0005538513.localdomain ceph-mon[292954]: pgmap v437: 177 pgs: 177 active+clean; 146 MiB data, 924 MiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 68 KiB/s wr, 64 op/s
Nov 28 10:08:20 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:20.216 2 INFO neutron.agent.securitygroups_rpc [None req-d053b12b-4cea-4da2-a120-7dbb5ec3bd14 b3ad92f082324bf2b498b6ec57fa1994 f4aa6a98849143efbe0d34d745657eb8 - - default default] Security group rule updated ['b905493a-8ebf-4d2f-8822-0b2d1ac4a85c']
Nov 28 10:08:20 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "snap_name": "7f4376e7-ac7c-4743-ba07-48e626bc51c1", "format": "json"}]: dispatch
Nov 28 10:08:20 np0005538513.localdomain ceph-mon[292954]: pgmap v438: 177 pgs: 177 active+clean; 146 MiB data, 924 MiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 21 KiB/s wr, 5 op/s
Nov 28 10:08:20 np0005538513.localdomain neutron_sriov_agent[254147]: 2025-11-28 10:08:20.622 2 INFO neutron.agent.securitygroups_rpc [None req-5d5ff3a5-db5c-4c22-9208-8bc209a22601 2d65c21983fa4a008a09c7a8bb7a6484 2603cf17f09846a397a42aba4be9d81b - - default default] Security group rule updated ['90aec1a6-5e99-47c4-8e4c-11b88cdc4ca9']
Nov 28 10:08:20 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:20.912 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e220 do_prune osdmap full prune enabled
Nov 28 10:08:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e221 e221: 6 total, 6 up, 6 in
Nov 28 10:08:21 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e221: 6 total, 6 up, 6 in
Nov 28 10:08:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:21.502 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:21 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:08:21 np0005538513.localdomain podman[328051]: 2025-11-28 10:08:21.845312825 +0000 UTC m=+0.082255546 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.expose-services=)
Nov 28 10:08:21 np0005538513.localdomain podman[328051]: 2025-11-28 10:08:21.862538126 +0000 UTC m=+0.099480777 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.6, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Nov 28 10:08:21 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:08:22 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:08:22 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1272958995' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:22 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:08:22 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1272958995' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:22 np0005538513.localdomain ceph-mon[292954]: osdmap e221: 6 total, 6 up, 6 in
Nov 28 10:08:22 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1272958995' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:22 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1272958995' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:23 np0005538513.localdomain ceph-mon[292954]: pgmap v440: 177 pgs: 177 active+clean; 147 MiB data, 925 MiB used, 41 GiB / 42 GiB avail; 3.3 MiB/s rd, 55 KiB/s wr, 26 op/s
Nov 28 10:08:24 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "snap_name": "7f4376e7-ac7c-4743-ba07-48e626bc51c1", "target_sub_name": "fddc728b-0ca3-4979-8f94-9ca64d777da3", "format": "json"}]: dispatch
Nov 28 10:08:24 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fddc728b-0ca3-4979-8f94-9ca64d777da3", "format": "json"}]: dispatch
Nov 28 10:08:24 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e48: np0005538515.yfkzhl(active, since 10m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:08:25 np0005538513.localdomain sudo[328071]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:08:25 np0005538513.localdomain sudo[328071]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:08:25 np0005538513.localdomain sudo[328071]: pam_unix(sudo:session): session closed for user root
Nov 28 10:08:25 np0005538513.localdomain sudo[328089]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 10:08:25 np0005538513.localdomain sudo[328089]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:08:25 np0005538513.localdomain ceph-mon[292954]: pgmap v441: 177 pgs: 177 active+clean; 147 MiB data, 925 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 44 KiB/s wr, 20 op/s
Nov 28 10:08:25 np0005538513.localdomain ceph-mon[292954]: mgrmap e48: np0005538515.yfkzhl(active, since 10m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:08:25 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/439989088' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:25 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/439989088' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:25 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:25.913 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:08:26 np0005538513.localdomain podman[328153]: 2025-11-28 10:08:26.070354978 +0000 UTC m=+0.084622928 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:08:26 np0005538513.localdomain podman[328153]: 2025-11-28 10:08:26.155567053 +0000 UTC m=+0.169835003 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:08:26 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:08:26 np0005538513.localdomain podman[328204]: 2025-11-28 10:08:26.288393397 +0000 UTC m=+0.095083501 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, distribution-scope=public, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-type=git, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vendor=Red Hat, Inc.)
Nov 28 10:08:26 np0005538513.localdomain podman[328204]: 2025-11-28 10:08:26.431222167 +0000 UTC m=+0.237912252 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, release=553, GIT_CLEAN=True, GIT_BRANCH=main, distribution-scope=public, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 10:08:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e221 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e221 do_prune osdmap full prune enabled
Nov 28 10:08:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e222 e222: 6 total, 6 up, 6 in
Nov 28 10:08:26 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e222: 6 total, 6 up, 6 in
Nov 28 10:08:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:26.505 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 10:08:26 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 10:08:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 10:08:26 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:26 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:27 np0005538513.localdomain sudo[328089]: pam_unix(sudo:session): session closed for user root
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1911934516' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1911934516' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:27 np0005538513.localdomain sudo[328320]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:08:27 np0005538513.localdomain sudo[328320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:08:27 np0005538513.localdomain sudo[328320]: pam_unix(sudo:session): session closed for user root
Nov 28 10:08:27 np0005538513.localdomain sudo[328338]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:08:27 np0005538513.localdomain sudo[328338]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: pgmap v442: 177 pgs: 177 active+clean; 147 MiB data, 925 MiB used, 41 GiB / 42 GiB avail; 2.2 MiB/s rd, 37 KiB/s wr, 17 op/s
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: osdmap e222: 6 total, 6 up, 6 in
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1911934516' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1911934516' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 10:08:27 np0005538513.localdomain sudo[328338]: pam_unix(sudo:session): session closed for user root
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:28 np0005538513.localdomain sudo[328388]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:08:28 np0005538513.localdomain sudo[328388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:08:28 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:08:28 np0005538513.localdomain sudo[328388]: pam_unix(sudo:session): session closed for user root
Nov 28 10:08:28 np0005538513.localdomain podman[328406]: 2025-11-28 10:08:28.275629358 +0000 UTC m=+0.083983539 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:08:28 np0005538513.localdomain podman[328406]: 2025-11-28 10:08:28.310432981 +0000 UTC m=+0.118787152 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 10:08:28 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:28 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:08:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:08:29 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/786658190' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:08:29 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/786658190' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e222 do_prune osdmap full prune enabled
Nov 28 10:08:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e223 e223: 6 total, 6 up, 6 in
Nov 28 10:08:29 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e223: 6 total, 6 up, 6 in
Nov 28 10:08:29 np0005538513.localdomain ceph-mon[292954]: pgmap v444: 177 pgs: 177 active+clean; 193 MiB data, 991 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 131 op/s
Nov 28 10:08:29 np0005538513.localdomain ceph-mon[292954]: Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 10:08:29 np0005538513.localdomain ceph-mon[292954]: Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:08:29 np0005538513.localdomain ceph-mon[292954]: Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 10:08:29 np0005538513.localdomain ceph-mon[292954]: Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 10:08:29 np0005538513.localdomain ceph-mon[292954]: Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:08:29 np0005538513.localdomain ceph-mon[292954]: Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:08:29 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1301495600' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:29 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1301495600' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:29 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/786658190' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:29 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/786658190' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:30 np0005538513.localdomain ceph-mon[292954]: osdmap e223: 6 total, 6 up, 6 in
Nov 28 10:08:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:30.915 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:30 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:08:30 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e223 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:31.510 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:31 np0005538513.localdomain ceph-mon[292954]: pgmap v446: 177 pgs: 177 active+clean; 193 MiB data, 991 MiB used, 41 GiB / 42 GiB avail; 77 KiB/s rd, 2.7 MiB/s wr, 115 op/s
Nov 28 10:08:31 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:08:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:08:31 np0005538513.localdomain systemd[1]: tmp-crun.QCIAQV.mount: Deactivated successfully.
Nov 28 10:08:31 np0005538513.localdomain podman[328427]: 2025-11-28 10:08:31.865299187 +0000 UTC m=+0.098817357 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 28 10:08:31 np0005538513.localdomain podman[328428]: 2025-11-28 10:08:31.904924688 +0000 UTC m=+0.134954641 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:08:31 np0005538513.localdomain podman[328427]: 2025-11-28 10:08:31.926348687 +0000 UTC m=+0.159866857 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:08:31 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:08:31 np0005538513.localdomain podman[328428]: 2025-11-28 10:08:31.97154929 +0000 UTC m=+0.201579303 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 28 10:08:31 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:08:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e223 do_prune osdmap full prune enabled
Nov 28 10:08:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e224 e224: 6 total, 6 up, 6 in
Nov 28 10:08:32 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e224: 6 total, 6 up, 6 in
Nov 28 10:08:32 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1386400267' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:32 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1386400267' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:32 np0005538513.localdomain ceph-mon[292954]: osdmap e224: 6 total, 6 up, 6 in
Nov 28 10:08:33 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e224 do_prune osdmap full prune enabled
Nov 28 10:08:33 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e225 e225: 6 total, 6 up, 6 in
Nov 28 10:08:33 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e225: 6 total, 6 up, 6 in
Nov 28 10:08:33 np0005538513.localdomain ceph-mon[292954]: pgmap v447: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 123 KiB/s rd, 2.7 MiB/s wr, 183 op/s
Nov 28 10:08:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e225 do_prune osdmap full prune enabled
Nov 28 10:08:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e226 e226: 6 total, 6 up, 6 in
Nov 28 10:08:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 10:08:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 14K writes, 57K keys, 14K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.01 MB/s
                                                          Cumulative WAL: 14K writes, 4566 syncs, 3.17 writes per sync, written: 0.04 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 9397 writes, 34K keys, 9397 commit groups, 1.0 writes per commit group, ingest: 26.84 MB, 0.04 MB/s
                                                          Interval WAL: 9397 writes, 3883 syncs, 2.42 writes per sync, written: 0.03 GB, 0.04 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 10:08:34 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e226: 6 total, 6 up, 6 in
Nov 28 10:08:34 np0005538513.localdomain ceph-mon[292954]: osdmap e225: 6 total, 6 up, 6 in
Nov 28 10:08:34 np0005538513.localdomain ceph-mon[292954]: pgmap v450: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 60 KiB/s rd, 18 KiB/s wr, 91 op/s
Nov 28 10:08:34 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3132541285' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:34 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3132541285' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:34 np0005538513.localdomain ceph-mon[292954]: osdmap e226: 6 total, 6 up, 6 in
Nov 28 10:08:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e226 do_prune osdmap full prune enabled
Nov 28 10:08:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e227 e227: 6 total, 6 up, 6 in
Nov 28 10:08:35 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e227: 6 total, 6 up, 6 in
Nov 28 10:08:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:35.916 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:36 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:08:36 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/594035084' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:36 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:08:36 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/594035084' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:36 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e227 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:36.513 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:36 np0005538513.localdomain ceph-mon[292954]: osdmap e227: 6 total, 6 up, 6 in
Nov 28 10:08:36 np0005538513.localdomain ceph-mon[292954]: pgmap v453: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:08:36 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/594035084' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:36 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/594035084' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e227 do_prune osdmap full prune enabled
Nov 28 10:08:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e228 e228: 6 total, 6 up, 6 in
Nov 28 10:08:37 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e228: 6 total, 6 up, 6 in
Nov 28 10:08:38 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:08:38 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/83214699' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:38 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:08:38 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/83214699' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:38 np0005538513.localdomain ceph-mon[292954]: osdmap e228: 6 total, 6 up, 6 in
Nov 28 10:08:38 np0005538513.localdomain ceph-mon[292954]: pgmap v455: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 125 KiB/s rd, 8.2 KiB/s wr, 170 op/s
Nov 28 10:08:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/83214699' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/83214699' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 10:08:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.3 total, 600.0 interval
                                                          Cumulative writes: 16K writes, 61K keys, 16K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.01 MB/s
                                                          Cumulative WAL: 16K writes, 5217 syncs, 3.12 writes per sync, written: 0.04 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 10K writes, 36K keys, 10K commit groups, 1.0 writes per commit group, ingest: 23.73 MB, 0.04 MB/s
                                                          Interval WAL: 10K writes, 4356 syncs, 2.39 writes per sync, written: 0.02 GB, 0.04 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 10:08:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e228 do_prune osdmap full prune enabled
Nov 28 10:08:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e229 e229: 6 total, 6 up, 6 in
Nov 28 10:08:39 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e229: 6 total, 6 up, 6 in
Nov 28 10:08:40 np0005538513.localdomain podman[238687]: time="2025-11-28T10:08:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:08:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:08:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1"
Nov 28 10:08:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:08:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19272 "" "Go-http-client/1.1"
Nov 28 10:08:40 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e229 do_prune osdmap full prune enabled
Nov 28 10:08:40 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e230 e230: 6 total, 6 up, 6 in
Nov 28 10:08:40 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e230: 6 total, 6 up, 6 in
Nov 28 10:08:40 np0005538513.localdomain ceph-mon[292954]: osdmap e229: 6 total, 6 up, 6 in
Nov 28 10:08:40 np0005538513.localdomain ceph-mon[292954]: pgmap v457: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 101 KiB/s rd, 6.7 KiB/s wr, 138 op/s
Nov 28 10:08:40 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:40.920 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e230 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e230 do_prune osdmap full prune enabled
Nov 28 10:08:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e231 e231: 6 total, 6 up, 6 in
Nov 28 10:08:41 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e231: 6 total, 6 up, 6 in
Nov 28 10:08:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:41.515 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:41 np0005538513.localdomain ceph-mon[292954]: osdmap e230: 6 total, 6 up, 6 in
Nov 28 10:08:41 np0005538513.localdomain ceph-mon[292954]: osdmap e231: 6 total, 6 up, 6 in
Nov 28 10:08:41 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:08:41 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:08:41 np0005538513.localdomain podman[328471]: 2025-11-28 10:08:41.864398121 +0000 UTC m=+0.082950297 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:08:41 np0005538513.localdomain podman[328471]: 2025-11-28 10:08:41.877431462 +0000 UTC m=+0.095983698 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:08:41 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:08:41 np0005538513.localdomain podman[328472]: 2025-11-28 10:08:41.969401056 +0000 UTC m=+0.186070704 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:08:42 np0005538513.localdomain podman[328472]: 2025-11-28 10:08:42.007650365 +0000 UTC m=+0.224320063 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 10:08:42 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:08:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e231 do_prune osdmap full prune enabled
Nov 28 10:08:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e232 e232: 6 total, 6 up, 6 in
Nov 28 10:08:42 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e232: 6 total, 6 up, 6 in
Nov 28 10:08:42 np0005538513.localdomain ceph-mon[292954]: pgmap v460: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 118 KiB/s rd, 5.3 KiB/s wr, 160 op/s
Nov 28 10:08:42 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "cdb1ad02-8773-41eb-84cf-933676ec61c7", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:08:42 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "cdb1ad02-8773-41eb-84cf-933676ec61c7", "format": "json"}]: dispatch
Nov 28 10:08:42 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:08:42 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "53e82a45-a05e-4381-85a3-03f5d3eecad9", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:08:42 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "53e82a45-a05e-4381-85a3-03f5d3eecad9", "format": "json"}]: dispatch
Nov 28 10:08:42 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:08:43 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:08:43Z|00470|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 28 10:08:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:43.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:08:43 np0005538513.localdomain ceph-mon[292954]: osdmap e232: 6 total, 6 up, 6 in
Nov 28 10:08:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:44.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:08:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e232 do_prune osdmap full prune enabled
Nov 28 10:08:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e233 e233: 6 total, 6 up, 6 in
Nov 28 10:08:44 np0005538513.localdomain ceph-mon[292954]: pgmap v462: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 119 KiB/s rd, 5.3 KiB/s wr, 161 op/s
Nov 28 10:08:44 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e233: 6 total, 6 up, 6 in
Nov 28 10:08:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:45.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:08:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e233 do_prune osdmap full prune enabled
Nov 28 10:08:45 np0005538513.localdomain ceph-mon[292954]: osdmap e233: 6 total, 6 up, 6 in
Nov 28 10:08:45 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:08:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e234 e234: 6 total, 6 up, 6 in
Nov 28 10:08:45 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e234: 6 total, 6 up, 6 in
Nov 28 10:08:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:45.922 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e234 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e234 do_prune osdmap full prune enabled
Nov 28 10:08:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e235 e235: 6 total, 6 up, 6 in
Nov 28 10:08:46 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e235: 6 total, 6 up, 6 in
Nov 28 10:08:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:46.518 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:46.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:08:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:46.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:08:46 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4b827a5f-531c-4609-b695-d7ea3eea20eb", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:08:46 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4b827a5f-531c-4609-b695-d7ea3eea20eb", "format": "json"}]: dispatch
Nov 28 10:08:46 np0005538513.localdomain ceph-mon[292954]: pgmap v464: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 96 KiB/s rd, 4.3 KiB/s wr, 130 op/s
Nov 28 10:08:46 np0005538513.localdomain ceph-mon[292954]: osdmap e234: 6 total, 6 up, 6 in
Nov 28 10:08:46 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "53e82a45-a05e-4381-85a3-03f5d3eecad9", "format": "json"}]: dispatch
Nov 28 10:08:46 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "53e82a45-a05e-4381-85a3-03f5d3eecad9", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:46 np0005538513.localdomain ceph-mon[292954]: osdmap e235: 6 total, 6 up, 6 in
Nov 28 10:08:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e235 do_prune osdmap full prune enabled
Nov 28 10:08:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e236 e236: 6 total, 6 up, 6 in
Nov 28 10:08:47 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e236: 6 total, 6 up, 6 in
Nov 28 10:08:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:47.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:08:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:08:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:08:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:08:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:08:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:08:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:08:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:08:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:08:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:08:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:08:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:08:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:08:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:48.405 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:48 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:08:48.407 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:08:48 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:08:48.409 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:08:48 np0005538513.localdomain ceph-mon[292954]: osdmap e236: 6 total, 6 up, 6 in
Nov 28 10:08:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:48.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:08:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:48.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:08:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:48.796 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:08:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:48.796 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:08:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:48.797 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:08:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:48.797 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:08:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:48.798 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:08:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:08:49 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1503239630' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:08:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:49.242 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:08:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:49.309 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:08:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:49.310 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:08:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:49.519 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:08:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:49.520 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11092MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:08:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e236 do_prune osdmap full prune enabled
Nov 28 10:08:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:49.521 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:08:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:49.521 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:08:49 np0005538513.localdomain ceph-mon[292954]: pgmap v468: 177 pgs: 177 active+clean; 147 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 79 KiB/s rd, 62 KiB/s wr, 112 op/s
Nov 28 10:08:49 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1503239630' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:08:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e237 e237: 6 total, 6 up, 6 in
Nov 28 10:08:49 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e237: 6 total, 6 up, 6 in
Nov 28 10:08:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:49.617 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 10:08:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:49.618 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:08:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:49.619 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:08:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:49.654 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:08:50 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:08:50 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/780788281' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:08:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:50.109 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:08:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:50.116 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:08:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:50.132 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:08:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:50.135 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:08:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:50.135 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.614s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:08:50 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "4b827a5f-531c-4609-b695-d7ea3eea20eb", "snap_name": "b751b79e-578e-4a27-88ca-ab5a9cb2b343", "format": "json"}]: dispatch
Nov 28 10:08:50 np0005538513.localdomain ceph-mon[292954]: osdmap e237: 6 total, 6 up, 6 in
Nov 28 10:08:50 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1833255642' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:08:50 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/780788281' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:08:50 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:08:50 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1604987579' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:50 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:08:50 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1604987579' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:08:50.846 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:08:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:08:50.847 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:08:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:08:50.848 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:08:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:50.923 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:51.137 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:08:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:51.138 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:08:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:51.138 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:08:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:51.222 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:08:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:51.222 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:08:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:51.223 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 10:08:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:51.223 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:08:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e237 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e237 do_prune osdmap full prune enabled
Nov 28 10:08:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e238 e238: 6 total, 6 up, 6 in
Nov 28 10:08:51 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e238: 6 total, 6 up, 6 in
Nov 28 10:08:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:51.522 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:51 np0005538513.localdomain ceph-mon[292954]: pgmap v470: 177 pgs: 177 active+clean; 147 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 79 KiB/s rd, 62 KiB/s wr, 112 op/s
Nov 28 10:08:51 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/936247826' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:08:51 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1604987579' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:51 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1604987579' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:51 np0005538513.localdomain ceph-mon[292954]: osdmap e238: 6 total, 6 up, 6 in
Nov 28 10:08:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:51.634 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:08:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:51.662 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:08:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:51.663 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 10:08:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:51.663 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:08:52 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2163301190' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:52 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2163301190' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:52 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:08:52 np0005538513.localdomain podman[328558]: 2025-11-28 10:08:52.868633489 +0000 UTC m=+0.103418158 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, architecture=x86_64, version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.buildah.version=1.33.7)
Nov 28 10:08:52 np0005538513.localdomain podman[328558]: 2025-11-28 10:08:52.885621862 +0000 UTC m=+0.120406491 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 10:08:52 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:08:53 np0005538513.localdomain ceph-mon[292954]: pgmap v472: 177 pgs: 177 active+clean; 147 MiB data, 934 MiB used, 41 GiB / 42 GiB avail; 117 KiB/s rd, 70 KiB/s wr, 164 op/s
Nov 28 10:08:53 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2410619381' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:08:54 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1048746670' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:08:54 np0005538513.localdomain ceph-mon[292954]: pgmap v473: 177 pgs: 177 active+clean; 147 MiB data, 934 MiB used, 41 GiB / 42 GiB avail; 50 KiB/s rd, 21 KiB/s wr, 68 op/s
Nov 28 10:08:54 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2986300977' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:54 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2986300977' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:55 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "4b827a5f-531c-4609-b695-d7ea3eea20eb", "snap_name": "b751b79e-578e-4a27-88ca-ab5a9cb2b343_a9f5af9b-568e-446f-a65a-029b17668afd", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:55 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "4b827a5f-531c-4609-b695-d7ea3eea20eb", "snap_name": "b751b79e-578e-4a27-88ca-ab5a9cb2b343", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:55.928 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:56 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:56 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e238 do_prune osdmap full prune enabled
Nov 28 10:08:56 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e239 e239: 6 total, 6 up, 6 in
Nov 28 10:08:56 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e239: 6 total, 6 up, 6 in
Nov 28 10:08:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:08:56.524 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:08:56 np0005538513.localdomain podman[328578]: 2025-11-28 10:08:56.849109858 +0000 UTC m=+0.087486196 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:08:56 np0005538513.localdomain podman[328578]: 2025-11-28 10:08:56.863995097 +0000 UTC m=+0.102371465 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:08:56 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:08:57 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:08:57.410 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:08:57 np0005538513.localdomain ceph-mon[292954]: pgmap v474: 177 pgs: 177 active+clean; 147 MiB data, 934 MiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 16 KiB/s wr, 54 op/s
Nov 28 10:08:57 np0005538513.localdomain ceph-mon[292954]: osdmap e239: 6 total, 6 up, 6 in
Nov 28 10:08:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:08:58 np0005538513.localdomain podman[328602]: 2025-11-28 10:08:58.857476802 +0000 UTC m=+0.087615080 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 10:08:58 np0005538513.localdomain podman[328602]: 2025-11-28 10:08:58.89082317 +0000 UTC m=+0.120961428 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 28 10:08:58 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:08:59 np0005538513.localdomain ceph-mon[292954]: pgmap v476: 177 pgs: 177 active+clean; 147 MiB data, 934 MiB used, 41 GiB / 42 GiB avail; 59 KiB/s rd, 31 KiB/s wr, 83 op/s
Nov 28 10:09:00 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e239 do_prune osdmap full prune enabled
Nov 28 10:09:00 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4b827a5f-531c-4609-b695-d7ea3eea20eb", "format": "json"}]: dispatch
Nov 28 10:09:00 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4b827a5f-531c-4609-b695-d7ea3eea20eb", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:00 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e240 e240: 6 total, 6 up, 6 in
Nov 28 10:09:00 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e240: 6 total, 6 up, 6 in
Nov 28 10:09:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:00.966 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:01 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:01.527 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:01 np0005538513.localdomain ceph-mon[292954]: pgmap v477: 177 pgs: 177 active+clean; 147 MiB data, 934 MiB used, 41 GiB / 42 GiB avail; 56 KiB/s rd, 29 KiB/s wr, 80 op/s
Nov 28 10:09:01 np0005538513.localdomain ceph-mon[292954]: osdmap e240: 6 total, 6 up, 6 in
Nov 28 10:09:02 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e240 do_prune osdmap full prune enabled
Nov 28 10:09:02 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e241 e241: 6 total, 6 up, 6 in
Nov 28 10:09:02 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e241: 6 total, 6 up, 6 in
Nov 28 10:09:02 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3664164282' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:02 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3664164282' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:02 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/4047621474' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:02 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2136588701' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:02 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2136588701' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:02 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:09:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:09:02 np0005538513.localdomain podman[328620]: 2025-11-28 10:09:02.875843569 +0000 UTC m=+0.104097199 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 10:09:02 np0005538513.localdomain podman[328620]: 2025-11-28 10:09:02.918038249 +0000 UTC m=+0.146291869 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:09:02 np0005538513.localdomain systemd[1]: tmp-crun.yEX5Bd.mount: Deactivated successfully.
Nov 28 10:09:02 np0005538513.localdomain podman[328621]: 2025-11-28 10:09:02.929290796 +0000 UTC m=+0.150829799 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 10:09:02 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:09:03 np0005538513.localdomain podman[328621]: 2025-11-28 10:09:03.015696098 +0000 UTC m=+0.237235061 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 10:09:03 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:09:03 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e241 do_prune osdmap full prune enabled
Nov 28 10:09:03 np0005538513.localdomain ceph-mon[292954]: pgmap v479: 177 pgs: 177 active+clean; 147 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 29 KiB/s wr, 65 op/s
Nov 28 10:09:03 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "cdb1ad02-8773-41eb-84cf-933676ec61c7", "format": "json"}]: dispatch
Nov 28 10:09:03 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "cdb1ad02-8773-41eb-84cf-933676ec61c7", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:03 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:03 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "format": "json"}]: dispatch
Nov 28 10:09:03 np0005538513.localdomain ceph-mon[292954]: osdmap e241: 6 total, 6 up, 6 in
Nov 28 10:09:03 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/109290016' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:03 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/109290016' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:03 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e242 e242: 6 total, 6 up, 6 in
Nov 28 10:09:03 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e242: 6 total, 6 up, 6 in
Nov 28 10:09:04 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:09:04 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/754623008' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:04 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:09:04 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/754623008' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:04 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fddc728b-0ca3-4979-8f94-9ca64d777da3", "format": "json"}]: dispatch
Nov 28 10:09:04 np0005538513.localdomain ceph-mon[292954]: osdmap e242: 6 total, 6 up, 6 in
Nov 28 10:09:04 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/754623008' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:04 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/754623008' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:05 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e242 do_prune osdmap full prune enabled
Nov 28 10:09:05 np0005538513.localdomain ceph-mon[292954]: pgmap v482: 177 pgs: 177 active+clean; 147 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 19 KiB/s wr, 47 op/s
Nov 28 10:09:05 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "fddc728b-0ca3-4979-8f94-9ca64d777da3", "format": "json"}]: dispatch
Nov 28 10:09:05 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:05 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e243 e243: 6 total, 6 up, 6 in
Nov 28 10:09:05 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e243: 6 total, 6 up, 6 in
Nov 28 10:09:06 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:06.013 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:06 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:06 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e243 do_prune osdmap full prune enabled
Nov 28 10:09:06 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e244 e244: 6 total, 6 up, 6 in
Nov 28 10:09:06 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e244: 6 total, 6 up, 6 in
Nov 28 10:09:06 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:06.531 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:06 np0005538513.localdomain ceph-mon[292954]: osdmap e243: 6 total, 6 up, 6 in
Nov 28 10:09:06 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "snap_name": "7121dfd5-9b25-4dc7-904e-3be06d03c952", "format": "json"}]: dispatch
Nov 28 10:09:06 np0005538513.localdomain ceph-mon[292954]: pgmap v484: 177 pgs: 177 active+clean; 147 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 22 KiB/s wr, 54 op/s
Nov 28 10:09:06 np0005538513.localdomain ceph-mon[292954]: osdmap e244: 6 total, 6 up, 6 in
Nov 28 10:09:07 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e244 do_prune osdmap full prune enabled
Nov 28 10:09:07 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e245 e245: 6 total, 6 up, 6 in
Nov 28 10:09:07 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e245: 6 total, 6 up, 6 in
Nov 28 10:09:07 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fddc728b-0ca3-4979-8f94-9ca64d777da3", "format": "json"}]: dispatch
Nov 28 10:09:07 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "fddc728b-0ca3-4979-8f94-9ca64d777da3", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:08 np0005538513.localdomain ceph-mon[292954]: osdmap e245: 6 total, 6 up, 6 in
Nov 28 10:09:08 np0005538513.localdomain ceph-mon[292954]: pgmap v487: 177 pgs: 177 active+clean; 148 MiB data, 944 MiB used, 41 GiB / 42 GiB avail; 142 KiB/s rd, 65 KiB/s wr, 200 op/s
Nov 28 10:09:08 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3077193037' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:08 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3077193037' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:09 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "snap_name": "24807f53-77c4-4f15-8496-e0fd980f52a8", "format": "json"}]: dispatch
Nov 28 10:09:10 np0005538513.localdomain podman[238687]: time="2025-11-28T10:09:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:09:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:09:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1"
Nov 28 10:09:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:09:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19268 "" "Go-http-client/1.1"
Nov 28 10:09:10 np0005538513.localdomain ceph-mon[292954]: pgmap v488: 177 pgs: 177 active+clean; 148 MiB data, 944 MiB used, 41 GiB / 42 GiB avail; 101 KiB/s rd, 46 KiB/s wr, 142 op/s
Nov 28 10:09:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:11.042 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e245 do_prune osdmap full prune enabled
Nov 28 10:09:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e246 e246: 6 total, 6 up, 6 in
Nov 28 10:09:11 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e246: 6 total, 6 up, 6 in
Nov 28 10:09:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:11.534 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:12 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "snap_name": "7f4376e7-ac7c-4743-ba07-48e626bc51c1_2561ce9f-a8b9-43a9-9ce8-c591d939acc0", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:12 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "snap_name": "7f4376e7-ac7c-4743-ba07-48e626bc51c1", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:12 np0005538513.localdomain ceph-mon[292954]: osdmap e246: 6 total, 6 up, 6 in
Nov 28 10:09:12 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:09:12 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:09:12 np0005538513.localdomain podman[328663]: 2025-11-28 10:09:12.871317505 +0000 UTC m=+0.097626009 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:09:12 np0005538513.localdomain podman[328664]: 2025-11-28 10:09:12.908825001 +0000 UTC m=+0.133487984 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 10:09:12 np0005538513.localdomain podman[328664]: 2025-11-28 10:09:12.919547141 +0000 UTC m=+0.144210144 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:09:12 np0005538513.localdomain podman[328663]: 2025-11-28 10:09:12.931541411 +0000 UTC m=+0.157849865 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:09:12 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:09:12 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:09:13 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:09:13 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3079034859' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:13 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:09:13 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3079034859' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:13 np0005538513.localdomain ceph-mon[292954]: pgmap v490: 177 pgs: 177 active+clean; 148 MiB data, 949 MiB used, 41 GiB / 42 GiB avail; 3.5 MiB/s rd, 85 KiB/s wr, 199 op/s
Nov 28 10:09:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3079034859' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3079034859' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/803796492' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/803796492' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e246 do_prune osdmap full prune enabled
Nov 28 10:09:14 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "snap_name": "24807f53-77c4-4f15-8496-e0fd980f52a8_43874ef7-c24e-4d18-b524-79d36cc3d273", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:14 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "snap_name": "24807f53-77c4-4f15-8496-e0fd980f52a8", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e247 e247: 6 total, 6 up, 6 in
Nov 28 10:09:14 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e247: 6 total, 6 up, 6 in
Nov 28 10:09:15 np0005538513.localdomain ceph-mon[292954]: pgmap v491: 177 pgs: 177 active+clean; 148 MiB data, 949 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 69 KiB/s wr, 162 op/s
Nov 28 10:09:15 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "format": "json"}]: dispatch
Nov 28 10:09:15 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:15 np0005538513.localdomain ceph-mon[292954]: osdmap e247: 6 total, 6 up, 6 in
Nov 28 10:09:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e247 do_prune osdmap full prune enabled
Nov 28 10:09:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e248 e248: 6 total, 6 up, 6 in
Nov 28 10:09:15 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e248: 6 total, 6 up, 6 in
Nov 28 10:09:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:16.070 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e248 do_prune osdmap full prune enabled
Nov 28 10:09:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e249 e249: 6 total, 6 up, 6 in
Nov 28 10:09:16 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e249: 6 total, 6 up, 6 in
Nov 28 10:09:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:16.536 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:16 np0005538513.localdomain ceph-mon[292954]: osdmap e248: 6 total, 6 up, 6 in
Nov 28 10:09:16 np0005538513.localdomain ceph-mon[292954]: pgmap v494: 177 pgs: 177 active+clean; 148 MiB data, 949 MiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 38 KiB/s wr, 57 op/s
Nov 28 10:09:16 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "snap_name": "7121dfd5-9b25-4dc7-904e-3be06d03c952_0622dbc9-fa1f-43ca-acfe-551c9624e3ef", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:16 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "snap_name": "7121dfd5-9b25-4dc7-904e-3be06d03c952", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:16 np0005538513.localdomain ceph-mon[292954]: osdmap e249: 6 total, 6 up, 6 in
Nov 28 10:09:16 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3340760179' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:16 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3340760179' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:17 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3754312141' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:17 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3754312141' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:09:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:09:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:09:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:09:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:09:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:09:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:09:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:09:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:09:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:09:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:09:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:09:18 np0005538513.localdomain ceph-mon[292954]: pgmap v496: 177 pgs: 177 active+clean; 194 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 117 KiB/s rd, 3.6 MiB/s wr, 171 op/s
Nov 28 10:09:18 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:18.673 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e249 do_prune osdmap full prune enabled
Nov 28 10:09:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e250 e250: 6 total, 6 up, 6 in
Nov 28 10:09:19 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e250: 6 total, 6 up, 6 in
Nov 28 10:09:19 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "format": "json"}]: dispatch
Nov 28 10:09:19 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:20 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e250 do_prune osdmap full prune enabled
Nov 28 10:09:20 np0005538513.localdomain ceph-mon[292954]: osdmap e250: 6 total, 6 up, 6 in
Nov 28 10:09:20 np0005538513.localdomain ceph-mon[292954]: pgmap v498: 177 pgs: 177 active+clean; 194 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 133 KiB/s rd, 4.1 MiB/s wr, 195 op/s
Nov 28 10:09:20 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e251 e251: 6 total, 6 up, 6 in
Nov 28 10:09:20 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e251: 6 total, 6 up, 6 in
Nov 28 10:09:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:21.107 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e251 do_prune osdmap full prune enabled
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e252 e252: 6 total, 6 up, 6 in
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e252: 6 total, 6 up, 6 in
Nov 28 10:09:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:21.540 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: osdmap e251: 6 total, 6 up, 6 in
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/382002900' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/382002900' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: osdmap e252: 6 total, 6 up, 6 in
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.751964) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324561752101, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 2643, "num_deletes": 281, "total_data_size": 3176462, "memory_usage": 3242064, "flush_reason": "Manual Compaction"}
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324561775665, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 3108429, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30678, "largest_seqno": 33320, "table_properties": {"data_size": 3096625, "index_size": 7669, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 27914, "raw_average_key_size": 22, "raw_value_size": 3072092, "raw_average_value_size": 2520, "num_data_blocks": 320, "num_entries": 1219, "num_filter_entries": 1219, "num_deletions": 281, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324437, "oldest_key_time": 1764324437, "file_creation_time": 1764324561, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 23781 microseconds, and 10562 cpu microseconds.
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.775755) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 3108429 bytes OK
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.775797) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.777991) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.778047) EVENT_LOG_v1 {"time_micros": 1764324561778038, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.778113) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 3164760, prev total WAL file size 3164760, number of live WAL files 2.
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.779367) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132323939' seq:72057594037927935, type:22 .. '7061786F73003132353531' seq:0, type:0; will stop at (end)
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(3035KB)], [54(17MB)]
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324561779440, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 20953640, "oldest_snapshot_seqno": -1}
Nov 28 10:09:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:21.789 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 13408 keys, 19616465 bytes, temperature: kUnknown
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324561896486, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 19616465, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19535722, "index_size": 46146, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33541, "raw_key_size": 357739, "raw_average_key_size": 26, "raw_value_size": 19303510, "raw_average_value_size": 1439, "num_data_blocks": 1754, "num_entries": 13408, "num_filter_entries": 13408, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324561, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.896968) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 19616465 bytes
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.899330) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 178.8 rd, 167.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.0, 17.0 +0.0 blob) out(18.7 +0.0 blob), read-write-amplify(13.1) write-amplify(6.3) OK, records in: 13974, records dropped: 566 output_compression: NoCompression
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.899361) EVENT_LOG_v1 {"time_micros": 1764324561899346, "job": 32, "event": "compaction_finished", "compaction_time_micros": 117175, "compaction_time_cpu_micros": 51179, "output_level": 6, "num_output_files": 1, "total_output_size": 19616465, "num_input_records": 13974, "num_output_records": 13408, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324561900252, "job": 32, "event": "table_file_deletion", "file_number": 56}
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324561903113, "job": 32, "event": "table_file_deletion", "file_number": 54}
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.779278) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.903210) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.903219) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.903222) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.903226) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:21 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.903228) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:22 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e252 do_prune osdmap full prune enabled
Nov 28 10:09:22 np0005538513.localdomain ceph-mon[292954]: pgmap v501: 177 pgs: 177 active+clean; 195 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 234 KiB/s rd, 4.1 MiB/s wr, 332 op/s
Nov 28 10:09:22 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1047822351' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:22 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1047822351' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:22 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e253 e253: 6 total, 6 up, 6 in
Nov 28 10:09:22 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e253: 6 total, 6 up, 6 in
Nov 28 10:09:23 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:09:23 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e253 do_prune osdmap full prune enabled
Nov 28 10:09:23 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e254 e254: 6 total, 6 up, 6 in
Nov 28 10:09:23 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e254: 6 total, 6 up, 6 in
Nov 28 10:09:23 np0005538513.localdomain ceph-mon[292954]: osdmap e253: 6 total, 6 up, 6 in
Nov 28 10:09:23 np0005538513.localdomain podman[328708]: 2025-11-28 10:09:23.84847391 +0000 UTC m=+0.084910158 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 10:09:23 np0005538513.localdomain podman[328708]: 2025-11-28 10:09:23.866426063 +0000 UTC m=+0.102862331 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, vendor=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, distribution-scope=public, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6)
Nov 28 10:09:23 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:09:24 np0005538513.localdomain ceph-mon[292954]: osdmap e254: 6 total, 6 up, 6 in
Nov 28 10:09:24 np0005538513.localdomain ceph-mon[292954]: pgmap v504: 177 pgs: 177 active+clean; 195 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 139 KiB/s rd, 43 KiB/s wr, 189 op/s
Nov 28 10:09:24 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/314747336' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:24 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/314747336' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:25 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3883756345' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:26.139 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e254 do_prune osdmap full prune enabled
Nov 28 10:09:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e255 e255: 6 total, 6 up, 6 in
Nov 28 10:09:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:26.542 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:26 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e255: 6 total, 6 up, 6 in
Nov 28 10:09:27 np0005538513.localdomain ceph-mon[292954]: pgmap v505: 177 pgs: 177 active+clean; 195 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 108 KiB/s rd, 34 KiB/s wr, 147 op/s
Nov 28 10:09:27 np0005538513.localdomain ceph-mon[292954]: osdmap e255: 6 total, 6 up, 6 in
Nov 28 10:09:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:09:27 np0005538513.localdomain podman[328729]: 2025-11-28 10:09:27.865805985 +0000 UTC m=+0.103731867 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:09:27 np0005538513.localdomain podman[328729]: 2025-11-28 10:09:27.878346851 +0000 UTC m=+0.116272743 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:09:27 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:09:28 np0005538513.localdomain sudo[328751]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:09:28 np0005538513.localdomain sudo[328751]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:09:28 np0005538513.localdomain sudo[328751]: pam_unix(sudo:session): session closed for user root
Nov 28 10:09:28 np0005538513.localdomain sudo[328769]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:09:28 np0005538513.localdomain sudo[328769]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:09:29 np0005538513.localdomain sudo[328769]: pam_unix(sudo:session): session closed for user root
Nov 28 10:09:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:09:29 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:09:29 np0005538513.localdomain ceph-mon[292954]: pgmap v507: 177 pgs: 177 active+clean; 251 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 99 KiB/s rd, 9.4 MiB/s wr, 140 op/s
Nov 28 10:09:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:09:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:09:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:09:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:09:29 np0005538513.localdomain sudo[328819]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:09:29 np0005538513.localdomain sudo[328819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:09:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:09:29 np0005538513.localdomain sudo[328819]: pam_unix(sudo:session): session closed for user root
Nov 28 10:09:29 np0005538513.localdomain podman[328836]: 2025-11-28 10:09:29.762839317 +0000 UTC m=+0.094947816 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:09:29 np0005538513.localdomain podman[328836]: 2025-11-28 10:09:29.799568839 +0000 UTC m=+0.131677288 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:09:29 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:09:30 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:30 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:09:30 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/444864674' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:30 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:09:30 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/444864674' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:30 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:09:31 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:09:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:31.172 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e255 do_prune osdmap full prune enabled
Nov 28 10:09:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e256 e256: 6 total, 6 up, 6 in
Nov 28 10:09:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:31.546 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:31 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e256: 6 total, 6 up, 6 in
Nov 28 10:09:31 np0005538513.localdomain ceph-mon[292954]: pgmap v508: 177 pgs: 177 active+clean; 251 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 84 KiB/s rd, 7.9 MiB/s wr, 119 op/s
Nov 28 10:09:31 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "06689989-6341-4053-b4cd-67b6bbd3acbb", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:31 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "06689989-6341-4053-b4cd-67b6bbd3acbb", "format": "json"}]: dispatch
Nov 28 10:09:31 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/444864674' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:31 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/444864674' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:31 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:09:31 np0005538513.localdomain ceph-mon[292954]: osdmap e256: 6 total, 6 up, 6 in
Nov 28 10:09:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:09:32 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3188050676' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:09:32 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3188050676' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:33 np0005538513.localdomain ceph-mon[292954]: pgmap v510: 177 pgs: 177 active+clean; 687 MiB data, 2.3 GiB used, 40 GiB / 42 GiB avail; 164 KiB/s rd, 62 MiB/s wr, 258 op/s
Nov 28 10:09:33 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3188050676' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:33 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3188050676' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:33 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "06689989-6341-4053-b4cd-67b6bbd3acbb", "snap_name": "7ef5a977-4548-46f4-a4ab-f17fcf8c22a8", "format": "json"}]: dispatch
Nov 28 10:09:33 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:09:33 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:09:33 np0005538513.localdomain podman[328858]: 2025-11-28 10:09:33.924629674 +0000 UTC m=+0.158019841 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 28 10:09:33 np0005538513.localdomain podman[328857]: 2025-11-28 10:09:33.877497841 +0000 UTC m=+0.116301535 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Nov 28 10:09:33 np0005538513.localdomain podman[328857]: 2025-11-28 10:09:33.960436456 +0000 UTC m=+0.199240090 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=edpm)
Nov 28 10:09:33 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:09:33 np0005538513.localdomain podman[328858]: 2025-11-28 10:09:33.989345478 +0000 UTC m=+0.222735655 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:09:33 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:09:34 np0005538513.localdomain ceph-mon[292954]: pgmap v511: 177 pgs: 177 active+clean; 687 MiB data, 2.3 GiB used, 40 GiB / 42 GiB avail; 164 KiB/s rd, 62 MiB/s wr, 258 op/s
Nov 28 10:09:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e256 do_prune osdmap full prune enabled
Nov 28 10:09:35 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d1cf0525-f942-48d6-9b6c-f05643be68cd", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e257 e257: 6 total, 6 up, 6 in
Nov 28 10:09:35 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e257: 6 total, 6 up, 6 in
Nov 28 10:09:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:36.212 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:36 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:36.549 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:36 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d1cf0525-f942-48d6-9b6c-f05643be68cd", "format": "json"}]: dispatch
Nov 28 10:09:36 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:36 np0005538513.localdomain ceph-mon[292954]: osdmap e257: 6 total, 6 up, 6 in
Nov 28 10:09:36 np0005538513.localdomain ceph-mon[292954]: pgmap v513: 177 pgs: 177 active+clean; 687 MiB data, 2.3 GiB used, 40 GiB / 42 GiB avail; 90 KiB/s rd, 55 MiB/s wr, 153 op/s
Nov 28 10:09:36 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "06689989-6341-4053-b4cd-67b6bbd3acbb", "snap_name": "7ef5a977-4548-46f4-a4ab-f17fcf8c22a8_bfb7e13e-e3ef-4568-80b6-62eac53f1d23", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:36 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "06689989-6341-4053-b4cd-67b6bbd3acbb", "snap_name": "7ef5a977-4548-46f4-a4ab-f17fcf8c22a8", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:36 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e49: np0005538515.yfkzhl(active, since 11m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:09:37 np0005538513.localdomain ceph-mon[292954]: mgrmap e49: np0005538515.yfkzhl(active, since 11m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:09:38 np0005538513.localdomain ceph-mon[292954]: pgmap v514: 177 pgs: 177 active+clean; 1.1 GiB data, 3.5 GiB used, 38 GiB / 42 GiB avail; 178 KiB/s rd, 105 MiB/s wr, 304 op/s
Nov 28 10:09:38 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:40 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:40 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7", "format": "json"}]: dispatch
Nov 28 10:09:40 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1662012236' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:40 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1662012236' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:40 np0005538513.localdomain podman[238687]: time="2025-11-28T10:09:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:09:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:09:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1"
Nov 28 10:09:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:09:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19269 "" "Go-http-client/1.1"
Nov 28 10:09:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e257 do_prune osdmap full prune enabled
Nov 28 10:09:41 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "06689989-6341-4053-b4cd-67b6bbd3acbb", "format": "json"}]: dispatch
Nov 28 10:09:41 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "06689989-6341-4053-b4cd-67b6bbd3acbb", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:41 np0005538513.localdomain ceph-mon[292954]: pgmap v515: 177 pgs: 177 active+clean; 1.1 GiB data, 3.5 GiB used, 38 GiB / 42 GiB avail; 171 KiB/s rd, 101 MiB/s wr, 292 op/s
Nov 28 10:09:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e258 e258: 6 total, 6 up, 6 in
Nov 28 10:09:41 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e258: 6 total, 6 up, 6 in
Nov 28 10:09:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:41.248 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:41.556 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:42 np0005538513.localdomain ceph-mon[292954]: osdmap e258: 6 total, 6 up, 6 in
Nov 28 10:09:42 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1184348018' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:42 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1184348018' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:09:42 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2049122593' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:09:42 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2049122593' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e258 do_prune osdmap full prune enabled
Nov 28 10:09:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e259 e259: 6 total, 6 up, 6 in
Nov 28 10:09:42 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e259: 6 total, 6 up, 6 in
Nov 28 10:09:43 np0005538513.localdomain ceph-mon[292954]: pgmap v517: 177 pgs: 177 active+clean; 195 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 200 KiB/s rd, 67 MiB/s wr, 343 op/s
Nov 28 10:09:43 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2049122593' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:43 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2049122593' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:43 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d1cf0525-f942-48d6-9b6c-f05643be68cd", "format": "json"}]: dispatch
Nov 28 10:09:43 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d1cf0525-f942-48d6-9b6c-f05643be68cd", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:43 np0005538513.localdomain ceph-mon[292954]: osdmap e259: 6 total, 6 up, 6 in
Nov 28 10:09:43 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:09:43 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:09:43 np0005538513.localdomain systemd[1]: tmp-crun.Ou3pfO.mount: Deactivated successfully.
Nov 28 10:09:43 np0005538513.localdomain podman[328900]: 2025-11-28 10:09:43.838753504 +0000 UTC m=+0.079404339 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:09:43 np0005538513.localdomain podman[328900]: 2025-11-28 10:09:43.851788395 +0000 UTC m=+0.092439260 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:09:43 np0005538513.localdomain systemd[1]: tmp-crun.zLpI3u.mount: Deactivated successfully.
Nov 28 10:09:43 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:09:43 np0005538513.localdomain podman[328901]: 2025-11-28 10:09:43.861918957 +0000 UTC m=+0.094980428 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 10:09:43 np0005538513.localdomain podman[328901]: 2025-11-28 10:09:43.942374327 +0000 UTC m=+0.175435838 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Nov 28 10:09:43 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:09:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e259 do_prune osdmap full prune enabled
Nov 28 10:09:44 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "6640fd55-9a20-4243-a340-7d5b72774834", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:44 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "6640fd55-9a20-4243-a340-7d5b72774834", "format": "json"}]: dispatch
Nov 28 10:09:44 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7", "format": "json"}]: dispatch
Nov 28 10:09:44 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e260 e260: 6 total, 6 up, 6 in
Nov 28 10:09:44 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e260: 6 total, 6 up, 6 in
Nov 28 10:09:44 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:44.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:09:45 np0005538513.localdomain ceph-mon[292954]: pgmap v519: 177 pgs: 177 active+clean; 195 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 200 KiB/s rd, 67 MiB/s wr, 343 op/s
Nov 28 10:09:45 np0005538513.localdomain ceph-mon[292954]: osdmap e260: 6 total, 6 up, 6 in
Nov 28 10:09:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:45.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:09:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:46.283 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e260 do_prune osdmap full prune enabled
Nov 28 10:09:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e261 e261: 6 total, 6 up, 6 in
Nov 28 10:09:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:46.559 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:46 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e261: 6 total, 6 up, 6 in
Nov 28 10:09:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:46.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:09:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:46.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:09:47 np0005538513.localdomain ceph-mon[292954]: pgmap v521: 177 pgs: 177 active+clean; 195 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 149 KiB/s rd, 21 MiB/s wr, 256 op/s
Nov 28 10:09:47 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6640fd55-9a20-4243-a340-7d5b72774834", "format": "json"}]: dispatch
Nov 28 10:09:47 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "6640fd55-9a20-4243-a340-7d5b72774834", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:47 np0005538513.localdomain ceph-mon[292954]: osdmap e261: 6 total, 6 up, 6 in
Nov 28 10:09:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:47.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:09:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:09:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:09:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:09:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:09:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:09:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:09:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:09:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:09:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:09:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:09:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:09:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:09:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:48.695 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:48 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:09:48.696 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:09:48 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:09:48.697 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:09:49 np0005538513.localdomain ceph-mon[292954]: pgmap v523: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 3.5 MiB/s rd, 93 KiB/s wr, 72 op/s
Nov 28 10:09:49 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1498043172' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:49 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1498043172' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:49.766 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:09:50 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e261 do_prune osdmap full prune enabled
Nov 28 10:09:50 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e262 e262: 6 total, 6 up, 6 in
Nov 28 10:09:50 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e262: 6 total, 6 up, 6 in
Nov 28 10:09:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:50.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:09:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:50.770 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:09:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:50.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:09:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:09:50.847 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:09:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:09:50.847 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:09:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:09:50.848 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:09:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:50.850 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:09:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:50.851 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:09:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:50.851 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 10:09:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:50.852 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:09:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:51.319 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:51.323 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:09:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:51.345 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:09:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:51.346 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 10:09:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:51.347 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:09:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:51.347 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:09:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:51.366 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:09:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:51.367 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:09:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:51.367 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:09:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:51.368 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:09:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:51.368 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e262 do_prune osdmap full prune enabled
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e263 e263: 6 total, 6 up, 6 in
Nov 28 10:09:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:51.565 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e263: 6 total, 6 up, 6 in
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.585043) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324591585095, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 756, "num_deletes": 255, "total_data_size": 900264, "memory_usage": 914792, "flush_reason": "Manual Compaction"}
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324591592538, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 812869, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33321, "largest_seqno": 34076, "table_properties": {"data_size": 809169, "index_size": 1427, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 10137, "raw_average_key_size": 21, "raw_value_size": 801282, "raw_average_value_size": 1734, "num_data_blocks": 62, "num_entries": 462, "num_filter_entries": 462, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324562, "oldest_key_time": 1764324562, "file_creation_time": 1764324591, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 7544 microseconds, and 3065 cpu microseconds.
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.592587) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 812869 bytes OK
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.592616) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.595498) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.595535) EVENT_LOG_v1 {"time_micros": 1764324591595526, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.595561) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 896217, prev total WAL file size 896217, number of live WAL files 2.
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.596259) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034303131' seq:72057594037927935, type:22 .. '6D6772737461740034323632' seq:0, type:0; will stop at (end)
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(793KB)], [57(18MB)]
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324591596307, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 20429334, "oldest_snapshot_seqno": -1}
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1803177013' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1803177013' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: pgmap v524: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 2.8 MiB/s rd, 77 KiB/s wr, 59 op/s
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: osdmap e262: 6 total, 6 up, 6 in
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "format": "json"}]: dispatch
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: osdmap e263: 6 total, 6 up, 6 in
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1803177013' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1803177013' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 13349 keys, 18386709 bytes, temperature: kUnknown
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324591692737, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 18386709, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18310381, "index_size": 41859, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33413, "raw_key_size": 357091, "raw_average_key_size": 26, "raw_value_size": 18083325, "raw_average_value_size": 1354, "num_data_blocks": 1573, "num_entries": 13349, "num_filter_entries": 13349, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324591, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.693087) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 18386709 bytes
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.695624) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 211.7 rd, 190.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 18.7 +0.0 blob) out(17.5 +0.0 blob), read-write-amplify(47.8) write-amplify(22.6) OK, records in: 13870, records dropped: 521 output_compression: NoCompression
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.695653) EVENT_LOG_v1 {"time_micros": 1764324591695640, "job": 34, "event": "compaction_finished", "compaction_time_micros": 96522, "compaction_time_cpu_micros": 49995, "output_level": 6, "num_output_files": 1, "total_output_size": 18386709, "num_input_records": 13870, "num_output_records": 13349, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324591695903, "job": 34, "event": "table_file_deletion", "file_number": 59}
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324591698383, "job": 34, "event": "table_file_deletion", "file_number": 57}
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.596179) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.698472) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.698480) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.698484) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.698487) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.698490) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:51 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:09:51Z|00471|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1135745932' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:09:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:51.837 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:09:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:51.908 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:09:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:51.909 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1402041267' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:09:51 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1402041267' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:52.105 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:09:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:52.107 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11060MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:09:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:52.107 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:09:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:52.108 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:09:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:52.184 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 10:09:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:52.185 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:09:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:52.185 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:09:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:52.231 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:09:52 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2570637429' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:09:52 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1135745932' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:09:52 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:52 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1402041267' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:52 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1402041267' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:52 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/3487273525' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:09:52 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:09:52 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2713521814' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:09:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:52.711 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:09:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:52.716 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:09:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:52.775 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:09:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:52.778 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:09:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:52.778 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.670s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:09:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:53.203 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:09:53 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "11696cb6-6ed6-4708-887e-84f5b86051e8", "size": 4294967296, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:53 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "11696cb6-6ed6-4708-887e-84f5b86051e8", "format": "json"}]: dispatch
Nov 28 10:09:53 np0005538513.localdomain ceph-mon[292954]: pgmap v527: 177 pgs: 177 active+clean; 242 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 3.7 MiB/s wr, 283 op/s
Nov 28 10:09:53 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2713521814' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:09:53 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1704733216' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:53 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1704733216' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:53 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:09:53.699 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:09:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e263 do_prune osdmap full prune enabled
Nov 28 10:09:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e264 e264: 6 total, 6 up, 6 in
Nov 28 10:09:54 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e264: 6 total, 6 up, 6 in
Nov 28 10:09:54 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "snap_name": "1f3bf784-193d-4af9-98c3-6a3e9518c295", "format": "json"}]: dispatch
Nov 28 10:09:54 np0005538513.localdomain ceph-mon[292954]: pgmap v528: 177 pgs: 177 active+clean; 242 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 3.0 MiB/s wr, 232 op/s
Nov 28 10:09:54 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:09:54 np0005538513.localdomain podman[328983]: 2025-11-28 10:09:54.845548463 +0000 UTC m=+0.080624605 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vcs-type=git, io.openshift.expose-services=, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 28 10:09:54 np0005538513.localdomain podman[328983]: 2025-11-28 10:09:54.855379415 +0000 UTC m=+0.090455557 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, maintainer=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible)
Nov 28 10:09:54 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:09:55 np0005538513.localdomain ceph-mon[292954]: osdmap e264: 6 total, 6 up, 6 in
Nov 28 10:09:55 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/729085219' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:09:55 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "584bbedb-3694-4e91-aa03-2dfba40587ca", "size": 3221225472, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:55 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "584bbedb-3694-4e91-aa03-2dfba40587ca", "format": "json"}]: dispatch
Nov 28 10:09:55 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:56.355 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:56 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e264 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:56 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e264 do_prune osdmap full prune enabled
Nov 28 10:09:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:09:56.568 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:56 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e265 e265: 6 total, 6 up, 6 in
Nov 28 10:09:56 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e265: 6 total, 6 up, 6 in
Nov 28 10:09:56 np0005538513.localdomain ceph-mon[292954]: pgmap v530: 177 pgs: 177 active+clean; 242 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 151 KiB/s rd, 3.6 MiB/s wr, 211 op/s
Nov 28 10:09:56 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/584414566' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:09:56 np0005538513.localdomain ceph-mon[292954]: osdmap e265: 6 total, 6 up, 6 in
Nov 28 10:09:57 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:57 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "format": "json"}]: dispatch
Nov 28 10:09:57 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:57 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "snap_name": "1f3bf784-193d-4af9-98c3-6a3e9518c295", "target_sub_name": "b9fd2bda-5757-4a94-8506-9ee54ffddc78", "format": "json"}]: dispatch
Nov 28 10:09:57 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b9fd2bda-5757-4a94-8506-9ee54ffddc78", "format": "json"}]: dispatch
Nov 28 10:09:57 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/782691183' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:57 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/782691183' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:58 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e265 do_prune osdmap full prune enabled
Nov 28 10:09:58 np0005538513.localdomain ceph-mon[292954]: pgmap v532: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 194 KiB/s rd, 3.4 MiB/s wr, 274 op/s
Nov 28 10:09:58 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "11696cb6-6ed6-4708-887e-84f5b86051e8", "format": "json"}]: dispatch
Nov 28 10:09:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:09:58 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e266 e266: 6 total, 6 up, 6 in
Nov 28 10:09:58 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e50: np0005538515.yfkzhl(active, since 11m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:09:58 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e266: 6 total, 6 up, 6 in
Nov 28 10:09:58 np0005538513.localdomain podman[329005]: 2025-11-28 10:09:58.864682213 +0000 UTC m=+0.102930692 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:09:58 np0005538513.localdomain podman[329005]: 2025-11-28 10:09:58.876582569 +0000 UTC m=+0.114831258 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:09:58 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:09:59 np0005538513.localdomain ceph-mon[292954]: mgrmap e50: np0005538515.yfkzhl(active, since 11m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:09:59 np0005538513.localdomain ceph-mon[292954]: osdmap e266: 6 total, 6 up, 6 in
Nov 28 10:10:00 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [INF] : overall HEALTH_OK
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.676 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.677 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.707 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.707 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9011f07-e60e-48ba-a89d-d0d3cefc1418', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:10:00.677590', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6671016c-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': '419097692cfe9929165749bced74accdced2c764cd24e8791e70c16958622380'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:10:00.677590', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6671136e-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': 'e928876c2606c778b5821f627a7a12b8fbd41aab2cff6afc153872d2de06bdd8'}]}, 'timestamp': '2025-11-28 10:10:00.708248', '_unique_id': '23bd52a48ae6476cbd03af838a69e9c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.710 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.715 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '31e0b36c-1b06-459d-871e-51d7bcb25937', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:10:00.711115', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '6672335c-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.883017807, 'message_signature': '414aae5bf3e94d71023af86f035cd9637d055194d9308aee010259272304c28b'}]}, 'timestamp': '2025-11-28 10:10:00.715590', '_unique_id': '0d5485849759441b9af2dcc52889ee55'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.717 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.717 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab2e34fe-8232-47cd-b0c6-cefd900a0286', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:10:00.717793', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '66729c8e-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.883017807, 'message_signature': '291fddca9e8bb110ac9adc6da21fd96c26126059670d3b300fdadc2e2ca2060c'}]}, 'timestamp': '2025-11-28 10:10:00.718278', '_unique_id': 'fd8fe9e7fb6e4666bde5bf47d3daaf0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.720 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.730 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.731 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87b92e0a-9ca6-4aea-a70b-461fae010db9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:10:00.720342', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '667498b8-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.892235642, 'message_signature': '812e6acf199b674541656ce843427d983ad6f963eeaf181333cea20ebe8c0896'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:10:00.720342', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6674a970-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.892235642, 'message_signature': '0c484510df273cb92152fcae5e8f946b4271d40219422462cba26c63498f43eb'}]}, 'timestamp': '2025-11-28 10:10:00.731679', '_unique_id': '09e1cd8e8d72452fab3a99abf0c4d4c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.733 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.733 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '032e7307-0ab7-4412-a5e6-a84d7fdb5da9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:10:00.733885', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '6675120c-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.883017807, 'message_signature': '02a92161f158a493d69ebfeb1895bc07024ccef606f2613b03eaeccfc3073bc6'}]}, 'timestamp': '2025-11-28 10:10:00.734389', '_unique_id': '9784fc3e437547f0ad7519899f3f6be5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.736 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.736 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.736 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d62d606-e032-4a16-9b12-9d43b14fcd05', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:10:00.736449', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6675744a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.892235642, 'message_signature': '2a81607ff4610d68136e0f810d12263da45310b7a34c67f7efd9e50310aadd17'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:10:00.736449', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '667584c6-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.892235642, 'message_signature': 'c767daa6284b97d7d6ac1cfde8f248720233c11a58edff41515dbb6361517f8f'}]}, 'timestamp': '2025-11-28 10:10:00.737295', '_unique_id': 'd7896ec17aca450e8bda61e76c7163bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.739 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.739 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.739 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f272c2ff-7ab4-4890-8bdd-d5cb46ff8d7c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:10:00.739407', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6675e7ea-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': '448fe4127945c544adb0ed46b158ffb061c9aa61776feebf12e6ebb1c9e04594'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:10:00.739407', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6675f884-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': '5323c46a0cd66a7192a6f44e40b5bcf9bd867970cbd084c3dcced739d50abba1'}]}, 'timestamp': '2025-11-28 10:10:00.740261', '_unique_id': '9da7d8fbc385468bae7fc37f7820b8ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.742 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.742 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e488da15-f85f-42d8-81e4-9c30bffb0c65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:10:00.742540', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '6676629c-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.883017807, 'message_signature': 'f7d32d2e76725980e937c723e6b11aff75993f570766e3c5f7fe66f8a52cdf52'}]}, 'timestamp': '2025-11-28 10:10:00.743006', '_unique_id': '5215f171cfaf4a7996f742902de4cdba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.745 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.745 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef787469-c1d4-42c4-a691-60d8912cff2f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:10:00.745382', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '6676d1a0-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.883017807, 'message_signature': 'c7231e62b4500d3082e887218d929b16f3a8bb79f38c6a9a4f6a1fc35b426b1b'}]}, 'timestamp': '2025-11-28 10:10:00.745846', '_unique_id': '992c64f31472426c9ba1d9a417d7ca2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.747 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.747 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b334287-72c5-4ea1-a2de-c0642b95ce3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:10:00.747887', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '66773474-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.883017807, 'message_signature': '0f39c9b338c3f566e267bbce5a0a4d61e1e951ead1ffacadb995674e56f80332'}]}, 'timestamp': '2025-11-28 10:10:00.748374', '_unique_id': '54c8cae4467840588db6288bc8658a18'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.750 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.750 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.750 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80e3f09b-fd19-4b8f-87bc-f17ffac5d061', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:10:00.750429', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6677964e-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': '2bf398d9c42f88d7fce2453d34860c58c8e14eb0c7b5163f72901d37d95647ac'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:10:00.750429', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6677a724-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': 'df28661111fab49d3a2664e794afcc62926dcb6fceae553abf30e71c6d28ea47'}]}, 'timestamp': '2025-11-28 10:10:00.751285', '_unique_id': '5a7f0d3df162436a95dd4d8bbf17e183'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.753 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.753 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f1a0ec6-cea2-46b5-8056-3c092ead95c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:10:00.753341', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '667809bc-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.883017807, 'message_signature': 'ccbc1f3127744f5b3e53986455e78230632e7713a82b688a23fdfdeadc3cb9ac'}]}, 'timestamp': '2025-11-28 10:10:00.753833', '_unique_id': '490804a11b8545b0a5b07a3bcb42fce5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.755 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.756 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e47a9297-520e-45a6-baab-263d54af9f15', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:10:00.756008', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '667871f4-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.883017807, 'message_signature': '33d7553ad018ce0577d0f270abe9175b2303d4ae062424c45aa211eff2800115'}]}, 'timestamp': '2025-11-28 10:10:00.756504', '_unique_id': '8564cb43bd4c4b1e8185c5f955ab2e83'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.758 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.778 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 18250000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '212563e3-bcd1-42ae-8868-add7e1ee6507', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18250000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:10:00.758762', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '667be37a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.94998646, 'message_signature': 'f9988f713a291de4e21eba916626e6e8e4c82de9e932030912a2dc37b313b7a3'}]}, 'timestamp': '2025-11-28 10:10:00.779178', '_unique_id': 'd3936f4c667c4997971f031668ead1cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.781 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.782 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '471a4ec1-85d0-48d7-8143-9bc999534ca3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:10:00.782089', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '667c747a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.94998646, 'message_signature': 'c5bb31dd042057416d1c26f214e840a183853b021265d03c2d236d44cd0097d6'}]}, 'timestamp': '2025-11-28 10:10:00.782879', '_unique_id': '0ce3e1dd6c6b4dbe8c3ee71474dd3928'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.785 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.785 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a53c25f-1fdf-4a4c-8151-a2680133bd65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:10:00.785730', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '667cfb16-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.883017807, 'message_signature': '8beb4c39a40b844d527f87844225ae45c312691579523e4618a10dba6a8faac8'}]}, 'timestamp': '2025-11-28 10:10:00.786304', '_unique_id': '5311457da1f14f9386d91942a4f44ac1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.788 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.788 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.789 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e9eaff2-e92d-4d4d-909a-b0e9e313feb5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:10:00.788656', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '667d6c04-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': 'f629602d9d76724c2e5c2e854dfaa06e6ffd4a887d5c2ea8080b035aea50001d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:10:00.788656', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '667d7d7a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': 'cebde52f43fbc56cd0a6f783ec6f008e70253ad8ef0309d239da3346cefb2f09'}]}, 'timestamp': '2025-11-28 10:10:00.789602', '_unique_id': 'b8be020b9d17447cb55acf2f96870478'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.791 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.791 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceph-mon[292954]: pgmap v534: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 52 KiB/s wr, 77 op/s
Nov 28 10:10:00 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3825568988' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:10:00 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3825568988' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:10:00 np0005538513.localdomain ceph-mon[292954]: overall HEALTH_OK
Nov 28 10:10:00 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "11696cb6-6ed6-4708-887e-84f5b86051e8", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:00 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c7b9c682-dd90-4895-89e5-edc4a14b470f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:00 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c7b9c682-dd90-4895-89e5-edc4a14b470f", "format": "json"}]: dispatch
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.792 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:00 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b78d2b2e-95b6-4f7a-9148-20eac74d8ba6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:10:00.791872', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '667deb02-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': '4aadbd540b8d05dfc6f812ffbd72b48a9d74ef44a0395bf59d3bba8af4e0d603'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:10:00.791872', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '667dfb1a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': '34e48aeadaf76fa298916d760a258a3824cfe36e0c954a126c3c5df7fa450817'}]}, 'timestamp': '2025-11-28 10:10:00.792890', '_unique_id': '66be5a99b3f346e6afff06ad90d52889'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.795 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.796 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54162c7b-fff6-439f-9cd9-0ddc1310ec5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:10:00.795667', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '667e7e50-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': 'ba7981e45ebbe1014637552dd5d93909a8495b2d9f1acb22500cfdabf92f3b31'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:10:00.795667', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '667e8fd0-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': '9d7900444cdb7c7ace59be438a6a9213765bddcaa7f222288fc80d1383bfc190'}]}, 'timestamp': '2025-11-28 10:10:00.796623', '_unique_id': 'c0425cb057f94b8f94f6aa75c6d469f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.798 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.799 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.799 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f5048e3-2042-46c6-ae4c-f8446b44e9c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:10:00.799298', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '667f0c08-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.883017807, 'message_signature': '645dc2b0304ca37a6a5d06bdef41aac8e66ee71f133f86bd81728ff8a336ceaa'}]}, 'timestamp': '2025-11-28 10:10:00.799771', '_unique_id': '31976e4151c548f3aac64274407223df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.801 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.802 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.802 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81460575-9d64-4db6-8538-d9e9aba3ef18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:10:00.802062', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '667f75f8-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.892235642, 'message_signature': '3adec2fc4bd23868e243f67f12278feed3f5692fefe01b89a1a3c78ede1abcb5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:10:00.802062', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '667f8264-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.892235642, 'message_signature': '7e87dfdb6d455e1b9ee8e0cf9de03bf494762c388b081878b1226f0d3822eb38'}]}, 'timestamp': '2025-11-28 10:10:00.802696', '_unique_id': 'ed6afc7bf9da46cc9ccd2b23dbb85f6d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:10:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:10:00 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:00 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:00 np0005538513.localdomain podman[329028]: 2025-11-28 10:10:00.856179745 +0000 UTC m=+0.088500307 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent)
Nov 28 10:10:00 np0005538513.localdomain podman[329028]: 2025-11-28 10:10:00.886451698 +0000 UTC m=+0.118772290 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:10:00 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:10:01 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:10:01 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2619852505' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:10:01 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:10:01 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2619852505' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:10:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:01.398 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:01 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:01.570 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:01.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:10:01 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve49", "tenant_id": "ed59ec099bfe470982dfd8309e19126f", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:10:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:01 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2619852505' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:10:01 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2619852505' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:10:02 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "584bbedb-3694-4e91-aa03-2dfba40587ca", "format": "json"}]: dispatch
Nov 28 10:10:02 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "584bbedb-3694-4e91-aa03-2dfba40587ca", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:02 np0005538513.localdomain ceph-mon[292954]: pgmap v535: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 111 KiB/s rd, 128 KiB/s wr, 160 op/s
Nov 28 10:10:03 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:10:03Z|00472|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:10:03 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:03.602 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:03 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b9fd2bda-5757-4a94-8506-9ee54ffddc78", "format": "json"}]: dispatch
Nov 28 10:10:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:10:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:10:04 np0005538513.localdomain podman[329046]: 2025-11-28 10:10:04.842842214 +0000 UTC m=+0.082844853 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:10:04 np0005538513.localdomain ceph-mon[292954]: pgmap v536: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 100 KiB/s rd, 115 KiB/s wr, 144 op/s
Nov 28 10:10:04 np0005538513.localdomain podman[329046]: 2025-11-28 10:10:04.853761011 +0000 UTC m=+0.093763640 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute)
Nov 28 10:10:04 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:10:04 np0005538513.localdomain podman[329047]: 2025-11-28 10:10:04.917186625 +0000 UTC m=+0.149673193 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:10:04 np0005538513.localdomain podman[329047]: 2025-11-28 10:10:04.989551455 +0000 UTC m=+0.222038023 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 10:10:05 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:10:05 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:10:05 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:05 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:05 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b9fd2bda-5757-4a94-8506-9ee54ffddc78", "format": "json"}]: dispatch
Nov 28 10:10:05 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:05 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "df1f50b2-116f-4913-b966-9e6fb632edd2", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:05 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "df1f50b2-116f-4913-b966-9e6fb632edd2", "format": "json"}]: dispatch
Nov 28 10:10:05 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:05 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c7b9c682-dd90-4895-89e5-edc4a14b470f", "format": "json"}]: dispatch
Nov 28 10:10:05 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c7b9c682-dd90-4895-89e5-edc4a14b470f", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch
Nov 28 10:10:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:05 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:06 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:06.429 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:06 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:06 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e266 do_prune osdmap full prune enabled
Nov 28 10:10:06 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:06.575 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:06 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e267 e267: 6 total, 6 up, 6 in
Nov 28 10:10:06 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e267: 6 total, 6 up, 6 in
Nov 28 10:10:07 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve48"} v 0)
Nov 28 10:10:07 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch
Nov 28 10:10:07 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished
Nov 28 10:10:07 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve48", "tenant_id": "ed59ec099bfe470982dfd8309e19126f", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:10:07 np0005538513.localdomain ceph-mon[292954]: pgmap v537: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 86 KiB/s rd, 99 KiB/s wr, 124 op/s
Nov 28 10:10:07 np0005538513.localdomain ceph-mon[292954]: osdmap e267: 6 total, 6 up, 6 in
Nov 28 10:10:07 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch
Nov 28 10:10:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch
Nov 28 10:10:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch
Nov 28 10:10:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished
Nov 28 10:10:08 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b9fd2bda-5757-4a94-8506-9ee54ffddc78", "format": "json"}]: dispatch
Nov 28 10:10:08 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b9fd2bda-5757-4a94-8506-9ee54ffddc78", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:08 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "503f8ba3-8dec-4f60-af76-593096ff9b7e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:08 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "503f8ba3-8dec-4f60-af76-593096ff9b7e", "format": "json"}]: dispatch
Nov 28 10:10:08 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve48", "format": "json"}]: dispatch
Nov 28 10:10:08 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve48", "format": "json"}]: dispatch
Nov 28 10:10:09 np0005538513.localdomain ceph-mon[292954]: pgmap v539: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 123 KiB/s wr, 81 op/s
Nov 28 10:10:10 np0005538513.localdomain podman[238687]: time="2025-11-28T10:10:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:10:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:10:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1"
Nov 28 10:10:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:10:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19275 "" "Go-http-client/1.1"
Nov 28 10:10:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:10:10 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:10 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:10 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:10 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch
Nov 28 10:10:10 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:10 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:10 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:11.468 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:11.577 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:11 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4122e8d3-d0ce-4fba-8b4f-9622dd23c08b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:11 np0005538513.localdomain ceph-mon[292954]: pgmap v540: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 49 KiB/s rd, 112 KiB/s wr, 74 op/s
Nov 28 10:10:11 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4122e8d3-d0ce-4fba-8b4f-9622dd23c08b", "format": "json"}]: dispatch
Nov 28 10:10:11 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "snap_name": "1f3bf784-193d-4af9-98c3-6a3e9518c295_c90956cb-9363-45be-b7cc-986dc86d427c", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:11 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "snap_name": "1f3bf784-193d-4af9-98c3-6a3e9518c295", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:11 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve47", "tenant_id": "ed59ec099bfe470982dfd8309e19126f", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:10:12 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "503f8ba3-8dec-4f60-af76-593096ff9b7e", "format": "json"}]: dispatch
Nov 28 10:10:12 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "503f8ba3-8dec-4f60-af76-593096ff9b7e", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:13 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:10:13 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2493387610' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:10:13 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:10:13 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2493387610' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:10:13 np0005538513.localdomain ceph-mon[292954]: pgmap v541: 177 pgs: 177 active+clean; 198 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 921 B/s rd, 122 KiB/s wr, 13 op/s
Nov 28 10:10:13 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "format": "json"}]: dispatch
Nov 28 10:10:13 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2493387610' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:10:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/2493387610' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:10:14 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve47"} v 0)
Nov 28 10:10:14 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch
Nov 28 10:10:14 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished
Nov 28 10:10:14 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4122e8d3-d0ce-4fba-8b4f-9622dd23c08b", "format": "json"}]: dispatch
Nov 28 10:10:14 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4122e8d3-d0ce-4fba-8b4f-9622dd23c08b", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:14 np0005538513.localdomain ceph-mon[292954]: pgmap v542: 177 pgs: 177 active+clean; 198 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 921 B/s rd, 122 KiB/s wr, 13 op/s
Nov 28 10:10:14 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a26a558f-6d92-48d1-91b7-33af52872ba0", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:14 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a26a558f-6d92-48d1-91b7-33af52872ba0", "format": "json"}]: dispatch
Nov 28 10:10:14 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:14 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:14 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f", "format": "json"}]: dispatch
Nov 28 10:10:14 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:14 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve47", "format": "json"}]: dispatch
Nov 28 10:10:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch
Nov 28 10:10:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch
Nov 28 10:10:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch
Nov 28 10:10:14 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished
Nov 28 10:10:14 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve47", "format": "json"}]: dispatch
Nov 28 10:10:14 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:10:14 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:10:14 np0005538513.localdomain podman[329090]: 2025-11-28 10:10:14.855246981 +0000 UTC m=+0.089796328 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:10:14 np0005538513.localdomain podman[329090]: 2025-11-28 10:10:14.866406885 +0000 UTC m=+0.100956192 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:10:14 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:10:14 np0005538513.localdomain podman[329091]: 2025-11-28 10:10:14.950064012 +0000 UTC m=+0.178789920 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:10:14 np0005538513.localdomain podman[329091]: 2025-11-28 10:10:14.960551586 +0000 UTC m=+0.189277464 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 10:10:14 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:10:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e267 do_prune osdmap full prune enabled
Nov 28 10:10:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e268 e268: 6 total, 6 up, 6 in
Nov 28 10:10:15 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e268: 6 total, 6 up, 6 in
Nov 28 10:10:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:16.499 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:16.580 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:16 np0005538513.localdomain ceph-mon[292954]: osdmap e268: 6 total, 6 up, 6 in
Nov 28 10:10:16 np0005538513.localdomain ceph-mon[292954]: pgmap v544: 177 pgs: 177 active+clean; 198 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 989 B/s rd, 131 KiB/s wr, 14 op/s
Nov 28 10:10:17 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f", "snap_name": "3d88a300-3326-420b-858f-9b926d8029b3", "format": "json"}]: dispatch
Nov 28 10:10:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:10:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:10:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:10:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:10:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:10:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:10:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:10:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:10:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:10:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:10:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:10:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:10:18 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve49"} v 0)
Nov 28 10:10:18 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch
Nov 28 10:10:18 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished
Nov 28 10:10:18 np0005538513.localdomain ceph-mon[292954]: pgmap v545: 177 pgs: 177 active+clean; 198 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 125 KiB/s wr, 14 op/s
Nov 28 10:10:18 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a26a558f-6d92-48d1-91b7-33af52872ba0", "format": "json"}]: dispatch
Nov 28 10:10:18 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a26a558f-6d92-48d1-91b7-33af52872ba0", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:18 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve49", "format": "json"}]: dispatch
Nov 28 10:10:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch
Nov 28 10:10:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch
Nov 28 10:10:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch
Nov 28 10:10:18 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished
Nov 28 10:10:18 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve49", "format": "json"}]: dispatch
Nov 28 10:10:18 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "format": "json"}]: dispatch
Nov 28 10:10:18 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:20 np0005538513.localdomain ceph-mon[292954]: pgmap v546: 177 pgs: 177 active+clean; 198 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 125 KiB/s wr, 14 op/s
Nov 28 10:10:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:21.526 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e268 do_prune osdmap full prune enabled
Nov 28 10:10:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:21.584 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e269 e269: 6 total, 6 up, 6 in
Nov 28 10:10:21 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e269: 6 total, 6 up, 6 in
Nov 28 10:10:21 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f", "snap_name": "3d88a300-3326-420b-858f-9b926d8029b3_22da53bc-1abc-41d9-a746-84c48ea05590", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:21 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f", "snap_name": "3d88a300-3326-420b-858f-9b926d8029b3", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:21 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d8f7d257-b798-4b3d-88f0-c0bbfa330aa3", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:21 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d8f7d257-b798-4b3d-88f0-c0bbfa330aa3", "format": "json"}]: dispatch
Nov 28 10:10:21 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:21 np0005538513.localdomain ceph-mon[292954]: osdmap e269: 6 total, 6 up, 6 in
Nov 28 10:10:22 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e269 do_prune osdmap full prune enabled
Nov 28 10:10:22 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e270 e270: 6 total, 6 up, 6 in
Nov 28 10:10:22 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e270: 6 total, 6 up, 6 in
Nov 28 10:10:23 np0005538513.localdomain ceph-mon[292954]: pgmap v548: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.4 KiB/s rd, 141 KiB/s wr, 15 op/s
Nov 28 10:10:23 np0005538513.localdomain ceph-mon[292954]: osdmap e270: 6 total, 6 up, 6 in
Nov 28 10:10:25 np0005538513.localdomain ceph-mon[292954]: pgmap v550: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.4 KiB/s rd, 141 KiB/s wr, 15 op/s
Nov 28 10:10:25 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f", "format": "json"}]: dispatch
Nov 28 10:10:25 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:25 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d8f7d257-b798-4b3d-88f0-c0bbfa330aa3", "format": "json"}]: dispatch
Nov 28 10:10:25 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d8f7d257-b798-4b3d-88f0-c0bbfa330aa3", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:25 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:25 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:25 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:10:25 np0005538513.localdomain podman[329132]: 2025-11-28 10:10:25.841639459 +0000 UTC m=+0.078835360 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41)
Nov 28 10:10:25 np0005538513.localdomain podman[329132]: 2025-11-28 10:10:25.862368827 +0000 UTC m=+0.099564728 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, config_id=edpm, maintainer=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git)
Nov 28 10:10:25 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:10:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:26.561 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:26.587 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:26 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:26 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "format": "json"}]: dispatch
Nov 28 10:10:26 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:26 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "format": "json"}]: dispatch
Nov 28 10:10:27 np0005538513.localdomain ceph-mon[292954]: pgmap v551: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 73 KiB/s wr, 6 op/s
Nov 28 10:10:28 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "b6df0b61-7d52-4368-906d-590e5295b08d", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:28 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b6df0b61-7d52-4368-906d-590e5295b08d", "format": "json"}]: dispatch
Nov 28 10:10:28 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:28 np0005538513.localdomain ceph-mon[292954]: pgmap v552: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 895 B/s rd, 128 KiB/s wr, 11 op/s
Nov 28 10:10:28 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3c672c69-0cef-413e-aa2f-cebe487d9fad", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:28 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3c672c69-0cef-413e-aa2f-cebe487d9fad", "format": "json"}]: dispatch
Nov 28 10:10:28 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:28 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:10:28 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:28 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:29 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "snap_name": "cdc57bbb-b20f-407d-9c9a-d134937d596e", "format": "json"}]: dispatch
Nov 28 10:10:29 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:10:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:10:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:10:29 np0005538513.localdomain podman[329152]: 2025-11-28 10:10:29.85109607 +0000 UTC m=+0.088087845 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:10:29 np0005538513.localdomain podman[329152]: 2025-11-28 10:10:29.862408149 +0000 UTC m=+0.099399934 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:10:29 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:10:29 np0005538513.localdomain sudo[329176]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:10:29 np0005538513.localdomain sudo[329176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:10:29 np0005538513.localdomain sudo[329176]: pam_unix(sudo:session): session closed for user root
Nov 28 10:10:30 np0005538513.localdomain sudo[329194]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:10:30 np0005538513.localdomain sudo[329194]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:10:30 np0005538513.localdomain sudo[329194]: pam_unix(sudo:session): session closed for user root
Nov 28 10:10:30 np0005538513.localdomain ceph-mon[292954]: pgmap v553: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 862 B/s rd, 123 KiB/s wr, 11 op/s
Nov 28 10:10:30 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "204dc313-c4b7-4b7f-a2b9-7d12dcbb771e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:30 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "204dc313-c4b7-4b7f-a2b9-7d12dcbb771e", "format": "json"}]: dispatch
Nov 28 10:10:30 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:30 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:10:30 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:10:30 np0005538513.localdomain sudo[329243]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:10:30 np0005538513.localdomain sudo[329243]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:10:30 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:10:30 np0005538513.localdomain sudo[329243]: pam_unix(sudo:session): session closed for user root
Nov 28 10:10:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:10:31 np0005538513.localdomain podman[329261]: 2025-11-28 10:10:31.031970306 +0000 UTC m=+0.078667454 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 10:10:31 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:10:31 np0005538513.localdomain podman[329261]: 2025-11-28 10:10:31.0664978 +0000 UTC m=+0.113195018 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:10:31 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:10:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e270 do_prune osdmap full prune enabled
Nov 28 10:10:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:31.588 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:10:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:31.590 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e271 e271: 6 total, 6 up, 6 in
Nov 28 10:10:31 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e271: 6 total, 6 up, 6 in
Nov 28 10:10:31 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:10:31 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:10:31 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:10:31 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:10:31 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:10:31 np0005538513.localdomain ceph-mon[292954]: osdmap e271: 6 total, 6 up, 6 in
Nov 28 10:10:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-254686751", "caps": ["mds", "allow rw path=/volumes/_nogroup/3c672c69-0cef-413e-aa2f-cebe487d9fad/4ce11f7d-42a8-42fc-8022-08399d2c2f15", "osd", "allow rw pool=manila_data namespace=fsvolumens_3c672c69-0cef-413e-aa2f-cebe487d9fad", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:10:31 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-254686751", "caps": ["mds", "allow rw path=/volumes/_nogroup/3c672c69-0cef-413e-aa2f-cebe487d9fad/4ce11f7d-42a8-42fc-8022-08399d2c2f15", "osd", "allow rw pool=manila_data namespace=fsvolumens_3c672c69-0cef-413e-aa2f-cebe487d9fad", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:31 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-254686751", "caps": ["mds", "allow rw path=/volumes/_nogroup/3c672c69-0cef-413e-aa2f-cebe487d9fad/4ce11f7d-42a8-42fc-8022-08399d2c2f15", "osd", "allow rw pool=manila_data namespace=fsvolumens_3c672c69-0cef-413e-aa2f-cebe487d9fad", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:32 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3c672c69-0cef-413e-aa2f-cebe487d9fad", "auth_id": "tempest-cephx-id-254686751", "tenant_id": "a65552de119e4309a43e9e85b3f7e533", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:10:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-254686751", "format": "json"} : dispatch
Nov 28 10:10:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-254686751", "caps": ["mds", "allow rw path=/volumes/_nogroup/3c672c69-0cef-413e-aa2f-cebe487d9fad/4ce11f7d-42a8-42fc-8022-08399d2c2f15", "osd", "allow rw pool=manila_data namespace=fsvolumens_3c672c69-0cef-413e-aa2f-cebe487d9fad", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-254686751", "caps": ["mds", "allow rw path=/volumes/_nogroup/3c672c69-0cef-413e-aa2f-cebe487d9fad/4ce11f7d-42a8-42fc-8022-08399d2c2f15", "osd", "allow rw pool=manila_data namespace=fsvolumens_3c672c69-0cef-413e-aa2f-cebe487d9fad", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-254686751", "caps": ["mds", "allow rw path=/volumes/_nogroup/3c672c69-0cef-413e-aa2f-cebe487d9fad/4ce11f7d-42a8-42fc-8022-08399d2c2f15", "osd", "allow rw pool=manila_data namespace=fsvolumens_3c672c69-0cef-413e-aa2f-cebe487d9fad", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:32 np0005538513.localdomain ceph-mon[292954]: pgmap v555: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 441 B/s rd, 100 KiB/s wr, 8 op/s
Nov 28 10:10:32 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b6df0b61-7d52-4368-906d-590e5295b08d", "format": "json"}]: dispatch
Nov 28 10:10:32 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b6df0b61-7d52-4368-906d-590e5295b08d", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:32 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "snap_name": "cdc57bbb-b20f-407d-9c9a-d134937d596e", "target_sub_name": "96f781a9-18ad-48a6-a288-5b831718a338", "format": "json"}]: dispatch
Nov 28 10:10:32 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "96f781a9-18ad-48a6-a288-5b831718a338", "format": "json"}]: dispatch
Nov 28 10:10:34 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:10:34Z|00473|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 28 10:10:34 np0005538513.localdomain ceph-mon[292954]: pgmap v556: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 93 KiB/s wr, 7 op/s
Nov 28 10:10:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:10:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:10:35 np0005538513.localdomain podman[329280]: 2025-11-28 10:10:35.50007341 +0000 UTC m=+0.093980067 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:10:35 np0005538513.localdomain podman[329280]: 2025-11-28 10:10:35.510512132 +0000 UTC m=+0.104418749 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 10:10:35 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:10:35 np0005538513.localdomain podman[329281]: 2025-11-28 10:10:35.600889597 +0000 UTC m=+0.192350048 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 10:10:35 np0005538513.localdomain podman[329281]: 2025-11-28 10:10:35.707629626 +0000 UTC m=+0.299090067 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:10:35 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:10:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:35.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:10:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:35.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 10:10:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:35.791 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 10:10:35 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:10:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:36.591 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:36 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:36.593 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:37 np0005538513.localdomain ceph-mon[292954]: pgmap v557: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 93 KiB/s wr, 7 op/s
Nov 28 10:10:39 np0005538513.localdomain ceph-mon[292954]: pgmap v558: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 97 KiB/s wr, 8 op/s
Nov 28 10:10:40 np0005538513.localdomain podman[238687]: time="2025-11-28T10:10:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:10:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:10:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1"
Nov 28 10:10:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:10:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19283 "" "Go-http-client/1.1"
Nov 28 10:10:40 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Nov 28 10:10:40 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:10:40 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:10:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:10:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:10:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:10:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:10:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-254686751"} v 0)
Nov 28 10:10:41 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-254686751"} : dispatch
Nov 28 10:10:41 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-254686751"}]': finished
Nov 28 10:10:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:41.594 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:10:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:41.621 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:10:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:41.622 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5028 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:10:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:41.622 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:10:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:41.623 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:41.623 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:10:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:41.625 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:10:41 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:41 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:41 np0005538513.localdomain ceph-mon[292954]: pgmap v559: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 97 KiB/s wr, 8 op/s
Nov 28 10:10:41 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:10:41 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-254686751"} : dispatch
Nov 28 10:10:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-254686751", "format": "json"} : dispatch
Nov 28 10:10:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-254686751"} : dispatch
Nov 28 10:10:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-254686751"}]': finished
Nov 28 10:10:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:10:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:42 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "204dc313-c4b7-4b7f-a2b9-7d12dcbb771e", "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "204dc313-c4b7-4b7f-a2b9-7d12dcbb771e", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "68e2b06e-b3f4-47c6-ba92-1f283cfd85db", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "68e2b06e-b3f4-47c6-ba92-1f283cfd85db", "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3c672c69-0cef-413e-aa2f-cebe487d9fad", "auth_id": "tempest-cephx-id-254686751", "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3c672c69-0cef-413e-aa2f-cebe487d9fad", "auth_id": "tempest-cephx-id-254686751", "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:42 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3c672c69-0cef-413e-aa2f-cebe487d9fad", "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3c672c69-0cef-413e-aa2f-cebe487d9fad", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538513.localdomain ceph-mon[292954]: pgmap v560: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 695 B/s rd, 193 KiB/s wr, 16 op/s
Nov 28 10:10:42 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9f812932-0755-4fbf-a6f3-aea9e3a38b58", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9f812932-0755-4fbf-a6f3-aea9e3a38b58", "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:44 np0005538513.localdomain sshd[329325]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:10:44 np0005538513.localdomain ceph-mon[292954]: pgmap v561: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 596 B/s rd, 125 KiB/s wr, 10 op/s
Nov 28 10:10:44 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4afb8271-18a6-4a0b-8cb1-7414aa7c5267", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:44 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:45 np0005538513.localdomain sshd[329325]: Received disconnect from 193.46.255.33 port 41566:11:  [preauth]
Nov 28 10:10:45 np0005538513.localdomain sshd[329325]: Disconnected from authenticating user root 193.46.255.33 port 41566 [preauth]
Nov 28 10:10:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:10:45 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:10:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Nov 28 10:10:45 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:10:45 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:10:45 np0005538513.localdomain podman[329327]: 2025-11-28 10:10:45.124663909 +0000 UTC m=+0.089133356 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:10:45 np0005538513.localdomain podman[329327]: 2025-11-28 10:10:45.16589892 +0000 UTC m=+0.130368407 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:10:45 np0005538513.localdomain podman[329328]: 2025-11-28 10:10:45.176098335 +0000 UTC m=+0.136684193 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 10:10:45 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:10:45 np0005538513.localdomain podman[329328]: 2025-11-28 10:10:45.191629233 +0000 UTC m=+0.152215101 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:10:45 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:10:45 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:45.790 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:10:46 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4afb8271-18a6-4a0b-8cb1-7414aa7c5267", "format": "json"}]: dispatch
Nov 28 10:10:46 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9861e523-796e-4848-a7e0-e4ce88058d68", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:46 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9861e523-796e-4848-a7e0-e4ce88058d68", "format": "json"}]: dispatch
Nov 28 10:10:46 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:46 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:10:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:10:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:10:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:10:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:10:46 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:10:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:46.626 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:10:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:46.628 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:10:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:46.629 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:10:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:46.629 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:10:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:46.658 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:46.659 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:10:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:46.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:10:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:46.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:10:47 np0005538513.localdomain ceph-mon[292954]: pgmap v562: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 125 KiB/s wr, 10 op/s
Nov 28 10:10:47 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Nov 28 10:10:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:47.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:10:48 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:10:48 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:48 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:10:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:10:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:10:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:10:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:10:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:10:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:10:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:10:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:10:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:10:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:10:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:10:48 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9f812932-0755-4fbf-a6f3-aea9e3a38b58", "format": "json"}]: dispatch
Nov 28 10:10:48 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9f812932-0755-4fbf-a6f3-aea9e3a38b58", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:48 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:10:48 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:48 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:48 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:49 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4afb8271-18a6-4a0b-8cb1-7414aa7c5267", "format": "json"}]: dispatch
Nov 28 10:10:49 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4afb8271-18a6-4a0b-8cb1-7414aa7c5267", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:49 np0005538513.localdomain ceph-mon[292954]: pgmap v563: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 167 KiB/s wr, 16 op/s
Nov 28 10:10:49 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:10:49 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "9861e523-796e-4848-a7e0-e4ce88058d68", "snap_name": "feab8b53-fb10-403a-9e5e-ef10a5640c05", "format": "json"}]: dispatch
Nov 28 10:10:49 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:10:49.587 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:10:49 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:10:49.589 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:10:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:49.590 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:49.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:10:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:50.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:10:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:50.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:10:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:50.789 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:10:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:50.790 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:10:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:50.790 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:10:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:50.791 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:10:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:50.791 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:10:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:10:50.847 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:10:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:10:50.848 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:10:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:10:50.848 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:10:51 np0005538513.localdomain ceph-mon[292954]: pgmap v564: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 128 KiB/s wr, 12 op/s
Nov 28 10:10:51 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:10:51 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1097277396' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:10:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:51.299 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:10:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:51.360 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:10:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:51.360 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:10:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Nov 28 10:10:51 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:10:51 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:10:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:51.597 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:10:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:51.599 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11049MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:10:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:51.599 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:10:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:51.600 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:10:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:51.688 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:51.815 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 10:10:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:51.815 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:10:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:51.816 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:10:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:51.887 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing inventories for resource provider 35fead26-0bad-4950-b646-987079d58a17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 10:10:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:51.938 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating ProviderTree inventory for provider 35fead26-0bad-4950-b646-987079d58a17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 10:10:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:51.938 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 10:10:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:51.952 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing aggregate associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 10:10:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:51.971 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing trait associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_FMA3,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_ABM,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AMD_SVM,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_CLMUL,HW_CPU_X86_BMI2,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 10:10:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:52.017 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:10:52 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "68e2b06e-b3f4-47c6-ba92-1f283cfd85db", "format": "json"}]: dispatch
Nov 28 10:10:52 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "68e2b06e-b3f4-47c6-ba92-1f283cfd85db", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:52 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e15fa73c-5a04-4afe-898a-d761ebf88b95", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:52 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e15fa73c-5a04-4afe-898a-d761ebf88b95", "format": "json"}]: dispatch
Nov 28 10:10:52 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1097277396' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:10:52 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:10:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:10:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:10:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:10:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:10:52 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:10:52 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:52 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:10:52 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4265101259' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:10:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:52.486 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:10:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:52.492 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:10:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:52.513 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:10:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:52.516 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:10:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:52.516 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.916s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:10:53 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "df1f50b2-116f-4913-b966-9e6fb632edd2", "format": "json"}]: dispatch
Nov 28 10:10:53 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "df1f50b2-116f-4913-b966-9e6fb632edd2", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:53 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "619d1399-3e67-47c1-b13e-c8a98f88c137", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:53 np0005538513.localdomain ceph-mon[292954]: pgmap v565: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 205 KiB/s wr, 18 op/s
Nov 28 10:10:53 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "619d1399-3e67-47c1-b13e-c8a98f88c137", "format": "json"}]: dispatch
Nov 28 10:10:53 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/4265101259' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:10:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:53.513 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:10:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:53.514 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:10:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:53.514 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:10:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:53.515 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:10:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:53.637 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:10:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:53.637 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:10:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:53.638 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 10:10:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:53.638 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:10:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:54.253 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:10:54 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1858069606' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:10:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:54.269 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:10:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:54.269 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 10:10:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:54.269 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:10:55 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:10:55 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:55 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:55 np0005538513.localdomain ceph-mon[292954]: pgmap v566: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 938 B/s rd, 120 KiB/s wr, 11 op/s
Nov 28 10:10:55 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "e15fa73c-5a04-4afe-898a-d761ebf88b95", "snap_name": "8728644f-8752-4fbc-9dba-996bfb595ace", "format": "json"}]: dispatch
Nov 28 10:10:55 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f288b089-d265-4559-9acd-a03615016513", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:55 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f288b089-d265-4559-9acd-a03615016513", "format": "json"}]: dispatch
Nov 28 10:10:55 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:55 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:55 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:55 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2557878891' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:10:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:10:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:55 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:55 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:10:55.592 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:10:56 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "format": "json"}]: dispatch
Nov 28 10:10:56 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:10:56 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "619d1399-3e67-47c1-b13e-c8a98f88c137", "format": "json"}]: dispatch
Nov 28 10:10:56 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "619d1399-3e67-47c1-b13e-c8a98f88c137", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:56 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:56.714 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:10:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:56.715 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:56.716 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5026 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:10:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:56.716 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:10:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:56.717 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:10:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:10:56.719 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:56 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:10:56 np0005538513.localdomain podman[329413]: 2025-11-28 10:10:56.846103828 +0000 UTC m=+0.079457620 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 28 10:10:56 np0005538513.localdomain podman[329413]: 2025-11-28 10:10:56.885634636 +0000 UTC m=+0.118988448 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, config_id=edpm, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 10:10:56 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:10:57 np0005538513.localdomain ceph-mon[292954]: pgmap v567: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 938 B/s rd, 120 KiB/s wr, 11 op/s
Nov 28 10:10:57 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/646999688' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:10:58 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2010999762' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:10:58 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Nov 28 10:10:58 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:10:58 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:10:59 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f288b089-d265-4559-9acd-a03615016513", "format": "json"}]: dispatch
Nov 28 10:10:59 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f288b089-d265-4559-9acd-a03615016513", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:59 np0005538513.localdomain ceph-mon[292954]: pgmap v568: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 171 KiB/s wr, 16 op/s
Nov 28 10:10:59 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "e15fa73c-5a04-4afe-898a-d761ebf88b95", "snap_name": "8728644f-8752-4fbc-9dba-996bfb595ace_609d618f-0ee5-41e8-beae-89d616159cd4", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:59 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "e15fa73c-5a04-4afe-898a-d761ebf88b95", "snap_name": "8728644f-8752-4fbc-9dba-996bfb595ace", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:59 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:10:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:10:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:10:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:10:59 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:10:59 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:10:59 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:59 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:00 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e271 do_prune osdmap full prune enabled
Nov 28 10:11:00 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e272 e272: 6 total, 6 up, 6 in
Nov 28 10:11:00 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9342293c-12c1-4a10-bd1e-8eba9e15cd79", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:00 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9342293c-12c1-4a10-bd1e-8eba9e15cd79", "format": "json"}]: dispatch
Nov 28 10:11:00 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "47d6dddf-401e-4ff2-980a-f34a0aa62099", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:00 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "47d6dddf-401e-4ff2-980a-f34a0aa62099", "format": "json"}]: dispatch
Nov 28 10:11:00 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e272: 6 total, 6 up, 6 in
Nov 28 10:11:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:11:00 np0005538513.localdomain podman[329434]: 2025-11-28 10:11:00.842186898 +0000 UTC m=+0.078846640 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 10:11:00 np0005538513.localdomain podman[329434]: 2025-11-28 10:11:00.850126023 +0000 UTC m=+0.086785725 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:11:00 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:11:01 np0005538513.localdomain ceph-mon[292954]: pgmap v569: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 128 KiB/s wr, 10 op/s
Nov 28 10:11:01 np0005538513.localdomain ceph-mon[292954]: osdmap e272: 6 total, 6 up, 6 in
Nov 28 10:11:01 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:01 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:01 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:01 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:01.720 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:11:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:01.722 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:11:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:01.722 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:11:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:01.723 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:11:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:01.758 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:01.759 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:11:01 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:11:01 np0005538513.localdomain podman[329456]: 2025-11-28 10:11:01.847774583 +0000 UTC m=+0.072102793 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:11:01 np0005538513.localdomain podman[329456]: 2025-11-28 10:11:01.857504812 +0000 UTC m=+0.081833052 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 10:11:01 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:11:02 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/9342293c-12c1-4a10-bd1e-8eba9e15cd79/6ecb968c-6419-4825-9a25-21daec62c92e", "osd", "allow rw pool=manila_data namespace=fsvolumens_9342293c-12c1-4a10-bd1e-8eba9e15cd79", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:02 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/9342293c-12c1-4a10-bd1e-8eba9e15cd79/6ecb968c-6419-4825-9a25-21daec62c92e", "osd", "allow rw pool=manila_data namespace=fsvolumens_9342293c-12c1-4a10-bd1e-8eba9e15cd79", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:02 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/9342293c-12c1-4a10-bd1e-8eba9e15cd79/6ecb968c-6419-4825-9a25-21daec62c92e", "osd", "allow rw pool=manila_data namespace=fsvolumens_9342293c-12c1-4a10-bd1e-8eba9e15cd79", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:02 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e15fa73c-5a04-4afe-898a-d761ebf88b95", "format": "json"}]: dispatch
Nov 28 10:11:02 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e15fa73c-5a04-4afe-898a-d761ebf88b95", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:02 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:02 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:02 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:02 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:02 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:02 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:02 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/9342293c-12c1-4a10-bd1e-8eba9e15cd79/6ecb968c-6419-4825-9a25-21daec62c92e", "osd", "allow rw pool=manila_data namespace=fsvolumens_9342293c-12c1-4a10-bd1e-8eba9e15cd79", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:02 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/9342293c-12c1-4a10-bd1e-8eba9e15cd79/6ecb968c-6419-4825-9a25-21daec62c92e", "osd", "allow rw pool=manila_data namespace=fsvolumens_9342293c-12c1-4a10-bd1e-8eba9e15cd79", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:02 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/9342293c-12c1-4a10-bd1e-8eba9e15cd79/6ecb968c-6419-4825-9a25-21daec62c92e", "osd", "allow rw pool=manila_data namespace=fsvolumens_9342293c-12c1-4a10-bd1e-8eba9e15cd79", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:03 np0005538513.localdomain ceph-mon[292954]: pgmap v571: 177 pgs: 177 active+clean; 204 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 159 KiB/s wr, 14 op/s
Nov 28 10:11:03 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "9342293c-12c1-4a10-bd1e-8eba9e15cd79", "auth_id": "tempest-cephx-id-459400664", "tenant_id": "1562d9ae673b4a5ea5a1a571bd0ea2c8", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:03 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "47d6dddf-401e-4ff2-980a-f34a0aa62099", "format": "json"}]: dispatch
Nov 28 10:11:03 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "47d6dddf-401e-4ff2-980a-f34a0aa62099", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:04 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "96f781a9-18ad-48a6-a288-5b831718a338", "format": "json"}]: dispatch
Nov 28 10:11:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:04.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:11:05 np0005538513.localdomain ceph-mon[292954]: pgmap v572: 177 pgs: 177 active+clean; 204 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 159 KiB/s wr, 14 op/s
Nov 28 10:11:05 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:05 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Nov 28 10:11:05 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:05 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:11:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:11:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:11:05 np0005538513.localdomain podman[329475]: 2025-11-28 10:11:05.860254808 +0000 UTC m=+0.093639746 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:11:05 np0005538513.localdomain podman[329475]: 2025-11-28 10:11:05.870531835 +0000 UTC m=+0.103916773 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 10:11:05 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:11:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:05.944 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:11:05 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:05.945 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 10:11:05 np0005538513.localdomain systemd[1]: tmp-crun.7l67Fa.mount: Deactivated successfully.
Nov 28 10:11:05 np0005538513.localdomain podman[329476]: 2025-11-28 10:11:05.958453423 +0000 UTC m=+0.188823698 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 28 10:11:06 np0005538513.localdomain podman[329476]: 2025-11-28 10:11:06.027510262 +0000 UTC m=+0.257880557 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true)
Nov 28 10:11:06 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:11:06 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "96f781a9-18ad-48a6-a288-5b831718a338", "format": "json"}]: dispatch
Nov 28 10:11:06 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "fef85cbb-0b69-4eb4-8353-b0bae82d0d83", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:06 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "fef85cbb-0b69-4eb4-8353-b0bae82d0d83", "format": "json"}]: dispatch
Nov 28 10:11:06 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:06 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:06 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:06 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:06 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:06 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:11:06 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:06 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e272 do_prune osdmap full prune enabled
Nov 28 10:11:06 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e273 e273: 6 total, 6 up, 6 in
Nov 28 10:11:06 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e273: 6 total, 6 up, 6 in
Nov 28 10:11:06 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:06.762 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:07 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} v 0)
Nov 28 10:11:07 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:07 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished
Nov 28 10:11:07 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:07 np0005538513.localdomain ceph-mon[292954]: pgmap v573: 177 pgs: 177 active+clean; 204 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 159 KiB/s wr, 14 op/s
Nov 28 10:11:07 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "eb97c54f-5603-4ed1-8403-ba162352ea4f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:07 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "eb97c54f-5603-4ed1-8403-ba162352ea4f", "format": "json"}]: dispatch
Nov 28 10:11:07 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:07 np0005538513.localdomain ceph-mon[292954]: osdmap e273: 6 total, 6 up, 6 in
Nov 28 10:11:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished
Nov 28 10:11:08 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "9342293c-12c1-4a10-bd1e-8eba9e15cd79", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:08 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "9342293c-12c1-4a10-bd1e-8eba9e15cd79", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:08 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9342293c-12c1-4a10-bd1e-8eba9e15cd79", "format": "json"}]: dispatch
Nov 28 10:11:08 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9342293c-12c1-4a10-bd1e-8eba9e15cd79", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:08 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:08 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:08 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:09 np0005538513.localdomain ceph-mon[292954]: pgmap v575: 177 pgs: 177 active+clean; 204 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.4 KiB/s rd, 199 KiB/s wr, 18 op/s
Nov 28 10:11:09 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:11:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:09 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:10 np0005538513.localdomain podman[238687]: time="2025-11-28T10:11:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:11:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:11:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1"
Nov 28 10:11:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:11:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19273 "" "Go-http-client/1.1"
Nov 28 10:11:10 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1d0b6ffa-5038-4546-af4f-2ad9a9443222", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:10 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1d0b6ffa-5038-4546-af4f-2ad9a9443222", "format": "json"}]: dispatch
Nov 28 10:11:10 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fef85cbb-0b69-4eb4-8353-b0bae82d0d83", "format": "json"}]: dispatch
Nov 28 10:11:10 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "fef85cbb-0b69-4eb4-8353-b0bae82d0d83", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:10 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "eb97c54f-5603-4ed1-8403-ba162352ea4f", "new_size": 2147483648, "format": "json"}]: dispatch
Nov 28 10:11:11 np0005538513.localdomain ceph-mon[292954]: pgmap v576: 177 pgs: 177 active+clean; 204 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 167 KiB/s wr, 15 op/s
Nov 28 10:11:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:11.763 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:11:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:11.766 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:11:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:11.767 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:11:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:11.767 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:11:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:11.803 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:11 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:11.804 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:11:12 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1d0b6ffa-5038-4546-af4f-2ad9a9443222/d4c11bd3-8d93-41b4-9ea4-9848a75f8c7c", "osd", "allow rw pool=manila_data namespace=fsvolumens_1d0b6ffa-5038-4546-af4f-2ad9a9443222", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:12 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1d0b6ffa-5038-4546-af4f-2ad9a9443222/d4c11bd3-8d93-41b4-9ea4-9848a75f8c7c", "osd", "allow rw pool=manila_data namespace=fsvolumens_1d0b6ffa-5038-4546-af4f-2ad9a9443222", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:12 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1d0b6ffa-5038-4546-af4f-2ad9a9443222/d4c11bd3-8d93-41b4-9ea4-9848a75f8c7c", "osd", "allow rw pool=manila_data namespace=fsvolumens_1d0b6ffa-5038-4546-af4f-2ad9a9443222", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:12 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Nov 28 10:11:12 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:12 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:11:12 np0005538513.localdomain ceph-mon[292954]: pgmap v577: 177 pgs: 177 active+clean; 205 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 142 KiB/s wr, 13 op/s
Nov 28 10:11:12 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1d0b6ffa-5038-4546-af4f-2ad9a9443222", "auth_id": "tempest-cephx-id-459400664", "tenant_id": "1562d9ae673b4a5ea5a1a571bd0ea2c8", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1d0b6ffa-5038-4546-af4f-2ad9a9443222/d4c11bd3-8d93-41b4-9ea4-9848a75f8c7c", "osd", "allow rw pool=manila_data namespace=fsvolumens_1d0b6ffa-5038-4546-af4f-2ad9a9443222", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1d0b6ffa-5038-4546-af4f-2ad9a9443222/d4c11bd3-8d93-41b4-9ea4-9848a75f8c7c", "osd", "allow rw pool=manila_data namespace=fsvolumens_1d0b6ffa-5038-4546-af4f-2ad9a9443222", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1d0b6ffa-5038-4546-af4f-2ad9a9443222/d4c11bd3-8d93-41b4-9ea4-9848a75f8c7c", "osd", "allow rw pool=manila_data namespace=fsvolumens_1d0b6ffa-5038-4546-af4f-2ad9a9443222", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:12 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:12 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:11:12 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:12 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "eb97c54f-5603-4ed1-8403-ba162352ea4f", "format": "json"}]: dispatch
Nov 28 10:11:12 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "eb97c54f-5603-4ed1-8403-ba162352ea4f", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:12 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "0b7976e6-bdd9-4983-8639-f0b5b8a68920", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:12 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:13 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "0b7976e6-bdd9-4983-8639-f0b5b8a68920", "format": "json"}]: dispatch
Nov 28 10:11:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/281311810' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:11:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/281311810' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:11:14 np0005538513.localdomain ceph-mon[292954]: pgmap v578: 177 pgs: 177 active+clean; 205 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 142 KiB/s wr, 13 op/s
Nov 28 10:11:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:15 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:15 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:11:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:11:15 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:11:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:15 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:15 np0005538513.localdomain podman[329519]: 2025-11-28 10:11:15.852906018 +0000 UTC m=+0.088862799 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:11:15 np0005538513.localdomain podman[329519]: 2025-11-28 10:11:15.860628886 +0000 UTC m=+0.096585647 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:11:15 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:11:15 np0005538513.localdomain podman[329520]: 2025-11-28 10:11:15.951800626 +0000 UTC m=+0.183951730 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:11:15 np0005538513.localdomain podman[329520]: 2025-11-28 10:11:15.965370963 +0000 UTC m=+0.197522067 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:11:15 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:11:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} v 0)
Nov 28 10:11:16 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:16 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished
Nov 28 10:11:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:16.805 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:11:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:16.835 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:11:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:16.835 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5030 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:11:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:16.835 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:11:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:16.837 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:16 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:16.838 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:11:16 np0005538513.localdomain ceph-mon[292954]: pgmap v579: 177 pgs: 177 active+clean; 205 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 142 KiB/s wr, 13 op/s
Nov 28 10:11:16 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "0b7976e6-bdd9-4983-8639-f0b5b8a68920", "format": "json"}]: dispatch
Nov 28 10:11:16 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "0b7976e6-bdd9-4983-8639-f0b5b8a68920", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:16 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1d0b6ffa-5038-4546-af4f-2ad9a9443222", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished
Nov 28 10:11:16 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1d0b6ffa-5038-4546-af4f-2ad9a9443222", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:16 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1d0b6ffa-5038-4546-af4f-2ad9a9443222", "format": "json"}]: dispatch
Nov 28 10:11:16 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1d0b6ffa-5038-4546-af4f-2ad9a9443222", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:11:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:11:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:11:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:11:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:11:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:11:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:11:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:11:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:11:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:11:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:11:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:11:18 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Nov 28 10:11:18 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:11:18 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:11:19 np0005538513.localdomain ceph-mon[292954]: pgmap v580: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 640 B/s rd, 170 KiB/s wr, 14 op/s
Nov 28 10:11:19 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "671f3f7b-b41c-49c8-8917-acdf4c0a35f5", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:19 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "671f3f7b-b41c-49c8-8917-acdf4c0a35f5", "format": "json"}]: dispatch
Nov 28 10:11:19 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:19 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:11:19 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:11:19 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:11:19 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:11:20 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:11:20 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:11:20 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1daffb16-bcf6-4808-941d-7da7540d99dc", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:20 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1daffb16-bcf6-4808-941d-7da7540d99dc", "format": "json"}]: dispatch
Nov 28 10:11:20 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:20 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "09b6c71b-6f5f-4037-840b-757a404b81fe", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:20 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "09b6c71b-6f5f-4037-840b-757a404b81fe", "format": "json"}]: dispatch
Nov 28 10:11:20 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:21 np0005538513.localdomain ceph-mon[292954]: pgmap v581: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 158 KiB/s wr, 13 op/s
Nov 28 10:11:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:21.838 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:11:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:21.840 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:11:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:21.840 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:11:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:21.841 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:11:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:21.860 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:21.861 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:11:22 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "671f3f7b-b41c-49c8-8917-acdf4c0a35f5", "new_size": 1073741824, "no_shrink": true, "format": "json"}]: dispatch
Nov 28 10:11:22 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1daffb16-bcf6-4808-941d-7da7540d99dc/48e67baa-974d-4f75-aae2-5962f27bbd3a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1daffb16-bcf6-4808-941d-7da7540d99dc", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:22 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1daffb16-bcf6-4808-941d-7da7540d99dc/48e67baa-974d-4f75-aae2-5962f27bbd3a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1daffb16-bcf6-4808-941d-7da7540d99dc", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:22 np0005538513.localdomain sshd[329563]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:11:22 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1daffb16-bcf6-4808-941d-7da7540d99dc/48e67baa-974d-4f75-aae2-5962f27bbd3a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1daffb16-bcf6-4808-941d-7da7540d99dc", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:22 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:22 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:22 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:23 np0005538513.localdomain ceph-mon[292954]: pgmap v582: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 852 B/s rd, 209 KiB/s wr, 18 op/s
Nov 28 10:11:23 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1daffb16-bcf6-4808-941d-7da7540d99dc", "auth_id": "tempest-cephx-id-459400664", "tenant_id": "1562d9ae673b4a5ea5a1a571bd0ea2c8", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1daffb16-bcf6-4808-941d-7da7540d99dc/48e67baa-974d-4f75-aae2-5962f27bbd3a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1daffb16-bcf6-4808-941d-7da7540d99dc", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1daffb16-bcf6-4808-941d-7da7540d99dc/48e67baa-974d-4f75-aae2-5962f27bbd3a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1daffb16-bcf6-4808-941d-7da7540d99dc", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1daffb16-bcf6-4808-941d-7da7540d99dc/48e67baa-974d-4f75-aae2-5962f27bbd3a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1daffb16-bcf6-4808-941d-7da7540d99dc", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:23 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:11:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:11:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:24 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "09b6c71b-6f5f-4037-840b-757a404b81fe", "format": "json"}]: dispatch
Nov 28 10:11:24 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "09b6c71b-6f5f-4037-840b-757a404b81fe", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:24 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ea36305f-cfab-4e87-868a-2ee1b584f374", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:24 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ea36305f-cfab-4e87-868a-2ee1b584f374", "format": "json"}]: dispatch
Nov 28 10:11:24 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:25 np0005538513.localdomain ceph-mon[292954]: pgmap v583: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 143 KiB/s wr, 12 op/s
Nov 28 10:11:25 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "671f3f7b-b41c-49c8-8917-acdf4c0a35f5", "format": "json"}]: dispatch
Nov 28 10:11:25 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "671f3f7b-b41c-49c8-8917-acdf4c0a35f5", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:25 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Nov 28 10:11:25 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:11:25 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:11:26 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:11:26 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:11:26 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:11:26 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:11:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} v 0)
Nov 28 10:11:26 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:26 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished
Nov 28 10:11:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:26.862 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:11:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:26.864 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:11:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:26.864 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:11:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:26.864 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:11:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:26.886 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:26.887 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:11:27 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:11:27 np0005538513.localdomain ceph-mon[292954]: pgmap v584: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 143 KiB/s wr, 12 op/s
Nov 28 10:11:27 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:11:27 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1daffb16-bcf6-4808-941d-7da7540d99dc", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished
Nov 28 10:11:27 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1daffb16-bcf6-4808-941d-7da7540d99dc", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:27 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9861e523-796e-4848-a7e0-e4ce88058d68", "snap_name": "feab8b53-fb10-403a-9e5e-ef10a5640c05_4a989d3d-660f-4fdb-879a-341eac23ba00", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:27 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9861e523-796e-4848-a7e0-e4ce88058d68", "snap_name": "feab8b53-fb10-403a-9e5e-ef10a5640c05", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:27 np0005538513.localdomain sshd[329563]: Connection reset by 205.210.31.108 port 63012 [preauth]
Nov 28 10:11:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:11:27 np0005538513.localdomain podman[329565]: 2025-11-28 10:11:27.718949062 +0000 UTC m=+0.080839452 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 28 10:11:27 np0005538513.localdomain podman[329565]: 2025-11-28 10:11:27.731291173 +0000 UTC m=+0.093181563 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter)
Nov 28 10:11:27 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:11:28 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1daffb16-bcf6-4808-941d-7da7540d99dc", "format": "json"}]: dispatch
Nov 28 10:11:28 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1daffb16-bcf6-4808-941d-7da7540d99dc", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:28 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ea36305f-cfab-4e87-868a-2ee1b584f374", "format": "json"}]: dispatch
Nov 28 10:11:28 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ea36305f-cfab-4e87-868a-2ee1b584f374", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:29 np0005538513.localdomain ceph-mon[292954]: pgmap v585: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 241 KiB/s wr, 20 op/s
Nov 28 10:11:29 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ac978b2f-998a-4925-a071-fbc679b9c4b6", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:29 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ac978b2f-998a-4925-a071-fbc679b9c4b6", "format": "json"}]: dispatch
Nov 28 10:11:29 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:11:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:29 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:29 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:29 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:29 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:30 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:11:30.064 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:11:30 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:30.065 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:30 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:11:30.066 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "tenant_id": "1562d9ae673b4a5ea5a1a571bd0ea2c8", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.222447) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324690222787, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2161, "num_deletes": 255, "total_data_size": 2093514, "memory_usage": 2159360, "flush_reason": "Manual Compaction"}
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324690237534, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 2054274, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 34077, "largest_seqno": 36237, "table_properties": {"data_size": 2044761, "index_size": 5574, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2885, "raw_key_size": 25919, "raw_average_key_size": 22, "raw_value_size": 2023509, "raw_average_value_size": 1764, "num_data_blocks": 240, "num_entries": 1147, "num_filter_entries": 1147, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324591, "oldest_key_time": 1764324591, "file_creation_time": 1764324690, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 14878 microseconds, and 6426 cpu microseconds.
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.237589) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 2054274 bytes OK
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.237614) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.239943) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.239968) EVENT_LOG_v1 {"time_micros": 1764324690239961, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.239992) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 2083220, prev total WAL file size 2083220, number of live WAL files 2.
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.241145) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132353530' seq:72057594037927935, type:22 .. '7061786F73003132383032' seq:0, type:0; will stop at (end)
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(2006KB)], [60(17MB)]
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324690241200, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 20440983, "oldest_snapshot_seqno": -1}
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 13959 keys, 18920248 bytes, temperature: kUnknown
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324690352819, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 18920248, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18839354, "index_size": 44889, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 34949, "raw_key_size": 372388, "raw_average_key_size": 26, "raw_value_size": 18601214, "raw_average_value_size": 1332, "num_data_blocks": 1690, "num_entries": 13959, "num_filter_entries": 13959, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324690, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.353217) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 18920248 bytes
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.355220) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 182.9 rd, 169.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 17.5 +0.0 blob) out(18.0 +0.0 blob), read-write-amplify(19.2) write-amplify(9.2) OK, records in: 14496, records dropped: 537 output_compression: NoCompression
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.355250) EVENT_LOG_v1 {"time_micros": 1764324690355236, "job": 36, "event": "compaction_finished", "compaction_time_micros": 111759, "compaction_time_cpu_micros": 53091, "output_level": 6, "num_output_files": 1, "total_output_size": 18920248, "num_input_records": 14496, "num_output_records": 13959, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324690355662, "job": 36, "event": "table_file_deletion", "file_number": 62}
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324690358241, "job": 36, "event": "table_file_deletion", "file_number": 60}
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.241000) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.358317) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.358323) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.358326) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.358329) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:30 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.358333) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:31 np0005538513.localdomain sudo[329584]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:11:31 np0005538513.localdomain sudo[329584]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:11:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:11:31 np0005538513.localdomain sudo[329584]: pam_unix(sudo:session): session closed for user root
Nov 28 10:11:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e273 do_prune osdmap full prune enabled
Nov 28 10:11:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e274 e274: 6 total, 6 up, 6 in
Nov 28 10:11:31 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e274: 6 total, 6 up, 6 in
Nov 28 10:11:31 np0005538513.localdomain podman[329601]: 2025-11-28 10:11:31.280002979 +0000 UTC m=+0.064916121 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:11:31 np0005538513.localdomain podman[329601]: 2025-11-28 10:11:31.287874792 +0000 UTC m=+0.072787934 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:11:31 np0005538513.localdomain sudo[329608]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:11:31 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:11:31 np0005538513.localdomain sudo[329608]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:11:31 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9861e523-796e-4848-a7e0-e4ce88058d68", "format": "json"}]: dispatch
Nov 28 10:11:31 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9861e523-796e-4848-a7e0-e4ce88058d68", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:31 np0005538513.localdomain ceph-mon[292954]: pgmap v586: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 149 KiB/s wr, 13 op/s
Nov 28 10:11:31 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:31 np0005538513.localdomain ceph-mon[292954]: osdmap e274: 6 total, 6 up, 6 in
Nov 28 10:11:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:31 np0005538513.localdomain sudo[329608]: pam_unix(sudo:session): session closed for user root
Nov 28 10:11:31 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:31.924 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:11:32 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:11:32 np0005538513.localdomain sudo[329675]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:11:32 np0005538513.localdomain sudo[329675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:11:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:11:32 np0005538513.localdomain sudo[329675]: pam_unix(sudo:session): session closed for user root
Nov 28 10:11:32 np0005538513.localdomain podman[329693]: 2025-11-28 10:11:32.298947785 +0000 UTC m=+0.083339099 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:11:32 np0005538513.localdomain podman[329693]: 2025-11-28 10:11:32.309407978 +0000 UTC m=+0.093799252 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:11:32 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:11:32 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "783fcf38-4096-452b-b965-78e606bf4fa1", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:32 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "783fcf38-4096-452b-b965-78e606bf4fa1", "format": "json"}]: dispatch
Nov 28 10:11:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:11:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:11:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:11:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:11:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} v 0)
Nov 28 10:11:32 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:32 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished
Nov 28 10:11:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Nov 28 10:11:32 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:11:32 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:11:33 np0005538513.localdomain ceph-mon[292954]: pgmap v588: 177 pgs: 2 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 172 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 211 KiB/s wr, 17 op/s
Nov 28 10:11:33 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:33 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:33 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:33 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:33 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished
Nov 28 10:11:33 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:33 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:11:33 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:11:33 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:11:33 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:11:34 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:11:34 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:11:35 np0005538513.localdomain ceph-mon[292954]: pgmap v589: 177 pgs: 2 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 172 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 211 KiB/s wr, 17 op/s
Nov 28 10:11:35 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ac978b2f-998a-4925-a071-fbc679b9c4b6", "format": "json"}]: dispatch
Nov 28 10:11:35 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ac978b2f-998a-4925-a071-fbc679b9c4b6", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:35 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:35 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:35 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:35 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:11:36 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:11:36.067 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "783fcf38-4096-452b-b965-78e606bf4fa1", "format": "json"}]: dispatch
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "783fcf38-4096-452b-b965-78e606bf4fa1", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:11:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.763087) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324696763133, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 425, "num_deletes": 259, "total_data_size": 187094, "memory_usage": 196712, "flush_reason": "Manual Compaction"}
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324696766850, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 185011, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36238, "largest_seqno": 36662, "table_properties": {"data_size": 182545, "index_size": 513, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6728, "raw_average_key_size": 18, "raw_value_size": 177115, "raw_average_value_size": 498, "num_data_blocks": 23, "num_entries": 355, "num_filter_entries": 355, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324690, "oldest_key_time": 1764324690, "file_creation_time": 1764324696, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 3817 microseconds, and 1397 cpu microseconds.
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.766901) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 185011 bytes OK
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.766926) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.770518) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.770547) EVENT_LOG_v1 {"time_micros": 1764324696770539, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.770568) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 184316, prev total WAL file size 193115, number of live WAL files 2.
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.771242) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034323734' seq:72057594037927935, type:22 .. '6C6F676D0034353239' seq:0, type:0; will stop at (end)
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(180KB)], [63(18MB)]
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324696771309, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 19105259, "oldest_snapshot_seqno": -1}
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 13773 keys, 18715383 bytes, temperature: kUnknown
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324696873585, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 18715383, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18635881, "index_size": 43945, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 34501, "raw_key_size": 369629, "raw_average_key_size": 26, "raw_value_size": 18401113, "raw_average_value_size": 1336, "num_data_blocks": 1643, "num_entries": 13773, "num_filter_entries": 13773, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324696, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.873955) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 18715383 bytes
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.876122) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 186.6 rd, 182.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 18.0 +0.0 blob) out(17.8 +0.0 blob), read-write-amplify(204.4) write-amplify(101.2) OK, records in: 14314, records dropped: 541 output_compression: NoCompression
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.876151) EVENT_LOG_v1 {"time_micros": 1764324696876139, "job": 38, "event": "compaction_finished", "compaction_time_micros": 102371, "compaction_time_cpu_micros": 53981, "output_level": 6, "num_output_files": 1, "total_output_size": 18715383, "num_input_records": 14314, "num_output_records": 13773, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324696876318, "job": 38, "event": "table_file_deletion", "file_number": 65}
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324696879174, "job": 38, "event": "table_file_deletion", "file_number": 63}
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.771010) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.879330) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.879339) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.879342) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.879346) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:36 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.879349) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:36 np0005538513.localdomain podman[329711]: 2025-11-28 10:11:36.901040296 +0000 UTC m=+0.132494074 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 10:11:36 np0005538513.localdomain podman[329711]: 2025-11-28 10:11:36.915401928 +0000 UTC m=+0.146855686 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible)
Nov 28 10:11:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:36.927 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:11:36 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:11:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:36.929 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:11:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:36.930 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:11:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:36.930 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:11:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:36.967 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:36.968 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:11:36 np0005538513.localdomain systemd[1]: tmp-crun.FErQAK.mount: Deactivated successfully.
Nov 28 10:11:37 np0005538513.localdomain podman[329712]: 2025-11-28 10:11:37.00177468 +0000 UTC m=+0.230324658 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:11:37 np0005538513.localdomain podman[329712]: 2025-11-28 10:11:37.040417251 +0000 UTC m=+0.268967189 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller)
Nov 28 10:11:37 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:11:37 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e51: np0005538515.yfkzhl(active, since 13m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:11:37 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "tenant_id": "1562d9ae673b4a5ea5a1a571bd0ea2c8", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:37 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:11:37 np0005538513.localdomain ceph-mon[292954]: pgmap v590: 177 pgs: 2 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 172 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 211 KiB/s wr, 17 op/s
Nov 28 10:11:37 np0005538513.localdomain ceph-mon[292954]: mgrmap e51: np0005538515.yfkzhl(active, since 13m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:11:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} v 0)
Nov 28 10:11:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished
Nov 28 10:11:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Nov 28 10:11:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:11:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:11:39 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "96f781a9-18ad-48a6-a288-5b831718a338", "format": "json"}]: dispatch
Nov 28 10:11:39 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "96f781a9-18ad-48a6-a288-5b831718a338", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:39 np0005538513.localdomain ceph-mon[292954]: pgmap v591: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 921 B/s rd, 210 KiB/s wr, 17 op/s
Nov 28 10:11:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished
Nov 28 10:11:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:11:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:11:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:11:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:11:40 np0005538513.localdomain podman[238687]: time="2025-11-28T10:11:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:11:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:11:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1"
Nov 28 10:11:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:11:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19273 "" "Go-http-client/1.1"
Nov 28 10:11:40 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:40 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:40 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:11:40 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:11:41 np0005538513.localdomain ceph-mon[292954]: pgmap v592: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 921 B/s rd, 210 KiB/s wr, 17 op/s
Nov 28 10:11:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e274 do_prune osdmap full prune enabled
Nov 28 10:11:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e275 e275: 6 total, 6 up, 6 in
Nov 28 10:11:41 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e275: 6 total, 6 up, 6 in
Nov 28 10:11:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:41.970 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:11:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:41.972 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:11:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:41.972 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:11:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:41.972 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:11:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:41.992 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:41 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:41.993 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:11:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:42 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:42 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:42 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "snap_name": "cdc57bbb-b20f-407d-9c9a-d134937d596e_d5b0525c-04f8-4742-a7f8-f2e936812088", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:42 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "snap_name": "cdc57bbb-b20f-407d-9c9a-d134937d596e", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:42 np0005538513.localdomain ceph-mon[292954]: osdmap e275: 6 total, 6 up, 6 in
Nov 28 10:11:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:42 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:42 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:43 np0005538513.localdomain ceph-mon[292954]: pgmap v594: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 189 KiB/s wr, 16 op/s
Nov 28 10:11:43 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:44 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "tenant_id": "1562d9ae673b4a5ea5a1a571bd0ea2c8", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e275 do_prune osdmap full prune enabled
Nov 28 10:11:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e276 e276: 6 total, 6 up, 6 in
Nov 28 10:11:45 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e276: 6 total, 6 up, 6 in
Nov 28 10:11:45 np0005538513.localdomain ceph-mon[292954]: pgmap v595: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 189 KiB/s wr, 16 op/s
Nov 28 10:11:45 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "format": "json"}]: dispatch
Nov 28 10:11:45 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Nov 28 10:11:45 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:45 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:11:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} v 0)
Nov 28 10:11:46 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:46 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished
Nov 28 10:11:46 np0005538513.localdomain ceph-mon[292954]: osdmap e276: 6 total, 6 up, 6 in
Nov 28 10:11:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:11:46 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished
Nov 28 10:11:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:11:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:11:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e276 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:46 np0005538513.localdomain podman[329755]: 2025-11-28 10:11:46.851319531 +0000 UTC m=+0.088920941 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:11:46 np0005538513.localdomain podman[329755]: 2025-11-28 10:11:46.888383463 +0000 UTC m=+0.125984853 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:11:46 np0005538513.localdomain podman[329756]: 2025-11-28 10:11:46.903115646 +0000 UTC m=+0.134821775 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd)
Nov 28 10:11:46 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:11:46 np0005538513.localdomain podman[329756]: 2025-11-28 10:11:46.918441889 +0000 UTC m=+0.150148008 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 10:11:46 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:11:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:46.994 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:11:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:46.996 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:11:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:46.997 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:11:46 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:46.997 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:11:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:47.039 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:47.040 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:11:47 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:47 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:47 np0005538513.localdomain ceph-mon[292954]: pgmap v597: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 639 B/s rd, 89 KiB/s wr, 8 op/s
Nov 28 10:11:47 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:47 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "format": "json"}]: dispatch
Nov 28 10:11:47 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:47 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:47 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:11:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:11:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:11:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:11:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:11:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:11:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:11:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:11:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:11:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:11:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:11:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:11:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:48.414 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:11:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:48.414 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:11:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:48.415 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:11:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:48.415 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:11:48 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "88c7260b-39e4-485d-ba81-9fc5fc185b80", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:48 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "88c7260b-39e4-485d-ba81-9fc5fc185b80", "format": "json"}]: dispatch
Nov 28 10:11:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:49 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:49 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:49 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:49 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:49 np0005538513.localdomain ceph-mon[292954]: pgmap v598: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 222 KiB/s wr, 20 op/s
Nov 28 10:11:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:49 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:50 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:11:50 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "f5b4c96d-e791-4ea7-b052-7e31c47893d9", "format": "json"}]: dispatch
Nov 28 10:11:50 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "tenant_id": "1562d9ae673b4a5ea5a1a571bd0ea2c8", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:50.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:11:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:11:50.848 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:11:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:11:50.849 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:11:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:11:50.849 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:11:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:51.500 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:51 np0005538513.localdomain ceph-mon[292954]: pgmap v599: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 626 B/s rd, 130 KiB/s wr, 12 op/s
Nov 28 10:11:51 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "88c7260b-39e4-485d-ba81-9fc5fc185b80", "snap_name": "640bfb92-ee5b-44bb-be51-eab8b7a02ed4", "format": "json"}]: dispatch
Nov 28 10:11:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e276 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e276 do_prune osdmap full prune enabled
Nov 28 10:11:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:51.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:11:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e277 e277: 6 total, 6 up, 6 in
Nov 28 10:11:51 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e277: 6 total, 6 up, 6 in
Nov 28 10:11:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:52.041 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:52.043 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:52 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Nov 28 10:11:52 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:52 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:11:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:52.766 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:11:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:52.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:11:52 np0005538513.localdomain ceph-mon[292954]: osdmap e277: 6 total, 6 up, 6 in
Nov 28 10:11:52 np0005538513.localdomain ceph-mon[292954]: pgmap v601: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 895 B/s rd, 233 KiB/s wr, 20 op/s
Nov 28 10:11:52 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:52 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:11:52 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:52 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "7b81adcd-a38a-49d6-b81f-5b50e8780854", "format": "json"}]: dispatch
Nov 28 10:11:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:52.793 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:11:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:52.794 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:11:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:52.794 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:11:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:52.794 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:11:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:52.795 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:11:52 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} v 0)
Nov 28 10:11:52 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:52 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished
Nov 28 10:11:53 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:11:53 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2162333142' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:11:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:53.244 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:11:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:53.317 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:11:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:53.317 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:11:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:53.544 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:11:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:53.546 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11036MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:11:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:53.546 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:11:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:53.547 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:11:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:53.624 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 10:11:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:53.625 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:11:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:53.625 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:11:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:53.703 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:11:53 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished
Nov 28 10:11:53 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:53 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2162333142' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:11:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:11:54 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2922934843' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:11:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:54.186 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:11:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:54.193 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:11:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:54.210 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:11:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:54.213 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:11:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:54.214 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:11:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:54.304 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:54 np0005538513.localdomain ceph-mon[292954]: pgmap v602: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 858 B/s rd, 223 KiB/s wr, 19 op/s
Nov 28 10:11:54 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2922934843' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:11:54 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "88c7260b-39e4-485d-ba81-9fc5fc185b80", "snap_name": "640bfb92-ee5b-44bb-be51-eab8b7a02ed4_ff125526-c3a4-469c-b6b2-27c868491137", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:54 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "88c7260b-39e4-485d-ba81-9fc5fc185b80", "snap_name": "640bfb92-ee5b-44bb-be51-eab8b7a02ed4", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:54 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1945970975' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:11:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:55.215 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:11:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:55.216 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:11:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:55.216 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:11:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:56.161 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:11:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:56.161 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:11:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:56.162 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 10:11:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:56.162 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:11:56 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:56 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:56 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:56 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:57 np0005538513.localdomain ceph-mon[292954]: pgmap v603: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 186 KiB/s wr, 16 op/s
Nov 28 10:11:57 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:11:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:57 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:57 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/515171710' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:11:57 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "format": "json"}]: dispatch
Nov 28 10:11:57 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:57.071 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:11:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:57.088 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:57.102 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:11:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:57.103 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 10:11:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:57.104 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:11:58 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "7b81adcd-a38a-49d6-b81f-5b50e8780854_1a1adf00-c956-42a5-9ff9-ac2b4556603f", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:58 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "7b81adcd-a38a-49d6-b81f-5b50e8780854", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:58 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "88c7260b-39e4-485d-ba81-9fc5fc185b80", "format": "json"}]: dispatch
Nov 28 10:11:58 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "88c7260b-39e4-485d-ba81-9fc5fc185b80", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:58 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:11:58 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:11:58.770 261084 INFO neutron.agent.linux.ip_lib [None req-c4fa7785-c232-40f8-a7a8-52415563ca7c - - - - - -] Device tapf3135b0f-56 cannot be used as it has no MAC address
Nov 28 10:11:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:58.827 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:58 np0005538513.localdomain kernel: device tapf3135b0f-56 entered promiscuous mode
Nov 28 10:11:58 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324718.8404] manager: (tapf3135b0f-56): new Generic device (/org/freedesktop/NetworkManager/Devices/75)
Nov 28 10:11:58 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:11:58Z|00474|binding|INFO|Claiming lport f3135b0f-56fc-476c-b4e5-c9e5a120aa9d for this chassis.
Nov 28 10:11:58 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:11:58Z|00475|binding|INFO|f3135b0f-56fc-476c-b4e5-c9e5a120aa9d: Claiming unknown
Nov 28 10:11:58 np0005538513.localdomain podman[329844]: 2025-11-28 10:11:58.843514026 +0000 UTC m=+0.139385035 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 10:11:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:58.848 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:58 np0005538513.localdomain systemd-udevd[329870]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:11:58 np0005538513.localdomain podman[329844]: 2025-11-28 10:11:58.86470125 +0000 UTC m=+0.160572289 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, build-date=2025-08-20T13:12:41)
Nov 28 10:11:58 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:11:58Z|00476|binding|INFO|Setting lport f3135b0f-56fc-476c-b4e5-c9e5a120aa9d ovn-installed in OVS
Nov 28 10:11:58 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:11:58Z|00477|binding|INFO|Setting lport f3135b0f-56fc-476c-b4e5-c9e5a120aa9d up in Southbound
Nov 28 10:11:58 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:11:58.868 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-1bac4260-cc00-4d23-940f-68536ef7d308', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1bac4260-cc00-4d23-940f-68536ef7d308', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '729a918c0a8248ff9fef91d8e41e340a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a10c622-42c8-4119-8cc0-4b51720ab1bd, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=f3135b0f-56fc-476c-b4e5-c9e5a120aa9d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:11:58 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:11:58.871 158130 INFO neutron.agent.ovn.metadata.agent [-] Port f3135b0f-56fc-476c-b4e5-c9e5a120aa9d in datapath 1bac4260-cc00-4d23-940f-68536ef7d308 bound to our chassis
Nov 28 10:11:58 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:11:58.873 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 875e6dcc-a188-4fba-a15d-1fa76107968e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:11:58 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:11:58.873 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1bac4260-cc00-4d23-940f-68536ef7d308, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:11:58 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:11:58.875 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[9ee0c22e-bb81-485e-ba6e-24f1a03aa398]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:11:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:58.877 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:58 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:11:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:58.926 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:11:58.944 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:59 np0005538513.localdomain ceph-mon[292954]: pgmap v604: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 179 KiB/s wr, 14 op/s
Nov 28 10:11:59 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Nov 28 10:11:59 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:11:59 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:11:59 np0005538513.localdomain podman[329924]: 
Nov 28 10:11:59 np0005538513.localdomain podman[329924]: 2025-11-28 10:11:59.812567486 +0000 UTC m=+0.093217633 container create c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1bac4260-cc00-4d23-940f-68536ef7d308, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:11:59 np0005538513.localdomain systemd[1]: Started libpod-conmon-c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998.scope.
Nov 28 10:11:59 np0005538513.localdomain podman[329924]: 2025-11-28 10:11:59.767976822 +0000 UTC m=+0.048626999 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:11:59 np0005538513.localdomain systemd[1]: tmp-crun.bB7JiY.mount: Deactivated successfully.
Nov 28 10:11:59 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:11:59 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53803e0e67ab6fc96360e1bca7f403c3373d728f0094543b1385d739dc6ce18a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:11:59 np0005538513.localdomain podman[329924]: 2025-11-28 10:11:59.891166978 +0000 UTC m=+0.171817125 container init c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1bac4260-cc00-4d23-940f-68536ef7d308, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 28 10:11:59 np0005538513.localdomain podman[329924]: 2025-11-28 10:11:59.900284809 +0000 UTC m=+0.180934956 container start c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1bac4260-cc00-4d23-940f-68536ef7d308, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:11:59 np0005538513.localdomain dnsmasq[329943]: started, version 2.85 cachesize 150
Nov 28 10:11:59 np0005538513.localdomain dnsmasq[329943]: DNS service limited to local subnets
Nov 28 10:11:59 np0005538513.localdomain dnsmasq[329943]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:11:59 np0005538513.localdomain dnsmasq[329943]: warning: no upstream servers configured
Nov 28 10:11:59 np0005538513.localdomain dnsmasq-dhcp[329943]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:11:59 np0005538513.localdomain dnsmasq[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/addn_hosts - 0 addresses
Nov 28 10:11:59 np0005538513.localdomain dnsmasq-dhcp[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/host
Nov 28 10:11:59 np0005538513.localdomain dnsmasq-dhcp[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/opts
Nov 28 10:11:59 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:11:59.965 261084 INFO neutron.agent.dhcp.agent [None req-8f135ad8-bb8b-4fce-a1d1-2ae3b356bcc8 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:11:59Z, description=, device_id=c2d60d6f-83fc-4648-964a-020aeb44c54e, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6ee2ac0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6407160>], id=bcda3fd1-8481-4078-8919-c9d7f3e1c8a6, ip_allocation=immediate, mac_address=fa:16:3e:a4:e8:4a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:11:56Z, description=, dns_domain=, id=1bac4260-cc00-4d23-940f-68536ef7d308, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-1059272471-network, port_security_enabled=True, project_id=729a918c0a8248ff9fef91d8e41e340a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=40620, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3688, status=ACTIVE, subnets=['a3e77d57-25ec-4c92-b11d-6f3e73fc2f1e'], tags=[], tenant_id=729a918c0a8248ff9fef91d8e41e340a, updated_at=2025-11-28T10:11:57Z, vlan_transparent=None, network_id=1bac4260-cc00-4d23-940f-68536ef7d308, port_security_enabled=False, project_id=729a918c0a8248ff9fef91d8e41e340a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3696, status=DOWN, tags=[], tenant_id=729a918c0a8248ff9fef91d8e41e340a, updated_at=2025-11-28T10:11:59Z on network 1bac4260-cc00-4d23-940f-68536ef7d308
Nov 28 10:12:00 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1145513864' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:12:00 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:12:00 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:12:00 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:12:00 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:12:00 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:12:00 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:12:00 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2356686647' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:12:00 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:12:00.141 261084 INFO neutron.agent.dhcp.agent [None req-7f83153c-7f78-4107-98a3-b8ea12f1b2bb - - - - - -] DHCP configuration for ports {'d81fcc1a-8a98-4599-b518-924ca188dfd4'} is completed
Nov 28 10:12:00 np0005538513.localdomain dnsmasq[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/addn_hosts - 1 addresses
Nov 28 10:12:00 np0005538513.localdomain dnsmasq-dhcp[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/host
Nov 28 10:12:00 np0005538513.localdomain dnsmasq-dhcp[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/opts
Nov 28 10:12:00 np0005538513.localdomain podman[329961]: 2025-11-28 10:12:00.321468807 +0000 UTC m=+0.063237860 container kill c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1bac4260-cc00-4d23-940f-68536ef7d308, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 28 10:12:00 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:12:00.603 261084 INFO neutron.agent.dhcp.agent [None req-a33d74a0-428d-4911-848e-e73f0d9734f7 - - - - - -] DHCP configuration for ports {'bcda3fd1-8481-4078-8919-c9d7f3e1c8a6'} is completed
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.676 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.677 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.705 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.705 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8222651-4b3f-4f2d-9ce0-38e10022865c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:12:00.677867', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'adf74582-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': 'eb539f1025d60ed666c71b324c9066b67e062c8b67322c5a1f56fbe778e5134f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:12:00.677867', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'adf75d88-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': 'cea35c50aba4eae65689a9dd0d36142a335f389c146cdde2dbb86a143ed1e898'}]}, 'timestamp': '2025-11-28 10:12:00.706520', '_unique_id': 'bdb0e93d7bc7434886ac353dd079bb58'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.709 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 10:12:00 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:12:00.721 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:11:59Z, description=, device_id=c2d60d6f-83fc-4648-964a-020aeb44c54e, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6694b80>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6694640>], id=bcda3fd1-8481-4078-8919-c9d7f3e1c8a6, ip_allocation=immediate, mac_address=fa:16:3e:a4:e8:4a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:11:56Z, description=, dns_domain=, id=1bac4260-cc00-4d23-940f-68536ef7d308, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-1059272471-network, port_security_enabled=True, project_id=729a918c0a8248ff9fef91d8e41e340a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=40620, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3688, status=ACTIVE, subnets=['a3e77d57-25ec-4c92-b11d-6f3e73fc2f1e'], tags=[], tenant_id=729a918c0a8248ff9fef91d8e41e340a, updated_at=2025-11-28T10:11:57Z, vlan_transparent=None, network_id=1bac4260-cc00-4d23-940f-68536ef7d308, port_security_enabled=False, project_id=729a918c0a8248ff9fef91d8e41e340a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3696, status=DOWN, tags=[], tenant_id=729a918c0a8248ff9fef91d8e41e340a, updated_at=2025-11-28T10:11:59Z on network 1bac4260-cc00-4d23-940f-68536ef7d308
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.728 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 18870000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2cffca0-664d-4aff-8554-a281e8f39ff3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18870000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:12:00.710100', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'adfae2d2-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.900646917, 'message_signature': 'a723426fe68d7ea7f0f3229e9877c5d517c184e52713a96c68be562f158c6a12'}]}, 'timestamp': '2025-11-28 10:12:00.729632', '_unique_id': '0ca08b7e0e24427390c2e2216371b462'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.732 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.732 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.735 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '908811aa-4974-4c23-b4bf-ee21e6f8c6d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:12:00.732450', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'adfbd8f4-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.904353422, 'message_signature': 'f3c1b7ed1afa3c39e0ee274293a30e3f120e32b6a5a920162c07b841c761efa4'}]}, 'timestamp': '2025-11-28 10:12:00.735916', '_unique_id': 'ed664a15af2d4286a5557f69a71e01c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.738 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.738 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b72240d-7ab5-416b-8750-0886ecf43113', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:12:00.738623', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'adfc55c2-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.904353422, 'message_signature': '770e5e609bba2d4ceea668c4490198db0c5046e6aa500ad898a86def9490bd07'}]}, 'timestamp': '2025-11-28 10:12:00.739161', '_unique_id': '8b2ddb0ec78144c493108e992f7b2177'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.741 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.741 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6865841-1b5d-42ef-83e2-12589a12557a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:12:00.741833', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'adfcd560-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.904353422, 'message_signature': 'ad92057203af800587dbe9ff4842236dabd12bec5bf6c5dcdd88d7f808164208'}]}, 'timestamp': '2025-11-28 10:12:00.742374', '_unique_id': '34a41996dd2c4d84b3a474d3904cb626'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.744 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.745 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19fad7fe-1959-40c0-9c23-30175d69097e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:12:00.745179', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'adfd5620-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.904353422, 'message_signature': 'e053d5f75d7c52e9ae395fd65f78a75ed8c77f852c1ce63228036f6d64ce2668'}]}, 'timestamp': '2025-11-28 10:12:00.745751', '_unique_id': '6aed1976ed504acb9871fb8fbb0f8360'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.748 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.759 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.760 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e52de5b8-65b4-4355-b3c2-d938222eb514', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:12:00.748256', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'adff86ca-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.920153228, 'message_signature': '6819390b05057e84ec68e6a99a7a6aec8cc63bba761177fdefe842357788a680'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:12:00.748256', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'adff9a3e-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.920153228, 'message_signature': '0c2f590b69b7dd9aef77d7c3882a6fb95d218118ee1b9a45258073095ce20ec4'}]}, 'timestamp': '2025-11-28 10:12:00.760560', '_unique_id': 'f81dedf2d806441892e04ab8b367b1bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.762 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.763 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9ca7edd-dc05-448a-9e6c-9b8c1dcc1ba2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:12:00.763119', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'ae0012d4-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.904353422, 'message_signature': '5a8e5d77748387904ab037fca8c15e251077cc65982aac4de44123f35049a602'}]}, 'timestamp': '2025-11-28 10:12:00.763601', '_unique_id': '0fe693c0af6742b6aab4983be1471de3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.766 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.766 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.767 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd7f075a7-8e6d-4790-bebd-1d6ef905c41a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:12:00.766513', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ae009704-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': '4a9372c0e7ca3a6fa293e144c2c0a60ddbfe7447fab835b4acc463d20ea9af5f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:12:00.766513', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ae00ad70-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': '833103ade7dcb0f7b56882d5114e30750029b0381979faff5532becd131517fd'}]}, 'timestamp': '2025-11-28 10:12:00.767528', '_unique_id': '6113028193ae42ee8bfb76a1086e48d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.769 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.770 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.770 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44df7095-d35f-4425-ba63-73b5651f6a5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:12:00.769983', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ae01211a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.920153228, 'message_signature': '175e85b6958518ceda7ebd1c3d0aad207adb8275b892ef9d3f67564aef8615c8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:12:00.769983', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ae0131c8-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.920153228, 'message_signature': '7ca87da8c69be1ed33ae93ebeaf5b8176a04de428e6d69d70406ead10344a1de'}]}, 'timestamp': '2025-11-28 10:12:00.770913', '_unique_id': '63f9b24eb39d42efbe7f74f7eddd6d80'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.773 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.773 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d4391ef-266a-4a2a-bfaf-840d0a7c48e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:12:00.773216', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'ae019cee-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.904353422, 'message_signature': 'd7425e312d571964a8459f68514aa947c3997d96ebb1b34c2185cf7b2bc30778'}]}, 'timestamp': '2025-11-28 10:12:00.773687', '_unique_id': 'e0e483c4774848aa85d644dd36d45381'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.775 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.775 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '01867545-9820-49fd-a7d9-2732570c5ba5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:12:00.775850', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'ae0204c2-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.904353422, 'message_signature': 'ba4fece42357cd5c1ad7a1582e500764ca13d814b4d29debb2dc40ef0645b9c9'}]}, 'timestamp': '2025-11-28 10:12:00.776347', '_unique_id': '9e5579a56a7b49059323d058ba85520d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.778 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.778 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97a21ad8-b72b-4fa4-b10d-2da2e12df742', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:12:00.778532', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'ae026c96-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.904353422, 'message_signature': 'b80d03137edfb7685398709824c5742fafb9d6bdad06ae01ea1ea9480bf8162f'}]}, 'timestamp': '2025-11-28 10:12:00.779004', '_unique_id': 'a513b694b35b4d83bb8a36732df1c733'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.781 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.781 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.781 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1d1024b-f0b2-4a8c-a291-f091c2050b32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:12:00.781169', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ae02d014-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': '870355561135e7353bc43eb0388d26374cc75fa26cccbe4cee8fa991b76d6741'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:12:00.781169', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ae02da5a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': 'c18c27627142a5bfbea8ce7d3cb54be6fed76406afdc30bb7cb89df4d2a81c0f'}]}, 'timestamp': '2025-11-28 10:12:00.781708', '_unique_id': '295e12d31b9c42ac942265c94c02fef8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.783 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.783 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eed5f2b9-7982-4b53-9a8e-4fccad2fbd9f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:12:00.783190', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'ae031eca-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.900646917, 'message_signature': 'f7f28cff1e07dbaebc6059d369911bd781e8ed4f6fd77f100550d5dc17d2ed2c'}]}, 'timestamp': '2025-11-28 10:12:00.783467', '_unique_id': 'edd3c134ad7748d0b73f51351421b028'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4bc3a9bf-4d20-41bc-b6de-669f73717003', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:12:00.784785', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'ae035d5e-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.904353422, 'message_signature': '4bf457d53e828a66f29f0b8166efe5ac5d5811685d282befdfc88d074774c7af'}]}, 'timestamp': '2025-11-28 10:12:00.785105', '_unique_id': '3f5cd9ecba4b4f15b46322ed10507431'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.786 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.786 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.786 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5cd9783d-bc38-4036-b293-844f48de0f5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:12:00.786476', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ae039f44-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.920153228, 'message_signature': 'de96cfc28c470a39d418e8c093bb1ca438b0e72c38cf8b63e0b357f9649f5937'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:12:00.786476', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ae03a93a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.920153228, 'message_signature': '67b7c4ee0230fbee1444e5a98fb835df35b8964edad5f0c3a2f291c939284c32'}]}, 'timestamp': '2025-11-28 10:12:00.787003', '_unique_id': 'c2fc3bda007f41f493d12dde39edcdfd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.788 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.788 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da204656-c739-4793-8d2b-4f1488f509c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:12:00.788404', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'ae03eabc-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.904353422, 'message_signature': '1882b6e646512a401a622c0915963f7d4fb80afe9cc74465028c0b8d0d2f1fa2'}]}, 'timestamp': '2025-11-28 10:12:00.788718', '_unique_id': '296395c5f14c43d4bb5ecc5d17666c86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.790 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.790 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.790 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37f4adf5-f5c7-4cfa-b29a-c657102ae708', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:12:00.790218', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ae043198-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': 'a7a4816e418c86249a6a877e29334beda793c046b0a7dac405a77202ae974bae'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:12:00.790218', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ae043ba2-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': 'cb96d6492e1a8bb875417a89ec150933b1cda85252e735dd4319b8357ef782c4'}]}, 'timestamp': '2025-11-28 10:12:00.790751', '_unique_id': '25ca6f66f0a349de8e8eefb87d49bd4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.792 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.792 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.792 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59d952e6-6bd2-4071-b113-f08d4688765b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:12:00.792214', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ae047f9a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': 'eca015218ef70e69391b5ec50d36875fc7f3dbbd8091655cba3aa1c821c621fb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:12:00.792214', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ae048c6a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': 'd8bb8ea6f0f772059aa1280094d6d606c6b585715b20f25b8caf840463c58f97'}]}, 'timestamp': '2025-11-28 10:12:00.792824', '_unique_id': '0e15d9c9ec8b4c3d803f06ae863d8177'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.794 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.794 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.794 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a6c3683-c6a0-4ec0-bc67-305b185c68a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:12:00.794269', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ae04cfae-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': '0c04899b68a716087af260afd8f88d6e6900957b64b637afe8cf02bac02db875'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:12:00.794269', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ae04da26-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': '019c15abb402b855fdd9fa74f28eb7121edcc069c7741a4cb888be0e79c56bc4'}]}, 'timestamp': '2025-11-28 10:12:00.794809', '_unique_id': '85fe3616559b40c1a3c21cd2297dacbe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:12:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.796 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538513.localdomain dnsmasq[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/addn_hosts - 1 addresses
Nov 28 10:12:00 np0005538513.localdomain dnsmasq-dhcp[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/host
Nov 28 10:12:00 np0005538513.localdomain dnsmasq-dhcp[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/opts
Nov 28 10:12:00 np0005538513.localdomain podman[329998]: 2025-11-28 10:12:00.969157934 +0000 UTC m=+0.063303291 container kill c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1bac4260-cc00-4d23-940f-68536ef7d308, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 28 10:12:01 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e277 do_prune osdmap full prune enabled
Nov 28 10:12:01 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e278 e278: 6 total, 6 up, 6 in
Nov 28 10:12:01 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e278: 6 total, 6 up, 6 in
Nov 28 10:12:01 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:12:01.245 261084 INFO neutron.agent.dhcp.agent [None req-9c7016f0-fd36-48f0-bfe7-253cb23f517c - - - - - -] DHCP configuration for ports {'bcda3fd1-8481-4078-8919-c9d7f3e1c8a6'} is completed
Nov 28 10:12:01 np0005538513.localdomain ceph-mon[292954]: pgmap v605: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 179 KiB/s wr, 14 op/s
Nov 28 10:12:01 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "11d8e4ee-62d3-434d-8e1b-ad29c40b9e5f", "format": "json"}]: dispatch
Nov 28 10:12:01 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7c140f95-3467-4116-9762-d08cfc663c93", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:12:01 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:12:01 np0005538513.localdomain ceph-mon[292954]: osdmap e278: 6 total, 6 up, 6 in
Nov 28 10:12:01 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:12:01 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:01 np0005538513.localdomain systemd[1]: tmp-crun.UOkQTy.mount: Deactivated successfully.
Nov 28 10:12:01 np0005538513.localdomain podman[330019]: 2025-11-28 10:12:01.822322953 +0000 UTC m=+0.058069160 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:12:01 np0005538513.localdomain podman[330019]: 2025-11-28 10:12:01.856482375 +0000 UTC m=+0.092228572 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:12:01 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:12:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:02.092 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:02 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7c140f95-3467-4116-9762-d08cfc663c93", "format": "json"}]: dispatch
Nov 28 10:12:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:12:02 np0005538513.localdomain podman[330042]: 2025-11-28 10:12:02.852194395 +0000 UTC m=+0.088601540 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 28 10:12:02 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:12:02 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:02 np0005538513.localdomain podman[330042]: 2025-11-28 10:12:02.886590496 +0000 UTC m=+0.122997631 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 28 10:12:02 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:02 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:12:03 np0005538513.localdomain ceph-mon[292954]: pgmap v607: 177 pgs: 6 active+clean+snaptrim_wait, 4 active+clean+snaptrim, 167 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 170 KiB/s wr, 13 op/s
Nov 28 10:12:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:12:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:03 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:03.655 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:12:04 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:12:05 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e278 do_prune osdmap full prune enabled
Nov 28 10:12:05 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7c140f95-3467-4116-9762-d08cfc663c93", "snap_name": "15a14553-6d16-4d08-bf64-a6b8c8a6c750", "format": "json"}]: dispatch
Nov 28 10:12:05 np0005538513.localdomain ceph-mon[292954]: pgmap v608: 177 pgs: 6 active+clean+snaptrim_wait, 4 active+clean+snaptrim, 167 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 170 KiB/s wr, 13 op/s
Nov 28 10:12:05 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e279 e279: 6 total, 6 up, 6 in
Nov 28 10:12:05 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e279: 6 total, 6 up, 6 in
Nov 28 10:12:06 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Nov 28 10:12:06 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:12:06 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:12:06 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e279 do_prune osdmap full prune enabled
Nov 28 10:12:06 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "11d8e4ee-62d3-434d-8e1b-ad29c40b9e5f_69807d00-efbf-4837-b2f8-58c7e5436c8f", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:06 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "11d8e4ee-62d3-434d-8e1b-ad29c40b9e5f", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:06 np0005538513.localdomain ceph-mon[292954]: osdmap e279: 6 total, 6 up, 6 in
Nov 28 10:12:06 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:12:06 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:12:06 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:12:06 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:12:06 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e280 e280: 6 total, 6 up, 6 in
Nov 28 10:12:06 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e280: 6 total, 6 up, 6 in
Nov 28 10:12:06 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e280 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:07.095 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:12:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:07.097 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:12:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:07.097 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:12:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:07.097 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:12:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:07.140 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:07.141 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:12:07 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e52: np0005538515.yfkzhl(active, since 14m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:12:07 np0005538513.localdomain ceph-mon[292954]: pgmap v610: 177 pgs: 6 active+clean+snaptrim_wait, 4 active+clean+snaptrim, 167 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 89 KiB/s wr, 7 op/s
Nov 28 10:12:07 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:12:07 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:12:07 np0005538513.localdomain ceph-mon[292954]: osdmap e280: 6 total, 6 up, 6 in
Nov 28 10:12:07 np0005538513.localdomain ceph-mon[292954]: mgrmap e52: np0005538515.yfkzhl(active, since 14m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:12:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:12:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:12:07 np0005538513.localdomain podman[330061]: 2025-11-28 10:12:07.844540134 +0000 UTC m=+0.080234994 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 10:12:07 np0005538513.localdomain podman[330061]: 2025-11-28 10:12:07.8603255 +0000 UTC m=+0.096020390 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Nov 28 10:12:07 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:12:07 np0005538513.localdomain podman[330062]: 2025-11-28 10:12:07.951310774 +0000 UTC m=+0.181191424 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:12:08 np0005538513.localdomain podman[330062]: 2025-11-28 10:12:08.018620518 +0000 UTC m=+0.248501158 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 10:12:08 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:12:08 np0005538513.localdomain ceph-mon[292954]: pgmap v612: 177 pgs: 177 active+clean; 235 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 3.2 MiB/s wr, 57 op/s
Nov 28 10:12:08 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7c140f95-3467-4116-9762-d08cfc663c93", "snap_name": "15a14553-6d16-4d08-bf64-a6b8c8a6c750_fac779c6-92c9-42a2-b987-c4a675360397", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:08 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7c140f95-3467-4116-9762-d08cfc663c93", "snap_name": "15a14553-6d16-4d08-bf64-a6b8c8a6c750", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:08 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "fff24b7d-2f6c-4225-9289-9551d74f54fa", "format": "json"}]: dispatch
Nov 28 10:12:09 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:12:09 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:09 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:09 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:12:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:12:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:09 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:10 np0005538513.localdomain podman[238687]: time="2025-11-28T10:12:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:12:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:12:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157511 "" "Go-http-client/1.1"
Nov 28 10:12:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:12:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19755 "" "Go-http-client/1.1"
Nov 28 10:12:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e280 do_prune osdmap full prune enabled
Nov 28 10:12:10 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e281 e281: 6 total, 6 up, 6 in
Nov 28 10:12:10 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e281: 6 total, 6 up, 6 in
Nov 28 10:12:10 np0005538513.localdomain ceph-mon[292954]: pgmap v613: 177 pgs: 177 active+clean; 235 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 2.7 MiB/s wr, 41 op/s
Nov 28 10:12:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e281 do_prune osdmap full prune enabled
Nov 28 10:12:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e282 e282: 6 total, 6 up, 6 in
Nov 28 10:12:11 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e282: 6 total, 6 up, 6 in
Nov 28 10:12:11 np0005538513.localdomain ceph-mon[292954]: osdmap e281: 6 total, 6 up, 6 in
Nov 28 10:12:11 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7c140f95-3467-4116-9762-d08cfc663c93", "format": "json"}]: dispatch
Nov 28 10:12:11 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7c140f95-3467-4116-9762-d08cfc663c93", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:11 np0005538513.localdomain ceph-mon[292954]: osdmap e282: 6 total, 6 up, 6 in
Nov 28 10:12:12 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:12:12Z|00478|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:12:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:12.127 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:12.140 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:12.143 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:12 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "fff24b7d-2f6c-4225-9289-9551d74f54fa_ead2690c-f6f8-4b05-b16e-a6295cb33d61", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:12 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "fff24b7d-2f6c-4225-9289-9551d74f54fa", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:12 np0005538513.localdomain ceph-mon[292954]: pgmap v616: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 3.7 MiB/s wr, 92 op/s
Nov 28 10:12:12 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Nov 28 10:12:12 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:12:12 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:12:13 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:12:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:12:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:12:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:12:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:12:13 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:12:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1973228268' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:12:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1973228268' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:12:13 np0005538513.localdomain dnsmasq[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/addn_hosts - 0 addresses
Nov 28 10:12:13 np0005538513.localdomain podman[330123]: 2025-11-28 10:12:13.906238678 +0000 UTC m=+0.065817618 container kill c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1bac4260-cc00-4d23-940f-68536ef7d308, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 10:12:13 np0005538513.localdomain dnsmasq-dhcp[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/host
Nov 28 10:12:13 np0005538513.localdomain dnsmasq-dhcp[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/opts
Nov 28 10:12:14 np0005538513.localdomain kernel: device tapf3135b0f-56 left promiscuous mode
Nov 28 10:12:14 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:12:14Z|00479|binding|INFO|Releasing lport f3135b0f-56fc-476c-b4e5-c9e5a120aa9d from this chassis (sb_readonly=0)
Nov 28 10:12:14 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:12:14Z|00480|binding|INFO|Setting lport f3135b0f-56fc-476c-b4e5-c9e5a120aa9d down in Southbound
Nov 28 10:12:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:14.241 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:14.260 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:14 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:12:14.268 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-1bac4260-cc00-4d23-940f-68536ef7d308', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1bac4260-cc00-4d23-940f-68536ef7d308', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '729a918c0a8248ff9fef91d8e41e340a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a10c622-42c8-4119-8cc0-4b51720ab1bd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=f3135b0f-56fc-476c-b4e5-c9e5a120aa9d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:12:14 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:12:14.269 158130 INFO neutron.agent.ovn.metadata.agent [-] Port f3135b0f-56fc-476c-b4e5-c9e5a120aa9d in datapath 1bac4260-cc00-4d23-940f-68536ef7d308 unbound from our chassis
Nov 28 10:12:14 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:12:14.271 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1bac4260-cc00-4d23-940f-68536ef7d308, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:12:14 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:12:14.271 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[6cfbc432-3b9c-4d99-818a-652a8ce2cb69]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:12:14 np0005538513.localdomain ceph-mon[292954]: pgmap v617: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 226 KiB/s wr, 42 op/s
Nov 28 10:12:15 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:12:15Z|00481|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:12:15 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:15.500 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e282 do_prune osdmap full prune enabled
Nov 28 10:12:15 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "fde565fd-d88b-4da4-8499-bcfdb95820a1", "format": "json"}]: dispatch
Nov 28 10:12:15 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e283 e283: 6 total, 6 up, 6 in
Nov 28 10:12:15 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e283: 6 total, 6 up, 6 in
Nov 28 10:12:15 np0005538513.localdomain systemd[1]: tmp-crun.XCF4Gz.mount: Deactivated successfully.
Nov 28 10:12:15 np0005538513.localdomain dnsmasq[329943]: exiting on receipt of SIGTERM
Nov 28 10:12:15 np0005538513.localdomain podman[330161]: 2025-11-28 10:12:15.932183514 +0000 UTC m=+0.084276089 container kill c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1bac4260-cc00-4d23-940f-68536ef7d308, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:12:15 np0005538513.localdomain systemd[1]: libpod-c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998.scope: Deactivated successfully.
Nov 28 10:12:16 np0005538513.localdomain podman[330175]: 2025-11-28 10:12:16.01452102 +0000 UTC m=+0.068097049 container died c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1bac4260-cc00-4d23-940f-68536ef7d308, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:12:16 np0005538513.localdomain podman[330175]: 2025-11-28 10:12:16.057272677 +0000 UTC m=+0.110848676 container cleanup c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1bac4260-cc00-4d23-940f-68536ef7d308, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:12:16 np0005538513.localdomain systemd[1]: libpod-conmon-c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998.scope: Deactivated successfully.
Nov 28 10:12:16 np0005538513.localdomain podman[330177]: 2025-11-28 10:12:16.10601147 +0000 UTC m=+0.150204540 container remove c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1bac4260-cc00-4d23-940f-68536ef7d308, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:12:16 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:12:16.136 261084 INFO neutron.agent.dhcp.agent [None req-f2b95ac2-9a28-41e2-ba92-89f709e8016d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:12:16 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:12:16.139 261084 INFO neutron.agent.dhcp.agent [None req-f2b95ac2-9a28-41e2-ba92-89f709e8016d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:12:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:12:16 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:16 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e283 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e283 do_prune osdmap full prune enabled
Nov 28 10:12:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e284 e284: 6 total, 6 up, 6 in
Nov 28 10:12:16 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e284: 6 total, 6 up, 6 in
Nov 28 10:12:16 np0005538513.localdomain ceph-mon[292954]: osdmap e283: 6 total, 6 up, 6 in
Nov 28 10:12:16 np0005538513.localdomain ceph-mon[292954]: pgmap v619: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 141 KiB/s wr, 36 op/s
Nov 28 10:12:16 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:12:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:12:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:16 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:16 np0005538513.localdomain ceph-mon[292954]: osdmap e284: 6 total, 6 up, 6 in
Nov 28 10:12:16 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-53803e0e67ab6fc96360e1bca7f403c3373d728f0094543b1385d739dc6ce18a-merged.mount: Deactivated successfully.
Nov 28 10:12:16 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998-userdata-shm.mount: Deactivated successfully.
Nov 28 10:12:16 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d1bac4260\x2dcc00\x2d4d23\x2d940f\x2d68536ef7d308.mount: Deactivated successfully.
Nov 28 10:12:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:12:16 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:12:17 np0005538513.localdomain podman[330203]: 2025-11-28 10:12:17.050693188 +0000 UTC m=+0.102781238 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:12:17 np0005538513.localdomain podman[330203]: 2025-11-28 10:12:17.063392139 +0000 UTC m=+0.115480239 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:12:17 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:12:17 np0005538513.localdomain podman[330204]: 2025-11-28 10:12:17.17767497 +0000 UTC m=+0.222444044 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 10:12:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:17.179 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:17 np0005538513.localdomain podman[330204]: 2025-11-28 10:12:17.192325422 +0000 UTC m=+0.237094446 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:12:17 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:12:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:12:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:12:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:12:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:12:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:12:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:12:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:12:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:12:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:12:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:12:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:12:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:12:19 np0005538513.localdomain ceph-mon[292954]: pgmap v621: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 831 B/s rd, 130 KiB/s wr, 11 op/s
Nov 28 10:12:19 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Nov 28 10:12:19 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:12:19 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:12:20 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:12:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:12:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:12:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:12:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:12:20 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:12:20 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "fde565fd-d88b-4da4-8499-bcfdb95820a1_3ca8ffc5-e61a-4a5f-afb0-e3f73884eb70", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:20 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "fde565fd-d88b-4da4-8499-bcfdb95820a1", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:21 np0005538513.localdomain ceph-mon[292954]: pgmap v622: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 639 B/s rd, 100 KiB/s wr, 8 op/s
Nov 28 10:12:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e284 do_prune osdmap full prune enabled
Nov 28 10:12:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e285 e285: 6 total, 6 up, 6 in
Nov 28 10:12:21 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e285: 6 total, 6 up, 6 in
Nov 28 10:12:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:22.183 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:22.186 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:22 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:12:22.266 261084 INFO neutron.agent.linux.ip_lib [None req-8117c4f7-4e76-4baf-9275-077c0fb7af16 - - - - - -] Device tapdf88dd7c-43 cannot be used as it has no MAC address
Nov 28 10:12:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:22.294 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:22 np0005538513.localdomain kernel: device tapdf88dd7c-43 entered promiscuous mode
Nov 28 10:12:22 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:12:22Z|00482|binding|INFO|Claiming lport df88dd7c-4397-4078-b7e1-7fbf48d503b7 for this chassis.
Nov 28 10:12:22 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324742.3045] manager: (tapdf88dd7c-43): new Generic device (/org/freedesktop/NetworkManager/Devices/76)
Nov 28 10:12:22 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:12:22Z|00483|binding|INFO|df88dd7c-4397-4078-b7e1-7fbf48d503b7: Claiming unknown
Nov 28 10:12:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:22.302 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:22 np0005538513.localdomain systemd-udevd[330254]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:12:22 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:12:22.319 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-36d3914b-0866-44d3-8d61-e3876a797a40', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-36d3914b-0866-44d3-8d61-e3876a797a40', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a5eaeee8da34a54a7b5240b18a0f9b2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb560543-1290-4f49-91f1-518ecd990906, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=df88dd7c-4397-4078-b7e1-7fbf48d503b7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:12:22 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:12:22.321 158130 INFO neutron.agent.ovn.metadata.agent [-] Port df88dd7c-4397-4078-b7e1-7fbf48d503b7 in datapath 36d3914b-0866-44d3-8d61-e3876a797a40 bound to our chassis
Nov 28 10:12:22 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:12:22.323 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port bd3edd05-3839-4526-8e12-85548b39b0b0 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:12:22 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:12:22.323 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 36d3914b-0866-44d3-8d61-e3876a797a40, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:12:22 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:12:22.324 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[1f4a7576-9528-4c42-aa0a-893bb05c7376]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:12:22 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdf88dd7c-43: No such device
Nov 28 10:12:22 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:12:22Z|00484|binding|INFO|Setting lport df88dd7c-4397-4078-b7e1-7fbf48d503b7 ovn-installed in OVS
Nov 28 10:12:22 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:12:22Z|00485|binding|INFO|Setting lport df88dd7c-4397-4078-b7e1-7fbf48d503b7 up in Southbound
Nov 28 10:12:22 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdf88dd7c-43: No such device
Nov 28 10:12:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:22.341 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:22 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdf88dd7c-43: No such device
Nov 28 10:12:22 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdf88dd7c-43: No such device
Nov 28 10:12:22 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdf88dd7c-43: No such device
Nov 28 10:12:22 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdf88dd7c-43: No such device
Nov 28 10:12:22 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdf88dd7c-43: No such device
Nov 28 10:12:22 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tapdf88dd7c-43: No such device
Nov 28 10:12:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:22.382 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:22.413 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:22 np0005538513.localdomain ceph-mon[292954]: osdmap e285: 6 total, 6 up, 6 in
Nov 28 10:12:22 np0005538513.localdomain ceph-mon[292954]: pgmap v624: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 224 KiB/s wr, 18 op/s
Nov 28 10:12:22 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e285 do_prune osdmap full prune enabled
Nov 28 10:12:22 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e286 e286: 6 total, 6 up, 6 in
Nov 28 10:12:22 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e286: 6 total, 6 up, 6 in
Nov 28 10:12:22 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:12:22 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:22 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:23 np0005538513.localdomain podman[330324]: 
Nov 28 10:12:23 np0005538513.localdomain podman[330324]: 2025-11-28 10:12:23.32576063 +0000 UTC m=+0.090351515 container create b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36d3914b-0866-44d3-8d61-e3876a797a40, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 28 10:12:23 np0005538513.localdomain podman[330324]: 2025-11-28 10:12:23.281567178 +0000 UTC m=+0.046158083 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:12:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:23.385 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:23 np0005538513.localdomain systemd[1]: Started libpod-conmon-b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f.scope.
Nov 28 10:12:23 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:12:23 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/398491bd3154902fd56e5c19ffa0b65563dff195d8ba193f4ec0c429240248ab/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:12:23 np0005538513.localdomain podman[330324]: 2025-11-28 10:12:23.432698305 +0000 UTC m=+0.197289190 container init b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36d3914b-0866-44d3-8d61-e3876a797a40, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 28 10:12:23 np0005538513.localdomain podman[330324]: 2025-11-28 10:12:23.447981895 +0000 UTC m=+0.212572790 container start b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36d3914b-0866-44d3-8d61-e3876a797a40, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 10:12:23 np0005538513.localdomain dnsmasq[330342]: started, version 2.85 cachesize 150
Nov 28 10:12:23 np0005538513.localdomain dnsmasq[330342]: DNS service limited to local subnets
Nov 28 10:12:23 np0005538513.localdomain dnsmasq[330342]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:12:23 np0005538513.localdomain dnsmasq[330342]: warning: no upstream servers configured
Nov 28 10:12:23 np0005538513.localdomain dnsmasq-dhcp[330342]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:12:23 np0005538513.localdomain dnsmasq[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/addn_hosts - 0 addresses
Nov 28 10:12:23 np0005538513.localdomain dnsmasq-dhcp[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/host
Nov 28 10:12:23 np0005538513.localdomain dnsmasq-dhcp[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/opts
Nov 28 10:12:23 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:12:23.588 261084 INFO neutron.agent.dhcp.agent [None req-d484a65a-4241-4462-85e1-84ab3923c5e7 - - - - - -] DHCP configuration for ports {'ee15ef05-e21c-4932-8e4b-a1142faa7a40'} is completed
Nov 28 10:12:23 np0005538513.localdomain ceph-mon[292954]: osdmap e286: 6 total, 6 up, 6 in
Nov 28 10:12:23 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:12:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:12:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:23 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "91be3195-e4e7-4f0e-9446-564a14aa6ee3", "format": "json"}]: dispatch
Nov 28 10:12:24 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:12:24.239 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:12:23Z, description=, device_id=a2e46648-c204-4aa4-852a-22559e830378, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65c5dc0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65c5b20>], id=92b58e02-a654-4139-b6b8-8d7bd52e9ded, ip_allocation=immediate, mac_address=fa:16:3e:04:23:47, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:12:19Z, description=, dns_domain=, id=36d3914b-0866-44d3-8d61-e3876a797a40, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIMysqlTest-289637141-network, port_security_enabled=True, project_id=2a5eaeee8da34a54a7b5240b18a0f9b2, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=45868, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3753, status=ACTIVE, subnets=['99c9b187-4788-4ff8-bc75-80da88278afe'], tags=[], tenant_id=2a5eaeee8da34a54a7b5240b18a0f9b2, updated_at=2025-11-28T10:12:20Z, vlan_transparent=None, network_id=36d3914b-0866-44d3-8d61-e3876a797a40, port_security_enabled=False, project_id=2a5eaeee8da34a54a7b5240b18a0f9b2, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3766, status=DOWN, tags=[], tenant_id=2a5eaeee8da34a54a7b5240b18a0f9b2, updated_at=2025-11-28T10:12:23Z on network 36d3914b-0866-44d3-8d61-e3876a797a40
Nov 28 10:12:24 np0005538513.localdomain systemd[1]: tmp-crun.CByurc.mount: Deactivated successfully.
Nov 28 10:12:24 np0005538513.localdomain dnsmasq[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/addn_hosts - 1 addresses
Nov 28 10:12:24 np0005538513.localdomain dnsmasq-dhcp[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/host
Nov 28 10:12:24 np0005538513.localdomain dnsmasq-dhcp[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/opts
Nov 28 10:12:24 np0005538513.localdomain podman[330360]: 2025-11-28 10:12:24.406096906 +0000 UTC m=+0.039420565 container kill b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36d3914b-0866-44d3-8d61-e3876a797a40, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:12:24 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:12:24.762 261084 INFO neutron.agent.dhcp.agent [None req-be4a8c81-5fee-47e9-a6d9-ea591aeb002c - - - - - -] DHCP configuration for ports {'92b58e02-a654-4139-b6b8-8d7bd52e9ded'} is completed
Nov 28 10:12:24 np0005538513.localdomain ceph-mon[292954]: pgmap v626: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 286 B/s rd, 77 KiB/s wr, 5 op/s
Nov 28 10:12:25 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:12:25.883 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:12:23Z, description=, device_id=a2e46648-c204-4aa4-852a-22559e830378, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6426250>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd64268b0>], id=92b58e02-a654-4139-b6b8-8d7bd52e9ded, ip_allocation=immediate, mac_address=fa:16:3e:04:23:47, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:12:19Z, description=, dns_domain=, id=36d3914b-0866-44d3-8d61-e3876a797a40, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIMysqlTest-289637141-network, port_security_enabled=True, project_id=2a5eaeee8da34a54a7b5240b18a0f9b2, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=45868, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3753, status=ACTIVE, subnets=['99c9b187-4788-4ff8-bc75-80da88278afe'], tags=[], tenant_id=2a5eaeee8da34a54a7b5240b18a0f9b2, updated_at=2025-11-28T10:12:20Z, vlan_transparent=None, network_id=36d3914b-0866-44d3-8d61-e3876a797a40, port_security_enabled=False, project_id=2a5eaeee8da34a54a7b5240b18a0f9b2, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3766, status=DOWN, tags=[], tenant_id=2a5eaeee8da34a54a7b5240b18a0f9b2, updated_at=2025-11-28T10:12:23Z on network 36d3914b-0866-44d3-8d61-e3876a797a40
Nov 28 10:12:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Nov 28 10:12:26 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:12:26 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:12:26 np0005538513.localdomain dnsmasq[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/addn_hosts - 1 addresses
Nov 28 10:12:26 np0005538513.localdomain podman[330398]: 2025-11-28 10:12:26.130443507 +0000 UTC m=+0.063252920 container kill b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36d3914b-0866-44d3-8d61-e3876a797a40, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:12:26 np0005538513.localdomain dnsmasq-dhcp[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/host
Nov 28 10:12:26 np0005538513.localdomain dnsmasq-dhcp[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/opts
Nov 28 10:12:26 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:12:26.461 261084 INFO neutron.agent.dhcp.agent [None req-84a10848-0264-4341-ba7d-7b4ab3b4b005 - - - - - -] DHCP configuration for ports {'92b58e02-a654-4139-b6b8-8d7bd52e9ded'} is completed
Nov 28 10:12:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:27 np0005538513.localdomain ceph-mon[292954]: pgmap v627: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 69 KiB/s wr, 5 op/s
Nov 28 10:12:27 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:12:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:12:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:12:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:12:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:12:27 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:12:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:27.188 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:28 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:12:29 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a152cb30-8073-4c75-8e60-d299f2f23f89", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:12:29 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a152cb30-8073-4c75-8e60-d299f2f23f89", "format": "json"}]: dispatch
Nov 28 10:12:29 np0005538513.localdomain ceph-mon[292954]: pgmap v628: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 151 KiB/s wr, 10 op/s
Nov 28 10:12:29 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "91be3195-e4e7-4f0e-9446-564a14aa6ee3_e616749c-df0b-458b-9f06-7163c80539a0", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:29 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "91be3195-e4e7-4f0e-9446-564a14aa6ee3", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:12:29 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:29 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:29 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:12:29 np0005538513.localdomain podman[330420]: 2025-11-28 10:12:29.847434518 +0000 UTC m=+0.082615868 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 28 10:12:29 np0005538513.localdomain podman[330420]: 2025-11-28 10:12:29.866532866 +0000 UTC m=+0.101714246 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, distribution-scope=public, config_id=edpm, maintainer=Red Hat, Inc., container_name=openstack_network_exporter)
Nov 28 10:12:29 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:12:30 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:12:30 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:12:30 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:30 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:30 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e286 do_prune osdmap full prune enabled
Nov 28 10:12:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e287 e287: 6 total, 6 up, 6 in
Nov 28 10:12:31 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e287: 6 total, 6 up, 6 in
Nov 28 10:12:31 np0005538513.localdomain ceph-mon[292954]: pgmap v629: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 80 KiB/s wr, 5 op/s
Nov 28 10:12:31 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:12:31 np0005538513.localdomain ceph-mon[292954]: osdmap e287: 6 total, 6 up, 6 in
Nov 28 10:12:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e287 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e287 do_prune osdmap full prune enabled
Nov 28 10:12:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e288 e288: 6 total, 6 up, 6 in
Nov 28 10:12:31 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e288: 6 total, 6 up, 6 in
Nov 28 10:12:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:32.191 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:12:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:32.192 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:32.193 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:12:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:32.193 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:12:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:32.194 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:12:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:32.196 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:32 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "535c32b0-23e4-4a2c-bbae-552340590760", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:12:32 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "535c32b0-23e4-4a2c-bbae-552340590760", "format": "json"}]: dispatch
Nov 28 10:12:32 np0005538513.localdomain ceph-mon[292954]: osdmap e288: 6 total, 6 up, 6 in
Nov 28 10:12:32 np0005538513.localdomain sudo[330442]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:12:32 np0005538513.localdomain sudo[330442]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:12:32 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:12:32 np0005538513.localdomain sudo[330442]: pam_unix(sudo:session): session closed for user root
Nov 28 10:12:32 np0005538513.localdomain sudo[330461]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:12:32 np0005538513.localdomain sudo[330461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:12:32 np0005538513.localdomain podman[330460]: 2025-11-28 10:12:32.569049499 +0000 UTC m=+0.096519125 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 10:12:32 np0005538513.localdomain podman[330460]: 2025-11-28 10:12:32.602665225 +0000 UTC m=+0.130134871 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:12:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Nov 28 10:12:32 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:12:32 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:12:32 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:12:33 np0005538513.localdomain sudo[330461]: pam_unix(sudo:session): session closed for user root
Nov 28 10:12:33 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:12:33 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:12:33 np0005538513.localdomain ceph-mon[292954]: pgmap v632: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 178 KiB/s wr, 12 op/s
Nov 28 10:12:33 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:12:33 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:12:33 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:12:33 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:12:33 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:12:33 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:12:33 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:12:33 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:12:33 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:12:33 np0005538513.localdomain sudo[330533]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:12:33 np0005538513.localdomain sudo[330533]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:12:33 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:12:33 np0005538513.localdomain sudo[330533]: pam_unix(sudo:session): session closed for user root
Nov 28 10:12:33 np0005538513.localdomain podman[330551]: 2025-11-28 10:12:33.545076973 +0000 UTC m=+0.070441561 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:12:33 np0005538513.localdomain podman[330551]: 2025-11-28 10:12:33.555460083 +0000 UTC m=+0.080824701 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 10:12:33 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:12:34 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:12:34 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "f5b4c96d-e791-4ea7-b052-7e31c47893d9_0d92dab8-838f-4c0e-9a76-ed8d13f07bb1", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:34 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "f5b4c96d-e791-4ea7-b052-7e31c47893d9", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e288 do_prune osdmap full prune enabled
Nov 28 10:12:35 np0005538513.localdomain ceph-mon[292954]: pgmap v633: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 178 KiB/s wr, 12 op/s
Nov 28 10:12:35 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "535c32b0-23e4-4a2c-bbae-552340590760", "format": "json"}]: dispatch
Nov 28 10:12:35 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "535c32b0-23e4-4a2c-bbae-552340590760", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e289 e289: 6 total, 6 up, 6 in
Nov 28 10:12:35 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e289: 6 total, 6 up, 6 in
Nov 28 10:12:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:12:35 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:35 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:35 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:12:35.952 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:12:35 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:12:35.953 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:12:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:35.980 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:36 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:12:36 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:12:36 np0005538513.localdomain dnsmasq[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/addn_hosts - 0 addresses
Nov 28 10:12:36 np0005538513.localdomain dnsmasq-dhcp[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/host
Nov 28 10:12:36 np0005538513.localdomain podman[330584]: 2025-11-28 10:12:36.174183321 +0000 UTC m=+0.063020542 container kill b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36d3914b-0866-44d3-8d61-e3876a797a40, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:12:36 np0005538513.localdomain dnsmasq-dhcp[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/opts
Nov 28 10:12:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:36.373 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:36 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:12:36Z|00486|binding|INFO|Releasing lport df88dd7c-4397-4078-b7e1-7fbf48d503b7 from this chassis (sb_readonly=0)
Nov 28 10:12:36 np0005538513.localdomain kernel: device tapdf88dd7c-43 left promiscuous mode
Nov 28 10:12:36 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:12:36Z|00487|binding|INFO|Setting lport df88dd7c-4397-4078-b7e1-7fbf48d503b7 down in Southbound
Nov 28 10:12:36 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:12:36.385 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-36d3914b-0866-44d3-8d61-e3876a797a40', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-36d3914b-0866-44d3-8d61-e3876a797a40', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a5eaeee8da34a54a7b5240b18a0f9b2', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb560543-1290-4f49-91f1-518ecd990906, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=df88dd7c-4397-4078-b7e1-7fbf48d503b7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:12:36 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:12:36.387 158130 INFO neutron.agent.ovn.metadata.agent [-] Port df88dd7c-4397-4078-b7e1-7fbf48d503b7 in datapath 36d3914b-0866-44d3-8d61-e3876a797a40 unbound from our chassis
Nov 28 10:12:36 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:12:36.389 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 36d3914b-0866-44d3-8d61-e3876a797a40, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:12:36 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:12:36.390 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[850aebea-eb9f-4304-b8d3-5791c71f9687]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:12:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:36.393 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:36 np0005538513.localdomain ceph-mon[292954]: osdmap e289: 6 total, 6 up, 6 in
Nov 28 10:12:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:12:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:12:36 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e289 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:37.217 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:37 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:12:37 np0005538513.localdomain ceph-mon[292954]: pgmap v635: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 127 KiB/s wr, 9 op/s
Nov 28 10:12:37 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "format": "json"}]: dispatch
Nov 28 10:12:37 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:37 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:12:37Z|00488|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:12:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:37.834 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:38 np0005538513.localdomain dnsmasq[330342]: exiting on receipt of SIGTERM
Nov 28 10:12:38 np0005538513.localdomain podman[330623]: 2025-11-28 10:12:38.625488562 +0000 UTC m=+0.059685600 container kill b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36d3914b-0866-44d3-8d61-e3876a797a40, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:12:38 np0005538513.localdomain systemd[1]: libpod-b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f.scope: Deactivated successfully.
Nov 28 10:12:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:12:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:12:38 np0005538513.localdomain podman[330644]: 2025-11-28 10:12:38.737961508 +0000 UTC m=+0.084334710 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 28 10:12:38 np0005538513.localdomain podman[330637]: 2025-11-28 10:12:38.757716226 +0000 UTC m=+0.114550999 container died b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36d3914b-0866-44d3-8d61-e3876a797a40, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:12:38 np0005538513.localdomain podman[330637]: 2025-11-28 10:12:38.800706031 +0000 UTC m=+0.157540794 container cleanup b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36d3914b-0866-44d3-8d61-e3876a797a40, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 10:12:38 np0005538513.localdomain podman[330644]: 2025-11-28 10:12:38.803186948 +0000 UTC m=+0.149560170 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS)
Nov 28 10:12:38 np0005538513.localdomain systemd[1]: libpod-conmon-b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f.scope: Deactivated successfully.
Nov 28 10:12:38 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:12:38 np0005538513.localdomain podman[330639]: 2025-11-28 10:12:38.893012326 +0000 UTC m=+0.242230606 container remove b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36d3914b-0866-44d3-8d61-e3876a797a40, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:12:38 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:12:38.928 261084 INFO neutron.agent.dhcp.agent [None req-53dd6a79-7cbd-4646-acaf-a1e9c3ae5257 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:12:38 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:12:38.928 261084 INFO neutron.agent.dhcp.agent [None req-53dd6a79-7cbd-4646-acaf-a1e9c3ae5257 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:12:38 np0005538513.localdomain podman[330646]: 2025-11-28 10:12:38.949364502 +0000 UTC m=+0.290801922 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 10:12:39 np0005538513.localdomain podman[330646]: 2025-11-28 10:12:39.05964093 +0000 UTC m=+0.401078290 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 10:12:39 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:12:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Nov 28 10:12:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:12:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:12:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 10:12:39 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 1200.0 total, 600.0 interval
                                                           Cumulative writes: 5604 writes, 37K keys, 5604 commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.05 MB/s
                                                           Cumulative WAL: 5604 writes, 5604 syncs, 1.00 writes per sync, written: 0.06 GB, 0.05 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 2520 writes, 11K keys, 2520 commit groups, 1.0 writes per commit group, ingest: 12.58 MB, 0.02 MB/s
                                                           Interval WAL: 2520 writes, 2520 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    132.5      0.34              0.13        19    0.018       0      0       0.0       0.0
                                                             L6      1/0   17.85 MB   0.0      0.3     0.0      0.3       0.3      0.0       0.0   6.7    172.5    158.3      1.91              0.80        18    0.106    224K   9420       0.0       0.0
                                                            Sum      1/0   17.85 MB   0.0      0.3     0.0      0.3       0.3      0.1       0.0   7.7    146.5    154.4      2.26              0.93        37    0.061    224K   9420       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0  12.0    159.6    162.6      0.82              0.37        14    0.058     96K   3788       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            Low      0/0    0.00 KB   0.0      0.3     0.0      0.3       0.3      0.0       0.0   0.0    172.5    158.3      1.91              0.80        18    0.106    224K   9420       0.0       0.0
                                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    134.2      0.34              0.13        18    0.019       0      0       0.0       0.0
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 1200.0 total, 600.0 interval
                                                           Flush(GB): cumulative 0.044, interval 0.011
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.34 GB write, 0.29 MB/s write, 0.32 GB read, 0.28 MB/s read, 2.3 seconds
                                                           Interval compaction: 0.13 GB write, 0.22 MB/s write, 0.13 GB read, 0.22 MB/s read, 0.8 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x55b5bb679350#2 capacity: 304.00 MB usage: 55.14 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.00081 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(3716,53.69 MB,17.6605%) FilterBlock(37,645.73 KB,0.207434%) IndexBlock(37,843.55 KB,0.270979%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Nov 28 10:12:39 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a152cb30-8073-4c75-8e60-d299f2f23f89", "format": "json"}]: dispatch
Nov 28 10:12:39 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a152cb30-8073-4c75-8e60-d299f2f23f89", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:39 np0005538513.localdomain ceph-mon[292954]: pgmap v636: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.0 KiB/s rd, 280 KiB/s wr, 18 op/s
Nov 28 10:12:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:12:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:12:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:12:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:12:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-398491bd3154902fd56e5c19ffa0b65563dff195d8ba193f4ec0c429240248ab-merged.mount: Deactivated successfully.
Nov 28 10:12:39 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f-userdata-shm.mount: Deactivated successfully.
Nov 28 10:12:39 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d36d3914b\x2d0866\x2d44d3\x2d8d61\x2de3876a797a40.mount: Deactivated successfully.
Nov 28 10:12:40 np0005538513.localdomain podman[238687]: time="2025-11-28T10:12:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:12:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:12:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1"
Nov 28 10:12:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:12:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19277 "" "Go-http-client/1.1"
Nov 28 10:12:40 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:12:40 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:12:40 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:12:40.954 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:12:41 np0005538513.localdomain ceph-mon[292954]: pgmap v637: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 502 B/s rd, 142 KiB/s wr, 8 op/s
Nov 28 10:12:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e289 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e289 do_prune osdmap full prune enabled
Nov 28 10:12:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 e290: 6 total, 6 up, 6 in
Nov 28 10:12:41 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e290: 6 total, 6 up, 6 in
Nov 28 10:12:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:42.220 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:12:42 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:42 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:42 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup create", "vol_name": "cephfs", "group_name": "a1a9bb3f-1d01-4004-8dd6-66f4f23a71c5", "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:12:42 np0005538513.localdomain ceph-mon[292954]: osdmap e290: 6 total, 6 up, 6 in
Nov 28 10:12:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:12:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:43 np0005538513.localdomain ceph-mon[292954]: pgmap v639: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 228 KiB/s wr, 13 op/s
Nov 28 10:12:43 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:12:45 np0005538513.localdomain ceph-mon[292954]: pgmap v640: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 963 B/s rd, 215 KiB/s wr, 12 op/s
Nov 28 10:12:45 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup rm", "vol_name": "cephfs", "group_name": "a1a9bb3f-1d01-4004-8dd6-66f4f23a71c5", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:45 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Nov 28 10:12:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Nov 28 10:12:45 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:12:45 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:12:46 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:12:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:12:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:12:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:12:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:12:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:47.223 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:47.253 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:47 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:12:47 np0005538513.localdomain ceph-mon[292954]: pgmap v641: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 183 KiB/s wr, 10 op/s
Nov 28 10:12:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:12:47 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:12:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:47.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:12:47 np0005538513.localdomain systemd[1]: tmp-crun.PrVOQp.mount: Deactivated successfully.
Nov 28 10:12:47 np0005538513.localdomain podman[330708]: 2025-11-28 10:12:47.870244087 +0000 UTC m=+0.098133944 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:12:47 np0005538513.localdomain podman[330708]: 2025-11-28 10:12:47.915774191 +0000 UTC m=+0.143664018 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 10:12:47 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:12:47 np0005538513.localdomain podman[330707]: 2025-11-28 10:12:47.919195256 +0000 UTC m=+0.150401175 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:12:47 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:12:47.977 261084 INFO neutron.agent.linux.ip_lib [None req-b40bfd1e-1a73-42c6-a455-63ef18baedf2 - - - - - -] Device tap1738dc88-e5 cannot be used as it has no MAC address
Nov 28 10:12:48 np0005538513.localdomain podman[330707]: 2025-11-28 10:12:48.003626378 +0000 UTC m=+0.234832297 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:12:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:48.050 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:48 np0005538513.localdomain kernel: device tap1738dc88-e5 entered promiscuous mode
Nov 28 10:12:48 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:12:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:48.058 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:48 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:12:48Z|00489|binding|INFO|Claiming lport 1738dc88-e5f5-4680-ab9e-8f550f7bd83e for this chassis.
Nov 28 10:12:48 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:12:48Z|00490|binding|INFO|1738dc88-e5f5-4680-ab9e-8f550f7bd83e: Claiming unknown
Nov 28 10:12:48 np0005538513.localdomain NetworkManager[5967]: <info>  [1764324768.0643] manager: (tap1738dc88-e5): new Generic device (/org/freedesktop/NetworkManager/Devices/77)
Nov 28 10:12:48 np0005538513.localdomain systemd-udevd[330759]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:12:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:12:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:12:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:12:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:12:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:12:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:12:48 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap1738dc88-e5: No such device
Nov 28 10:12:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:12:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:12:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:12:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:12:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:12:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:12:48 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:12:48Z|00491|binding|INFO|Setting lport 1738dc88-e5f5-4680-ab9e-8f550f7bd83e ovn-installed in OVS
Nov 28 10:12:48 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap1738dc88-e5: No such device
Nov 28 10:12:48 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap1738dc88-e5: No such device
Nov 28 10:12:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:48.102 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:48 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap1738dc88-e5: No such device
Nov 28 10:12:48 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap1738dc88-e5: No such device
Nov 28 10:12:48 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap1738dc88-e5: No such device
Nov 28 10:12:48 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap1738dc88-e5: No such device
Nov 28 10:12:48 np0005538513.localdomain virtnodedevd[227875]: ethtool ioctl error on tap1738dc88-e5: No such device
Nov 28 10:12:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:48.137 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:48 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:12:48Z|00492|binding|INFO|Setting lport 1738dc88-e5f5-4680-ab9e-8f550f7bd83e up in Southbound
Nov 28 10:12:48 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:12:48.141 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-41370116-60b0-4433-ab19-12e9b7026582', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41370116-60b0-4433-ab19-12e9b7026582', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ce454c94ec74465ac8200d5fe0b153e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c1903e82-d4b6-46fe-9759-cfad1c16d3d4, chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=1738dc88-e5f5-4680-ab9e-8f550f7bd83e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:12:48 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:12:48.143 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 1738dc88-e5f5-4680-ab9e-8f550f7bd83e in datapath 41370116-60b0-4433-ab19-12e9b7026582 bound to our chassis
Nov 28 10:12:48 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:12:48.144 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 41370116-60b0-4433-ab19-12e9b7026582 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:12:48 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:12:48.145 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[931c7037-1d94-437e-9ad6-1dd4ed871bfe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:12:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:48.169 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:48 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup create", "vol_name": "cephfs", "group_name": "32cdd268-82b3-4d40-9663-e0bea21b28c3", "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:12:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:48.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:12:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:48.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:12:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:48.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:12:48 np0005538513.localdomain systemd[1]: tmp-crun.ayvavn.mount: Deactivated successfully.
Nov 28 10:12:49 np0005538513.localdomain podman[330830]: 
Nov 28 10:12:49 np0005538513.localdomain podman[330830]: 2025-11-28 10:12:49.404583814 +0000 UTC m=+0.089034824 container create fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41370116-60b0-4433-ab19-12e9b7026582, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 10:12:49 np0005538513.localdomain podman[330830]: 2025-11-28 10:12:49.360556337 +0000 UTC m=+0.045007387 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:12:49 np0005538513.localdomain systemd[1]: Started libpod-conmon-fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79.scope.
Nov 28 10:12:49 np0005538513.localdomain systemd[1]: tmp-crun.d64MIh.mount: Deactivated successfully.
Nov 28 10:12:49 np0005538513.localdomain systemd[1]: Started libcrun container.
Nov 28 10:12:49 np0005538513.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f48131d4cd50d23f0a08edebbccc523a267ca2a1360891739e8c240e1e9f0859/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:12:49 np0005538513.localdomain podman[330830]: 2025-11-28 10:12:49.533224898 +0000 UTC m=+0.217675918 container init fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41370116-60b0-4433-ab19-12e9b7026582, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:12:49 np0005538513.localdomain podman[330830]: 2025-11-28 10:12:49.542216295 +0000 UTC m=+0.226667305 container start fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41370116-60b0-4433-ab19-12e9b7026582, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:12:49 np0005538513.localdomain dnsmasq[330849]: started, version 2.85 cachesize 150
Nov 28 10:12:49 np0005538513.localdomain dnsmasq[330849]: DNS service limited to local subnets
Nov 28 10:12:49 np0005538513.localdomain dnsmasq[330849]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:12:49 np0005538513.localdomain dnsmasq[330849]: warning: no upstream servers configured
Nov 28 10:12:49 np0005538513.localdomain dnsmasq-dhcp[330849]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:12:49 np0005538513.localdomain dnsmasq[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/addn_hosts - 0 addresses
Nov 28 10:12:49 np0005538513.localdomain dnsmasq-dhcp[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/host
Nov 28 10:12:49 np0005538513.localdomain dnsmasq-dhcp[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/opts
Nov 28 10:12:49 np0005538513.localdomain ceph-mon[292954]: pgmap v642: 177 pgs: 177 active+clean; 220 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 140 KiB/s wr, 8 op/s
Nov 28 10:12:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:12:49 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:49 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:49 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:12:49.738 261084 INFO neutron.agent.dhcp.agent [None req-c8fb3da4-a500-46e8-9b84-1d4e2d342699 - - - - - -] DHCP configuration for ports {'30dc80b5-018b-40de-ae8c-72305a0bb063'} is completed
Nov 28 10:12:50 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:12:50 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:12:50 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:50 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:50 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:12:50.849 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:12:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:12:50.850 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:12:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:12:50.851 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:12:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:51.515 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:51 np0005538513.localdomain ceph-mon[292954]: pgmap v643: 177 pgs: 177 active+clean; 220 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 140 KiB/s wr, 8 op/s
Nov 28 10:12:51 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup rm", "vol_name": "cephfs", "group_name": "32cdd268-82b3-4d40-9663-e0bea21b28c3", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:51 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:12:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:51.769 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:51.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:12:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:51.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:12:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:52.228 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:52 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "2f14e343-1793-4cb4-b8c5-3069f088f109", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:12:52 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "2f14e343-1793-4cb4-b8c5-3069f088f109", "format": "json"}]: dispatch
Nov 28 10:12:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:52.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:12:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:52.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:12:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:52.814 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:12:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:52.815 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:12:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:52.815 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:12:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:52.815 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:12:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:52.816 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:12:53 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Nov 28 10:12:53 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:12:53 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:12:53 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:12:53 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2706457592' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:12:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:53.268 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:12:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:12:53.294 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:12:52Z, description=, device_id=c6a3c308-ab33-40a1-9933-91a047698d13, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65c1d30>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd65c1d60>], id=5b58bb5a-3cf7-4593-ab3b-2d1f27a01a18, ip_allocation=immediate, mac_address=fa:16:3e:7c:e0:a5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:12:46Z, description=, dns_domain=, id=41370116-60b0-4433-ab19-12e9b7026582, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-1537040395-network, port_security_enabled=True, project_id=8ce454c94ec74465ac8200d5fe0b153e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25437, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3799, status=ACTIVE, subnets=['129cbc91-ddc3-4c68-80c6-e69ac70a7c43'], tags=[], tenant_id=8ce454c94ec74465ac8200d5fe0b153e, updated_at=2025-11-28T10:12:47Z, vlan_transparent=None, network_id=41370116-60b0-4433-ab19-12e9b7026582, port_security_enabled=False, project_id=8ce454c94ec74465ac8200d5fe0b153e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3807, status=DOWN, tags=[], tenant_id=8ce454c94ec74465ac8200d5fe0b153e, updated_at=2025-11-28T10:12:52Z on network 41370116-60b0-4433-ab19-12e9b7026582
Nov 28 10:12:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:53.378 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:12:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:53.379 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:12:53 np0005538513.localdomain dnsmasq[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/addn_hosts - 1 addresses
Nov 28 10:12:53 np0005538513.localdomain dnsmasq-dhcp[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/host
Nov 28 10:12:53 np0005538513.localdomain podman[330889]: 2025-11-28 10:12:53.568442903 +0000 UTC m=+0.077952992 container kill fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41370116-60b0-4433-ab19-12e9b7026582, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 10:12:53 np0005538513.localdomain dnsmasq-dhcp[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/opts
Nov 28 10:12:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:53.616 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:12:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:53.618 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11010MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:12:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:53.619 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:12:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:53.620 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:12:53 np0005538513.localdomain ceph-mon[292954]: pgmap v644: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 302 B/s rd, 130 KiB/s wr, 8 op/s
Nov 28 10:12:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:12:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:12:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:12:53 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:12:53 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2706457592' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:12:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:53.701 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 10:12:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:53.702 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:12:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:53.702 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:12:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:53.740 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:12:53 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:12:53.907 261084 INFO neutron.agent.dhcp.agent [None req-c295737c-9342-4b7d-80a8-56a817df0419 - - - - - -] DHCP configuration for ports {'5b58bb5a-3cf7-4593-ab3b-2d1f27a01a18'} is completed
Nov 28 10:12:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:12:54 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2576893307' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:12:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:54.196 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:12:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:54.203 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:12:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:54.235 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:12:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:54.238 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:12:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:54.238 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.618s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:12:54 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:12:54 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:12:54 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2576893307' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:12:55 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:12:55.315 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:12:52Z, description=, device_id=c6a3c308-ab33-40a1-9933-91a047698d13, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6f575e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fdcd6f57ac0>], id=5b58bb5a-3cf7-4593-ab3b-2d1f27a01a18, ip_allocation=immediate, mac_address=fa:16:3e:7c:e0:a5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:12:46Z, description=, dns_domain=, id=41370116-60b0-4433-ab19-12e9b7026582, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-1537040395-network, port_security_enabled=True, project_id=8ce454c94ec74465ac8200d5fe0b153e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25437, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3799, status=ACTIVE, subnets=['129cbc91-ddc3-4c68-80c6-e69ac70a7c43'], tags=[], tenant_id=8ce454c94ec74465ac8200d5fe0b153e, updated_at=2025-11-28T10:12:47Z, vlan_transparent=None, network_id=41370116-60b0-4433-ab19-12e9b7026582, port_security_enabled=False, project_id=8ce454c94ec74465ac8200d5fe0b153e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3807, status=DOWN, tags=[], tenant_id=8ce454c94ec74465ac8200d5fe0b153e, updated_at=2025-11-28T10:12:52Z on network 41370116-60b0-4433-ab19-12e9b7026582
Nov 28 10:12:55 np0005538513.localdomain systemd[1]: tmp-crun.dN7Ub2.mount: Deactivated successfully.
Nov 28 10:12:55 np0005538513.localdomain dnsmasq[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/addn_hosts - 1 addresses
Nov 28 10:12:55 np0005538513.localdomain dnsmasq-dhcp[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/host
Nov 28 10:12:55 np0005538513.localdomain dnsmasq-dhcp[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/opts
Nov 28 10:12:55 np0005538513.localdomain podman[330950]: 2025-11-28 10:12:55.631256604 +0000 UTC m=+0.069428300 container kill fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41370116-60b0-4433-ab19-12e9b7026582, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:12:55 np0005538513.localdomain ceph-mon[292954]: pgmap v645: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 110 KiB/s wr, 6 op/s
Nov 28 10:12:55 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "2f14e343-1793-4cb4-b8c5-3069f088f109", "format": "json"}]: dispatch
Nov 28 10:12:55 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "2f14e343-1793-4cb4-b8c5-3069f088f109", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:55 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:12:55.862 261084 INFO neutron.agent.dhcp.agent [None req-41c17d07-4157-484d-bdc1-1de8e4697e2c - - - - - -] DHCP configuration for ports {'5b58bb5a-3cf7-4593-ab3b-2d1f27a01a18'} is completed
Nov 28 10:12:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:56.239 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:12:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:56.240 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:12:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:56.240 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:12:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:56.374 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:12:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:56.376 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:12:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:56.377 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 10:12:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:56.377 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:12:56 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:12:56 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:56 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:56 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1162585651' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:12:56 np0005538513.localdomain ceph-mon[292954]: pgmap v646: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 110 KiB/s wr, 6 op/s
Nov 28 10:12:56 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:12:56 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:12:56 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:56 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:56 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:56 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/44064503' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:12:56 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:57.226 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:12:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:57.230 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:12:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:57.231 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:57.231 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:12:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:57.232 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:12:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:57.233 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:12:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:57.236 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:57.341 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:12:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:57.342 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 10:12:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:12:57.342 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:12:59 np0005538513.localdomain ceph-mon[292954]: pgmap v647: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 180 KiB/s wr, 11 op/s
Nov 28 10:12:59 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "028817fc-1877-4dba-8e2f-8daf5bd38000", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:12:59 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "028817fc-1877-4dba-8e2f-8daf5bd38000", "format": "json"}]: dispatch
Nov 28 10:12:59 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:13:00 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Nov 28 10:13:00 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:13:00 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:13:00 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:13:00 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:13:00 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:13:00 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:13:00 np0005538513.localdomain podman[330970]: 2025-11-28 10:13:00.84815166 +0000 UTC m=+0.083582767 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, architecture=x86_64, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container)
Nov 28 10:13:00 np0005538513.localdomain podman[330970]: 2025-11-28 10:13:00.860763838 +0000 UTC m=+0.096194975 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, config_id=edpm, name=ubi9-minimal, distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64)
Nov 28 10:13:00 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:13:01 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:13:01 np0005538513.localdomain ceph-mon[292954]: pgmap v648: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 119 KiB/s wr, 7 op/s
Nov 28 10:13:01 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:13:01 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:13:01 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/713742877' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:13:01 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/3297457012' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:13:01 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:13:01Z|00493|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:13:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:01.151 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:01 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:02 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "028817fc-1877-4dba-8e2f-8daf5bd38000", "format": "json"}]: dispatch
Nov 28 10:13:02 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "028817fc-1877-4dba-8e2f-8daf5bd38000", "force": true, "format": "json"}]: dispatch
Nov 28 10:13:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:02.256 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:13:02 np0005538513.localdomain podman[330990]: 2025-11-28 10:13:02.846289107 +0000 UTC m=+0.079936573 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:13:02 np0005538513.localdomain podman[330990]: 2025-11-28 10:13:02.858393451 +0000 UTC m=+0.092040897 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:13:02 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:13:03 np0005538513.localdomain ceph-mon[292954]: pgmap v649: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 184 KiB/s wr, 10 op/s
Nov 28 10:13:03 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:13:03 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:13:03 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:13:03 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:13:03 np0005538513.localdomain systemd[1]: tmp-crun.GbSYPS.mount: Deactivated successfully.
Nov 28 10:13:03 np0005538513.localdomain podman[331013]: 2025-11-28 10:13:03.840981306 +0000 UTC m=+0.076237520 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 10:13:03 np0005538513.localdomain podman[331013]: 2025-11-28 10:13:03.846645511 +0000 UTC m=+0.081901684 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:13:03 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:13:04 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:13:04 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:13:04 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:13:04 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:13:04 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:13:04 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:04.406 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:05 np0005538513.localdomain ceph-mon[292954]: pgmap v650: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 135 KiB/s wr, 7 op/s
Nov 28 10:13:06 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:07 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Nov 28 10:13:07 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:13:07 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:13:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:07.294 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:07 np0005538513.localdomain ceph-mon[292954]: pgmap v651: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 135 KiB/s wr, 7 op/s
Nov 28 10:13:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:13:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:13:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:13:07 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:13:08 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:13:08 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:13:09 np0005538513.localdomain ceph-mon[292954]: pgmap v652: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 167 KiB/s wr, 10 op/s
Nov 28 10:13:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:13:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:13:09 np0005538513.localdomain podman[331031]: 2025-11-28 10:13:09.868322335 +0000 UTC m=+0.100242300 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 28 10:13:09 np0005538513.localdomain podman[331031]: 2025-11-28 10:13:09.883547893 +0000 UTC m=+0.115467848 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:13:09 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:13:09 np0005538513.localdomain podman[331032]: 2025-11-28 10:13:09.961694851 +0000 UTC m=+0.188314683 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, container_name=ovn_controller)
Nov 28 10:13:10 np0005538513.localdomain podman[331032]: 2025-11-28 10:13:10.026677213 +0000 UTC m=+0.253297055 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Nov 28 10:13:10 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:13:10 np0005538513.localdomain podman[238687]: time="2025-11-28T10:13:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:13:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:13:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157512 "" "Go-http-client/1.1"
Nov 28 10:13:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:13:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19763 "" "Go-http-client/1.1"
Nov 28 10:13:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:13:11 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:13:11 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:13:11 np0005538513.localdomain ceph-mon[292954]: pgmap v653: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 96 KiB/s wr, 5 op/s
Nov 28 10:13:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:13:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:13:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:13:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:13:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:12.296 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:13:12 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:13:13 np0005538513.localdomain ceph-mon[292954]: pgmap v654: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 159 KiB/s wr, 8 op/s
Nov 28 10:13:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/236213188' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:13:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/236213188' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:13:15 np0005538513.localdomain ceph-mon[292954]: pgmap v655: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 94 KiB/s wr, 6 op/s
Nov 28 10:13:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:17 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Nov 28 10:13:17 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:13:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:17.301 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:13:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:17.303 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:13:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:17.303 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:13:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:17.303 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:13:17 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:13:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:17.340 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:17.341 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:13:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:17.344 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:17 np0005538513.localdomain ceph-mon[292954]: pgmap v656: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 94 KiB/s wr, 6 op/s
Nov 28 10:13:17 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:13:17 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:13:17 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:13:17 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:13:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:13:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:13:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:13:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:13:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:13:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:13:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:13:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:13:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:13:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:13:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:13:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:13:18 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:13:18 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:13:18 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:18.753 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:13:18 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:13:18 np0005538513.localdomain podman[331076]: 2025-11-28 10:13:18.863671411 +0000 UTC m=+0.083114682 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:13:18 np0005538513.localdomain podman[331076]: 2025-11-28 10:13:18.90224865 +0000 UTC m=+0.121691901 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:13:18 np0005538513.localdomain podman[331075]: 2025-11-28 10:13:18.919421939 +0000 UTC m=+0.142722178 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:13:18 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:13:18 np0005538513.localdomain podman[331075]: 2025-11-28 10:13:18.958463632 +0000 UTC m=+0.181763861 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:13:18 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:13:19 np0005538513.localdomain ceph-mon[292954]: pgmap v657: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 96 KiB/s wr, 6 op/s
Nov 28 10:13:20 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:13:20 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:13:20 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:13:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Nov 28 10:13:20 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:13:21 np0005538513.localdomain dnsmasq[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/addn_hosts - 0 addresses
Nov 28 10:13:21 np0005538513.localdomain dnsmasq-dhcp[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/host
Nov 28 10:13:21 np0005538513.localdomain dnsmasq-dhcp[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/opts
Nov 28 10:13:21 np0005538513.localdomain podman[331133]: 2025-11-28 10:13:21.380694947 +0000 UTC m=+0.070767452 container kill fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41370116-60b0-4433-ab19-12e9b7026582, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 10:13:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:21.574 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:21 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:13:21Z|00494|binding|INFO|Releasing lport 1738dc88-e5f5-4680-ab9e-8f550f7bd83e from this chassis (sb_readonly=0)
Nov 28 10:13:21 np0005538513.localdomain kernel: device tap1738dc88-e5 left promiscuous mode
Nov 28 10:13:21 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:13:21Z|00495|binding|INFO|Setting lport 1738dc88-e5f5-4680-ab9e-8f550f7bd83e down in Southbound
Nov 28 10:13:21 np0005538513.localdomain ceph-mon[292954]: pgmap v658: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 65 KiB/s wr, 3 op/s
Nov 28 10:13:21 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:13:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:13:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:13:21 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:13:21.585 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-41370116-60b0-4433-ab19-12e9b7026582', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41370116-60b0-4433-ab19-12e9b7026582', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ce454c94ec74465ac8200d5fe0b153e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c1903e82-d4b6-46fe-9759-cfad1c16d3d4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>], logical_port=1738dc88-e5f5-4680-ab9e-8f550f7bd83e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fba9777a730>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:13:21 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:13:21.593 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 1738dc88-e5f5-4680-ab9e-8f550f7bd83e in datapath 41370116-60b0-4433-ab19-12e9b7026582 unbound from our chassis
Nov 28 10:13:21 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:21.596 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:21 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:13:21.596 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 41370116-60b0-4433-ab19-12e9b7026582, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:13:21 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:13:21.599 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[b4807ed4-84b9-456a-bcc6-3671c6802d38]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:13:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:22.343 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:22 np0005538513.localdomain ceph-mon[292954]: pgmap v659: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 128 KiB/s wr, 7 op/s
Nov 28 10:13:23 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:13:23Z|00496|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:13:23 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:23.367 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:23 np0005538513.localdomain dnsmasq[330849]: exiting on receipt of SIGTERM
Nov 28 10:13:23 np0005538513.localdomain podman[331172]: 2025-11-28 10:13:23.742094046 +0000 UTC m=+0.062686102 container kill fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41370116-60b0-4433-ab19-12e9b7026582, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 10:13:23 np0005538513.localdomain systemd[1]: libpod-fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79.scope: Deactivated successfully.
Nov 28 10:13:23 np0005538513.localdomain podman[331185]: 2025-11-28 10:13:23.817943413 +0000 UTC m=+0.061119833 container died fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41370116-60b0-4433-ab19-12e9b7026582, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 28 10:13:23 np0005538513.localdomain systemd[1]: tmp-crun.qG84Du.mount: Deactivated successfully.
Nov 28 10:13:23 np0005538513.localdomain podman[331185]: 2025-11-28 10:13:23.861335721 +0000 UTC m=+0.104512091 container cleanup fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41370116-60b0-4433-ab19-12e9b7026582, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:13:23 np0005538513.localdomain systemd[1]: libpod-conmon-fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79.scope: Deactivated successfully.
Nov 28 10:13:23 np0005538513.localdomain podman[331193]: 2025-11-28 10:13:23.909095982 +0000 UTC m=+0.137699204 container remove fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41370116-60b0-4433-ab19-12e9b7026582, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:13:23 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:13:23.939 261084 INFO neutron.agent.dhcp.agent [None req-05e642a3-4c7a-488c-a34c-e73794abd662 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:13:24 np0005538513.localdomain neutron_dhcp_agent[261080]: 2025-11-28 10:13:24.140 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:13:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay-f48131d4cd50d23f0a08edebbccc523a267ca2a1360891739e8c240e1e9f0859-merged.mount: Deactivated successfully.
Nov 28 10:13:24 np0005538513.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79-userdata-shm.mount: Deactivated successfully.
Nov 28 10:13:24 np0005538513.localdomain systemd[1]: run-netns-qdhcp\x2d41370116\x2d60b0\x2d4433\x2dab19\x2d12e9b7026582.mount: Deactivated successfully.
Nov 28 10:13:25 np0005538513.localdomain ceph-mon[292954]: pgmap v660: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 65 KiB/s wr, 4 op/s
Nov 28 10:13:25 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:13:26 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1f8e6c04-5771-4f46-846b-71f913803117", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:13:26 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1f8e6c04-5771-4f46-846b-71f913803117", "format": "json"}]: dispatch
Nov 28 10:13:26 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:13:26Z|00497|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:13:26 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:26.549 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:27 np0005538513.localdomain ceph-mon[292954]: pgmap v661: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 65 KiB/s wr, 4 op/s
Nov 28 10:13:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:27.345 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:28 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97,allow rw path=/volumes/_nogroup/1f8e6c04-5771-4f46-846b-71f913803117/99d1667a-36eb-45db-8990-56f5fc443d2e", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50,allow rw pool=manila_data namespace=fsvolumens_1f8e6c04-5771-4f46-846b-71f913803117"]} v 0)
Nov 28 10:13:28 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97,allow rw path=/volumes/_nogroup/1f8e6c04-5771-4f46-846b-71f913803117/99d1667a-36eb-45db-8990-56f5fc443d2e", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50,allow rw pool=manila_data namespace=fsvolumens_1f8e6c04-5771-4f46-846b-71f913803117"]} : dispatch
Nov 28 10:13:28 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97,allow rw path=/volumes/_nogroup/1f8e6c04-5771-4f46-846b-71f913803117/99d1667a-36eb-45db-8990-56f5fc443d2e", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50,allow rw pool=manila_data namespace=fsvolumens_1f8e6c04-5771-4f46-846b-71f913803117"]}]': finished
Nov 28 10:13:29 np0005538513.localdomain ceph-mon[292954]: pgmap v662: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 79 KiB/s wr, 5 op/s
Nov 28 10:13:29 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1f8e6c04-5771-4f46-846b-71f913803117", "auth_id": "bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:13:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Nov 28 10:13:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97,allow rw path=/volumes/_nogroup/1f8e6c04-5771-4f46-846b-71f913803117/99d1667a-36eb-45db-8990-56f5fc443d2e", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50,allow rw pool=manila_data namespace=fsvolumens_1f8e6c04-5771-4f46-846b-71f913803117"]} : dispatch
Nov 28 10:13:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97,allow rw path=/volumes/_nogroup/1f8e6c04-5771-4f46-846b-71f913803117/99d1667a-36eb-45db-8990-56f5fc443d2e", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50,allow rw pool=manila_data namespace=fsvolumens_1f8e6c04-5771-4f46-846b-71f913803117"]} : dispatch
Nov 28 10:13:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97,allow rw path=/volumes/_nogroup/1f8e6c04-5771-4f46-846b-71f913803117/99d1667a-36eb-45db-8990-56f5fc443d2e", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50,allow rw pool=manila_data namespace=fsvolumens_1f8e6c04-5771-4f46-846b-71f913803117"]}]': finished
Nov 28 10:13:29 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Nov 28 10:13:31 np0005538513.localdomain ceph-mon[292954]: pgmap v663: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 77 KiB/s wr, 4 op/s
Nov 28 10:13:31 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:13:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:31 np0005538513.localdomain podman[331215]: 2025-11-28 10:13:31.856468979 +0000 UTC m=+0.087715693 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, name=ubi9-minimal)
Nov 28 10:13:31 np0005538513.localdomain podman[331215]: 2025-11-28 10:13:31.874408301 +0000 UTC m=+0.105655025 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, version=9.6, config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible)
Nov 28 10:13:31 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:13:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:32.349 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50"]} v 0)
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50"]} : dispatch
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50"]}]': finished
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.672403) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324812672426, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 2361, "num_deletes": 259, "total_data_size": 2545114, "memory_usage": 2724472, "flush_reason": "Manual Compaction"}
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324812685606, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 2498832, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36663, "largest_seqno": 39023, "table_properties": {"data_size": 2488792, "index_size": 6097, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2885, "raw_key_size": 25107, "raw_average_key_size": 22, "raw_value_size": 2467168, "raw_average_value_size": 2175, "num_data_blocks": 263, "num_entries": 1134, "num_filter_entries": 1134, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324696, "oldest_key_time": 1764324696, "file_creation_time": 1764324812, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 13275 microseconds, and 4723 cpu microseconds.
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.685666) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 2498832 bytes OK
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.685695) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.689053) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.689074) EVENT_LOG_v1 {"time_micros": 1764324812689068, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.689093) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 2534388, prev total WAL file size 2534388, number of live WAL files 2.
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.689859) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132383031' seq:72057594037927935, type:22 .. '7061786F73003133303533' seq:0, type:0; will stop at (end)
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(2440KB)], [66(17MB)]
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324812689917, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 21214215, "oldest_snapshot_seqno": -1}
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 14366 keys, 19606922 bytes, temperature: kUnknown
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324812794364, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 19606922, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19522694, "index_size": 47199, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35973, "raw_key_size": 383926, "raw_average_key_size": 26, "raw_value_size": 19276866, "raw_average_value_size": 1341, "num_data_blocks": 1771, "num_entries": 14366, "num_filter_entries": 14366, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324812, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.794752) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 19606922 bytes
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.796466) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 202.9 rd, 187.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 17.8 +0.0 blob) out(18.7 +0.0 blob), read-write-amplify(16.3) write-amplify(7.8) OK, records in: 14907, records dropped: 541 output_compression: NoCompression
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.796496) EVENT_LOG_v1 {"time_micros": 1764324812796482, "job": 40, "event": "compaction_finished", "compaction_time_micros": 104558, "compaction_time_cpu_micros": 55838, "output_level": 6, "num_output_files": 1, "total_output_size": 19606922, "num_input_records": 14907, "num_output_records": 14366, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324812797044, "job": 40, "event": "table_file_deletion", "file_number": 68}
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324812799616, "job": 40, "event": "table_file_deletion", "file_number": 66}
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.689748) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.799723) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.799731) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.799734) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.799737) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.799740) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: pgmap v664: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 112 KiB/s wr, 6 op/s
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1f8e6c04-5771-4f46-846b-71f913803117", "auth_id": "bob", "format": "json"}]: dispatch
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50"]} : dispatch
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50"]} : dispatch
Nov 28 10:13:32 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50"]}]': finished
Nov 28 10:13:33 np0005538513.localdomain sudo[331234]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:13:33 np0005538513.localdomain sudo[331234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:13:33 np0005538513.localdomain sudo[331234]: pam_unix(sudo:session): session closed for user root
Nov 28 10:13:33 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:13:33 np0005538513.localdomain sudo[331252]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 10:13:33 np0005538513.localdomain sudo[331252]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:13:33 np0005538513.localdomain podman[331253]: 2025-11-28 10:13:33.823152147 +0000 UTC m=+0.088384105 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:13:33 np0005538513.localdomain podman[331253]: 2025-11-28 10:13:33.837482089 +0000 UTC m=+0.102714077 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:13:33 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:13:33 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1f8e6c04-5771-4f46-846b-71f913803117", "auth_id": "bob", "format": "json"}]: dispatch
Nov 28 10:13:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:13:34 np0005538513.localdomain sudo[331252]: pam_unix(sudo:session): session closed for user root
Nov 28 10:13:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 10:13:34 np0005538513.localdomain systemd[1]: tmp-crun.ilbtCv.mount: Deactivated successfully.
Nov 28 10:13:34 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:34 np0005538513.localdomain podman[331308]: 2025-11-28 10:13:34.261988179 +0000 UTC m=+0.087595831 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:13:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 10:13:34 np0005538513.localdomain podman[331308]: 2025-11-28 10:13:34.270363017 +0000 UTC m=+0.095970729 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 10:13:34 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:34 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:13:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 10:13:34 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 10:13:34 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 10:13:34 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 10:13:34 np0005538513.localdomain sudo[331332]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:13:34 np0005538513.localdomain sudo[331332]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:13:34 np0005538513.localdomain sudo[331332]: pam_unix(sudo:session): session closed for user root
Nov 28 10:13:34 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:34 np0005538513.localdomain sudo[331350]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:13:34 np0005538513.localdomain sudo[331350]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:13:35 np0005538513.localdomain sudo[331350]: pam_unix(sudo:session): session closed for user root
Nov 28 10:13:35 np0005538513.localdomain ceph-mon[292954]: pgmap v665: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 49 KiB/s wr, 2 op/s
Nov 28 10:13:35 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:35 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:35 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:35 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:35 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:35 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:13:35 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:35 np0005538513.localdomain sudo[331399]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:13:35 np0005538513.localdomain sudo[331399]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:13:35 np0005538513.localdomain sudo[331399]: pam_unix(sudo:session): session closed for user root
Nov 28 10:13:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.bob"} v 0)
Nov 28 10:13:35 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch
Nov 28 10:13:35 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished
Nov 28 10:13:36 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:13:36 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:13:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:13:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:13:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch
Nov 28 10:13:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Nov 28 10:13:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch
Nov 28 10:13:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished
Nov 28 10:13:36 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:36 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:36.548 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:36 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:37 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "bob", "format": "json"}]: dispatch
Nov 28 10:13:37 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "bob", "format": "json"}]: dispatch
Nov 28 10:13:37 np0005538513.localdomain ceph-mon[292954]: pgmap v666: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 49 KiB/s wr, 2 op/s
Nov 28 10:13:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:37.352 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:37 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:13:37.635 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:13:37 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:13:37.636 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:13:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:37.680 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:38 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:13:38.638 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:13:39 np0005538513.localdomain ceph-mon[292954]: pgmap v667: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 101 KiB/s wr, 5 op/s
Nov 28 10:13:40 np0005538513.localdomain podman[238687]: time="2025-11-28T10:13:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:13:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:13:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1"
Nov 28 10:13:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:13:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19275 "" "Go-http-client/1.1"
Nov 28 10:13:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:13:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:13:40 np0005538513.localdomain podman[331417]: 2025-11-28 10:13:40.854262143 +0000 UTC m=+0.092570433 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:13:40 np0005538513.localdomain podman[331418]: 2025-11-28 10:13:40.903391846 +0000 UTC m=+0.137961541 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.build-date=20251125, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 10:13:40 np0005538513.localdomain podman[331417]: 2025-11-28 10:13:40.918469391 +0000 UTC m=+0.156777721 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 28 10:13:40 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:13:40 np0005538513.localdomain podman[331418]: 2025-11-28 10:13:40.976481658 +0000 UTC m=+0.211051343 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:13:40 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:13:41 np0005538513.localdomain ceph-mon[292954]: pgmap v668: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 87 KiB/s wr, 4 op/s
Nov 28 10:13:41 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1f8e6c04-5771-4f46-846b-71f913803117", "format": "json"}]: dispatch
Nov 28 10:13:41 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1f8e6c04-5771-4f46-846b-71f913803117", "force": true, "format": "json"}]: dispatch
Nov 28 10:13:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:42.355 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:42.504 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:42 np0005538513.localdomain ceph-mon[292954]: pgmap v669: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 114 KiB/s wr, 5 op/s
Nov 28 10:13:43 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "format": "json"}]: dispatch
Nov 28 10:13:43 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "force": true, "format": "json"}]: dispatch
Nov 28 10:13:44 np0005538513.localdomain ceph-mon[292954]: pgmap v670: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 79 KiB/s wr, 3 op/s
Nov 28 10:13:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:47 np0005538513.localdomain ceph-mon[292954]: pgmap v671: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 79 KiB/s wr, 3 op/s
Nov 28 10:13:47 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:13:47Z|00498|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0)
Nov 28 10:13:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:47.117 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:47.358 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:47.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:13:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:13:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:13:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:13:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:13:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:13:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:13:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:13:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:13:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:13:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:13:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:13:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:13:49 np0005538513.localdomain ceph-mon[292954]: pgmap v672: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 112 KiB/s wr, 5 op/s
Nov 28 10:13:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:13:49 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:13:49 np0005538513.localdomain podman[331465]: 2025-11-28 10:13:49.882996242 +0000 UTC m=+0.115356775 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 28 10:13:49 np0005538513.localdomain podman[331465]: 2025-11-28 10:13:49.923483489 +0000 UTC m=+0.155844042 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 28 10:13:49 np0005538513.localdomain systemd[1]: tmp-crun.TJpQ4m.mount: Deactivated successfully.
Nov 28 10:13:49 np0005538513.localdomain podman[331464]: 2025-11-28 10:13:49.939112271 +0000 UTC m=+0.173705743 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:13:49 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:13:49 np0005538513.localdomain podman[331464]: 2025-11-28 10:13:49.973411538 +0000 UTC m=+0.208004990 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:13:49 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:13:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:50.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:13:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:50.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:13:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:50.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:13:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:13:50.850 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:13:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:13:50.851 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:13:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:13:50.851 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:13:51 np0005538513.localdomain ceph-mon[292954]: pgmap v673: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 60 KiB/s wr, 2 op/s
Nov 28 10:13:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:51.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:13:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:52.361 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:52.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:13:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:52.893 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:13:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:52.894 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:13:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:52.894 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:13:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:52.895 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:13:52 np0005538513.localdomain ceph-mon[292954]: pgmap v674: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 60 KiB/s wr, 3 op/s
Nov 28 10:13:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:52.895 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:13:53 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:13:53 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3360059162' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:13:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:53.375 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:13:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:53.507 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:13:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:53.508 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:13:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:53.741 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:13:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:53.743 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11004MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:13:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:53.744 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:13:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:53.744 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:13:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:53.809 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 10:13:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:53.810 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:13:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:53.810 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:13:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:53.841 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:13:53 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3360059162' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:13:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:13:54 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2441468294' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:13:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:54.314 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:13:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:54.319 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:13:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:54.459 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:13:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:54.462 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:13:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:54.462 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.718s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:13:54 np0005538513.localdomain ceph-mon[292954]: pgmap v675: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 33 KiB/s wr, 2 op/s
Nov 28 10:13:54 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2441468294' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:13:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:55.458 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:13:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:55.459 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:13:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:55.459 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:13:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:55.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:13:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:55.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:13:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:55.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:13:55 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:13:55 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "format": "json"}]: dispatch
Nov 28 10:13:55 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:13:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:56.189 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:13:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:56.189 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:13:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:56.190 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 10:13:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:56.190 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:13:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:56.589 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:13:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:56.609 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:13:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:56.609 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 10:13:56 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:56 np0005538513.localdomain ceph-mon[292954]: pgmap v676: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 33 KiB/s wr, 2 op/s
Nov 28 10:13:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:13:57.364 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:57 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2582566830' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:13:58 np0005538513.localdomain ceph-mon[292954]: pgmap v677: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 46 KiB/s wr, 3 op/s
Nov 28 10:13:58 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1398638247' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:13:58 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:14:00 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "8d719993-3b66-454f-a026-687de7e6b3e4", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:14:00 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "8d719993-3b66-454f-a026-687de7e6b3e4", "format": "json"}]: dispatch
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.677 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.678 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.682 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d868a28-b4de-498a-8330-bd01da89c264', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:14:00.678540', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'f57a4cd8-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.850460591, 'message_signature': '0e9a66ed60666ebaf463052fa4504659386944ba1e65b876c2943beeb9fd3049'}]}, 'timestamp': '2025-11-28 10:14:00.682969', '_unique_id': '946597200112445c85504c9b03a2796a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.685 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.704 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dcb010ef-e19f-488e-bd2a-34ff4b0a5bf8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:14:00.685940', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'f57dc110-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.876722411, 'message_signature': 'dc8a7112a8a0d9344ae9363e88d07d63f9445151167f97a97e468037086db9b9'}]}, 'timestamp': '2025-11-28 10:14:00.705544', '_unique_id': 'c5fe267d912345e0a97fb2f51af99957'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.707 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.707 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 19480000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef96c593-ef44-4bfb-b761-7da0285c98c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19480000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:14:00.707843', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'f57e34ce-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.876722411, 'message_signature': 'e8abcbf78fee2041a703fc228c5c8601872cf45d9fe16a07135f972aa47a2ffe'}]}, 'timestamp': '2025-11-28 10:14:00.708552', '_unique_id': 'e4b4cf76f4134883bf345453ecb2a92b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.711 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.711 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.741 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.741 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ee8c098-8428-4ef1-98cc-ad85a1927530', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:14:00.711907', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f5834a4a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': 'df459dfa72d4ee6510592829158712171d7bc183b306600b5d308864c68cf2d0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:14:00.711907', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f5835a94-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': 'f8eed7b15760973bc9fdaceeba11be1733395805a67793569d75bf220029f93c'}]}, 'timestamp': '2025-11-28 10:14:00.742214', '_unique_id': 'b529ff80d4774d528e162e819294c3b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.744 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.744 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.744 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3c0c199-94fd-467f-8cd5-15697eee34c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:14:00.744459', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f583c538-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': '4960ce9d0bde6ea33ca11c76502c36567dd206dc949cd08a97256621dd70a4e4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:14:00.744459', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f583d708-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': '2dd80774e285e25a823cabf8b39fd95ec14d60df7e09bfd272caf3628e26b782'}]}, 'timestamp': '2025-11-28 10:14:00.745342', '_unique_id': 'a16028973ab64946ae5ab6361f3eff8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.747 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.747 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.747 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a5178e3-9e3e-41c9-b5fa-77fc2c918466', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:14:00.747452', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f58439aa-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': '3e3b18363832cb2dcc32acef006134de0c57968da36a978a7373e551228e967f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:14:00.747452', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f5844a58-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': 'a1a95370a42b972887813a30e03d4e995ec8805ec0ccc9231b4e2da4be62895a'}]}, 'timestamp': '2025-11-28 10:14:00.748292', '_unique_id': 'bae1250ddee3415a9139fed83b1f8395'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.750 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.750 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.750 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cde4e7c7-c2aa-4500-a861-30e005d177ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:14:00.750359', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f584ac28-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': 'b6de245ec2c1e9a3a1f6a27e085966367ef5fec1afeab7ebe4db50d0ad3bdf5b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:14:00.750359', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f584bbaa-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': '8d2609cb5b0b57c0bff55bc4c168a3a959819d2e1dabd3fb54fdc495fd552d9b'}]}, 'timestamp': '2025-11-28 10:14:00.751223', '_unique_id': 'c654cee9c45d4755bb576398b3b7a16b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.753 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.753 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64fd8631-5e19-4aeb-9352-c309f26b16d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:14:00.753270', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'f5851d0c-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.850460591, 'message_signature': '40735ca1ac35e761c263355040fa23096262f8eb4bbbdf8401a4670f1a6cedb0'}]}, 'timestamp': '2025-11-28 10:14:00.753713', '_unique_id': '9bda825ea1d443c3860453da8da30387'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.755 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.755 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73ddfabd-c2a6-4105-85ab-e42afdf5a2e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:14:00.755791', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'f5858152-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.850460591, 'message_signature': '12a5a2c6bc181f52b91e118db65d480d3b0586abd5fd39092492f9232c66658e'}]}, 'timestamp': '2025-11-28 10:14:00.756301', '_unique_id': '4bba39267bb447a798e520e5116ca556'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.758 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.758 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7b788b5-4031-4b4e-a8f8-31cdf42d7293', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:14:00.758503', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'f585ea3e-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.850460591, 'message_signature': 'a5b2fbdfeebecb3e5bebf4e90fc4d076749ba6e3a9e0eef8e17762ce381013d7'}]}, 'timestamp': '2025-11-28 10:14:00.758967', '_unique_id': '5d00f9acb34b4589b50acb4edfbb1d50'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.760 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.761 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eaa95f89-a1bb-4f85-8a36-3cf3872371f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:14:00.760993', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'f5864bdc-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.850460591, 'message_signature': '886e8b00e874328799d7b8b509f751c77eb32e039466b24fb1ae23bb3eb6acae'}]}, 'timestamp': '2025-11-28 10:14:00.761463', '_unique_id': '785626421b6e42d494ceda490f07bfde'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.763 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.763 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.763 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'acfb2f54-dc0f-43e2-8c02-40343f70b959', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:14:00.763499', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f586ac4e-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': '33be5fb1b45144f6528e2447108167a385830d7f8ea9c4ecf0af090d956300d9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:14:00.763499', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f586bd2e-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': '0d2c040b1ef393fc8b35d8c8da5ab99839ce239ef9022c50cb1dc5fe017e274e'}]}, 'timestamp': '2025-11-28 10:14:00.764339', '_unique_id': '93acf28e807f49d796f5fa71dd05cb6f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.766 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.766 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57499fb7-7a3c-43b8-86c9-3158e5ced404', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:14:00.766384', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'f5871ef4-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.850460591, 'message_signature': '8b29bb967bbb3a95f6a42471df4199b54c3fbc0775639748ade3611d2e9ff807'}]}, 'timestamp': '2025-11-28 10:14:00.766872', '_unique_id': '57bbc126314f44919b156bd4133fbecd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.768 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.779 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.780 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23dae271-5a5a-4254-b98e-a0e527836da2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:14:00.768871', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f5892870-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.940764365, 'message_signature': 'd9701a2cf6e20e5218867b33b6424adfad31e39c8f01c0dd29dfebe528552a14'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:14:00.768871', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f5893b1c-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.940764365, 'message_signature': 'a4d6aca46b55527bdfed4b898eae3814cfb54de4c7cd3c4014879d6526ffa889'}]}, 'timestamp': '2025-11-28 10:14:00.780676', '_unique_id': '63242bc9266e400f8a45e9cc1a65f369'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.782 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.782 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0f41099-8891-4996-a3f3-33dc114138f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:14:00.782876', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'f589a372-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.850460591, 'message_signature': '8f3763223d28785812c9c2c9a9dc8f0b0b96042ea351d2c07fe5d0601f7f0331'}]}, 'timestamp': '2025-11-28 10:14:00.783371', '_unique_id': '2423626ab29a4db4b2747c50e53331e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.785 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.785 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.785 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d568b5c-c0b4-45a5-876c-0e05d2255955', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:14:00.785385', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f58a038a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.940764365, 'message_signature': 'c95c6e8131729527a1a9f8a5a9fc24eb1ee6bea3f4c86c77bb9ddc556c13e913'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:14:00.785385', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f58a1320-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.940764365, 'message_signature': '474212aaafd6b616a1b82196d6e8615e1ff6889c831c949a6b5350dd878a672c'}]}, 'timestamp': '2025-11-28 10:14:00.786228', '_unique_id': '4f3e46ef535643e3915f9e2f658fc0eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.788 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.788 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aff9107c-d3cf-43e0-b851-146423c5049f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:14:00.788306', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'f58a759a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.850460591, 'message_signature': '19ce452baebe69fe7718333fe163db4002b1284eac99d386cac8e2be294c4c54'}]}, 'timestamp': '2025-11-28 10:14:00.788755', '_unique_id': 'e09cbc05208f40398702bc3f06a16c51'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.790 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.790 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.790 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.791 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be1494b7-444d-497c-ac0e-70b6342ff71f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:14:00.790920', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f58add78-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': '631f6fa29642b3b956e9a82f8b9613d7e01befc0e4d09509f0950f18a1bcffef'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:14:00.790920', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f58aed68-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': '7b6de110ab609e0dd9d3b2870cf3a44ed0f25f807ca9ed188670defc17e55a3a'}]}, 'timestamp': '2025-11-28 10:14:00.791786', '_unique_id': 'c382b1f5c7884190a2cfa3884cafc2c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.793 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.794 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b8e0390-9842-4ed9-9a0c-743f007d8cd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:14:00.794101', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'f58b5848-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.850460591, 'message_signature': 'f709d8b0a41685d2f2249685b9b051a0ecd71a5564a4965a37179001955e33ca'}]}, 'timestamp': '2025-11-28 10:14:00.794575', '_unique_id': '0f96516ee9b643689d1e5fae3c8728eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.796 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.796 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2e3558c-9f41-4755-9f8a-901833a3082a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:14:00.796590', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'f58bb914-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.850460591, 'message_signature': 'bb693e1c9aa8a036838fcf2885c179c3dcbb9bf64d1884f7fe57e1b9d6d43692'}]}, 'timestamp': '2025-11-28 10:14:00.797061', '_unique_id': '198a2a6f893f41afaa2be52c61c716a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.799 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.799 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.799 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.799 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '262a87b5-9948-4ca8-8ff6-b742ec38acc0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:14:00.799285', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f58c2232-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.940764365, 'message_signature': '90d09fd1a461ff47df31fa875283422a3a7740a4b7ab4c002a58866890ab78cc'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:14:00.799285', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f58c31dc-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.940764365, 'message_signature': '6ff5ace67115f77e36d3a1775e09907d6d54827fda0150a07193f3ccf16c1f92'}]}, 'timestamp': '2025-11-28 10:14:00.800122', '_unique_id': '3fb142321a1e4e63ba4818e2b2be853f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:14:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:14:01 np0005538513.localdomain ceph-mon[292954]: pgmap v678: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 13 KiB/s wr, 0 op/s
Nov 28 10:14:01 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:02.366 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:02.604 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:14:02 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:14:02 np0005538513.localdomain systemd[1]: tmp-crun.K9V3gP.mount: Deactivated successfully.
Nov 28 10:14:02 np0005538513.localdomain podman[331551]: 2025-11-28 10:14:02.852560746 +0000 UTC m=+0.087278100 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, release=1755695350)
Nov 28 10:14:02 np0005538513.localdomain podman[331551]: 2025-11-28 10:14:02.868820788 +0000 UTC m=+0.103538182 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-type=git, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, distribution-scope=public, config_id=edpm, architecture=x86_64, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, container_name=openstack_network_exporter)
Nov 28 10:14:02 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:14:02 np0005538513.localdomain ceph-mon[292954]: pgmap v679: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 28 KiB/s wr, 1 op/s
Nov 28 10:14:02 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1439035389' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:14:03 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/8d719993-3b66-454f-a026-687de7e6b3e4/a35af86e-1bef-43ca-805f-c714f40e8411", "osd", "allow rw pool=manila_data namespace=fsvolumens_8d719993-3b66-454f-a026-687de7e6b3e4", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:14:03 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/8d719993-3b66-454f-a026-687de7e6b3e4/a35af86e-1bef-43ca-805f-c714f40e8411", "osd", "allow rw pool=manila_data namespace=fsvolumens_8d719993-3b66-454f-a026-687de7e6b3e4", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:14:03 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/8d719993-3b66-454f-a026-687de7e6b3e4/a35af86e-1bef-43ca-805f-c714f40e8411", "osd", "allow rw pool=manila_data namespace=fsvolumens_8d719993-3b66-454f-a026-687de7e6b3e4", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:14:03 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "8d719993-3b66-454f-a026-687de7e6b3e4", "auth_id": "Joe", "tenant_id": "301971e834c14ea7aa009696c3f04782", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:14:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Nov 28 10:14:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/8d719993-3b66-454f-a026-687de7e6b3e4/a35af86e-1bef-43ca-805f-c714f40e8411", "osd", "allow rw pool=manila_data namespace=fsvolumens_8d719993-3b66-454f-a026-687de7e6b3e4", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:14:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/8d719993-3b66-454f-a026-687de7e6b3e4/a35af86e-1bef-43ca-805f-c714f40e8411", "osd", "allow rw pool=manila_data namespace=fsvolumens_8d719993-3b66-454f-a026-687de7e6b3e4", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:14:03 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/8d719993-3b66-454f-a026-687de7e6b3e4/a35af86e-1bef-43ca-805f-c714f40e8411", "osd", "allow rw pool=manila_data namespace=fsvolumens_8d719993-3b66-454f-a026-687de7e6b3e4", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:14:03 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2102988728' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:14:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:14:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:14:04 np0005538513.localdomain podman[331573]: 2025-11-28 10:14:04.851626694 +0000 UTC m=+0.084039481 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:14:04 np0005538513.localdomain podman[331573]: 2025-11-28 10:14:04.856892106 +0000 UTC m=+0.089304873 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:14:04 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:14:04 np0005538513.localdomain systemd[1]: tmp-crun.vmWrMA.mount: Deactivated successfully.
Nov 28 10:14:04 np0005538513.localdomain ceph-mon[292954]: pgmap v680: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 27 KiB/s wr, 1 op/s
Nov 28 10:14:04 np0005538513.localdomain podman[331572]: 2025-11-28 10:14:04.952152851 +0000 UTC m=+0.188994055 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:14:04 np0005538513.localdomain podman[331572]: 2025-11-28 10:14:04.964389858 +0000 UTC m=+0.201231032 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:14:04 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:14:06 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:07 np0005538513.localdomain ceph-mon[292954]: pgmap v681: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 27 KiB/s wr, 1 op/s
Nov 28 10:14:07 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:14:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:07.368 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:08 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:14:08 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "format": "json"}]: dispatch
Nov 28 10:14:09 np0005538513.localdomain ceph-mon[292954]: pgmap v682: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 66 KiB/s wr, 2 op/s
Nov 28 10:14:10 np0005538513.localdomain podman[238687]: time="2025-11-28T10:14:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:14:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:14:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1"
Nov 28 10:14:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:14:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19283 "" "Go-http-client/1.1"
Nov 28 10:14:11 np0005538513.localdomain ceph-mon[292954]: pgmap v683: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s wr, 2 op/s
Nov 28 10:14:11 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "auth_id": "Joe", "tenant_id": "b90c445933704341b38d135548fb5388", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:14:11 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Nov 28 10:14:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:14:11 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:14:11 np0005538513.localdomain podman[331612]: 2025-11-28 10:14:11.874879967 +0000 UTC m=+0.108391871 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 28 10:14:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:11 np0005538513.localdomain podman[331612]: 2025-11-28 10:14:11.915537431 +0000 UTC m=+0.149049305 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 10:14:11 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:14:11 np0005538513.localdomain podman[331613]: 2025-11-28 10:14:11.932870094 +0000 UTC m=+0.162352663 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 10:14:11 np0005538513.localdomain podman[331613]: 2025-11-28 10:14:11.995887236 +0000 UTC m=+0.225369805 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 10:14:12 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:14:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:12.372 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:12 np0005538513.localdomain ceph-mon[292954]: pgmap v684: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 75 KiB/s wr, 3 op/s
Nov 28 10:14:13 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-241168775", "caps": ["mds", "allow rw path=/volumes/_nogroup/d56bb3f2-efa0-4328-9320-c5298bccaeb7/3fd14072-fd4e-434b-b433-15cdcb82070a", "osd", "allow rw pool=manila_data namespace=fsvolumens_d56bb3f2-efa0-4328-9320-c5298bccaeb7", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:14:13 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-241168775", "caps": ["mds", "allow rw path=/volumes/_nogroup/d56bb3f2-efa0-4328-9320-c5298bccaeb7/3fd14072-fd4e-434b-b433-15cdcb82070a", "osd", "allow rw pool=manila_data namespace=fsvolumens_d56bb3f2-efa0-4328-9320-c5298bccaeb7", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:14:13 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-241168775", "caps": ["mds", "allow rw path=/volumes/_nogroup/d56bb3f2-efa0-4328-9320-c5298bccaeb7/3fd14072-fd4e-434b-b433-15cdcb82070a", "osd", "allow rw pool=manila_data namespace=fsvolumens_d56bb3f2-efa0-4328-9320-c5298bccaeb7", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:14:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/4282388622' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:14:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/4282388622' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:14:13 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "auth_id": "tempest-cephx-id-241168775", "tenant_id": "b90c445933704341b38d135548fb5388", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:14:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-241168775", "format": "json"} : dispatch
Nov 28 10:14:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-241168775", "caps": ["mds", "allow rw path=/volumes/_nogroup/d56bb3f2-efa0-4328-9320-c5298bccaeb7/3fd14072-fd4e-434b-b433-15cdcb82070a", "osd", "allow rw pool=manila_data namespace=fsvolumens_d56bb3f2-efa0-4328-9320-c5298bccaeb7", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:14:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-241168775", "caps": ["mds", "allow rw path=/volumes/_nogroup/d56bb3f2-efa0-4328-9320-c5298bccaeb7/3fd14072-fd4e-434b-b433-15cdcb82070a", "osd", "allow rw pool=manila_data namespace=fsvolumens_d56bb3f2-efa0-4328-9320-c5298bccaeb7", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:14:13 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-241168775", "caps": ["mds", "allow rw path=/volumes/_nogroup/d56bb3f2-efa0-4328-9320-c5298bccaeb7/3fd14072-fd4e-434b-b433-15cdcb82070a", "osd", "allow rw pool=manila_data namespace=fsvolumens_d56bb3f2-efa0-4328-9320-c5298bccaeb7", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:14:14 np0005538513.localdomain ceph-mon[292954]: pgmap v685: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 61 KiB/s wr, 2 op/s
Nov 28 10:14:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:17 np0005538513.localdomain ceph-mon[292954]: pgmap v686: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 61 KiB/s wr, 2 op/s
Nov 28 10:14:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:17.374 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:14:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:14:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:14:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:14:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:14:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:14:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:14:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:14:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:14:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:14:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:14:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:14:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:14:18 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "auth_id": "Joe", "format": "json"}]: dispatch
Nov 28 10:14:18 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "auth_id": "Joe", "format": "json"}]: dispatch
Nov 28 10:14:18 np0005538513.localdomain ovn_controller[152322]: 2025-11-28T10:14:18Z|00499|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 28 10:14:19 np0005538513.localdomain ceph-mon[292954]: pgmap v687: 177 pgs: 177 active+clean; 227 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 94 KiB/s wr, 3 op/s
Nov 28 10:14:20 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-241168775"} v 0)
Nov 28 10:14:20 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-241168775"} : dispatch
Nov 28 10:14:20 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-241168775"}]': finished
Nov 28 10:14:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:14:20 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:14:20 np0005538513.localdomain podman[331656]: 2025-11-28 10:14:20.851091598 +0000 UTC m=+0.084347879 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:14:20 np0005538513.localdomain podman[331656]: 2025-11-28 10:14:20.884166418 +0000 UTC m=+0.117422679 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:14:20 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:14:20 np0005538513.localdomain podman[331657]: 2025-11-28 10:14:20.904207365 +0000 UTC m=+0.135791805 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 28 10:14:20 np0005538513.localdomain podman[331657]: 2025-11-28 10:14:20.9397401 +0000 UTC m=+0.171324500 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 10:14:20 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:14:21 np0005538513.localdomain ceph-mon[292954]: pgmap v688: 177 pgs: 177 active+clean; 227 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 55 KiB/s wr, 2 op/s
Nov 28 10:14:21 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "auth_id": "tempest-cephx-id-241168775", "format": "json"}]: dispatch
Nov 28 10:14:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-241168775"} : dispatch
Nov 28 10:14:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-241168775", "format": "json"} : dispatch
Nov 28 10:14:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-241168775"} : dispatch
Nov 28 10:14:21 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-241168775"}]': finished
Nov 28 10:14:21 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "auth_id": "tempest-cephx-id-241168775", "format": "json"}]: dispatch
Nov 28 10:14:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:22.378 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:22 np0005538513.localdomain ceph-mon[292954]: pgmap v689: 177 pgs: 177 active+clean; 228 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 86 KiB/s wr, 4 op/s
Nov 28 10:14:23 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.Joe"} v 0)
Nov 28 10:14:23 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch
Nov 28 10:14:23 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished
Nov 28 10:14:23 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "8d719993-3b66-454f-a026-687de7e6b3e4", "auth_id": "Joe", "format": "json"}]: dispatch
Nov 28 10:14:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch
Nov 28 10:14:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Nov 28 10:14:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch
Nov 28 10:14:23 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished
Nov 28 10:14:23 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "8d719993-3b66-454f-a026-687de7e6b3e4", "auth_id": "Joe", "format": "json"}]: dispatch
Nov 28 10:14:24 np0005538513.localdomain ceph-mon[292954]: pgmap v690: 177 pgs: 177 active+clean; 228 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 64 KiB/s wr, 3 op/s
Nov 28 10:14:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:27 np0005538513.localdomain ceph-mon[292954]: pgmap v691: 177 pgs: 177 active+clean; 228 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 64 KiB/s wr, 3 op/s
Nov 28 10:14:27 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin", "format": "json"} : dispatch
Nov 28 10:14:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:27.380 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:28 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "auth_id": "admin", "tenant_id": "301971e834c14ea7aa009696c3f04782", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:14:29 np0005538513.localdomain ceph-mon[292954]: pgmap v692: 177 pgs: 177 active+clean; 228 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 100 KiB/s wr, 5 op/s
Nov 28 10:14:30 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/4c25470d-c14c-4093-b430-b79c735aaf06/471f0b39-a16d-4e49-a0e3-afb597cde17a", "osd", "allow rw pool=manila_data namespace=fsvolumens_4c25470d-c14c-4093-b430-b79c735aaf06", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:14:30 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/4c25470d-c14c-4093-b430-b79c735aaf06/471f0b39-a16d-4e49-a0e3-afb597cde17a", "osd", "allow rw pool=manila_data namespace=fsvolumens_4c25470d-c14c-4093-b430-b79c735aaf06", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:14:30 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/4c25470d-c14c-4093-b430-b79c735aaf06/471f0b39-a16d-4e49-a0e3-afb597cde17a", "osd", "allow rw pool=manila_data namespace=fsvolumens_4c25470d-c14c-4093-b430-b79c735aaf06", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:14:31 np0005538513.localdomain ceph-mon[292954]: pgmap v693: 177 pgs: 177 active+clean; 228 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 67 KiB/s wr, 3 op/s
Nov 28 10:14:31 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "auth_id": "david", "tenant_id": "301971e834c14ea7aa009696c3f04782", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:14:31 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Nov 28 10:14:31 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/4c25470d-c14c-4093-b430-b79c735aaf06/471f0b39-a16d-4e49-a0e3-afb597cde17a", "osd", "allow rw pool=manila_data namespace=fsvolumens_4c25470d-c14c-4093-b430-b79c735aaf06", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:14:31 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/4c25470d-c14c-4093-b430-b79c735aaf06/471f0b39-a16d-4e49-a0e3-afb597cde17a", "osd", "allow rw pool=manila_data namespace=fsvolumens_4c25470d-c14c-4093-b430-b79c735aaf06", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:14:31 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/4c25470d-c14c-4093-b430-b79c735aaf06/471f0b39-a16d-4e49-a0e3-afb597cde17a", "osd", "allow rw pool=manila_data namespace=fsvolumens_4c25470d-c14c-4093-b430-b79c735aaf06", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:14:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:32.384 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:32 np0005538513.localdomain ceph-mon[292954]: pgmap v694: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 98 KiB/s wr, 5 op/s
Nov 28 10:14:34 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e1e8ed83-707d-47a8-914a-3aa1c73e18ce", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:14:34 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:14:34 np0005538513.localdomain podman[331698]: 2025-11-28 10:14:34.188771923 +0000 UTC m=+0.072389392 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, io.openshift.expose-services=, version=9.6)
Nov 28 10:14:34 np0005538513.localdomain podman[331698]: 2025-11-28 10:14:34.202374152 +0000 UTC m=+0.085991611 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, distribution-scope=public, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 28 10:14:34 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:14:35 np0005538513.localdomain ceph-mon[292954]: pgmap v695: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 67 KiB/s wr, 3 op/s
Nov 28 10:14:35 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e1e8ed83-707d-47a8-914a-3aa1c73e18ce", "format": "json"}]: dispatch
Nov 28 10:14:35 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:14:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:14:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:14:35 np0005538513.localdomain podman[331720]: 2025-11-28 10:14:35.499952894 +0000 UTC m=+0.089781567 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 28 10:14:35 np0005538513.localdomain podman[331720]: 2025-11-28 10:14:35.532003362 +0000 UTC m=+0.121832045 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent)
Nov 28 10:14:35 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:14:35 np0005538513.localdomain podman[331719]: 2025-11-28 10:14:35.556356592 +0000 UTC m=+0.148979581 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:14:35 np0005538513.localdomain podman[331719]: 2025-11-28 10:14:35.564226675 +0000 UTC m=+0.156849664 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:14:35 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:14:35 np0005538513.localdomain sudo[331759]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:14:35 np0005538513.localdomain sudo[331759]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:14:35 np0005538513.localdomain sudo[331759]: pam_unix(sudo:session): session closed for user root
Nov 28 10:14:35 np0005538513.localdomain sudo[331777]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:14:35 np0005538513.localdomain sudo[331777]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:14:36 np0005538513.localdomain sudo[331777]: pam_unix(sudo:session): session closed for user root
Nov 28 10:14:36 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:14:36 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:14:36 np0005538513.localdomain sudo[331827]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:14:36 np0005538513.localdomain sudo[331827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:14:36 np0005538513.localdomain sudo[331827]: pam_unix(sudo:session): session closed for user root
Nov 28 10:14:36 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:37 np0005538513.localdomain ceph-mon[292954]: pgmap v696: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 67 KiB/s wr, 3 op/s
Nov 28 10:14:37 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:14:37 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:14:37 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:14:37 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:14:37 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "e1e8ed83-707d-47a8-914a-3aa1c73e18ce", "auth_id": "david", "tenant_id": "b90c445933704341b38d135548fb5388", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:14:37 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Nov 28 10:14:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:37.386 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:14:39 np0005538513.localdomain ceph-mon[292954]: pgmap v697: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 95 KiB/s wr, 5 op/s
Nov 28 10:14:40 np0005538513.localdomain podman[238687]: time="2025-11-28T10:14:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:14:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:14:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1"
Nov 28 10:14:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:14:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19276 "" "Go-http-client/1.1"
Nov 28 10:14:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:14:41 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:14:41 np0005538513.localdomain ceph-mon[292954]: pgmap v698: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 59 KiB/s wr, 2 op/s
Nov 28 10:14:41 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "e1e8ed83-707d-47a8-914a-3aa1c73e18ce", "auth_id": "david", "format": "json"}]: dispatch
Nov 28 10:14:41 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "e1e8ed83-707d-47a8-914a-3aa1c73e18ce", "auth_id": "david", "format": "json"}]: dispatch
Nov 28 10:14:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:14:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:42.390 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:14:42 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:14:42 np0005538513.localdomain podman[331846]: 2025-11-28 10:14:42.850853632 +0000 UTC m=+0.079573752 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller)
Nov 28 10:14:42 np0005538513.localdomain podman[331846]: 2025-11-28 10:14:42.893895979 +0000 UTC m=+0.122616109 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 10:14:42 np0005538513.localdomain podman[331845]: 2025-11-28 10:14:42.911220973 +0000 UTC m=+0.141997587 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=edpm, io.buildah.version=1.41.3)
Nov 28 10:14:42 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:14:42 np0005538513.localdomain ceph-mon[292954]: pgmap v699: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 68 KiB/s wr, 3 op/s
Nov 28 10:14:42 np0005538513.localdomain podman[331845]: 2025-11-28 10:14:42.95365962 +0000 UTC m=+0.184436294 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:14:42 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:14:43 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.david"} v 0)
Nov 28 10:14:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch
Nov 28 10:14:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished
Nov 28 10:14:43 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "auth_id": "david", "format": "json"}]: dispatch
Nov 28 10:14:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch
Nov 28 10:14:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Nov 28 10:14:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch
Nov 28 10:14:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished
Nov 28 10:14:43 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "auth_id": "david", "format": "json"}]: dispatch
Nov 28 10:14:44 np0005538513.localdomain ceph-mon[292954]: pgmap v700: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 36 KiB/s wr, 1 op/s
Nov 28 10:14:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:47 np0005538513.localdomain ceph-mon[292954]: pgmap v701: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 36 KiB/s wr, 1 op/s
Nov 28 10:14:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:47.393 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:14:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:14:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:14:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:14:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:14:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:14:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:14:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:14:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:14:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:14:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:14:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:14:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:14:48 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e1e8ed83-707d-47a8-914a-3aa1c73e18ce", "format": "json"}]: dispatch
Nov 28 10:14:48 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e1e8ed83-707d-47a8-914a-3aa1c73e18ce", "force": true, "format": "json"}]: dispatch
Nov 28 10:14:49 np0005538513.localdomain ceph-mon[292954]: pgmap v702: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 67 KiB/s wr, 3 op/s
Nov 28 10:14:49 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:49.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:14:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:50.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:14:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:14:50.852 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:14:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:14:50.852 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:14:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:14:50.853 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:14:51 np0005538513.localdomain ceph-mon[292954]: pgmap v703: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 39 KiB/s wr, 2 op/s
Nov 28 10:14:51 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "format": "json"}]: dispatch
Nov 28 10:14:51 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "force": true, "format": "json"}]: dispatch
Nov 28 10:14:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:14:51 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:14:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:51.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:14:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:51.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:14:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:51.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:14:51 np0005538513.localdomain podman[331887]: 2025-11-28 10:14:51.849623377 +0000 UTC m=+0.081455741 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 28 10:14:51 np0005538513.localdomain podman[331887]: 2025-11-28 10:14:51.864475114 +0000 UTC m=+0.096307458 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:14:51 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:14:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:51 np0005538513.localdomain podman[331886]: 2025-11-28 10:14:51.945325236 +0000 UTC m=+0.179456261 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:14:51 np0005538513.localdomain podman[331886]: 2025-11-28 10:14:51.978619092 +0000 UTC m=+0.212750117 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:14:51 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:14:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:52.397 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:52 np0005538513.localdomain ceph-mon[292954]: pgmap v704: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 73 KiB/s wr, 3 op/s
Nov 28 10:14:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:53.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:14:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:53.792 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:14:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:53.793 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:14:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:53.793 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:14:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:53.793 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:14:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:53.794 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:14:53 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "8d719993-3b66-454f-a026-687de7e6b3e4", "format": "json"}]: dispatch
Nov 28 10:14:53 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "8d719993-3b66-454f-a026-687de7e6b3e4", "force": true, "format": "json"}]: dispatch
Nov 28 10:14:54 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:14:54 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2621744934' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:14:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:54.242 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:14:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:54.300 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:14:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:54.301 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:14:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:54.510 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:14:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:54.511 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=10996MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:14:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:54.511 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:14:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:54.511 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:14:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:54.584 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 10:14:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:54.584 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:14:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:54.585 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:14:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:54.635 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:14:54 np0005538513.localdomain ceph-mon[292954]: pgmap v705: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 65 KiB/s wr, 3 op/s
Nov 28 10:14:54 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2621744934' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:14:55 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:14:55 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3985923543' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:14:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:55.110 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:14:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:55.117 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:14:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:55.141 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:14:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:55.144 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:14:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:55.144 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.633s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:14:56 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3985923543' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:14:56 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:57 np0005538513.localdomain ceph-mon[292954]: pgmap v706: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 65 KiB/s wr, 3 op/s
Nov 28 10:14:57 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "auth_id": "admin", "format": "json"}]: dispatch
Nov 28 10:14:57 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "format": "json"}]: dispatch
Nov 28 10:14:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:57.141 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:14:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:57.142 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:14:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:57.142 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:14:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:57.399 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:14:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:57.400 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:57.400 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:14:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:57.401 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:14:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:57.402 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:14:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:57.405 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:57.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:14:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:57.773 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:14:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:57.773 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:14:58 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "force": true, "format": "json"}]: dispatch
Nov 28 10:14:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:58.255 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:14:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:58.256 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:14:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:58.256 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 10:14:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:58.257 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:14:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:58.676 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:14:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:58.691 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:14:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:14:58.691 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 10:14:59 np0005538513.localdomain ceph-mon[292954]: pgmap v707: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 106 KiB/s wr, 5 op/s
Nov 28 10:15:00 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/149668244' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:15:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:15:00.754 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:15:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:00.754 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:00 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:15:00.755 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:15:01 np0005538513.localdomain ceph-mon[292954]: pgmap v708: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 75 KiB/s wr, 3 op/s
Nov 28 10:15:01 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1184699795' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:15:01 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:02.402 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:02.406 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:02 np0005538513.localdomain ceph-mon[292954]: pgmap v709: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 77 KiB/s wr, 5 op/s
Nov 28 10:15:03 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1417585220' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:15:03 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2960680827' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:15:04 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:15:04 np0005538513.localdomain systemd[1]: tmp-crun.PUi2x2.mount: Deactivated successfully.
Nov 28 10:15:04 np0005538513.localdomain podman[331972]: 2025-11-28 10:15:04.849601801 +0000 UTC m=+0.084394793 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 10:15:04 np0005538513.localdomain podman[331972]: 2025-11-28 10:15:04.865517704 +0000 UTC m=+0.100310696 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, config_id=edpm, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, release=1755695350, managed_by=edpm_ansible, io.buildah.version=1.33.7)
Nov 28 10:15:04 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:15:04 np0005538513.localdomain ceph-mon[292954]: pgmap v710: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 44 KiB/s wr, 3 op/s
Nov 28 10:15:05 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:15:05.758 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:15:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:15:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:15:05 np0005538513.localdomain systemd[1]: tmp-crun.olZtNb.mount: Deactivated successfully.
Nov 28 10:15:05 np0005538513.localdomain podman[331992]: 2025-11-28 10:15:05.839983307 +0000 UTC m=+0.068316306 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:15:05 np0005538513.localdomain podman[331991]: 2025-11-28 10:15:05.900415157 +0000 UTC m=+0.128611822 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 10:15:05 np0005538513.localdomain podman[331991]: 2025-11-28 10:15:05.911552782 +0000 UTC m=+0.139749487 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:15:05 np0005538513.localdomain podman[331992]: 2025-11-28 10:15:05.926381182 +0000 UTC m=+0.154714231 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:15:05 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:15:05 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:15:06 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:07 np0005538513.localdomain ceph-mon[292954]: pgmap v711: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 44 KiB/s wr, 3 op/s
Nov 28 10:15:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:07.404 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:07.408 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:09 np0005538513.localdomain ceph-mon[292954]: pgmap v712: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 60 KiB/s wr, 3 op/s
Nov 28 10:15:10 np0005538513.localdomain podman[238687]: time="2025-11-28T10:15:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:15:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:15:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1"
Nov 28 10:15:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:15:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19285 "" "Go-http-client/1.1"
Nov 28 10:15:11 np0005538513.localdomain ceph-mon[292954]: pgmap v713: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 19 KiB/s wr, 1 op/s
Nov 28 10:15:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:12.407 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:12 np0005538513.localdomain ceph-mon[292954]: pgmap v714: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 19 KiB/s wr, 1 op/s
Nov 28 10:15:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:15:13 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:15:13 np0005538513.localdomain podman[332031]: 2025-11-28 10:15:13.852791003 +0000 UTC m=+0.087901892 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 28 10:15:13 np0005538513.localdomain podman[332032]: 2025-11-28 10:15:13.900942663 +0000 UTC m=+0.131226013 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 10:15:13 np0005538513.localdomain podman[332031]: 2025-11-28 10:15:13.919391814 +0000 UTC m=+0.154502703 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 28 10:15:13 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:15:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1275385607' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:15:13 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/1275385607' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:15:13 np0005538513.localdomain podman[332032]: 2025-11-28 10:15:13.994811959 +0000 UTC m=+0.225095249 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller)
Nov 28 10:15:14 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:15:15 np0005538513.localdomain ceph-mon[292954]: pgmap v715: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 16 KiB/s wr, 0 op/s
Nov 28 10:15:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:17 np0005538513.localdomain ceph-mon[292954]: pgmap v716: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 16 KiB/s wr, 0 op/s
Nov 28 10:15:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:17.410 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:15:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:15:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:15:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:15:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:15:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:15:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:15:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:15:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:15:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:15:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:15:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:15:19 np0005538513.localdomain ceph-mon[292954]: pgmap v717: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 16 KiB/s wr, 0 op/s
Nov 28 10:15:19 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e4f2bd9a-6730-4442-8982-1535b4534b94", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:15:19 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e4f2bd9a-6730-4442-8982-1535b4534b94", "format": "json"}]: dispatch
Nov 28 10:15:19 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:15:21 np0005538513.localdomain ceph-mon[292954]: pgmap v718: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:15:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:22 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:15:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:22.412 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:15:22 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:15:22 np0005538513.localdomain podman[332075]: 2025-11-28 10:15:22.854661807 +0000 UTC m=+0.082394830 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 10:15:22 np0005538513.localdomain podman[332075]: 2025-11-28 10:15:22.86829704 +0000 UTC m=+0.096030063 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:15:22 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:15:22 np0005538513.localdomain systemd[1]: tmp-crun.MNzzRV.mount: Deactivated successfully.
Nov 28 10:15:22 np0005538513.localdomain podman[332074]: 2025-11-28 10:15:22.969213743 +0000 UTC m=+0.199908258 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:15:22 np0005538513.localdomain podman[332074]: 2025-11-28 10:15:22.979580114 +0000 UTC m=+0.210274609 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:15:22 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:15:23 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "98a016eb-e15e-4a92-95d1-45b6c6f58025", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:15:23 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "98a016eb-e15e-4a92-95d1-45b6c6f58025", "format": "json"}]: dispatch
Nov 28 10:15:23 np0005538513.localdomain ceph-mon[292954]: pgmap v719: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 12 KiB/s wr, 0 op/s
Nov 28 10:15:25 np0005538513.localdomain ceph-mon[292954]: pgmap v720: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 12 KiB/s wr, 0 op/s
Nov 28 10:15:26 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "183f7a7a-5ae4-4e98-be38-4edc7d9e437a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:15:26 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "183f7a7a-5ae4-4e98-be38-4edc7d9e437a", "format": "json"}]: dispatch
Nov 28 10:15:26 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:15:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:27 np0005538513.localdomain ceph-mon[292954]: pgmap v721: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 12 KiB/s wr, 0 op/s
Nov 28 10:15:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:27.415 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:29 np0005538513.localdomain ceph-mon[292954]: pgmap v722: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s wr, 2 op/s
Nov 28 10:15:29 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:15:30 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "8b4e8834-f80e-4f71-9b6d-4708a36d1242", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:15:30 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "8b4e8834-f80e-4f71-9b6d-4708a36d1242", "format": "json"}]: dispatch
Nov 28 10:15:31 np0005538513.localdomain ceph-mon[292954]: pgmap v723: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s wr, 2 op/s
Nov 28 10:15:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:32.418 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:32 np0005538513.localdomain ceph-mon[292954]: pgmap v724: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 73 KiB/s wr, 3 op/s
Nov 28 10:15:33 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "8b4e8834-f80e-4f71-9b6d-4708a36d1242", "format": "json"}]: dispatch
Nov 28 10:15:33 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "8b4e8834-f80e-4f71-9b6d-4708a36d1242", "force": true, "format": "json"}]: dispatch
Nov 28 10:15:35 np0005538513.localdomain ceph-mon[292954]: pgmap v725: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 61 KiB/s wr, 2 op/s
Nov 28 10:15:35 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:15:35 np0005538513.localdomain systemd[1]: tmp-crun.LMuaek.mount: Deactivated successfully.
Nov 28 10:15:35 np0005538513.localdomain podman[332117]: 2025-11-28 10:15:35.530674031 +0000 UTC m=+0.127469246 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 10:15:35 np0005538513.localdomain podman[332117]: 2025-11-28 10:15:35.542509588 +0000 UTC m=+0.139304783 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 28 10:15:35 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:15:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:35.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:15:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:35.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 10:15:35 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:35.791 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 10:15:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:15:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:15:36 np0005538513.localdomain podman[332138]: 2025-11-28 10:15:36.850175734 +0000 UTC m=+0.086642563 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:15:36 np0005538513.localdomain podman[332138]: 2025-11-28 10:15:36.858239234 +0000 UTC m=+0.094706113 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:15:36 np0005538513.localdomain sudo[332156]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:15:36 np0005538513.localdomain sudo[332156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:15:36 np0005538513.localdomain sudo[332156]: pam_unix(sudo:session): session closed for user root
Nov 28 10:15:36 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:15:36 np0005538513.localdomain systemd[1]: tmp-crun.xIyuWZ.mount: Deactivated successfully.
Nov 28 10:15:36 np0005538513.localdomain podman[332139]: 2025-11-28 10:15:36.925740103 +0000 UTC m=+0.157352571 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:15:36 np0005538513.localdomain sudo[332190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:15:36 np0005538513.localdomain sudo[332190]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:15:36 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:36 np0005538513.localdomain podman[332139]: 2025-11-28 10:15:36.958476767 +0000 UTC m=+0.190089195 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:15:36 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:15:37 np0005538513.localdomain ceph-mon[292954]: pgmap v726: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 61 KiB/s wr, 2 op/s
Nov 28 10:15:37 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "183f7a7a-5ae4-4e98-be38-4edc7d9e437a", "format": "json"}]: dispatch
Nov 28 10:15:37 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "183f7a7a-5ae4-4e98-be38-4edc7d9e437a", "force": true, "format": "json"}]: dispatch
Nov 28 10:15:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:37.422 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:37 np0005538513.localdomain sudo[332190]: pam_unix(sudo:session): session closed for user root
Nov 28 10:15:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:15:37 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:15:37 np0005538513.localdomain sudo[332246]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:15:37 np0005538513.localdomain sudo[332246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:15:37 np0005538513.localdomain sudo[332246]: pam_unix(sudo:session): session closed for user root
Nov 28 10:15:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:15:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:15:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:15:38 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:15:39 np0005538513.localdomain ceph-mon[292954]: pgmap v727: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 107 KiB/s wr, 4 op/s
Nov 28 10:15:39 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "98a016eb-e15e-4a92-95d1-45b6c6f58025", "format": "json"}]: dispatch
Nov 28 10:15:39 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "98a016eb-e15e-4a92-95d1-45b6c6f58025", "force": true, "format": "json"}]: dispatch
Nov 28 10:15:40 np0005538513.localdomain podman[238687]: time="2025-11-28T10:15:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:15:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:15:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1"
Nov 28 10:15:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:15:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19276 "" "Go-http-client/1.1"
Nov 28 10:15:40 np0005538513.localdomain ceph-mon[292954]: pgmap v728: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 66 KiB/s wr, 2 op/s
Nov 28 10:15:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:15:41 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:15:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:15:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:42.423 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:42.427 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:43 np0005538513.localdomain ceph-mon[292954]: pgmap v729: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 90 KiB/s wr, 4 op/s
Nov 28 10:15:43 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e4f2bd9a-6730-4442-8982-1535b4534b94", "format": "json"}]: dispatch
Nov 28 10:15:43 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e4f2bd9a-6730-4442-8982-1535b4534b94", "force": true, "format": "json"}]: dispatch
Nov 28 10:15:44 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:15:44 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:15:44 np0005538513.localdomain systemd[1]: tmp-crun.W9VFKS.mount: Deactivated successfully.
Nov 28 10:15:44 np0005538513.localdomain podman[332264]: 2025-11-28 10:15:44.862219304 +0000 UTC m=+0.097571191 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm)
Nov 28 10:15:44 np0005538513.localdomain podman[332265]: 2025-11-28 10:15:44.906045981 +0000 UTC m=+0.138279021 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Nov 28 10:15:44 np0005538513.localdomain podman[332264]: 2025-11-28 10:15:44.929489356 +0000 UTC m=+0.164841233 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 10:15:44 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:15:45 np0005538513.localdomain podman[332265]: 2025-11-28 10:15:45.016595273 +0000 UTC m=+0.248828353 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Nov 28 10:15:45 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:15:45 np0005538513.localdomain ceph-mon[292954]: pgmap v730: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 70 KiB/s wr, 3 op/s
Nov 28 10:15:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:47 np0005538513.localdomain ceph-mon[292954]: pgmap v731: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 70 KiB/s wr, 3 op/s
Nov 28 10:15:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:47.426 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:15:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:15:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:15:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:15:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:15:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:15:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:15:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:15:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:15:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:15:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:15:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:15:49 np0005538513.localdomain ceph-mon[292954]: pgmap v732: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 682 B/s rd, 95 KiB/s wr, 4 op/s
Nov 28 10:15:50 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:50.791 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:15:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:15:50.853 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:15:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:15:50.853 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:15:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:15:50.854 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:15:51 np0005538513.localdomain ceph-mon[292954]: pgmap v733: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 49 KiB/s wr, 3 op/s
Nov 28 10:15:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:51.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:15:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:51.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:15:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:51.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:15:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:51.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:15:51 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:52.429 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:52 np0005538513.localdomain ceph-mon[292954]: pgmap v734: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 56 KiB/s wr, 3 op/s
Nov 28 10:15:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:15:53 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:15:53 np0005538513.localdomain podman[332307]: 2025-11-28 10:15:53.850887727 +0000 UTC m=+0.081272307 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:15:53 np0005538513.localdomain podman[332307]: 2025-11-28 10:15:53.858601736 +0000 UTC m=+0.088986346 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:15:53 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:15:53 np0005538513.localdomain podman[332308]: 2025-11-28 10:15:53.904595759 +0000 UTC m=+0.132915344 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS)
Nov 28 10:15:53 np0005538513.localdomain podman[332308]: 2025-11-28 10:15:53.914657961 +0000 UTC m=+0.142977556 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 10:15:53 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:15:54 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4650ef04-7360-41ee-b6b9-a66770a7edbb", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:15:54 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4650ef04-7360-41ee-b6b9-a66770a7edbb", "format": "json"}]: dispatch
Nov 28 10:15:54 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:15:55 np0005538513.localdomain ceph-mon[292954]: pgmap v735: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 32 KiB/s wr, 1 op/s
Nov 28 10:15:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:55.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:15:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:55.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:15:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:55.793 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:15:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:55.794 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:15:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:55.794 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:15:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:55.795 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:15:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:55.795 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:15:56 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:15:56 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2526196820' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:15:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:56.254 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:15:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:56.314 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:15:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:56.315 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:15:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:56.521 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:15:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:56.522 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=10992MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:15:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:56.523 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:15:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:56.523 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:15:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:56.797 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 10:15:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:56.798 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:15:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:56.798 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:15:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:56.892 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing inventories for resource provider 35fead26-0bad-4950-b646-987079d58a17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 10:15:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:56.953 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating ProviderTree inventory for provider 35fead26-0bad-4950-b646-987079d58a17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 10:15:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:56.954 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 10:15:56 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:56.971 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing aggregate associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 10:15:56 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:56.992 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing trait associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_FMA3,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_ABM,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AMD_SVM,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_CLMUL,HW_CPU_X86_BMI2,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 10:15:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:57.026 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:15:57 np0005538513.localdomain ceph-mon[292954]: pgmap v736: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 32 KiB/s wr, 1 op/s
Nov 28 10:15:57 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "4650ef04-7360-41ee-b6b9-a66770a7edbb", "snap_name": "489a50ca-e11a-4c64-8d66-724b9734ba9f", "format": "json"}]: dispatch
Nov 28 10:15:57 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2526196820' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:15:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:57.432 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:57 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:15:57 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1329289336' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:15:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:57.485 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:15:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:57.492 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:15:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:57.510 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:15:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:57.512 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:15:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:57.513 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.989s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:15:58 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1329289336' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:15:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:58.510 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:15:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:58.511 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:15:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:58.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:15:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:58.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:15:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:58.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:15:59 np0005538513.localdomain ceph-mon[292954]: pgmap v737: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 50 KiB/s wr, 2 op/s
Nov 28 10:15:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:59.301 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:15:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:59.302 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:15:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:59.302 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 10:15:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:59.302 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:15:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:59.685 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:15:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:59.698 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:15:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:15:59.698 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 10:16:00 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1387439949' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.677 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.678 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.708 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.708 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc7d0c19-01b9-4524-8b87-517a2a802529', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:16:00.678858', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3d04cab0-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': 'eca5a7a586c4edd19f76eb9c8913fabf4522b7b3edc775edfb6b87309eae0824'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:16:00.678858', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3d04dd52-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': '331a25e551b887fdb90f2147290296d02eb3d7bfecbbc3efd1132329218da339'}]}, 'timestamp': '2025-11-28 10:16:00.709196', '_unique_id': 'd339e3c9c9e543b8ae6fb9c564c3d260'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.711 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.712 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.712 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fded22d5-182f-4dc6-b1d9-1f7b988d2828', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:16:00.711996', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3d055e3a-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': '637546bc4910fd054ea3c18438bd2dca00f9cfaef776125a7d767658d371d652'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:16:00.711996', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3d056e98-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': 'b558d7ce00df5bc49d73538847cb0bfd20bbfedc5a55d1454209a07355314691'}]}, 'timestamp': '2025-11-28 10:16:00.712869', '_unique_id': 'c4a7a21e2a7a43188dd024dd2460c450'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.714 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.725 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.725 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd81ef56a-01c8-4d7c-875c-10d2a32b95ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:16:00.715006', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3d075f28-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.886933985, 'message_signature': '0d029be738a6979fa441af14a373feddb4b74a524008c7606a3105f5b8a17e39'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:16:00.715006', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3d076f7c-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.886933985, 'message_signature': '5cb98b17d841b7686c1cf4b977cd5926734db8309ee05103010245fdd79bbfcb'}]}, 'timestamp': '2025-11-28 10:16:00.725998', '_unique_id': 'b86cb7c01ad6412cbbce2c41775c6585'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.728 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.728 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.728 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab2d6e60-f0db-4017-9571-5689127754d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:16:00.728199', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3d07d5c0-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.886933985, 'message_signature': 'a5b76cb247d66104c0d5f8d68219792f2ae8aaeb34e3ba72f11dab3d994db871'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:16:00.728199', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3d07e556-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.886933985, 'message_signature': '3fe3126f4ceda88b6a7072abe378c88131512799ddb00e3cf53e815c98491336'}]}, 'timestamp': '2025-11-28 10:16:00.729014', '_unique_id': '322dfced9e63495e816977d19dd3b1f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.730 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.731 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.731 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.731 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28631402-744a-4228-be43-4f5f9daa1534', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:16:00.731285', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3d084e56-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.886933985, 'message_signature': 'b381f6b939e23b24a826a4806a1b456316535c4783de18b39891531801043373'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:16:00.731285', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3d085db0-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.886933985, 'message_signature': '4118cab07d29c2a4dd063cb6113d115122c89208379f5e8734ee578747a37b10'}]}, 'timestamp': '2025-11-28 10:16:00.732126', '_unique_id': 'e55258cff5f549529ba9c790638c62f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.734 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.737 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '098746a9-6c96-4743-b70f-50f62d64f6b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:16:00.734183', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3d09463a-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.906077868, 'message_signature': 'c0f89fa1cac461ac72652cde0bd7d4476feca91ef6d41dcb0e4c0f0c7be64379'}]}, 'timestamp': '2025-11-28 10:16:00.738160', '_unique_id': '82c0a3cfa7ed425a9561e3edd1235f83'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.740 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.740 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'faeeb054-4a48-469d-920b-86b6ec48f3f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:16:00.740418', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3d09b688-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.906077868, 'message_signature': '0ab9be3ee90259c65525c095e22f315d8803adaf0fd0274ca7e9adad883a4f74'}]}, 'timestamp': '2025-11-28 10:16:00.741051', '_unique_id': '640ce0e520b64d1c98008f3222b8ac05'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.743 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.743 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1cc957d3-394f-4cb3-b47e-c5e71fea0563', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:16:00.743467', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3d0a2da2-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.906077868, 'message_signature': '76e6eb89f430c0081a884d814bde2d15b0460dc8b3fdb4af362afa3eeb05bb0c'}]}, 'timestamp': '2025-11-28 10:16:00.744011', '_unique_id': 'e3fdccba48514d1e81e9e39e1cab6721'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.746 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.764 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1df59347-ed21-4bbf-93fa-707c89fdc4aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:16:00.746261', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '3d0d5504-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.935910592, 'message_signature': '16a8c63f13aab10aa6a373bbcdc2990c3c6ed8c61b002385cf11f12055d83cb0'}]}, 'timestamp': '2025-11-28 10:16:00.764698', '_unique_id': '3c6d426a501149a09ade2dd46355bd7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.766 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.767 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.767 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92d48e41-1f81-4084-9a72-f0a0a64c2430', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:16:00.767143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3d0dcb24-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': '6132aa63d6cf8921f8b342e45a7236dc5d7d71ca52e957eac8ddc7277e492f4f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:16:00.767143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3d0ddea2-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': '7f32c0ffe4930dd91c9c44d65ea275dadcefbc4e6d077258adb463a8b76b2a10'}]}, 'timestamp': '2025-11-28 10:16:00.768238', '_unique_id': 'ad5b4f07ac90417fae8078a8f08f8565'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.770 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.770 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a8dad1c-be27-4d6e-838d-e2f07b5f5ba1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:16:00.770595', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3d0e51a2-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.906077868, 'message_signature': '8bf6f78d1e7490a2ffc622f813b8ed9248beb28b989ed80d7657d2b01273e659'}]}, 'timestamp': '2025-11-28 10:16:00.771232', '_unique_id': '7454c1b1ed9743b198e0bd666fe02579'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.773 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.773 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.773 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '618379f5-2955-4139-9f6f-82e310344c48', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:16:00.773625', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3d0ec7ae-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.906077868, 'message_signature': 'ae30a0b2e1df20793d913045a24944be2375f0f649d88cd912f50c1b87c732a5'}]}, 'timestamp': '2025-11-28 10:16:00.774242', '_unique_id': '288d44acce4142c5bbf64b936d86dd22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.776 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.776 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.777 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a52fc2ef-6b96-43fe-bf79-f96457505af7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:16:00.776625', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3d0f3c8e-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': 'de63d32005b88ccc38c10f8a8b10387590f135aafd29e40bc8c0e2c8215812b4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:16:00.776625', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3d0f519c-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': '9020f88c14f12332ce58186904a2b66dfed85c7c909573863eee0bff38394b81'}]}, 'timestamp': '2025-11-28 10:16:00.777681', '_unique_id': 'a16a7b29940d4f699a6312cc2bd250ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.779 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.780 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2fbe4e89-56a2-458d-9d75-6443ca9350f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:16:00.779959', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3d0fc0dc-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.906077868, 'message_signature': '7ab5108a01831e368abfc4a97f71ce0dbb902538550cbf87813e219c84a122d4'}]}, 'timestamp': '2025-11-28 10:16:00.780604', '_unique_id': 'd2e65598e93d488bab9870c5a5277478'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.782 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.782 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.783 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c68a2e24-d218-47a4-a79c-0e07345ec148', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:16:00.782836', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3d10309e-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': '55d2e584586ef470696742971139677f66f42d3748501f052e5296cde288bea3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:16:00.782836', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3d10444e-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': 'c475628eadabe3990fb634701ce2493be77a231c8ede16fd56c5806cb8762e7e'}]}, 'timestamp': '2025-11-28 10:16:00.783890', '_unique_id': '8bbed43fb368475e8f217b865f8ff7c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.785 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.786 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f45d0660-7364-4690-ba49-72a141289b7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:16:00.786145', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3d10b1ea-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.906077868, 'message_signature': '447719a389abf3a3e3d3559343c61cad8b3bfe868138c6ec1ba9ca9010ed85b4'}]}, 'timestamp': '2025-11-28 10:16:00.786767', '_unique_id': '1ff1e02d31e346f49f9aa6ad49a7be37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.788 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.789 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 20070000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f622abe7-fc12-4452-8ad4-da81f3273ecc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20070000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:16:00.789083', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '3d1123be-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.935910592, 'message_signature': 'd06a1e8d8c364151a0a69f6d08e5a300cd758a7f2a6ffe842425da42f66e5825'}]}, 'timestamp': '2025-11-28 10:16:00.789675', '_unique_id': 'b262a6bfdee84cb591bb279124909473'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.791 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.791 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.792 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f38c37ae-2bd1-411c-b1e7-8fd272225645', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:16:00.792154', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3d119b8c-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.906077868, 'message_signature': '97f60b366e2d18daf121861101081c577c57e88447faa146c34f732a5647e6f3'}]}, 'timestamp': '2025-11-28 10:16:00.792699', '_unique_id': '7e0f4c28ef02420f8888f43cc008cda4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.794 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8d469f2-72a4-4930-87ef-5bd382c0da55', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:16:00.795153', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3d120bbc-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': '0d4db8536788fdeebdcd6b01972b6e3d31e7405dd71913cdda70fdf9b3afaa3e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:16:00.795153', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3d12188c-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': '7bc8f61052e82b30fe86cdd2fdc4c3352eede3b45b69660e38549f70a22d6f9c'}]}, 'timestamp': '2025-11-28 10:16:00.795799', '_unique_id': '9b449fdc93744848b05080b0c76ffd6e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.797 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.797 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '818d6f39-f1a7-46e7-b377-c6a5f5da23fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:16:00.797176', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3d125afe-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.906077868, 'message_signature': 'b1611f46a964182e20c01f18bcb3fad57630f3ecbdcdcda479671c37f04e211e'}]}, 'timestamp': '2025-11-28 10:16:00.797561', '_unique_id': '7cd36745769e4aa2bbf337f3558ae700'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '621c8aca-6187-490d-b9d4-5936489d8a7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:16:00.798917', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3d12a04a-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.906077868, 'message_signature': 'b08e26e254e0846c8d59e5d19cb585a249a1eb3a30b4124e4bd72c773befdaaa'}]}, 'timestamp': '2025-11-28 10:16:00.799320', '_unique_id': '71121c48d8c647f3b51e43b9dedb2cc7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:16:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:16:01 np0005538513.localdomain ceph-mon[292954]: pgmap v738: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 25 KiB/s wr, 1 op/s
Nov 28 10:16:01 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "4650ef04-7360-41ee-b6b9-a66770a7edbb", "snap_name": "489a50ca-e11a-4c64-8d66-724b9734ba9f_7b47d2ce-aead-4b15-bdda-8d040cf088ac", "force": true, "format": "json"}]: dispatch
Nov 28 10:16:01 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "4650ef04-7360-41ee-b6b9-a66770a7edbb", "snap_name": "489a50ca-e11a-4c64-8d66-724b9734ba9f", "force": true, "format": "json"}]: dispatch
Nov 28 10:16:01 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/3856985557' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:16:01 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:02.435 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:02 np0005538513.localdomain ceph-mon[292954]: pgmap v739: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 51 KiB/s wr, 2 op/s
Nov 28 10:16:04 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4650ef04-7360-41ee-b6b9-a66770a7edbb", "format": "json"}]: dispatch
Nov 28 10:16:04 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4650ef04-7360-41ee-b6b9-a66770a7edbb", "force": true, "format": "json"}]: dispatch
Nov 28 10:16:05 np0005538513.localdomain ceph-mon[292954]: pgmap v740: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 45 KiB/s wr, 2 op/s
Nov 28 10:16:05 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2845142208' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:16:05 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:16:05 np0005538513.localdomain podman[332393]: 2025-11-28 10:16:05.854376625 +0000 UTC m=+0.092994309 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 28 10:16:05 np0005538513.localdomain podman[332393]: 2025-11-28 10:16:05.866544731 +0000 UTC m=+0.105162375 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, release=1755695350, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc.)
Nov 28 10:16:05 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:16:06 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 do_prune osdmap full prune enabled
Nov 28 10:16:06 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/160190658' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:16:06 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e291 e291: 6 total, 6 up, 6 in
Nov 28 10:16:06 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e291: 6 total, 6 up, 6 in
Nov 28 10:16:06 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:06.694 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:16:06 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:06.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:16:06 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:07 np0005538513.localdomain ceph-mon[292954]: pgmap v741: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 45 KiB/s wr, 2 op/s
Nov 28 10:16:07 np0005538513.localdomain ceph-mon[292954]: osdmap e291: 6 total, 6 up, 6 in
Nov 28 10:16:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:07.440 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:16:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:07.442 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:16:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:07.442 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:16:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:07.442 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:16:07 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:16:07.445 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:16:07 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:16:07.446 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:16:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:07.467 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:07.467 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:16:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:16:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:16:07 np0005538513.localdomain systemd[1]: tmp-crun.98xNFp.mount: Deactivated successfully.
Nov 28 10:16:07 np0005538513.localdomain podman[332413]: 2025-11-28 10:16:07.866428895 +0000 UTC m=+0.100917654 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:16:07 np0005538513.localdomain podman[332413]: 2025-11-28 10:16:07.879647555 +0000 UTC m=+0.114136304 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:16:07 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:16:07 np0005538513.localdomain podman[332414]: 2025-11-28 10:16:07.95738578 +0000 UTC m=+0.186470632 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:16:07 np0005538513.localdomain podman[332414]: 2025-11-28 10:16:07.99548463 +0000 UTC m=+0.224569462 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 28 10:16:08 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:16:09 np0005538513.localdomain ceph-mon[292954]: pgmap v743: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 63 KiB/s wr, 3 op/s
Nov 28 10:16:10 np0005538513.localdomain podman[238687]: time="2025-11-28T10:16:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:16:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:16:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1"
Nov 28 10:16:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:16:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19282 "" "Go-http-client/1.1"
Nov 28 10:16:11 np0005538513.localdomain ceph-mon[292954]: pgmap v744: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 63 KiB/s wr, 3 op/s
Nov 28 10:16:11 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:16:11.448 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:16:11 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:12.467 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:12.469 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:12 np0005538513.localdomain ceph-mon[292954]: pgmap v745: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 49 KiB/s wr, 2 op/s
Nov 28 10:16:14 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "2507d4ac-39eb-44f4-bc88-d8388bf61f95", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:16:14 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3621361723' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:16:14 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3621361723' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:16:14 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "2507d4ac-39eb-44f4-bc88-d8388bf61f95", "format": "json"}]: dispatch
Nov 28 10:16:14 np0005538513.localdomain ceph-mon[292954]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:16:14 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:14.633 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:16:15 np0005538513.localdomain ceph-mon[292954]: pgmap v746: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 49 KiB/s wr, 2 op/s
Nov 28 10:16:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:16:15 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:16:15 np0005538513.localdomain systemd[1]: tmp-crun.KOH8q6.mount: Deactivated successfully.
Nov 28 10:16:15 np0005538513.localdomain podman[332455]: 2025-11-28 10:16:15.869499778 +0000 UTC m=+0.099159361 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 28 10:16:15 np0005538513.localdomain podman[332454]: 2025-11-28 10:16:15.962504077 +0000 UTC m=+0.194393158 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 28 10:16:15 np0005538513.localdomain podman[332455]: 2025-11-28 10:16:15.98814974 +0000 UTC m=+0.217809323 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 28 10:16:16 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:16:16 np0005538513.localdomain podman[332454]: 2025-11-28 10:16:16.005089574 +0000 UTC m=+0.236978655 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:16:16 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.156162) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324976156204, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 2245, "num_deletes": 251, "total_data_size": 2059071, "memory_usage": 2102240, "flush_reason": "Manual Compaction"}
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324976169322, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 1989878, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39024, "largest_seqno": 41268, "table_properties": {"data_size": 1980900, "index_size": 5423, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 21281, "raw_average_key_size": 21, "raw_value_size": 1961837, "raw_average_value_size": 1971, "num_data_blocks": 233, "num_entries": 995, "num_filter_entries": 995, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324812, "oldest_key_time": 1764324812, "file_creation_time": 1764324976, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 13216 microseconds, and 5762 cpu microseconds.
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.169374) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 1989878 bytes OK
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.169397) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.171688) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.171709) EVENT_LOG_v1 {"time_micros": 1764324976171703, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.171730) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 2049537, prev total WAL file size 2049537, number of live WAL files 2.
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.172453) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003133303532' seq:72057594037927935, type:22 .. '7061786F73003133333034' seq:0, type:0; will stop at (end)
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(1943KB)], [69(18MB)]
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324976172524, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 21596800, "oldest_snapshot_seqno": -1}
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 14828 keys, 20052075 bytes, temperature: kUnknown
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324976277069, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 20052075, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19964297, "index_size": 49597, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 37125, "raw_key_size": 394575, "raw_average_key_size": 26, "raw_value_size": 19710029, "raw_average_value_size": 1329, "num_data_blocks": 1867, "num_entries": 14828, "num_filter_entries": 14828, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324976, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.277389) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 20052075 bytes
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.279129) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 206.4 rd, 191.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 18.7 +0.0 blob) out(19.1 +0.0 blob), read-write-amplify(20.9) write-amplify(10.1) OK, records in: 15361, records dropped: 533 output_compression: NoCompression
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.279160) EVENT_LOG_v1 {"time_micros": 1764324976279146, "job": 42, "event": "compaction_finished", "compaction_time_micros": 104636, "compaction_time_cpu_micros": 55212, "output_level": 6, "num_output_files": 1, "total_output_size": 20052075, "num_input_records": 15361, "num_output_records": 14828, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324976279572, "job": 42, "event": "table_file_deletion", "file_number": 71}
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324976282571, "job": 42, "event": "table_file_deletion", "file_number": 69}
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.172349) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.282660) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.282667) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.282670) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.282673) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.282675) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e291 do_prune osdmap full prune enabled
Nov 28 10:16:16 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e292 e292: 6 total, 6 up, 6 in
Nov 28 10:16:17 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e292: 6 total, 6 up, 6 in
Nov 28 10:16:17 np0005538513.localdomain ceph-mon[292954]: pgmap v747: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 49 KiB/s wr, 2 op/s
Nov 28 10:16:17 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "2507d4ac-39eb-44f4-bc88-d8388bf61f95", "snap_name": "f728da4c-905d-477e-bad7-75aaf837cd76", "format": "json"}]: dispatch
Nov 28 10:16:17 np0005538513.localdomain ceph-mon[292954]: osdmap e292: 6 total, 6 up, 6 in
Nov 28 10:16:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:17.470 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:17.473 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:16:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:16:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:16:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:16:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:16:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:16:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:16:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:16:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:16:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:16:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:16:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:16:18 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:18.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:16:18 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:18.773 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 10:16:19 np0005538513.localdomain ceph-mon[292954]: pgmap v749: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 40 KiB/s wr, 1 op/s
Nov 28 10:16:21 np0005538513.localdomain ceph-mon[292954]: pgmap v750: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 40 KiB/s wr, 1 op/s
Nov 28 10:16:21 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "2507d4ac-39eb-44f4-bc88-d8388bf61f95", "snap_name": "f728da4c-905d-477e-bad7-75aaf837cd76_f56b26cb-b1ec-4894-975d-4a0ccc289bb1", "force": true, "format": "json"}]: dispatch
Nov 28 10:16:21 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "2507d4ac-39eb-44f4-bc88-d8388bf61f95", "snap_name": "f728da4c-905d-477e-bad7-75aaf837cd76", "force": true, "format": "json"}]: dispatch
Nov 28 10:16:21 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e292 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:22.475 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:16:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:22.477 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:16:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:22.477 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:16:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:22.477 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:16:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:22.488 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:22.489 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:16:23 np0005538513.localdomain ceph-mon[292954]: pgmap v751: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s wr, 2 op/s
Nov 28 10:16:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:16:24 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:16:24 np0005538513.localdomain podman[332499]: 2025-11-28 10:16:24.853270588 +0000 UTC m=+0.084453216 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:16:24 np0005538513.localdomain podman[332499]: 2025-11-28 10:16:24.864442674 +0000 UTC m=+0.095625262 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:16:24 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:16:24 np0005538513.localdomain podman[332500]: 2025-11-28 10:16:24.920949532 +0000 UTC m=+0.146882578 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 10:16:24 np0005538513.localdomain podman[332500]: 2025-11-28 10:16:24.933514211 +0000 UTC m=+0.159447227 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd)
Nov 28 10:16:24 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:16:25 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "2507d4ac-39eb-44f4-bc88-d8388bf61f95", "format": "json"}]: dispatch
Nov 28 10:16:25 np0005538513.localdomain ceph-mon[292954]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "2507d4ac-39eb-44f4-bc88-d8388bf61f95", "force": true, "format": "json"}]: dispatch
Nov 28 10:16:25 np0005538513.localdomain ceph-mon[292954]: pgmap v752: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s wr, 2 op/s
Nov 28 10:16:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e292 do_prune osdmap full prune enabled
Nov 28 10:16:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e293 e293: 6 total, 6 up, 6 in
Nov 28 10:16:26 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e293: 6 total, 6 up, 6 in
Nov 28 10:16:26 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:27 np0005538513.localdomain ceph-mon[292954]: pgmap v753: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s wr, 2 op/s
Nov 28 10:16:27 np0005538513.localdomain ceph-mon[292954]: osdmap e293: 6 total, 6 up, 6 in
Nov 28 10:16:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:27.489 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:16:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:27.491 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:16:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:27.491 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:16:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:27.492 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:16:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:27.513 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:27.513 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:16:29 np0005538513.localdomain ceph-mon[292954]: pgmap v755: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 61 KiB/s wr, 3 op/s
Nov 28 10:16:31 np0005538513.localdomain ceph-mon[292954]: pgmap v756: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 61 KiB/s wr, 3 op/s
Nov 28 10:16:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:32.514 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:16:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:32.515 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:32.515 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:16:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:32.515 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:16:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:32.516 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:16:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:32.517 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:33 np0005538513.localdomain ceph-mon[292954]: pgmap v757: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 57 KiB/s wr, 3 op/s
Nov 28 10:16:35 np0005538513.localdomain ceph-mon[292954]: pgmap v758: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 57 KiB/s wr, 3 op/s
Nov 28 10:16:36 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:16:36 np0005538513.localdomain systemd[1]: tmp-crun.DVjIWv.mount: Deactivated successfully.
Nov 28 10:16:36 np0005538513.localdomain podman[332541]: 2025-11-28 10:16:36.861763329 +0000 UTC m=+0.098591662 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm)
Nov 28 10:16:36 np0005538513.localdomain podman[332541]: 2025-11-28 10:16:36.904699849 +0000 UTC m=+0.141528152 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, maintainer=Red Hat, Inc., vcs-type=git, name=ubi9-minimal, version=9.6, distribution-scope=public, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350)
Nov 28 10:16:36 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:16:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e293 do_prune osdmap full prune enabled
Nov 28 10:16:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 e294: 6 total, 6 up, 6 in
Nov 28 10:16:37 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e294: 6 total, 6 up, 6 in
Nov 28 10:16:37 np0005538513.localdomain ceph-mon[292954]: pgmap v759: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 57 KiB/s wr, 3 op/s
Nov 28 10:16:37 np0005538513.localdomain ceph-mon[292954]: osdmap e294: 6 total, 6 up, 6 in
Nov 28 10:16:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:37.518 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:16:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:37.520 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:16:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:37.521 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:16:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:37.521 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:16:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:37.544 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:37.544 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:16:38 np0005538513.localdomain sudo[332561]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:16:38 np0005538513.localdomain sudo[332561]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:16:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:16:38 np0005538513.localdomain sudo[332561]: pam_unix(sudo:session): session closed for user root
Nov 28 10:16:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:16:38 np0005538513.localdomain sudo[332584]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:16:38 np0005538513.localdomain sudo[332584]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:16:38 np0005538513.localdomain systemd[1]: tmp-crun.ux6kgW.mount: Deactivated successfully.
Nov 28 10:16:38 np0005538513.localdomain podman[332580]: 2025-11-28 10:16:38.191156229 +0000 UTC m=+0.077464749 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 10:16:38 np0005538513.localdomain systemd[1]: tmp-crun.LEI9gn.mount: Deactivated successfully.
Nov 28 10:16:38 np0005538513.localdomain podman[332578]: 2025-11-28 10:16:38.257825103 +0000 UTC m=+0.141290735 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:16:38 np0005538513.localdomain podman[332580]: 2025-11-28 10:16:38.276002645 +0000 UTC m=+0.162311135 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 28 10:16:38 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:16:38 np0005538513.localdomain podman[332578]: 2025-11-28 10:16:38.296399866 +0000 UTC m=+0.179865518 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:16:38 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:16:38 np0005538513.localdomain sudo[332584]: pam_unix(sudo:session): session closed for user root
Nov 28 10:16:38 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:16:38 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:16:39 np0005538513.localdomain sudo[332669]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:16:39 np0005538513.localdomain sudo[332669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:16:39 np0005538513.localdomain sudo[332669]: pam_unix(sudo:session): session closed for user root
Nov 28 10:16:39 np0005538513.localdomain ceph-mon[292954]: pgmap v761: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 26 KiB/s wr, 1 op/s
Nov 28 10:16:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:16:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:16:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:16:39 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:16:40 np0005538513.localdomain podman[238687]: time="2025-11-28T10:16:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:16:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:16:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1"
Nov 28 10:16:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:16:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19281 "" "Go-http-client/1.1"
Nov 28 10:16:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:16:41 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:16:41 np0005538513.localdomain ceph-mon[292954]: pgmap v762: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 26 KiB/s wr, 1 op/s
Nov 28 10:16:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:16:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:42.545 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:16:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:42.560 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:16:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:42.560 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5015 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:16:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:42.561 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:16:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:42.561 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:16:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:42.562 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:43 np0005538513.localdomain ceph-mon[292954]: pgmap v763: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:45 np0005538513.localdomain ceph-mon[292954]: pgmap v764: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:16:46 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:16:46 np0005538513.localdomain podman[332688]: 2025-11-28 10:16:46.866259153 +0000 UTC m=+0.091342429 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:16:46 np0005538513.localdomain podman[332688]: 2025-11-28 10:16:46.94239914 +0000 UTC m=+0.167482426 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:16:46 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:16:47 np0005538513.localdomain podman[332687]: 2025-11-28 10:16:46.949724756 +0000 UTC m=+0.175716040 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:16:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:47 np0005538513.localdomain podman[332687]: 2025-11-28 10:16:47.030252369 +0000 UTC m=+0.256243613 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 10:16:47 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:16:47 np0005538513.localdomain ceph-mon[292954]: pgmap v765: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:47.563 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:47.564 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:16:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:16:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:16:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:16:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:16:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:16:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:16:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:16:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:16:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:16:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:16:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:16:49 np0005538513.localdomain ceph-mon[292954]: pgmap v766: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:16:50.853 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:16:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:16:50.854 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:16:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:16:50.855 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:16:51 np0005538513.localdomain ceph-mon[292954]: pgmap v767: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:51 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:51.790 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:16:52 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:52.565 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:16:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:52.595 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:16:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:52.596 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5030 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:16:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:52.596 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:16:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:52.596 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:16:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:52.597 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:52.598 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:52.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:16:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:52.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:16:53 np0005538513.localdomain ceph-mon[292954]: pgmap v768: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:53.589 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:16:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:53.612 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Triggering sync for uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 28 10:16:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:53.613 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:16:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:53.613 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:16:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:53.645 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.032s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:16:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:53.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:16:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:53.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:16:55 np0005538513.localdomain ceph-mon[292954]: pgmap v769: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:16:55 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:16:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:55.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:16:55 np0005538513.localdomain systemd[1]: tmp-crun.5nCuzO.mount: Deactivated successfully.
Nov 28 10:16:55 np0005538513.localdomain podman[332730]: 2025-11-28 10:16:55.872819116 +0000 UTC m=+0.101872925 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:16:55 np0005538513.localdomain podman[332731]: 2025-11-28 10:16:55.93499874 +0000 UTC m=+0.163839713 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.license=GPLv2)
Nov 28 10:16:55 np0005538513.localdomain podman[332731]: 2025-11-28 10:16:55.946602029 +0000 UTC m=+0.175443032 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:16:55 np0005538513.localdomain podman[332730]: 2025-11-28 10:16:55.960726726 +0000 UTC m=+0.189780555 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:16:55 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:16:55 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:16:57 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:57 np0005538513.localdomain ceph-mon[292954]: pgmap v770: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:57.600 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:16:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:57.602 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:16:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:57.602 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:16:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:57.603 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:16:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:57.631 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:57.632 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:16:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:57.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:16:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:57.803 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:16:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:57.804 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:16:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:57.804 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:16:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:57.805 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:16:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:57.806 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:16:58 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:16:58 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/775532288' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:16:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:58.296 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:16:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:58.362 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:16:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:58.363 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:16:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:58.573 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:16:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:58.574 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=10991MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:16:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:58.574 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:16:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:58.575 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:16:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:58.664 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 10:16:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:58.664 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:16:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:58.665 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:16:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:58.721 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:16:59 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:16:59 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1769094031' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:16:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:59.187 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:16:59 np0005538513.localdomain sshd[332817]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:16:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:59.196 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:16:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:59.226 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:16:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:59.230 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:16:59 np0005538513.localdomain ceph-mon[292954]: pgmap v771: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:59 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/775532288' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:16:59 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1769094031' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:16:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:16:59.231 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.657s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:16:59 np0005538513.localdomain sshd[332817]: Accepted publickey for zuul from 38.102.83.114 port 56522 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 10:16:59 np0005538513.localdomain systemd-logind[764]: New session 75 of user zuul.
Nov 28 10:16:59 np0005538513.localdomain systemd[1]: Started Session 75 of User zuul.
Nov 28 10:16:59 np0005538513.localdomain sshd[332817]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 10:16:59 np0005538513.localdomain sudo[332837]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-luggrxvycmvcrunwdtrmtyvptyifmdmy ; /usr/bin/python3
Nov 28 10:16:59 np0005538513.localdomain sudo[332837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 10:16:59 np0005538513.localdomain python3[332839]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister
                                                           _uses_shell=True zuul_log_id=fa163ef9-e89a-49a1-b30e-00000000000c-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:16:59 np0005538513.localdomain sudo[332837]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:00.228 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:17:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:00.229 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:17:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:00.229 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:17:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:00.229 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:17:01 np0005538513.localdomain ceph-mon[292954]: pgmap v772: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:01 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/847218026' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:17:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:01.347 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:17:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:01.348 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:17:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:01.348 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 10:17:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:01.348 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:17:02 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:02 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/3631755571' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:17:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:02.587 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:17:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:02.604 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:17:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:02.604 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 10:17:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:02.605 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:17:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:02.633 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:17:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:02.635 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:17:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:02.636 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:17:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:02.636 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:17:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:02.637 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:02.638 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:17:03 np0005538513.localdomain ceph-mon[292954]: pgmap v773: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:05 np0005538513.localdomain ceph-mon[292954]: pgmap v774: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:05 np0005538513.localdomain sshd[332817]: pam_unix(sshd:session): session closed for user zuul
Nov 28 10:17:05 np0005538513.localdomain systemd[1]: session-75.scope: Deactivated successfully.
Nov 28 10:17:05 np0005538513.localdomain systemd-logind[764]: Session 75 logged out. Waiting for processes to exit.
Nov 28 10:17:05 np0005538513.localdomain systemd-logind[764]: Removed session 75.
Nov 28 10:17:06 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/489346964' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:17:07 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:07 np0005538513.localdomain ceph-mon[292954]: pgmap v775: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:07 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/3700869485' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:17:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:07.639 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:17:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:07.733 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:17:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:07.734 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5095 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:17:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:07.734 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:17:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:07.735 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:07.735 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:17:07 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:17:07 np0005538513.localdomain podman[332842]: 2025-11-28 10:17:07.844172999 +0000 UTC m=+0.083446303 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 28 10:17:07 np0005538513.localdomain podman[332842]: 2025-11-28 10:17:07.864117636 +0000 UTC m=+0.103390980 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, version=9.6, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Nov 28 10:17:07 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:17:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:17:08 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:17:08 np0005538513.localdomain podman[332862]: 2025-11-28 10:17:08.851233571 +0000 UTC m=+0.085079265 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent)
Nov 28 10:17:08 np0005538513.localdomain podman[332862]: 2025-11-28 10:17:08.882547381 +0000 UTC m=+0.116393055 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 10:17:08 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:17:08 np0005538513.localdomain systemd[1]: tmp-crun.RSk4qD.mount: Deactivated successfully.
Nov 28 10:17:08 np0005538513.localdomain podman[332861]: 2025-11-28 10:17:08.964719424 +0000 UTC m=+0.200971612 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:17:08 np0005538513.localdomain podman[332861]: 2025-11-28 10:17:08.972444733 +0000 UTC m=+0.208696951 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:17:08 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:17:09 np0005538513.localdomain ceph-mon[292954]: pgmap v776: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:10 np0005538513.localdomain podman[238687]: time="2025-11-28T10:17:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:17:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:17:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1"
Nov 28 10:17:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:17:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19285 "" "Go-http-client/1.1"
Nov 28 10:17:11 np0005538513.localdomain ceph-mon[292954]: pgmap v777: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:12 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:12.736 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:17:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:12.738 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:13 np0005538513.localdomain ceph-mon[292954]: pgmap v778: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:14 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/218519506' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:17:14 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/218519506' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:17:15 np0005538513.localdomain ceph-mon[292954]: pgmap v779: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:17 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:17 np0005538513.localdomain ceph-mon[292954]: pgmap v780: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:17.739 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:17:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:17.741 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:17:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:17.741 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:17:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:17.741 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:17:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:17.773 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:17.774 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:17:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:17.775 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:17:17 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:17:17 np0005538513.localdomain systemd[1]: Starting dnf makecache...
Nov 28 10:17:17 np0005538513.localdomain podman[332902]: 2025-11-28 10:17:17.889710491 +0000 UTC m=+0.094222548 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 10:17:17 np0005538513.localdomain podman[332902]: 2025-11-28 10:17:17.905264982 +0000 UTC m=+0.109777009 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Nov 28 10:17:17 np0005538513.localdomain podman[332903]: 2025-11-28 10:17:17.940219524 +0000 UTC m=+0.144380480 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:17:17 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:17:17 np0005538513.localdomain podman[332903]: 2025-11-28 10:17:17.987546659 +0000 UTC m=+0.191707655 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:17:18 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:17:18 np0005538513.localdomain dnf[332904]: Updating Subscription Management repositories.
Nov 28 10:17:18 np0005538513.localdomain dnf[332904]: Unable to read consumer identity
Nov 28 10:17:18 np0005538513.localdomain dnf[332904]: This system is not registered with an entitlement server. You can use subscription-manager to register.
Nov 28 10:17:18 np0005538513.localdomain dnf[332904]: Metadata cache refreshed recently.
Nov 28 10:17:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:17:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:17:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:17:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:17:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:17:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:17:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:17:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:17:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:17:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:17:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:17:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:17:18 np0005538513.localdomain systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 28 10:17:18 np0005538513.localdomain systemd[1]: Finished dnf makecache.
Nov 28 10:17:18 np0005538513.localdomain sshd[332945]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:17:18 np0005538513.localdomain sshd[332945]: Accepted publickey for zuul from 38.102.83.114 port 53186 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 10:17:18 np0005538513.localdomain systemd-logind[764]: New session 76 of user zuul.
Nov 28 10:17:18 np0005538513.localdomain systemd[1]: Started Session 76 of User zuul.
Nov 28 10:17:18 np0005538513.localdomain sshd[332945]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 10:17:18 np0005538513.localdomain sudo[332949]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /var/log
Nov 28 10:17:18 np0005538513.localdomain sudo[332949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 10:17:19 np0005538513.localdomain ceph-mon[292954]: pgmap v781: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:19 np0005538513.localdomain sudo[332949]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:19 np0005538513.localdomain sshd[332948]: Received disconnect from 38.102.83.114 port 53186:11: disconnected by user
Nov 28 10:17:19 np0005538513.localdomain sshd[332948]: Disconnected from user zuul 38.102.83.114 port 53186
Nov 28 10:17:19 np0005538513.localdomain sshd[332945]: pam_unix(sshd:session): session closed for user zuul
Nov 28 10:17:19 np0005538513.localdomain systemd[1]: session-76.scope: Deactivated successfully.
Nov 28 10:17:19 np0005538513.localdomain systemd-logind[764]: Session 76 logged out. Waiting for processes to exit.
Nov 28 10:17:19 np0005538513.localdomain systemd-logind[764]: Removed session 76.
Nov 28 10:17:19 np0005538513.localdomain sshd[332967]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:17:20 np0005538513.localdomain sshd[332967]: Accepted publickey for zuul from 38.102.83.114 port 53190 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 10:17:20 np0005538513.localdomain systemd-logind[764]: New session 77 of user zuul.
Nov 28 10:17:20 np0005538513.localdomain systemd[1]: Started Session 77 of User zuul.
Nov 28 10:17:20 np0005538513.localdomain sshd[332967]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 10:17:20 np0005538513.localdomain sudo[332971]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/containers/networks
Nov 28 10:17:20 np0005538513.localdomain sudo[332971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 10:17:20 np0005538513.localdomain sudo[332971]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:20 np0005538513.localdomain sshd[332970]: Received disconnect from 38.102.83.114 port 53190:11: disconnected by user
Nov 28 10:17:20 np0005538513.localdomain sshd[332970]: Disconnected from user zuul 38.102.83.114 port 53190
Nov 28 10:17:20 np0005538513.localdomain sshd[332967]: pam_unix(sshd:session): session closed for user zuul
Nov 28 10:17:20 np0005538513.localdomain systemd[1]: session-77.scope: Deactivated successfully.
Nov 28 10:17:20 np0005538513.localdomain systemd-logind[764]: Session 77 logged out. Waiting for processes to exit.
Nov 28 10:17:20 np0005538513.localdomain systemd-logind[764]: Removed session 77.
Nov 28 10:17:20 np0005538513.localdomain sshd[332989]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:17:20 np0005538513.localdomain sshd[332989]: Accepted publickey for zuul from 38.102.83.114 port 53206 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 10:17:20 np0005538513.localdomain systemd-logind[764]: New session 78 of user zuul.
Nov 28 10:17:20 np0005538513.localdomain systemd[1]: Started Session 78 of User zuul.
Nov 28 10:17:20 np0005538513.localdomain sshd[332989]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 10:17:20 np0005538513.localdomain sudo[332993]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/containers/containers.conf
Nov 28 10:17:20 np0005538513.localdomain sudo[332993]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 10:17:20 np0005538513.localdomain sudo[332993]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:20 np0005538513.localdomain sshd[332992]: Received disconnect from 38.102.83.114 port 53206:11: disconnected by user
Nov 28 10:17:20 np0005538513.localdomain sshd[332992]: Disconnected from user zuul 38.102.83.114 port 53206
Nov 28 10:17:20 np0005538513.localdomain sshd[332989]: pam_unix(sshd:session): session closed for user zuul
Nov 28 10:17:20 np0005538513.localdomain systemd[1]: session-78.scope: Deactivated successfully.
Nov 28 10:17:20 np0005538513.localdomain systemd-logind[764]: Session 78 logged out. Waiting for processes to exit.
Nov 28 10:17:20 np0005538513.localdomain systemd-logind[764]: Removed session 78.
Nov 28 10:17:21 np0005538513.localdomain sshd[333011]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:17:21 np0005538513.localdomain ceph-mon[292954]: pgmap v782: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:21 np0005538513.localdomain sshd[333011]: Accepted publickey for zuul from 38.102.83.114 port 53212 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 10:17:21 np0005538513.localdomain systemd-logind[764]: New session 79 of user zuul.
Nov 28 10:17:21 np0005538513.localdomain systemd[1]: Started Session 79 of User zuul.
Nov 28 10:17:21 np0005538513.localdomain sshd[333011]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 10:17:21 np0005538513.localdomain sudo[333015]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/ceph
Nov 28 10:17:21 np0005538513.localdomain sudo[333015]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 10:17:21 np0005538513.localdomain sudo[333015]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:21 np0005538513.localdomain sshd[333014]: Received disconnect from 38.102.83.114 port 53212:11: disconnected by user
Nov 28 10:17:21 np0005538513.localdomain sshd[333014]: Disconnected from user zuul 38.102.83.114 port 53212
Nov 28 10:17:21 np0005538513.localdomain sshd[333011]: pam_unix(sshd:session): session closed for user zuul
Nov 28 10:17:21 np0005538513.localdomain systemd-logind[764]: Session 79 logged out. Waiting for processes to exit.
Nov 28 10:17:21 np0005538513.localdomain systemd[1]: session-79.scope: Deactivated successfully.
Nov 28 10:17:21 np0005538513.localdomain systemd-logind[764]: Removed session 79.
Nov 28 10:17:21 np0005538513.localdomain sshd[333033]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:17:21 np0005538513.localdomain sshd[333033]: Accepted publickey for zuul from 38.102.83.114 port 53226 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 10:17:21 np0005538513.localdomain systemd-logind[764]: New session 80 of user zuul.
Nov 28 10:17:21 np0005538513.localdomain systemd[1]: Started Session 80 of User zuul.
Nov 28 10:17:21 np0005538513.localdomain sshd[333033]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 10:17:21 np0005538513.localdomain sudo[333037]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/ci
Nov 28 10:17:21 np0005538513.localdomain sudo[333037]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 10:17:22 np0005538513.localdomain sudo[333037]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:22 np0005538513.localdomain sshd[333036]: Received disconnect from 38.102.83.114 port 53226:11: disconnected by user
Nov 28 10:17:22 np0005538513.localdomain sshd[333036]: Disconnected from user zuul 38.102.83.114 port 53226
Nov 28 10:17:22 np0005538513.localdomain sshd[333033]: pam_unix(sshd:session): session closed for user zuul
Nov 28 10:17:22 np0005538513.localdomain systemd[1]: session-80.scope: Deactivated successfully.
Nov 28 10:17:22 np0005538513.localdomain systemd-logind[764]: Session 80 logged out. Waiting for processes to exit.
Nov 28 10:17:22 np0005538513.localdomain systemd-logind[764]: Removed session 80.
Nov 28 10:17:22 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:22 np0005538513.localdomain sshd[333055]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:17:22 np0005538513.localdomain sshd[333055]: Accepted publickey for zuul from 38.102.83.114 port 53242 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 10:17:22 np0005538513.localdomain systemd-logind[764]: New session 81 of user zuul.
Nov 28 10:17:22 np0005538513.localdomain systemd[1]: Started Session 81 of User zuul.
Nov 28 10:17:22 np0005538513.localdomain sshd[333055]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 10:17:22 np0005538513.localdomain sudo[333059]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/yum.conf
Nov 28 10:17:22 np0005538513.localdomain sudo[333059]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 10:17:22 np0005538513.localdomain sudo[333059]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:22 np0005538513.localdomain sshd[333058]: Received disconnect from 38.102.83.114 port 53242:11: disconnected by user
Nov 28 10:17:22 np0005538513.localdomain sshd[333058]: Disconnected from user zuul 38.102.83.114 port 53242
Nov 28 10:17:22 np0005538513.localdomain sshd[333055]: pam_unix(sshd:session): session closed for user zuul
Nov 28 10:17:22 np0005538513.localdomain systemd[1]: session-81.scope: Deactivated successfully.
Nov 28 10:17:22 np0005538513.localdomain systemd-logind[764]: Session 81 logged out. Waiting for processes to exit.
Nov 28 10:17:22 np0005538513.localdomain systemd-logind[764]: Removed session 81.
Nov 28 10:17:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:22.775 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:22.784 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:22 np0005538513.localdomain sshd[333077]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:17:22 np0005538513.localdomain sshd[333077]: Accepted publickey for zuul from 38.102.83.114 port 53250 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 10:17:22 np0005538513.localdomain systemd-logind[764]: New session 82 of user zuul.
Nov 28 10:17:23 np0005538513.localdomain systemd[1]: Started Session 82 of User zuul.
Nov 28 10:17:23 np0005538513.localdomain sshd[333077]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 10:17:23 np0005538513.localdomain ceph-mon[292954]: pgmap v783: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:23 np0005538513.localdomain sudo[333081]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/yum.repos.d
Nov 28 10:17:23 np0005538513.localdomain sudo[333081]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 10:17:23 np0005538513.localdomain sudo[333081]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:23 np0005538513.localdomain sshd[333080]: Received disconnect from 38.102.83.114 port 53250:11: disconnected by user
Nov 28 10:17:23 np0005538513.localdomain sshd[333080]: Disconnected from user zuul 38.102.83.114 port 53250
Nov 28 10:17:23 np0005538513.localdomain sshd[333077]: pam_unix(sshd:session): session closed for user zuul
Nov 28 10:17:23 np0005538513.localdomain systemd[1]: session-82.scope: Deactivated successfully.
Nov 28 10:17:23 np0005538513.localdomain systemd-logind[764]: Session 82 logged out. Waiting for processes to exit.
Nov 28 10:17:23 np0005538513.localdomain systemd-logind[764]: Removed session 82.
Nov 28 10:17:23 np0005538513.localdomain sshd[333099]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:17:23 np0005538513.localdomain sshd[333099]: Accepted publickey for zuul from 38.102.83.114 port 53264 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 10:17:23 np0005538513.localdomain systemd-logind[764]: New session 83 of user zuul.
Nov 28 10:17:23 np0005538513.localdomain systemd[1]: Started Session 83 of User zuul.
Nov 28 10:17:23 np0005538513.localdomain sshd[333099]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 10:17:23 np0005538513.localdomain sudo[333103]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/os-net-config
Nov 28 10:17:23 np0005538513.localdomain sudo[333103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 10:17:23 np0005538513.localdomain sudo[333103]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:23 np0005538513.localdomain sshd[333102]: Received disconnect from 38.102.83.114 port 53264:11: disconnected by user
Nov 28 10:17:23 np0005538513.localdomain sshd[333102]: Disconnected from user zuul 38.102.83.114 port 53264
Nov 28 10:17:23 np0005538513.localdomain sshd[333099]: pam_unix(sshd:session): session closed for user zuul
Nov 28 10:17:23 np0005538513.localdomain systemd[1]: session-83.scope: Deactivated successfully.
Nov 28 10:17:23 np0005538513.localdomain systemd-logind[764]: Session 83 logged out. Waiting for processes to exit.
Nov 28 10:17:23 np0005538513.localdomain systemd-logind[764]: Removed session 83.
Nov 28 10:17:23 np0005538513.localdomain sshd[333121]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:17:24 np0005538513.localdomain sshd[333121]: Accepted publickey for zuul from 38.102.83.114 port 53276 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 10:17:24 np0005538513.localdomain systemd-logind[764]: New session 84 of user zuul.
Nov 28 10:17:24 np0005538513.localdomain systemd[1]: Started Session 84 of User zuul.
Nov 28 10:17:24 np0005538513.localdomain sshd[333121]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 10:17:24 np0005538513.localdomain sudo[333125]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /home/zuul/ansible_hostname
Nov 28 10:17:24 np0005538513.localdomain sudo[333125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 10:17:24 np0005538513.localdomain sudo[333125]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:24 np0005538513.localdomain sshd[333124]: Received disconnect from 38.102.83.114 port 53276:11: disconnected by user
Nov 28 10:17:24 np0005538513.localdomain sshd[333124]: Disconnected from user zuul 38.102.83.114 port 53276
Nov 28 10:17:24 np0005538513.localdomain sshd[333121]: pam_unix(sshd:session): session closed for user zuul
Nov 28 10:17:24 np0005538513.localdomain systemd[1]: session-84.scope: Deactivated successfully.
Nov 28 10:17:24 np0005538513.localdomain systemd-logind[764]: Session 84 logged out. Waiting for processes to exit.
Nov 28 10:17:24 np0005538513.localdomain systemd-logind[764]: Removed session 84.
Nov 28 10:17:25 np0005538513.localdomain ceph-mon[292954]: pgmap v784: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:17:26 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:17:26 np0005538513.localdomain podman[333144]: 2025-11-28 10:17:26.867343831 +0000 UTC m=+0.093371730 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:17:26 np0005538513.localdomain podman[333144]: 2025-11-28 10:17:26.880443777 +0000 UTC m=+0.106471716 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:17:26 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:17:26 np0005538513.localdomain podman[333145]: 2025-11-28 10:17:26.972333921 +0000 UTC m=+0.197739151 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, container_name=multipathd)
Nov 28 10:17:27 np0005538513.localdomain podman[333145]: 2025-11-28 10:17:27.008519152 +0000 UTC m=+0.233924382 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, managed_by=edpm_ansible)
Nov 28 10:17:27 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:17:27.054719) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325047054751, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 981, "num_deletes": 257, "total_data_size": 792412, "memory_usage": 811864, "flush_reason": "Manual Compaction"}
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325047062556, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 774824, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 41269, "largest_seqno": 42249, "table_properties": {"data_size": 770485, "index_size": 2002, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10035, "raw_average_key_size": 19, "raw_value_size": 761389, "raw_average_value_size": 1487, "num_data_blocks": 89, "num_entries": 512, "num_filter_entries": 512, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324976, "oldest_key_time": 1764324976, "file_creation_time": 1764325047, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 7881 microseconds, and 3387 cpu microseconds.
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:17:27.062596) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 774824 bytes OK
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:17:27.062617) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:17:27.064687) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:17:27.064706) EVENT_LOG_v1 {"time_micros": 1764325047064700, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:17:27.064725) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 787724, prev total WAL file size 788048, number of live WAL files 2.
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:17:27.065383) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034353238' seq:72057594037927935, type:22 .. '6C6F676D0034373739' seq:0, type:0; will stop at (end)
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(756KB)], [72(19MB)]
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325047065468, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 20826899, "oldest_snapshot_seqno": -1}
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 14806 keys, 20694519 bytes, temperature: kUnknown
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325047172995, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 20694519, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 20605393, "index_size": 51009, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 37061, "raw_key_size": 395167, "raw_average_key_size": 26, "raw_value_size": 20349931, "raw_average_value_size": 1374, "num_data_blocks": 1924, "num_entries": 14806, "num_filter_entries": 14806, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764325047, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:17:27.173493) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 20694519 bytes
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:17:27.175392) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 193.5 rd, 192.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 19.1 +0.0 blob) out(19.7 +0.0 blob), read-write-amplify(53.6) write-amplify(26.7) OK, records in: 15340, records dropped: 534 output_compression: NoCompression
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:17:27.175423) EVENT_LOG_v1 {"time_micros": 1764325047175410, "job": 44, "event": "compaction_finished", "compaction_time_micros": 107653, "compaction_time_cpu_micros": 54410, "output_level": 6, "num_output_files": 1, "total_output_size": 20694519, "num_input_records": 15340, "num_output_records": 14806, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325047175766, "job": 44, "event": "table_file_deletion", "file_number": 74}
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325047178899, "job": 44, "event": "table_file_deletion", "file_number": 72}
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:17:27.065254) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:17:27.178965) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:17:27.178973) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:17:27.178976) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:17:27.178978) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:17:27.178981) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:17:27 np0005538513.localdomain ceph-mon[292954]: pgmap v785: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:27.787 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:17:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:27.788 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:17:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:27.789 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:17:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:27.789 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:17:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:27.813 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:27.814 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:17:29 np0005538513.localdomain ceph-mon[292954]: pgmap v786: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:31 np0005538513.localdomain ceph-mon[292954]: pgmap v787: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:32 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:32.815 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:17:33 np0005538513.localdomain ceph-mon[292954]: pgmap v788: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:35 np0005538513.localdomain ceph-mon[292954]: pgmap v789: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:37 np0005538513.localdomain ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e53: np0005538515.yfkzhl(active, since 19m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:17:37 np0005538513.localdomain ceph-mon[292954]: pgmap v790: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:37.818 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:17:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:37.820 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:17:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:37.820 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:17:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:37.820 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:17:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:37.852 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:37 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:37.853 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:17:38 np0005538513.localdomain ceph-mon[292954]: mgrmap e53: np0005538515.yfkzhl(active, since 19m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:17:38 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:17:38 np0005538513.localdomain podman[333184]: 2025-11-28 10:17:38.857125553 +0000 UTC m=+0.093413832 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 10:17:38 np0005538513.localdomain podman[333184]: 2025-11-28 10:17:38.873727327 +0000 UTC m=+0.110015606 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vcs-type=git, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, vendor=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=edpm)
Nov 28 10:17:38 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:17:39 np0005538513.localdomain ceph-mon[292954]: pgmap v791: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 28 10:17:39 np0005538513.localdomain sudo[333205]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:17:39 np0005538513.localdomain sudo[333205]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:17:39 np0005538513.localdomain sudo[333205]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:17:39 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:17:39 np0005538513.localdomain sudo[333230]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:17:39 np0005538513.localdomain sudo[333230]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:17:39 np0005538513.localdomain podman[333224]: 2025-11-28 10:17:39.433538095 +0000 UTC m=+0.093941588 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true)
Nov 28 10:17:39 np0005538513.localdomain podman[333224]: 2025-11-28 10:17:39.441725288 +0000 UTC m=+0.102128781 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:17:39 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:17:39 np0005538513.localdomain podman[333223]: 2025-11-28 10:17:39.532656044 +0000 UTC m=+0.196459783 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:17:39 np0005538513.localdomain podman[333223]: 2025-11-28 10:17:39.569528055 +0000 UTC m=+0.233331774 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:17:39 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:17:40 np0005538513.localdomain sudo[333230]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:40 np0005538513.localdomain podman[238687]: time="2025-11-28T10:17:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:17:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:17:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1"
Nov 28 10:17:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:17:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19281 "" "Go-http-client/1.1"
Nov 28 10:17:40 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:17:40 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:17:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:17:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:17:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:17:40 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:17:40 np0005538513.localdomain sudo[333313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:17:40 np0005538513.localdomain sudo[333313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:17:40 np0005538513.localdomain sudo[333313]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:17:41 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:17:41 np0005538513.localdomain ceph-mon[292954]: pgmap v792: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 28 10:17:41 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:17:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:42 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:42.853 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:17:43 np0005538513.localdomain ceph-mon[292954]: pgmap v793: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 28 10:17:43 np0005538513.localdomain sshd[333331]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:17:44 np0005538513.localdomain sshd[333331]: error: kex_exchange_identification: Connection closed by remote host
Nov 28 10:17:44 np0005538513.localdomain sshd[333331]: Connection closed by 80.94.92.182 port 43182
Nov 28 10:17:45 np0005538513.localdomain ceph-mon[292954]: pgmap v794: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 28 10:17:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:47 np0005538513.localdomain ceph-mon[292954]: pgmap v795: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 28 10:17:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:47.858 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:17:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:47.861 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:17:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:47.861 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:17:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:47.861 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:17:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:47.882 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:47 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:47.883 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:17:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:17:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:17:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:17:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:17:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:17:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:17:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:17:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:17:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:17:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:17:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:17:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:17:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:17:48 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:17:48 np0005538513.localdomain systemd[1]: tmp-crun.xSzk2n.mount: Deactivated successfully.
Nov 28 10:17:48 np0005538513.localdomain podman[333332]: 2025-11-28 10:17:48.877114955 +0000 UTC m=+0.104488845 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 10:17:48 np0005538513.localdomain podman[333332]: 2025-11-28 10:17:48.891456548 +0000 UTC m=+0.118830448 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_managed=true)
Nov 28 10:17:48 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:17:48 np0005538513.localdomain podman[333333]: 2025-11-28 10:17:48.970147234 +0000 UTC m=+0.197562996 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:17:49 np0005538513.localdomain podman[333333]: 2025-11-28 10:17:49.075585858 +0000 UTC m=+0.303001660 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 10:17:49 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:17:49 np0005538513.localdomain ceph-mon[292954]: pgmap v796: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 28 10:17:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:17:50.854 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:17:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:17:50.855 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:17:50 np0005538513.localdomain ovn_metadata_agent[158125]: 2025-11-28 10:17:50.855 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:17:51 np0005538513.localdomain ceph-mon[292954]: pgmap v797: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Nov 28 10:17:52 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:52.883 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:17:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:52.886 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:52.887 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:17:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:52.887 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:17:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:52.888 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:17:52 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:52.890 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:17:53 np0005538513.localdomain ceph-mon[292954]: pgmap v798: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Nov 28 10:17:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:53.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:17:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:53.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:17:53 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:53.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:17:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:54.773 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:17:54 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:54.774 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:17:55 np0005538513.localdomain ceph-mon[292954]: pgmap v799: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:55 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:55.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:17:57 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:17:57 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:17:57 np0005538513.localdomain podman[333375]: 2025-11-28 10:17:57.19723045 +0000 UTC m=+0.092059120 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible)
Nov 28 10:17:57 np0005538513.localdomain ceph-mon[292954]: pgmap v800: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:57 np0005538513.localdomain podman[333375]: 2025-11-28 10:17:57.217557199 +0000 UTC m=+0.112385879 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:17:57 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:17:57 np0005538513.localdomain podman[333374]: 2025-11-28 10:17:57.291925052 +0000 UTC m=+0.190632332 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:17:57 np0005538513.localdomain podman[333374]: 2025-11-28 10:17:57.330608199 +0000 UTC m=+0.229315479 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:17:57 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:17:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:57.893 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:17:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:57.895 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:17:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:57.896 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:17:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:57.896 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:17:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:57.921 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:57 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:57.922 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:17:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:58.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:17:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:58.792 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:17:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:58.792 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:17:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:58.793 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:17:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:58.793 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:17:58 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:58.794 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:17:59 np0005538513.localdomain ceph-mon[292954]: pgmap v801: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:59 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:17:59 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4023789649' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:17:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:59.250 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:17:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:59.313 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:17:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:59.313 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:17:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:59.532 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:17:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:59.533 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=10993MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:17:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:59.534 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:17:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:59.535 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:17:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:59.615 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 10:17:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:59.615 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:17:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:59.616 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:17:59 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:17:59.660 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:18:00 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:18:00 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2940993655' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:18:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:00.112 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:18:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:00.119 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:18:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:00.135 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:18:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:00.137 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:18:00 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:00.138 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.603s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:18:00 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/4023789649' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:18:00 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2940993655' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.678 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.679 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.697 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4704df75-ee2f-4a76-9f8b-154643de4bd8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:18:00.679297', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '8489b17a-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.868943201, 'message_signature': 'e7984c9761dee7d8098bdc76383974c3ddb4f1b1141e4f9f6768b4b149e41b97'}]}, 'timestamp': '2025-11-28 10:18:00.698073', '_unique_id': '59120e98bffa4ac9a85be39fdee278e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.699 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.700 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.701 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 20650000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '75efc927-2f7d-4586-b47e-70dc5fe3e9fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20650000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:18:00.701155', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '848a4194-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.868943201, 'message_signature': '20ec0fd0a94b453d716eeeed3a01a2bb7af61b767791cfd1105b573a30760c20'}]}, 'timestamp': '2025-11-28 10:18:00.701599', '_unique_id': 'f5bb70a937f14f3a995cea0625f93b12'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.702 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.703 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.706 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3cfa7d09-82ca-4383-af3e-7bf34704dcee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:18:00.703708', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '848b1f6a-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.875600117, 'message_signature': 'aeef076fa6a7aa5fa66a8b14cbf0f649dd5bbfdafb4c356c5d3e950364cb4c0a'}]}, 'timestamp': '2025-11-28 10:18:00.707305', '_unique_id': 'a8e8d0c2e63c402fb9b2675a3a62cdc6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.708 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.709 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.709 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0eac266-bf66-4211-8b53-9dcc0c83d0e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:18:00.709690', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '848b8fa4-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.875600117, 'message_signature': 'a40732c7c1c2d11d2210d85e2675237f88e37a9ce78b7868b742ff194b54bc01'}]}, 'timestamp': '2025-11-28 10:18:00.710215', '_unique_id': '74629974538b493caac303b7473a4d60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.711 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.712 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.739 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.739 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6321a37-875c-4346-a65c-b47c85f86e53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:18:00.712528', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '84900f5c-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.8844205, 'message_signature': 'b370210c79e16a896e6bb6795e8c76cd20a2e1a221d6041a6bb55145f8c6aa42'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:18:00.712528', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '84901f92-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.8844205, 'message_signature': 'cebaa640475f8f93eaaa51510959731ae58dd784cf29900dbfcd0b81c25f341f'}]}, 'timestamp': '2025-11-28 10:18:00.740072', '_unique_id': '93aec64dd68c4d12b2e587ee215cd2ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.740 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.742 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.742 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5ef7869-ec00-4fec-af7e-0b3d3daa7c9d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:18:00.742218', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '84908572-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.875600117, 'message_signature': '51b0bf07427c0b9542eb0b2ffef76886f1174c6ebc339ebdb043f5477b009c86'}]}, 'timestamp': '2025-11-28 10:18:00.742668', '_unique_id': '663992f2ff6c4a028097026131d82945'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.743 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.744 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.744 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.745 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '354faba5-4555-4a9e-a68d-d3a06ccac211', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:18:00.744731', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8490e760-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.8844205, 'message_signature': 'ace0ba187afb1125f20bccdd0a2813fd7a647a3fe9a5aa74e00fd6c4b757be82'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:18:00.744731', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8490f89a-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.8844205, 'message_signature': 'a78875a8b78b41db45c8dc11871a3cb97f39e5d6292bd0c1862b70f32d8f58ad'}]}, 'timestamp': '2025-11-28 10:18:00.745582', '_unique_id': 'dd4ab5df1d0442dfb2959c1a6c3233b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.746 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.747 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.747 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.748 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '837e4f49-caeb-4895-9739-a34070d89610', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:18:00.747676', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '84915a56-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.8844205, 'message_signature': '360362c452d03e20326169840fdcc57aa3463cc24c6e39d7afbc47d19dddc715'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:18:00.747676', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '84916b36-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.8844205, 'message_signature': '7acded85b0d81eb9095d53ab35760400dcdc107d5fcb5626ce1b713eb137bdb6'}]}, 'timestamp': '2025-11-28 10:18:00.748520', '_unique_id': '1c69c82d513844179dec8ff3d6263667'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.749 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.750 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.750 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64e36086-e157-42c4-9bca-4bc87e4c1007', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:18:00.750698', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8491d184-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.875600117, 'message_signature': '28399d8e62120b12878b742e87fa642e48dcfe601da16fea46c90326f47ee59a'}]}, 'timestamp': '2025-11-28 10:18:00.751199', '_unique_id': 'edb86a7e2301456b85755fbf790fab40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.752 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.753 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.762 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.763 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a78ff20-31de-405a-8c35-ebf8802d5aca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:18:00.753215', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8493b03a-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.92510716, 'message_signature': 'bc23be87d67b045ac4498aa18ec8bcf39ea1e3410d03c305565b7c8556c35895'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:18:00.753215', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8493c066-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.92510716, 'message_signature': '4b0dc37afa78df3bf28ede7a2cda706bfd64a75872a79c257044595d4a03eba1'}]}, 'timestamp': '2025-11-28 10:18:00.763807', '_unique_id': '6afa6f2d69694eb891baabc88499e75c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.764 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.765 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.765 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8702fafd-2d96-49c8-b3f1-cccc2e37e276', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:18:00.765916', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8494247a-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.875600117, 'message_signature': '775a8ab1bd04b6a80c183c5d1e7436aaed2df291766ef71b0bc67c3006817cfa'}]}, 'timestamp': '2025-11-28 10:18:00.766398', '_unique_id': '77232be9646d4b749f4709bb8b5b182c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.767 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.768 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.768 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.768 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '602b0f8c-061e-4ec0-bf4b-ca0adafc4c3d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:18:00.768579', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '84948b54-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.875600117, 'message_signature': '7affb2f435500609eb7054c3f68b4512d0f339aa54b020453bd4b45423a7f099'}]}, 'timestamp': '2025-11-28 10:18:00.769136', '_unique_id': '3748c6ec0a5c4e2e9b12dc833b17fdbd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.769 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.771 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.771 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.771 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9d2496d-2c29-4b0d-86b9-fe3bfe275659', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:18:00.771207', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8494f184-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.8844205, 'message_signature': '9040fd0442306379d8c163f6bc293b3b8a9c385507d87cbe5e63b055a5c98009'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:18:00.771207', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8495012e-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.8844205, 'message_signature': '99b39c60cfe80ac0e27d8044804983e15fc9a97aa9caa73233e1ebe2803225a2'}]}, 'timestamp': '2025-11-28 10:18:00.772042', '_unique_id': '6e35337fcc7045639858723b9f9e3fcd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.772 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.773 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.774 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '693ed4e2-2644-454e-9ec5-543309bb99dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:18:00.774152', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8495652e-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.875600117, 'message_signature': '1be7d0370412f7d14093f57b1a02167fda20b7f99c452554887444e51de277a0'}]}, 'timestamp': '2025-11-28 10:18:00.774609', '_unique_id': 'b44f0e1f215c4c64981c5a613f14f329'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.775 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.776 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.776 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85acb547-4bd4-48aa-a84f-b680f7f51a64', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:18:00.776765', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8495cb2c-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.875600117, 'message_signature': 'cf91a2cc2347b059f8bad441721cff1f85573ee31d13eabd325a85f7a89cbcf8'}]}, 'timestamp': '2025-11-28 10:18:00.777306', '_unique_id': '1f8594bf7a5240e5b260138f58f386bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.778 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.779 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.779 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.780 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74f45647-9c6f-4b21-a1f4-cca8bcf9c7e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:18:00.779558', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '84963850-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.92510716, 'message_signature': '1e536398f3bb6a271c7ca42aeba82a0bd09d12b692da29fcff386710b729d3a7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:18:00.779558', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '84964b1a-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.92510716, 'message_signature': '1e7c45d4ecce68a8cf9b4d1432d608f170f62b139c6ec0e6c378f7cacb9908e6'}]}, 'timestamp': '2025-11-28 10:18:00.780471', '_unique_id': '2886fd78ce6647c4a649e26ba5604fd0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.781 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.782 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.782 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.783 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dde1ff32-ec10-4f04-8d6b-4539558b4116', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:18:00.782579', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8496afce-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.8844205, 'message_signature': '220e07c0a7c8b7b80a4cbf8f184813481b854cc6cae19ec34251f078b37c7242'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:18:00.782579', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8496c0fe-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.8844205, 'message_signature': '58630911e8b8e168785e982cc784eeefbfcde0a35b4a107f27a87a3b93ef02ca'}]}, 'timestamp': '2025-11-28 10:18:00.783483', '_unique_id': '33da6ff79be147e3ae048fe7ff1f8966'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.784 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cddbf945-000f-43a5-b377-0c4ebd30a226', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:18:00.784867', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '849703fc-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.8844205, 'message_signature': 'd7a6bc2cdea561d4888caeeb9377b7323dd7687951c8de40e3b184050e664d78'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:18:00.784867', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '84970ed8-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.8844205, 'message_signature': '7f353591aa1408651b1edda3ef077b3201a1f480f06f709d45f543bcc2130659'}]}, 'timestamp': '2025-11-28 10:18:00.785399', '_unique_id': '8b7680b0b7a44030b125e6b85dedb85e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.785 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.786 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.786 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.787 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.787 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14ba0278-edf0-4dd4-abce-0546b3c370fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:18:00.786992', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '84975816-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.92510716, 'message_signature': '97bee8dd0b36287a32f85131f10cb7bf4c2d410da922d712a6502f3b1aa05bf0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:18:00.786992', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8497623e-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.92510716, 'message_signature': '417bc66d5c9368e2c718b96a33e0ea22bcaf379c04998cebc1d98a97160e4fa0'}]}, 'timestamp': '2025-11-28 10:18:00.787537', '_unique_id': '0474fae8c7c84069b5b447aa8a6b2b91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.788 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95f1175b-7051-40c2-894a-dcd5c114d7dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:18:00.788823', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '84979ede-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.875600117, 'message_signature': 'ba2741f4acc318626866880e113af4c64932a34936e81c28320affc973f4cd09'}]}, 'timestamp': '2025-11-28 10:18:00.789133', '_unique_id': '25eda5f2f81c420fb6e117b662050927'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.789 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.790 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.790 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b12c0ca4-69c0-44dd-87f9-35ac38a21378', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:18:00.790431', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8497dd7c-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12914.875600117, 'message_signature': 'ea64d1741e35f4bada28262367221980f460185ff75be070db3bdcd4131bc499'}]}, 'timestamp': '2025-11-28 10:18:00.790709', '_unique_id': 'f7e5d8b56caa49689da922edc6c32196'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:18:00 np0005538513.localdomain ceilometer_agent_compute[236072]: 2025-11-28 10:18:00.791 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:18:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:01.139 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:18:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:01.140 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:18:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:01.140 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:18:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:01.141 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:18:01 np0005538513.localdomain ceph-mon[292954]: pgmap v802: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:01 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/3704514543' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:18:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:01.365 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:18:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:01.366 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:18:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:01.367 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 10:18:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:01.367 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:18:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:01.694 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:18:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:01.720 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:18:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:01.720 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 10:18:01 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:01.722 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:18:02 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:18:02 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/4099343195' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:18:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:02.923 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:18:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:02.924 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:02.924 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:18:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:02.924 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:18:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:02.926 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:18:02 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:02.928 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:03 np0005538513.localdomain ceph-mon[292954]: pgmap v803: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:05 np0005538513.localdomain ceph-mon[292954]: pgmap v804: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:06 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1023311526' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:18:07 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:18:07 np0005538513.localdomain ceph-mon[292954]: pgmap v805: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:07 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1297328519' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:18:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:07.930 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:18:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:07.932 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:18:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:07.932 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:18:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:07.932 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:18:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:07.957 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:07 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:07.958 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:18:09 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:09.349 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:18:09 np0005538513.localdomain ceph-mon[292954]: pgmap v806: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:18:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:18:09 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:18:09 np0005538513.localdomain podman[333462]: 2025-11-28 10:18:09.862800391 +0000 UTC m=+0.094331581 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:18:09 np0005538513.localdomain podman[333463]: 2025-11-28 10:18:09.909118625 +0000 UTC m=+0.136175097 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, distribution-scope=public, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Nov 28 10:18:09 np0005538513.localdomain podman[333462]: 2025-11-28 10:18:09.927268576 +0000 UTC m=+0.158799766 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:18:09 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:18:09 np0005538513.localdomain podman[333463]: 2025-11-28 10:18:09.978445271 +0000 UTC m=+0.205501783 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, distribution-scope=public, container_name=openstack_network_exporter, version=9.6, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 10:18:09 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:18:10 np0005538513.localdomain podman[333464]: 2025-11-28 10:18:10.065763894 +0000 UTC m=+0.289986318 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:18:10 np0005538513.localdomain podman[333464]: 2025-11-28 10:18:10.079469428 +0000 UTC m=+0.303691862 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 10:18:10 np0005538513.localdomain podman[238687]: time="2025-11-28T10:18:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:18:10 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:18:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:18:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1"
Nov 28 10:18:10 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:18:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19285 "" "Go-http-client/1.1"
Nov 28 10:18:11 np0005538513.localdomain ceph-mon[292954]: pgmap v807: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:12.100411) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325092100452, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 726, "num_deletes": 250, "total_data_size": 767625, "memory_usage": 780272, "flush_reason": "Manual Compaction"}
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325092109580, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 621617, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 42250, "largest_seqno": 42975, "table_properties": {"data_size": 618670, "index_size": 866, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 8404, "raw_average_key_size": 20, "raw_value_size": 612179, "raw_average_value_size": 1507, "num_data_blocks": 40, "num_entries": 406, "num_filter_entries": 406, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764325047, "oldest_key_time": 1764325047, "file_creation_time": 1764325092, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 9228 microseconds, and 2985 cpu microseconds.
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:12.109634) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 621617 bytes OK
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:12.109660) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:12.111856) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:12.111870) EVENT_LOG_v1 {"time_micros": 1764325092111865, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:12.111886) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 763913, prev total WAL file size 764237, number of live WAL files 2.
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:12.112435) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034323631' seq:72057594037927935, type:22 .. '6D6772737461740034353132' seq:0, type:0; will stop at (end)
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(607KB)], [75(19MB)]
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325092112490, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 21316136, "oldest_snapshot_seqno": -1}
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 14708 keys, 19334686 bytes, temperature: kUnknown
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325092217928, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 19334686, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19250496, "index_size": 46329, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36805, "raw_key_size": 393339, "raw_average_key_size": 26, "raw_value_size": 19000850, "raw_average_value_size": 1291, "num_data_blocks": 1729, "num_entries": 14708, "num_filter_entries": 14708, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764325092, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:12.218357) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 19334686 bytes
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:12.220404) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 201.8 rd, 183.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 19.7 +0.0 blob) out(18.4 +0.0 blob), read-write-amplify(65.4) write-amplify(31.1) OK, records in: 15212, records dropped: 504 output_compression: NoCompression
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:12.220435) EVENT_LOG_v1 {"time_micros": 1764325092220421, "job": 46, "event": "compaction_finished", "compaction_time_micros": 105617, "compaction_time_cpu_micros": 53107, "output_level": 6, "num_output_files": 1, "total_output_size": 19334686, "num_input_records": 15212, "num_output_records": 14708, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325092220698, "job": 46, "event": "table_file_deletion", "file_number": 77}
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325092223793, "job": 46, "event": "table_file_deletion", "file_number": 75}
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:12.112306) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:12.223895) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:12.223903) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:12.223906) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:12.223909) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:18:12 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:12.223912) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:18:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:12.959 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:18:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:12.960 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:18:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:12.960 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:18:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:12.960 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:18:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:12.961 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:18:12 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:12.964 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:13 np0005538513.localdomain ceph-mon[292954]: pgmap v808: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:14 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3885796551' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:18:14 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.32:0/3885796551' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:18:15 np0005538513.localdomain ceph-mon[292954]: pgmap v809: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:17 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:18:17 np0005538513.localdomain ceph-mon[292954]: pgmap v810: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:17.964 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:18:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:17.966 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:18:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:17.967 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:18:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:17.967 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:18:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:17.993 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:17 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:17.994 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:18:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:18:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:18:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:18:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:18:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:18:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:18:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:18:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:18:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:18:18 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:18:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:18:18 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:18:19 np0005538513.localdomain ceph-mon[292954]: pgmap v811: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:19 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.
Nov 28 10:18:19 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.
Nov 28 10:18:19 np0005538513.localdomain systemd[1]: tmp-crun.YB5NM9.mount: Deactivated successfully.
Nov 28 10:18:19 np0005538513.localdomain podman[333522]: 2025-11-28 10:18:19.859166963 +0000 UTC m=+0.094513057 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:18:19 np0005538513.localdomain podman[333523]: 2025-11-28 10:18:19.903158964 +0000 UTC m=+0.134328369 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:18:19 np0005538513.localdomain podman[333522]: 2025-11-28 10:18:19.925817385 +0000 UTC m=+0.161163499 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 10:18:19 np0005538513.localdomain systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully.
Nov 28 10:18:20 np0005538513.localdomain podman[333523]: 2025-11-28 10:18:20.004558963 +0000 UTC m=+0.235728348 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 28 10:18:20 np0005538513.localdomain systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully.
Nov 28 10:18:20 np0005538513.localdomain sshd[333565]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:18:20 np0005538513.localdomain sshd[333565]: Accepted publickey for zuul from 192.168.122.10 port 51318 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 10:18:20 np0005538513.localdomain systemd-logind[764]: New session 85 of user zuul.
Nov 28 10:18:20 np0005538513.localdomain systemd[1]: Started Session 85 of User zuul.
Nov 28 10:18:20 np0005538513.localdomain sshd[333565]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 10:18:20 np0005538513.localdomain sudo[333569]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt
Nov 28 10:18:20 np0005538513.localdomain sudo[333569]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 10:18:21 np0005538513.localdomain ceph-mon[292954]: pgmap v812: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:22 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:18:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:22.994 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:22 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:22.996 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:23 np0005538513.localdomain ceph-mon[292954]: pgmap v813: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:24 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "status"} v 0)
Nov 28 10:18:24 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1185886869' entity='client.admin' cmd={"prefix": "status"} : dispatch
Nov 28 10:18:24 np0005538513.localdomain ceph-mon[292954]: from='client.49284 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:24 np0005538513.localdomain ceph-mon[292954]: from='client.69335 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:24 np0005538513.localdomain ceph-mon[292954]: from='client.59317 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:24 np0005538513.localdomain ceph-mon[292954]: from='client.49290 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:24 np0005538513.localdomain ceph-mon[292954]: from='client.69341 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:24 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1185886869' entity='client.admin' cmd={"prefix": "status"} : dispatch
Nov 28 10:18:25 np0005538513.localdomain ceph-mon[292954]: from='client.59332 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:25 np0005538513.localdomain ceph-mon[292954]: pgmap v814: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:25 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2941785349' entity='client.admin' cmd={"prefix": "status"} : dispatch
Nov 28 10:18:25 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1170736032' entity='client.admin' cmd={"prefix": "status"} : dispatch
Nov 28 10:18:26 np0005538513.localdomain ovs-vsctl[333819]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 28 10:18:27 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:18:27 np0005538513.localdomain ceph-mon[292954]: pgmap v815: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:27 np0005538513.localdomain virtqemud[201490]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 28 10:18:27 np0005538513.localdomain virtqemud[201490]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 28 10:18:27 np0005538513.localdomain virtqemud[201490]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 28 10:18:27 np0005538513.localdomain systemd[1]: efi.automount: Got automount request for /efi, triggered by 333974 (lsinitrd)
Nov 28 10:18:27 np0005538513.localdomain systemd[1]: Mounting EFI System Partition Automount...
Nov 28 10:18:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.
Nov 28 10:18:27 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.
Nov 28 10:18:27 np0005538513.localdomain systemd[1]: Mounted EFI System Partition Automount.
Nov 28 10:18:27 np0005538513.localdomain podman[333978]: 2025-11-28 10:18:27.809178734 +0000 UTC m=+0.083518827 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:18:27 np0005538513.localdomain podman[333979]: 2025-11-28 10:18:27.878033454 +0000 UTC m=+0.151292374 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:18:27 np0005538513.localdomain podman[333978]: 2025-11-28 10:18:27.89370318 +0000 UTC m=+0.168043283 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:18:27 np0005538513.localdomain systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully.
Nov 28 10:18:27 np0005538513.localdomain podman[333979]: 2025-11-28 10:18:27.91633697 +0000 UTC m=+0.189595840 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:18:27 np0005538513.localdomain systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully.
Nov 28 10:18:27 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc asok_command: cache status {prefix=cache status} (starting...)
Nov 28 10:18:27 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc Can't run that command on an inactive MDS!
Nov 28 10:18:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:27.997 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:18:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:27.998 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:18:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:27.998 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 28 10:18:27 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:27.998 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:18:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:28.033 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:28 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:28.034 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:18:28 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc asok_command: client ls {prefix=client ls} (starting...)
Nov 28 10:18:28 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc Can't run that command on an inactive MDS!
Nov 28 10:18:28 np0005538513.localdomain lvm[334109]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Nov 28 10:18:28 np0005538513.localdomain lvm[334109]: VG ceph_vg1 finished
Nov 28 10:18:28 np0005538513.localdomain lvm[334112]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 28 10:18:28 np0005538513.localdomain lvm[334112]: VG ceph_vg0 finished
Nov 28 10:18:28 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc asok_command: damage ls {prefix=damage ls} (starting...)
Nov 28 10:18:28 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc Can't run that command on an inactive MDS!
Nov 28 10:18:28 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc asok_command: dump loads {prefix=dump loads} (starting...)
Nov 28 10:18:28 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc Can't run that command on an inactive MDS!
Nov 28 10:18:28 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Nov 28 10:18:28 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc Can't run that command on an inactive MDS!
Nov 28 10:18:29 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Nov 28 10:18:29 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc Can't run that command on an inactive MDS!
Nov 28 10:18:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "report"} v 0)
Nov 28 10:18:29 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2707254784' entity='client.admin' cmd={"prefix": "report"} : dispatch
Nov 28 10:18:29 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Nov 28 10:18:29 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc Can't run that command on an inactive MDS!
Nov 28 10:18:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "report"} v 0)
Nov 28 10:18:29 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Nov 28 10:18:29 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Nov 28 10:18:29 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc Can't run that command on an inactive MDS!
Nov 28 10:18:29 np0005538513.localdomain ceph-mon[292954]: pgmap v816: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:29 np0005538513.localdomain ceph-mon[292954]: from='client.59344 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:29 np0005538513.localdomain ceph-mon[292954]: from='client.69356 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:29 np0005538513.localdomain ceph-mon[292954]: from='client.59353 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:29 np0005538513.localdomain ceph-mon[292954]: from='client.49311 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:29 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2707254784' entity='client.admin' cmd={"prefix": "report"} : dispatch
Nov 28 10:18:29 np0005538513.localdomain ceph-mon[292954]: from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Nov 28 10:18:29 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/493760634' entity='client.admin' cmd={"prefix": "report"} : dispatch
Nov 28 10:18:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "report"} v 0)
Nov 28 10:18:29 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Nov 28 10:18:29 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Nov 28 10:18:29 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc Can't run that command on an inactive MDS!
Nov 28 10:18:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 10:18:29 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4057740893' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:18:29 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc asok_command: get subtrees {prefix=get subtrees} (starting...)
Nov 28 10:18:29 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc Can't run that command on an inactive MDS!
Nov 28 10:18:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Nov 28 10:18:29 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1807372338' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Nov 28 10:18:29 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config log"} v 0)
Nov 28 10:18:29 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1177517886' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Nov 28 10:18:29 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc asok_command: ops {prefix=ops} (starting...)
Nov 28 10:18:29 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc Can't run that command on an inactive MDS!
Nov 28 10:18:30 np0005538513.localdomain ceph-mon[292954]: from='client.69362 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:30 np0005538513.localdomain ceph-mon[292954]: from='client.59362 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:30 np0005538513.localdomain ceph-mon[292954]: from='client.49329 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:30 np0005538513.localdomain ceph-mon[292954]: from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Nov 28 10:18:30 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/6787673' entity='client.admin' cmd={"prefix": "report"} : dispatch
Nov 28 10:18:30 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/4057740893' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:18:30 np0005538513.localdomain ceph-mon[292954]: from='client.69389 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:30 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/4066833274' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:18:30 np0005538513.localdomain ceph-mon[292954]: from='client.59392 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:30 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1807372338' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Nov 28 10:18:30 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1177517886' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Nov 28 10:18:30 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/461455018' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Nov 28 10:18:30 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1994531849' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:18:30 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1543263582' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Nov 28 10:18:30 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mgr dump"} v 0)
Nov 28 10:18:30 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3391289643' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 28 10:18:30 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config-key dump"} v 0)
Nov 28 10:18:30 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/207961467' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Nov 28 10:18:30 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc asok_command: session ls {prefix=session ls} (starting...)
Nov 28 10:18:30 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc Can't run that command on an inactive MDS!
Nov 28 10:18:30 np0005538513.localdomain ceph-mds[282744]: mds.mds.np0005538513.yljthc asok_command: status {prefix=status} (starting...)
Nov 28 10:18:30 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Nov 28 10:18:30 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4251160342' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/998697886' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: pgmap v817: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/783008498' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3391289643' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/207961467' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/357711483' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/4264039640' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1304102523' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: from='client.69452 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/4251160342' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2610313962' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: from='client.49377 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2434183970' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2654470196' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/998697886' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/3437019487' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1834711607' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "features"} v 0)
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? ' entity='client.admin' cmd={"prefix": "features"} : dispatch
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "features"} v 0)
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3436769846' entity='client.admin' cmd={"prefix": "features"} : dispatch
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4077617134' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mgr stat"} v 0)
Nov 28 10:18:31 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2492373822' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4060251766' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "features"} v 0)
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? ' entity='client.admin' cmd={"prefix": "features"} : dispatch
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: from='client.59455 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: from='client.49389 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: from='client.59473 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: from='client.? ' entity='client.admin' cmd={"prefix": "features"} : dispatch
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1906133235' entity='client.admin' cmd={"prefix": "features"} : dispatch
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3436769846' entity='client.admin' cmd={"prefix": "features"} : dispatch
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/4077617134' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: from='client.59479 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2752663634' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/3827185081' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/3594311866' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2492373822' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/4060251766' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: from='client.? ' entity='client.admin' cmd={"prefix": "features"} : dispatch
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2621700254' entity='client.admin' cmd={"prefix": "features"} : dispatch
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2252583354' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2663181736' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mgr versions"} v 0)
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1900692722' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Nov 28 10:18:32 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/931850460' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Nov 28 10:18:33 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:33.035 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:18:33 np0005538513.localdomain ceph-mon[292954]: pgmap v818: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:33 np0005538513.localdomain ceph-mon[292954]: from='client.69524 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:33 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1900692722' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 28 10:18:33 np0005538513.localdomain ceph-mon[292954]: from='client.49425 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:33 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2369810755' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Nov 28 10:18:33 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/984640696' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Nov 28 10:18:33 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/4097305108' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 28 10:18:33 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2044064874' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Nov 28 10:18:33 np0005538513.localdomain ceph-mon[292954]: from='client.49437 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:33 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/931850460' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Nov 28 10:18:33 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1681934238' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 28 10:18:33 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2747186648' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Nov 28 10:18:33 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Nov 28 10:18:33 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1517482184' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:18.610006+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 91 heartbeat osd_stat(store_statfs(0x1b8952000/0x0/0x1bfc00000, data 0x30b72b6/0x313b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 113664000 unmapped: 1753088 heap: 115417088 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1050102 data_alloc: 301989888 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:19.610220+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 113664000 unmapped: 1753088 heap: 115417088 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:20.610459+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 113664000 unmapped: 1753088 heap: 115417088 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 91 heartbeat osd_stat(store_statfs(0x1b8952000/0x0/0x1bfc00000, data 0x30b72b6/0x313b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:21.610741+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 113664000 unmapped: 1753088 heap: 115417088 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:22.610911+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 113664000 unmapped: 1753088 heap: 115417088 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:23.611248+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 113664000 unmapped: 1753088 heap: 115417088 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:24.611408+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1050102 data_alloc: 301989888 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 113664000 unmapped: 1753088 heap: 115417088 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:25.611554+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 113664000 unmapped: 1753088 heap: 115417088 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:26.611743+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 113664000 unmapped: 1753088 heap: 115417088 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 91 heartbeat osd_stat(store_statfs(0x1b8952000/0x0/0x1bfc00000, data 0x30b72b6/0x313b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:27.611902+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 113664000 unmapped: 1753088 heap: 115417088 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:28.612113+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 91 heartbeat osd_stat(store_statfs(0x1b8952000/0x0/0x1bfc00000, data 0x30b72b6/0x313b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 113664000 unmapped: 1753088 heap: 115417088 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:29.612277+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1050102 data_alloc: 301989888 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 113664000 unmapped: 1753088 heap: 115417088 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:30.612447+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 91 heartbeat osd_stat(store_statfs(0x1b8952000/0x0/0x1bfc00000, data 0x30b72b6/0x313b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 113664000 unmapped: 1753088 heap: 115417088 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:31.612611+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 113664000 unmapped: 1753088 heap: 115417088 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:32.612783+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 113664000 unmapped: 1753088 heap: 115417088 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:33.613090+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 113664000 unmapped: 1753088 heap: 115417088 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:34.613249+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1050102 data_alloc: 301989888 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 91 heartbeat osd_stat(store_statfs(0x1b8952000/0x0/0x1bfc00000, data 0x30b72b6/0x313b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 113664000 unmapped: 1753088 heap: 115417088 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Got map version 41
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Active mgr is now 
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc reconnect Terminating session with v2:172.18.0.107:6810/2760684413
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc reconnect No active mgr available yet
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 91 ms_handle_reset con 0x55843ef6e400 session 0x55843b9585a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:35.613452+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d3de400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 91 handle_osd_map epochs [92,92], i have 91, src has [1,92]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 84.032394409s of 84.093727112s, submitted: 12
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 113811456 unmapped: 1605632 heap: 115417088 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Got map version 42
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc reconnect Starting new session with [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: get_auth_request con 0x55843fcae400 auth_method 0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_configure stats_period=5
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:36.613643+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 113811456 unmapped: 1605632 heap: 115417088 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:37.613807+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 113844224 unmapped: 2621440 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:38.614054+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Got map version 43
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 113860608 unmapped: 2605056 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:39.614205+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1053102 data_alloc: 301989888 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 113860608 unmapped: 2605056 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:40.614363+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Got map version 44
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:41.614531+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:42.614698+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:43.614836+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:44.615087+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1053102 data_alloc: 301989888 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:45.615228+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:46.615481+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:47.615620+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:48.615758+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:49.616002+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1053102 data_alloc: 301989888 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:50.616208+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:51.616433+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:52.616646+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:53.616840+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:54.617055+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1053102 data_alloc: 301989888 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:55.617233+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:56.617393+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:57.617598+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:58.617805+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:59.617998+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1053102 data_alloc: 301989888 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:00.618189+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:01.618316+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:02.618456+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:03.618691+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:04.618820+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1053102 data_alloc: 301989888 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:05.619043+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:06.619269+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:07.619453+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:08.619593+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.3 total, 600.0 interval
                                                          Cumulative writes: 5847 writes, 25K keys, 5847 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5847 writes, 861 syncs, 6.79 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 165 writes, 340 keys, 165 commit groups, 1.0 writes per commit group, ingest: 0.33 MB, 0.00 MB/s
                                                          Interval WAL: 165 writes, 82 syncs, 2.01 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:09.619741+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1053102 data_alloc: 301989888 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:10.619929+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:11.620139+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:12.620350+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:13.620581+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:14.620815+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1053102 data_alloc: 301989888 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:15.620995+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:16.621202+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:17.621343+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:18.621511+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:19.621671+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1053102 data_alloc: 285212672 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:20.621804+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:21.621995+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:22.622231+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:23.622450+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:24.622593+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1053102 data_alloc: 285212672 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:25.622798+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:26.623114+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:27.623282+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:28.623411+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:29.623565+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1053102 data_alloc: 285212672 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:30.623783+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:31.623988+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114008064 unmapped: 2457600 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:32.624131+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114016256 unmapped: 2449408 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:33.624544+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114016256 unmapped: 2449408 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:34.624712+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114016256 unmapped: 2449408 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1053102 data_alloc: 285212672 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:35.624855+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114016256 unmapped: 2449408 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:36.625070+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114016256 unmapped: 2449408 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:37.625274+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114016256 unmapped: 2449408 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:38.625437+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114016256 unmapped: 2449408 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:39.625621+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114016256 unmapped: 2449408 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1053102 data_alloc: 285212672 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:40.625821+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114016256 unmapped: 2449408 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:41.625931+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114016256 unmapped: 2449408 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:42.626106+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114016256 unmapped: 2449408 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:43.626235+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114016256 unmapped: 2449408 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:44.626365+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114016256 unmapped: 2449408 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1053102 data_alloc: 285212672 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:45.626519+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114016256 unmapped: 2449408 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:46.626725+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114016256 unmapped: 2449408 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:47.626829+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114024448 unmapped: 2441216 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:48.626974+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114024448 unmapped: 2441216 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:49.627160+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114024448 unmapped: 2441216 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1053102 data_alloc: 285212672 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:50.627342+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114024448 unmapped: 2441216 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:51.627476+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114024448 unmapped: 2441216 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:52.627666+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114024448 unmapped: 2441216 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:53.627826+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114024448 unmapped: 2441216 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:54.627998+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114024448 unmapped: 2441216 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1053102 data_alloc: 285212672 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:55.628185+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114024448 unmapped: 2441216 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:56.628396+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114024448 unmapped: 2441216 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:57.628577+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114024448 unmapped: 2441216 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:58.628779+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114024448 unmapped: 2441216 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:59.628937+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114032640 unmapped: 2433024 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1053102 data_alloc: 285212672 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:00.629084+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114032640 unmapped: 2433024 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:01.629248+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114032640 unmapped: 2433024 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:02.629428+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114032640 unmapped: 2433024 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:03.629576+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114032640 unmapped: 2433024 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:04.629732+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114032640 unmapped: 2433024 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1053102 data_alloc: 285212672 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:05.629926+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114032640 unmapped: 2433024 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:06.630163+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114032640 unmapped: 2433024 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Got map version 45
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:07.630375+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114040832 unmapped: 2424832 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:08.630648+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114040832 unmapped: 2424832 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:09.630817+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114040832 unmapped: 2424832 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1053102 data_alloc: 285212672 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:10.630977+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114040832 unmapped: 2424832 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b894e000/0x0/0x1bfc00000, data 0x30b967a/0x313f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:11.631148+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114040832 unmapped: 2424832 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:12.631301+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114040832 unmapped: 2424832 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:13.631495+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114040832 unmapped: 2424832 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:14.631638+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114040832 unmapped: 2424832 heap: 116465664 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 99.247543335s of 99.301292419s, submitted: 13
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1055818 data_alloc: 285212672 data_used: 26722304
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:15.631811+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114081792 unmapped: 19169280 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 92 handle_osd_map epochs [92,93], i have 92, src has [1,93]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 93 ms_handle_reset con 0x55843e973000 session 0x55843efeef00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 93 handle_osd_map epochs [93,93], i have 93, src has [1,93]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:16.631976+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b8148000/0x0/0x1bfc00000, data 0x38bba6b/0x3945000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 114163712 unmapped: 19087360 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:17.632114+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115195904 unmapped: 18055168 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 93 handle_osd_map epochs [94,94], i have 93, src has [1,94]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 handle_osd_map epochs [94,94], i have 94, src has [1,94]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:18.632306+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:19.632477+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1177487 data_alloc: 285212672 data_used: 26734592
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:20.632635+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7942000/0x0/0x1bfc00000, data 0x40bde6f/0x414b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:21.632776+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:22.632942+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:23.633140+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7942000/0x0/0x1bfc00000, data 0x40bde6f/0x414b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7942000/0x0/0x1bfc00000, data 0x40bde6f/0x414b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:24.633315+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1177487 data_alloc: 285212672 data_used: 26734592
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:25.633476+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:26.633676+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:27.633869+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:28.634386+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:29.634995+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7942000/0x0/0x1bfc00000, data 0x40bde6f/0x414b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1177487 data_alloc: 285212672 data_used: 26734592
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:30.635097+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:31.635255+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7942000/0x0/0x1bfc00000, data 0x40bde6f/0x414b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:32.635481+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:33.635645+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7942000/0x0/0x1bfc00000, data 0x40bde6f/0x414b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:34.635881+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1177487 data_alloc: 285212672 data_used: 26734592
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:35.636049+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:36.636467+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:37.636619+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7942000/0x0/0x1bfc00000, data 0x40bde6f/0x414b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:38.636786+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:39.636927+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1177487 data_alloc: 285212672 data_used: 26734592
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:40.637382+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:41.637755+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:42.637935+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:43.638268+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7942000/0x0/0x1bfc00000, data 0x40bde6f/0x414b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:44.638444+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1177487 data_alloc: 285212672 data_used: 26734592
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:45.638656+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:46.638929+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:47.639156+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:48.639322+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:49.639525+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7942000/0x0/0x1bfc00000, data 0x40bde6f/0x414b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1177487 data_alloc: 285212672 data_used: 26734592
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:50.639717+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:51.639874+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7942000/0x0/0x1bfc00000, data 0x40bde6f/0x414b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 17981440 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:52.639979+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7942000/0x0/0x1bfc00000, data 0x40bde6f/0x414b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115277824 unmapped: 17973248 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:53.640147+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115277824 unmapped: 17973248 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:54.640294+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115277824 unmapped: 17973248 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1177487 data_alloc: 285212672 data_used: 26734592
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7942000/0x0/0x1bfc00000, data 0x40bde6f/0x414b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:55.640469+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115277824 unmapped: 17973248 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:56.640687+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115277824 unmapped: 17973248 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:57.640877+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115277824 unmapped: 17973248 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7942000/0x0/0x1bfc00000, data 0x40bde6f/0x414b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:58.641068+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115277824 unmapped: 17973248 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:59.641191+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7942000/0x0/0x1bfc00000, data 0x40bde6f/0x414b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115277824 unmapped: 17973248 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1177487 data_alloc: 285212672 data_used: 26734592
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:00.641343+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115277824 unmapped: 17973248 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:01.641481+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115277824 unmapped: 17973248 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:02.641653+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115277824 unmapped: 17973248 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:03.641806+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115277824 unmapped: 17973248 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7942000/0x0/0x1bfc00000, data 0x40bde6f/0x414b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:04.641942+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115277824 unmapped: 17973248 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1177487 data_alloc: 285212672 data_used: 26734592
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:05.642077+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115277824 unmapped: 17973248 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:06.642303+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7942000/0x0/0x1bfc00000, data 0x40bde6f/0x414b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115277824 unmapped: 17973248 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:07.642453+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115277824 unmapped: 17973248 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:08.642628+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115286016 unmapped: 17965056 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:09.642816+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115286016 unmapped: 17965056 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1177487 data_alloc: 285212672 data_used: 26734592
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:10.643058+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7942000/0x0/0x1bfc00000, data 0x40bde6f/0x414b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115286016 unmapped: 17965056 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:11.643195+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 115286016 unmapped: 17965056 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:12.643365+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ef6e000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 ms_handle_reset con 0x55843ef6e000 session 0x55843efeed20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 122642432 unmapped: 10608640 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:13.643472+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 58.572074890s of 58.800083160s, submitted: 38
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843faed400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 123338752 unmapped: 9912320 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 ms_handle_reset con 0x55843faed400 session 0x55843efee960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:14.643609+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 123338752 unmapped: 9912320 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.038835
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1728053248 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1660944384 meta_used: 1241673 data_alloc: 301989888 data_used: 35123200
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:15.643754+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7455000/0x0/0x1bfc00000, data 0x45abe6f/0x4639000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 123371520 unmapped: 9879552 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d3dfc00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 ms_handle_reset con 0x55843d3dfc00 session 0x55843efee5a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:16.643946+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7455000/0x0/0x1bfc00000, data 0x45abe6f/0x4639000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 123371520 unmapped: 9879552 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:17.644135+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7455000/0x0/0x1bfc00000, data 0x45abe6f/0x4639000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 123371520 unmapped: 9879552 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 ms_handle_reset con 0x55843e973000 session 0x55843efee3c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ef6e000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 ms_handle_reset con 0x55843ef6e000 session 0x55843efee1e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:18.644296+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ef6e400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 ms_handle_reset con 0x55843ef6e400 session 0x55843efee000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7454000/0x0/0x1bfc00000, data 0x45abe7f/0x463a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 123387904 unmapped: 9863168 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843faed400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:19.644446+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e732800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 123387904 unmapped: 9863168 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.038835
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1728053248 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1660944384 meta_used: 1247511 data_alloc: 301989888 data_used: 35651584
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:20.644641+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 123715584 unmapped: 9535488 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:21.644816+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7454000/0x0/0x1bfc00000, data 0x45abe7f/0x463a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 127041536 unmapped: 6209536 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7454000/0x0/0x1bfc00000, data 0x45abe7f/0x463a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:22.644967+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 127328256 unmapped: 5922816 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:23.645122+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b7454000/0x0/0x1bfc00000, data 0x45abe7f/0x463a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 127328256 unmapped: 5922816 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:24.645290+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.041370392s of 11.116180420s, submitted: 11
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 127328256 unmapped: 5922816 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.038835
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1728053248 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1660944384 meta_used: 1279786 data_alloc: 301989888 data_used: 39845888
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:25.645452+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 94 handle_osd_map epochs [95,95], i have 94, src has [1,95]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 95 handle_osd_map epochs [95,95], i have 95, src has [1,95]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 127344640 unmapped: 5906432 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:26.645647+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 127344640 unmapped: 5906432 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 95 heartbeat osd_stat(store_statfs(0x1b744c000/0x0/0x1bfc00000, data 0x45ae293/0x4641000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:27.645797+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 95 handle_osd_map epochs [96,96], i have 95, src has [1,96]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 96 handle_osd_map epochs [96,96], i have 96, src has [1,96]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 96 heartbeat osd_stat(store_statfs(0x1b744e000/0x0/0x1bfc00000, data 0x45ae270/0x4640000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fcae000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 127352832 unmapped: 5898240 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fcd4400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:28.645905+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 96 ms_handle_reset con 0x55843fcae000 session 0x55843c995c20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 96 ms_handle_reset con 0x55843e732800 session 0x55843efee780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 96 ms_handle_reset con 0x55843faed400 session 0x55843cd38d20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 127352832 unmapped: 5898240 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:29.646079+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 127352832 unmapped: 5898240 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.038835
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1728053248 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1660944384 meta_used: 1286671 data_alloc: 301989888 data_used: 39874560
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:30.646225+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 127369216 unmapped: 5881856 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:31.646387+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 127533056 unmapped: 5718016 heap: 133251072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:32.646510+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 131465216 unmapped: 4931584 heap: 136396800 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 96 heartbeat osd_stat(store_statfs(0x1b6aee000/0x0/0x1bfc00000, data 0x4eff64f/0x4f91000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:33.646660+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 131620864 unmapped: 4775936 heap: 136396800 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:34.646825+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 96 handle_osd_map epochs [97,97], i have 96, src has [1,97]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.466845512s of 10.003924370s, submitted: 165
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 97 ms_handle_reset con 0x55843e973000 session 0x55843ab32780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 97 ms_handle_reset con 0x55843fcd4400 session 0x55843efef0e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 97 ms_handle_reset con 0x558439f7b800 session 0x55844084a1e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 129990656 unmapped: 6406144 heap: 136396800 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.038835
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1728053248 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1660944384 meta_used: 1379864 data_alloc: 301989888 data_used: 40108032
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843a294000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 97 handle_osd_map epochs [97,97], i have 97, src has [1,97]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:35.646956+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 97 heartbeat osd_stat(store_statfs(0x1b6a3a000/0x0/0x1bfc00000, data 0x4fa78f3/0x503b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 127393792 unmapped: 9003008 heap: 136396800 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843b5a4c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 97 ms_handle_reset con 0x55843a294000 session 0x55843c995c20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:36.647391+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 97 ms_handle_reset con 0x55843b5a4c00 session 0x55844084a3c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 97 heartbeat osd_stat(store_statfs(0x1b7064000/0x0/0x1bfc00000, data 0x40c48f3/0x4158000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 127688704 unmapped: 18227200 heap: 145915904 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:37.647831+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 127688704 unmapped: 18227200 heap: 145915904 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:38.647985+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 127688704 unmapped: 18227200 heap: 145915904 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:39.648106+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843a294000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 97 ms_handle_reset con 0x55843a294000 session 0x55843ced9a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 127991808 unmapped: 17924096 heap: 145915904 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.038835
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1728053248 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1660944384 meta_used: 1319931 data_alloc: 301989888 data_used: 35168256
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:40.648401+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 127991808 unmapped: 17924096 heap: 145915904 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:41.648642+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 97 heartbeat osd_stat(store_statfs(0x1b6bed000/0x0/0x1bfc00000, data 0x4e0c968/0x4ea1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 127762432 unmapped: 18153472 heap: 145915904 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:42.648917+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 129171456 unmapped: 16744448 heap: 145915904 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:43.649171+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 129171456 unmapped: 16744448 heap: 145915904 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:44.649328+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 129171456 unmapped: 16744448 heap: 145915904 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.038835
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1728053248 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1660944384 meta_used: 1349851 data_alloc: 301989888 data_used: 39366656
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:45.649707+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 129171456 unmapped: 16744448 heap: 145915904 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:46.650113+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 97 heartbeat osd_stat(store_statfs(0x1b6bed000/0x0/0x1bfc00000, data 0x4e0c968/0x4ea1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 129171456 unmapped: 16744448 heap: 145915904 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:47.650336+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 129171456 unmapped: 16744448 heap: 145915904 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:48.650473+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 129171456 unmapped: 16744448 heap: 145915904 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:49.650710+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 129171456 unmapped: 16744448 heap: 145915904 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.038835
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1728053248 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1660944384 meta_used: 1349851 data_alloc: 301989888 data_used: 39366656
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:50.650876+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 129171456 unmapped: 16744448 heap: 145915904 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 97 heartbeat osd_stat(store_statfs(0x1b6bed000/0x0/0x1bfc00000, data 0x4e0c968/0x4ea1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843faed400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:51.651057+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 16.570291519s of 16.853242874s, submitted: 78
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 97 ms_handle_reset con 0x55843faed400 session 0x55843cdb8b40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 132595712 unmapped: 13320192 heap: 145915904 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:52.651223+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fcd4400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 97 ms_handle_reset con 0x55843fcd4400 session 0x55843e607a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 133545984 unmapped: 16572416 heap: 150118400 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:53.651380+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 134184960 unmapped: 15933440 heap: 150118400 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 97 heartbeat osd_stat(store_statfs(0x1b56a1000/0x0/0x1bfc00000, data 0x63569ca/0x63ec000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:54.651528+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 134651904 unmapped: 15466496 heap: 150118400 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.038835
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1728053248 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1660944384 meta_used: 1533846 data_alloc: 301989888 data_used: 39477248
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843f145400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 97 ms_handle_reset con 0x55843f145400 session 0x55843efeeb40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:55.657946+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843f144000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 134471680 unmapped: 15646720 heap: 150118400 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:56.658189+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 97 heartbeat osd_stat(store_statfs(0x1b567d000/0x0/0x1bfc00000, data 0x637a9ed/0x6411000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 134471680 unmapped: 15646720 heap: 150118400 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:57.658360+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 134471680 unmapped: 15646720 heap: 150118400 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:58.658512+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 134471680 unmapped: 15646720 heap: 150118400 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843f145400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:59.658784+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 134479872 unmapped: 15638528 heap: 150118400 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.038835
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1728053248 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1660944384 meta_used: 1536001 data_alloc: 301989888 data_used: 40108032
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:00.658983+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 137207808 unmapped: 12910592 heap: 150118400 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:01.659147+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 139460608 unmapped: 10657792 heap: 150118400 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:02.659343+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.602990150s of 11.293771744s, submitted: 128
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 97 heartbeat osd_stat(store_statfs(0x1b567b000/0x0/0x1bfc00000, data 0x637c9ed/0x6413000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 139493376 unmapped: 10625024 heap: 150118400 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:03.659486+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 139493376 unmapped: 10625024 heap: 150118400 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:04.659631+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 97 ms_handle_reset con 0x55843e973000 session 0x55844084a780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 97 ms_handle_reset con 0x558439f7b800 session 0x55843cdb9e00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 139501568 unmapped: 10616832 heap: 150118400 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.038835
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1728053248 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1660944384 meta_used: 1578601 data_alloc: 301989888 data_used: 46084096
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:05.659764+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 139501568 unmapped: 10616832 heap: 150118400 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:06.659961+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 97 handle_osd_map epochs [98,98], i have 97, src has [1,98]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843a294000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 98 heartbeat osd_stat(store_statfs(0x1b5679000/0x0/0x1bfc00000, data 0x637e9ed/0x6415000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 98 ms_handle_reset con 0x55843a294000 session 0x55843cdbb4a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 139632640 unmapped: 10485760 heap: 150118400 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843b5a5400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:07.660293+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 98 ms_handle_reset con 0x55843b5a5400 session 0x55843cdbab40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144777216 unmapped: 24338432 heap: 169115648 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:08.660444+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 98 handle_osd_map epochs [99,99], i have 98, src has [1,99]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843b5a4c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144801792 unmapped: 24313856 heap: 169115648 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843faecc00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:09.660587+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 99 handle_osd_map epochs [100,100], i have 99, src has [1,100]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 100 handle_osd_map epochs [99,100], i have 100, src has [1,100]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 100 ms_handle_reset con 0x55843b5a4c00 session 0x55843fca5a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 100 ms_handle_reset con 0x55843f145400 session 0x55843efef2c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 100 ms_handle_reset con 0x55843f144000 session 0x55844084be00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144826368 unmapped: 24289280 heap: 169115648 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.038835
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1728053248 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1660944384 meta_used: 1812106 data_alloc: 301989888 data_used: 48566272
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:10.660748+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144826368 unmapped: 24289280 heap: 169115648 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:11.660979+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144875520 unmapped: 24240128 heap: 169115648 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 100 handle_osd_map epochs [100,100], i have 100, src has [1,100]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 100 heartbeat osd_stat(store_statfs(0x1b3c2f000/0x0/0x1bfc00000, data 0x7dbe66b/0x7e5d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [0,0,0,0,5,3])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:12.661105+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 148463616 unmapped: 20652032 heap: 169115648 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:13.661246+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 100 heartbeat osd_stat(store_statfs(0x1b3159000/0x0/0x1bfc00000, data 0x889666b/0x8935000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.316488266s of 11.059107780s, submitted: 159
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 148701184 unmapped: 20414464 heap: 169115648 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843a294000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:14.661447+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 100 ms_handle_reset con 0x55843a294000 session 0x55843d0aa3c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 100 handle_osd_map epochs [101,101], i have 100, src has [1,101]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 143720448 unmapped: 25395200 heap: 169115648 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.038835
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1728053248 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1660944384 meta_used: 1704664 data_alloc: 301989888 data_used: 43634688
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:15.661631+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 143720448 unmapped: 25395200 heap: 169115648 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:16.661822+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 143736832 unmapped: 25378816 heap: 169115648 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:17.661968+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 143769600 unmapped: 25346048 heap: 169115648 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:18.662129+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 101 heartbeat osd_stat(store_statfs(0x1b4558000/0x0/0x1bfc00000, data 0x708f88a/0x712e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 143835136 unmapped: 25280512 heap: 169115648 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843b5a4c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 101 ms_handle_reset con 0x55843b5a4c00 session 0x55843efef4a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843b5a5400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 101 ms_handle_reset con 0x55843b5a5400 session 0x55843b958b40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843a294000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 101 ms_handle_reset con 0x55843a294000 session 0x55843b959860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843b5a4c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:19.662276+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 101 ms_handle_reset con 0x55843b5a4c00 session 0x55843aa85c20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843f144000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 101 ms_handle_reset con 0x55843f144000 session 0x55843cdd0d20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843f145400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 101 heartbeat osd_stat(store_statfs(0x1b45c7000/0x0/0x1bfc00000, data 0x74278ec/0x74c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [0,0,0,2])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 162021376 unmapped: 10526720 heap: 172548096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.038835
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1728053248 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1660944384 meta_used: 1885418 data_alloc: 318767104 data_used: 48934912
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:20.662387+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 101 ms_handle_reset con 0x55843f145400 session 0x55843d0cdc20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 101 heartbeat osd_stat(store_statfs(0x1b383e000/0x0/0x1bfc00000, data 0x81b08ec/0x8250000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 154288128 unmapped: 18259968 heap: 172548096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:21.662528+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 101 ms_handle_reset con 0x558439f7b800 session 0x558440867680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 101 ms_handle_reset con 0x55843faecc00 session 0x558440867860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 154296320 unmapped: 18251776 heap: 172548096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843a294000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:22.662647+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 101 ms_handle_reset con 0x558439f7b800 session 0x55843fca5e00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843b5a4c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 101 handle_osd_map epochs [102,102], i have 101, src has [1,102]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 148094976 unmapped: 24453120 heap: 172548096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:23.662776+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 102 handle_osd_map epochs [102,102], i have 102, src has [1,102]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 102 ms_handle_reset con 0x55843b5a4c00 session 0x55843b3b1a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 136093696 unmapped: 36454400 heap: 172548096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:24.662917+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 135946240 unmapped: 36601856 heap: 172548096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1452658 data_alloc: 301989888 data_used: 27721728
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:25.663202+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 102 heartbeat osd_stat(store_statfs(0x1b5633000/0x0/0x1bfc00000, data 0x51f0ba5/0x528d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 136019968 unmapped: 36528128 heap: 172548096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:26.663390+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 136019968 unmapped: 36528128 heap: 172548096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:27.663507+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 136019968 unmapped: 36528128 heap: 172548096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:28.664089+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 102 heartbeat osd_stat(store_statfs(0x1b5633000/0x0/0x1bfc00000, data 0x51f0ba5/0x528d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 136019968 unmapped: 36528128 heap: 172548096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:29.664263+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 102 handle_osd_map epochs [102,103], i have 102, src has [1,103]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 14.939451218s of 15.777985573s, submitted: 222
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 131399680 unmapped: 41148416 heap: 172548096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1440252 data_alloc: 301989888 data_used: 27725824
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:30.664393+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 131399680 unmapped: 41148416 heap: 172548096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:31.664518+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 131399680 unmapped: 41148416 heap: 172548096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:32.664672+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 131399680 unmapped: 41148416 heap: 172548096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:33.664825+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 131448832 unmapped: 41099264 heap: 172548096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:34.664987+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 103 heartbeat osd_stat(store_statfs(0x1b67fc000/0x0/0x1bfc00000, data 0x51f2e49/0x5291000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 131645440 unmapped: 40902656 heap: 172548096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:35.665154+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1452816 data_alloc: 301989888 data_used: 29040640
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 131670016 unmapped: 40878080 heap: 172548096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:36.665350+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 103 heartbeat osd_stat(store_statfs(0x1b67fd000/0x0/0x1bfc00000, data 0x51f2e49/0x5291000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 131702784 unmapped: 40845312 heap: 172548096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843f144000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:37.665462+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 103 ms_handle_reset con 0x55843f144000 session 0x55843f3741e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 129056768 unmapped: 43491328 heap: 172548096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:38.665640+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 103 ms_handle_reset con 0x55843a294000 session 0x55843c65f0e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 129064960 unmapped: 43483136 heap: 172548096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 103 ms_handle_reset con 0x558439f7b800 session 0x55843d06fa40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:39.665833+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843a294000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 103 handle_osd_map epochs [104,104], i have 103, src has [1,104]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.707607269s of 10.002224922s, submitted: 69
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 104 ms_handle_reset con 0x55843a294000 session 0x55843c6534a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843b5a4c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 104 ms_handle_reset con 0x55843b5a4c00 session 0x55843d0cda40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843f144000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 153395200 unmapped: 26984448 heap: 180379648 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 104 ms_handle_reset con 0x55843f144000 session 0x55843c90e5a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:40.665972+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843faecc00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.038835
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1728053248 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1806333 data_alloc: 301989888 data_used: 34816000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 104 handle_osd_map epochs [105,105], i have 104, src has [1,105]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 105 ms_handle_reset con 0x55843faecc00 session 0x55843cd3cb40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138452992 unmapped: 41926656 heap: 180379648 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:41.666139+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 105 handle_osd_map epochs [106,106], i have 105, src has [1,106]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 106 handle_osd_map epochs [106,106], i have 106, src has [1,106]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138518528 unmapped: 41861120 heap: 180379648 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:42.666336+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 106 heartbeat osd_stat(store_statfs(0x1b3f1e000/0x0/0x1bfc00000, data 0x7ac5b39/0x7b6e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 106 heartbeat osd_stat(store_statfs(0x1b3f1e000/0x0/0x1bfc00000, data 0x7ac5b39/0x7b6e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 135462912 unmapped: 44916736 heap: 180379648 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 106 ms_handle_reset con 0x558439f7b800 session 0x55843d06e000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:43.666472+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843a294000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 106 heartbeat osd_stat(store_statfs(0x1b5039000/0x0/0x1bfc00000, data 0x69a4ad7/0x6a4c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 135528448 unmapped: 44851200 heap: 180379648 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:44.666693+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 106 handle_osd_map epochs [107,107], i have 106, src has [1,107]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 135135232 unmapped: 45244416 heap: 180379648 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:45.666841+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1686781 data_alloc: 301989888 data_used: 32673792
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 135135232 unmapped: 45244416 heap: 180379648 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:46.667102+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 107 heartbeat osd_stat(store_statfs(0x1b503d000/0x0/0x1bfc00000, data 0x69a6d7b/0x6a50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843b5a4c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 143802368 unmapped: 36577280 heap: 180379648 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:47.667282+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 107 ms_handle_reset con 0x55843b5a4c00 session 0x55843b959a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 135716864 unmapped: 52363264 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:48.667437+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 107 handle_osd_map epochs [108,108], i have 107, src has [1,108]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843f144000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 135782400 unmapped: 52297728 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:49.667603+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.157241821s of 10.135128975s, submitted: 190
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 108 handle_osd_map epochs [108,109], i have 108, src has [1,109]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 109 handle_osd_map epochs [109,109], i have 109, src has [1,109]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 109 ms_handle_reset con 0x55843f144000 session 0x55843b958b40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 109 handle_osd_map epochs [108,109], i have 109, src has [1,109]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 135897088 unmapped: 52183040 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:50.667720+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.038835
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1728053248 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1660944384 meta_used: 1924326 data_alloc: 301989888 data_used: 36204544
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843f145400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 109 handle_osd_map epochs [110,110], i have 109, src has [1,110]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 109 handle_osd_map epochs [110,110], i have 110, src has [1,110]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 110 handle_osd_map epochs [110,110], i have 110, src has [1,110]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 110 handle_osd_map epochs [110,110], i have 110, src has [1,110]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 110 ms_handle_reset con 0x55843e973000 session 0x55843efee3c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 134660096 unmapped: 53420032 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:51.667914+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 110 heartbeat osd_stat(store_statfs(0x1b34ff000/0x0/0x1bfc00000, data 0x84dbc3d/0x858c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 110 handle_osd_map epochs [111,111], i have 110, src has [1,111]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 110 handle_osd_map epochs [111,111], i have 111, src has [1,111]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 111 handle_osd_map epochs [111,111], i have 111, src has [1,111]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 134733824 unmapped: 53346304 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:52.668095+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 111 handle_osd_map epochs [112,112], i have 111, src has [1,112]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 134742016 unmapped: 53338112 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 112 handle_osd_map epochs [111,112], i have 112, src has [1,112]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 112 ms_handle_reset con 0x55843f145400 session 0x55843e758780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:53.668245+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 112 heartbeat osd_stat(store_statfs(0x1b4e68000/0x0/0x1bfc00000, data 0x676e42d/0x6825000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [0,0,1])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 112 handle_osd_map epochs [113,113], i have 112, src has [1,113]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 113 handle_osd_map epochs [113,113], i have 113, src has [1,113]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843b5a4c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 113 handle_osd_map epochs [113,113], i have 113, src has [1,113]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 113 ms_handle_reset con 0x558439f7b800 session 0x55843f374780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 137715712 unmapped: 50364416 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:54.668388+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 113 ms_handle_reset con 0x55843b5a4c00 session 0x55843fca4b40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 113 handle_osd_map epochs [114,114], i have 113, src has [1,114]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 114 handle_osd_map epochs [114,114], i have 114, src has [1,114]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 139165696 unmapped: 48914432 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:55.668630+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1615261 data_alloc: 301989888 data_used: 32813056
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 114 heartbeat osd_stat(store_statfs(0x1b5e58000/0x0/0x1bfc00000, data 0x5773ae1/0x5830000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 114 handle_osd_map epochs [115,115], i have 114, src has [1,115]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141205504 unmapped: 46874624 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:56.668854+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 115 ms_handle_reset con 0x55843e973000 session 0x55843f374b40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b5e0e000/0x0/0x1bfc00000, data 0x57b1297/0x586f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843f144000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 115 handle_osd_map epochs [116,116], i have 115, src has [1,116]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 116 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141279232 unmapped: 46800896 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:57.669215+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 116 ms_handle_reset con 0x55843f144000 session 0x55843efeef00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 116 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 116 handle_osd_map epochs [117,117], i have 116, src has [1,117]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843faed400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 117 handle_osd_map epochs [117,117], i have 117, src has [1,117]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141451264 unmapped: 46628864 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:58.669371+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 117 ms_handle_reset con 0x55843faed400 session 0x55843efee1e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 117 handle_osd_map epochs [117,117], i have 117, src has [1,117]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 117 handle_osd_map epochs [118,118], i have 117, src has [1,118]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 118 handle_osd_map epochs [118,118], i have 118, src has [1,118]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141557760 unmapped: 46522368 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:59.669506+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 118 ms_handle_reset con 0x558439f7b800 session 0x55843efee960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 118 handle_osd_map epochs [118,119], i have 118, src has [1,119]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 8.442638397s of 10.008609772s, submitted: 513
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843b5a4c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141590528 unmapped: 46489600 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:00.669772+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1632854 data_alloc: 301989888 data_used: 32854016
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 119 handle_osd_map epochs [120,120], i have 119, src has [1,120]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 120 heartbeat osd_stat(store_statfs(0x1b5e10000/0x0/0x1bfc00000, data 0x57bc516/0x587d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [0,0,0,0,1])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 120 ms_handle_reset con 0x55843b5a4c00 session 0x55843cdd03c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141746176 unmapped: 46333952 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:01.669912+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 120 handle_osd_map epochs [121,121], i have 120, src has [1,121]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141828096 unmapped: 46252032 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:02.670079+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 121 ms_handle_reset con 0x55843e973000 session 0x55843d0aa000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141836288 unmapped: 46243840 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:03.670280+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141836288 unmapped: 46243840 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:04.670405+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 121 heartbeat osd_stat(store_statfs(0x1b5e0b000/0x0/0x1bfc00000, data 0x57c01c3/0x5880000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 121 handle_osd_map epochs [122,122], i have 121, src has [1,122]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 142974976 unmapped: 45105152 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:05.670591+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1637339 data_alloc: 301989888 data_used: 32854016
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 142974976 unmapped: 45105152 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:06.670825+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 143032320 unmapped: 45047808 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:07.670996+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 122 heartbeat osd_stat(store_statfs(0x1b5e08000/0x0/0x1bfc00000, data 0x57c34a3/0x5885000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 143032320 unmapped: 45047808 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:08.671161+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 143040512 unmapped: 45039616 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:09.671547+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 122 handle_osd_map epochs [122,123], i have 122, src has [1,123]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.877010345s of 10.119226456s, submitted: 104
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:10.671850+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 143048704 unmapped: 45031424 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1640369 data_alloc: 301989888 data_used: 32854016
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:11.672332+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 143048704 unmapped: 45031424 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 123 heartbeat osd_stat(store_statfs(0x1b5e04000/0x0/0x1bfc00000, data 0x57c5747/0x5889000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843f144000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:12.672626+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 143048704 unmapped: 45031424 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 123 handle_osd_map epochs [124,124], i have 123, src has [1,124]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 124 ms_handle_reset con 0x55843f144000 session 0x55843d0aa960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 124 heartbeat osd_stat(store_statfs(0x1b5e02000/0x0/0x1bfc00000, data 0x57c5b56/0x588b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:13.672899+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 143106048 unmapped: 44974080 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 124 heartbeat osd_stat(store_statfs(0x1b5e02000/0x0/0x1bfc00000, data 0x57c5b56/0x588b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:14.673078+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 143122432 unmapped: 44957696 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 124 heartbeat osd_stat(store_statfs(0x1b5dfe000/0x0/0x1bfc00000, data 0x57c7f14/0x588f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:15.673240+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 143122432 unmapped: 44957696 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1648130 data_alloc: 301989888 data_used: 32854016
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:16.673644+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 143130624 unmapped: 44949504 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:17.673864+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 143155200 unmapped: 44924928 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 124 heartbeat osd_stat(store_statfs(0x1b5dff000/0x0/0x1bfc00000, data 0x57c7f14/0x588f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 124 handle_osd_map epochs [125,125], i have 124, src has [1,125]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fcd4400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 125 heartbeat osd_stat(store_statfs(0x1b5dff000/0x0/0x1bfc00000, data 0x57c7f14/0x588f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [0,1])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 125 ms_handle_reset con 0x55843fcd4400 session 0x55843b5ab4a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:18.674297+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 143286272 unmapped: 44793856 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 125 ms_handle_reset con 0x55843a294000 session 0x55843efee5a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:19.674456+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 125 ms_handle_reset con 0x558439f7b800 session 0x55843aa845a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138133504 unmapped: 49946624 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:20.674586+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138133504 unmapped: 49946624 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1430736 data_alloc: 301989888 data_used: 25563136
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:21.674896+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138133504 unmapped: 49946624 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:22.675066+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138133504 unmapped: 49946624 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 125 heartbeat osd_stat(store_statfs(0x1b74c2000/0x0/0x1bfc00000, data 0x4102e95/0x41c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:23.675230+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138133504 unmapped: 49946624 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:24.675375+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138133504 unmapped: 49946624 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:25.675855+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138133504 unmapped: 49946624 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 125 handle_osd_map epochs [126,126], i have 125, src has [1,126]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 15.368701935s of 15.700431824s, submitted: 101
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1434762 data_alloc: 301989888 data_used: 25575424
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 126 heartbeat osd_stat(store_statfs(0x1b74c2000/0x0/0x1bfc00000, data 0x4105139/0x41cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:26.676080+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138149888 unmapped: 49930240 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:27.676394+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138149888 unmapped: 49930240 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:28.676602+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138149888 unmapped: 49930240 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:29.676913+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138149888 unmapped: 49930240 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:30.677059+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138149888 unmapped: 49930240 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1434762 data_alloc: 301989888 data_used: 25575424
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:31.677188+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138149888 unmapped: 49930240 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 126 heartbeat osd_stat(store_statfs(0x1b74c2000/0x0/0x1bfc00000, data 0x4105139/0x41cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:32.677368+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138158080 unmapped: 49922048 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:33.677540+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138158080 unmapped: 49922048 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:34.677978+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138158080 unmapped: 49922048 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:35.678154+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138158080 unmapped: 49922048 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1435562 data_alloc: 301989888 data_used: 25595904
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 126 heartbeat osd_stat(store_statfs(0x1b74c2000/0x0/0x1bfc00000, data 0x4105139/0x41cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:36.678347+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138158080 unmapped: 49922048 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:37.678801+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138158080 unmapped: 49922048 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:38.679066+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138158080 unmapped: 49922048 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:39.679256+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138158080 unmapped: 49922048 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:40.679398+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138166272 unmapped: 49913856 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1435562 data_alloc: 301989888 data_used: 25595904
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 126 heartbeat osd_stat(store_statfs(0x1b74c2000/0x0/0x1bfc00000, data 0x4105139/0x41cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:41.679550+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138166272 unmapped: 49913856 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 126 heartbeat osd_stat(store_statfs(0x1b74c2000/0x0/0x1bfc00000, data 0x4105139/0x41cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:42.679704+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138166272 unmapped: 49913856 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 126 heartbeat osd_stat(store_statfs(0x1b74c2000/0x0/0x1bfc00000, data 0x4105139/0x41cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:43.679859+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138166272 unmapped: 49913856 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:44.680161+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138166272 unmapped: 49913856 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:45.680338+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138166272 unmapped: 49913856 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1435562 data_alloc: 301989888 data_used: 25595904
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 126 heartbeat osd_stat(store_statfs(0x1b74c2000/0x0/0x1bfc00000, data 0x4105139/0x41cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:46.680535+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138166272 unmapped: 49913856 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 126 heartbeat osd_stat(store_statfs(0x1b74c2000/0x0/0x1bfc00000, data 0x4105139/0x41cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:47.680719+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138166272 unmapped: 49913856 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:48.680909+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138182656 unmapped: 49897472 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:49.681098+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138182656 unmapped: 49897472 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:50.681234+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138182656 unmapped: 49897472 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1435562 data_alloc: 301989888 data_used: 25595904
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:51.681439+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138182656 unmapped: 49897472 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:52.681595+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 126 heartbeat osd_stat(store_statfs(0x1b74c2000/0x0/0x1bfc00000, data 0x4105139/0x41cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138182656 unmapped: 49897472 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:53.681736+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138182656 unmapped: 49897472 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:54.681941+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138182656 unmapped: 49897472 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843b5a4c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 29.084304810s of 29.100522995s, submitted: 20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:55.682095+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138190848 unmapped: 49889280 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1438429 data_alloc: 301989888 data_used: 25595904
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 126 handle_osd_map epochs [127,127], i have 126, src has [1,127]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 127 ms_handle_reset con 0x55843b5a4c00 session 0x55843cddbc20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 127 ms_handle_reset con 0x55843e973000 session 0x55843cf245a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843f144000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 127 ms_handle_reset con 0x55843f144000 session 0x55843cd3cd20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:56.682285+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138215424 unmapped: 49864704 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 127 handle_osd_map epochs [127,127], i have 127, src has [1,127]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b74bb000/0x0/0x1bfc00000, data 0x410754d/0x41d2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:57.682422+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138248192 unmapped: 49831936 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b74ba000/0x0/0x1bfc00000, data 0x410794d/0x41d3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 127 handle_osd_map epochs [128,128], i have 127, src has [1,128]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 127 handle_osd_map epochs [128,128], i have 128, src has [1,128]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:58.682584+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138248192 unmapped: 49831936 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:59.682777+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138248192 unmapped: 49831936 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843f144000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 128 handle_osd_map epochs [129,129], i have 128, src has [1,129]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 129 handle_osd_map epochs [129,129], i have 129, src has [1,129]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 129 ms_handle_reset con 0x55843f144000 session 0x55843cd3d4a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:00.682954+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138305536 unmapped: 49774592 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1451379 data_alloc: 301989888 data_used: 25608192
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 129 handle_osd_map epochs [129,129], i have 129, src has [1,129]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:01.683133+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138305536 unmapped: 49774592 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 129 handle_osd_map epochs [130,130], i have 129, src has [1,130]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 130 ms_handle_reset con 0x558439f7b800 session 0x55843b9fa000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:02.683281+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138338304 unmapped: 49741824 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:03.683443+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 130 heartbeat osd_stat(store_statfs(0x1b74b1000/0x0/0x1bfc00000, data 0x410e0d9/0x41db000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138338304 unmapped: 49741824 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:04.683609+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138346496 unmapped: 49733632 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843a294000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 130 ms_handle_reset con 0x55843a294000 session 0x55843eef21e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 130 handle_osd_map epochs [131,131], i have 130, src has [1,131]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.439478874s of 10.662603378s, submitted: 70
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:05.683765+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843b5a4c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138371072 unmapped: 49709056 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1454892 data_alloc: 301989888 data_used: 25620480
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 131 handle_osd_map epochs [131,132], i have 131, src has [1,132]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:06.683932+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 132 heartbeat osd_stat(store_statfs(0x1b74a8000/0x0/0x1bfc00000, data 0x411277b/0x41e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138371072 unmapped: 49709056 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 132 handle_osd_map epochs [133,133], i have 132, src has [1,133]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 133 ms_handle_reset con 0x55843b5a4c00 session 0x55843cd3da40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 133 handle_osd_map epochs [133,133], i have 133, src has [1,133]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:07.684074+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138485760 unmapped: 49594368 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ec5ac00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 133 handle_osd_map epochs [134,134], i have 133, src has [1,134]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 134 handle_osd_map epochs [134,134], i have 134, src has [1,134]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 134 ms_handle_reset con 0x55843ec5ac00 session 0x55843e7583c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 134 ms_handle_reset con 0x55843e973000 session 0x55843cd381e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:08.684226+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138567680 unmapped: 49512448 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843b5a4c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:09.684422+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 134 ms_handle_reset con 0x55843b5a4c00 session 0x55844102e1e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 134 ms_handle_reset con 0x558439f7b800 session 0x55843d11fc20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 134 ms_handle_reset con 0x55843e973000 session 0x55843d11e1e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 138592256 unmapped: 49487872 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:10.684632+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 134 handle_osd_map epochs [135,135], i have 134, src has [1,135]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 135 handle_osd_map epochs [135,135], i have 135, src has [1,135]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 135 heartbeat osd_stat(store_statfs(0x1b749c000/0x0/0x1bfc00000, data 0x41175e3/0x41f0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 147021824 unmapped: 41058304 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1537541 data_alloc: 301989888 data_used: 25620480
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ec5ac00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843f144000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 135 ms_handle_reset con 0x55843f144000 session 0x55844102e3c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843b40e400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets getting new tickets!
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:11.684958+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _finish_auth 0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:11.686157+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 139157504 unmapped: 48922624 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 135 ms_handle_reset con 0x55843b40e400 session 0x55844102e960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:12.685102+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 139182080 unmapped: 48898048 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:13.685250+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 135 ms_handle_reset con 0x558439f7b800 session 0x55844102ed20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843b5a4c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 139214848 unmapped: 48865280 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:14.685406+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 135 handle_osd_map epochs [136,136], i have 135, src has [1,136]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 136 handle_osd_map epochs [136,136], i have 136, src has [1,136]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 136 ms_handle_reset con 0x55843b5a4c00 session 0x55844102f2c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 139149312 unmapped: 48930816 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 136 ms_handle_reset con 0x55843ec5ac00 session 0x55843d11f4a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 136 heartbeat osd_stat(store_statfs(0x1b3c9b000/0x0/0x1bfc00000, data 0x79197ef/0x79f3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:15.685585+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 139149312 unmapped: 48930816 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1925907 data_alloc: 301989888 data_used: 25632768
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.850202560s of 10.631160736s, submitted: 188
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 136 ms_handle_reset con 0x55843e973000 session 0x55843d11e780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843f144000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:16.685772+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 136 ms_handle_reset con 0x55843f144000 session 0x55844084a3c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 139190272 unmapped: 48889856 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 136 handle_osd_map epochs [136,137], i have 136, src has [1,137]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 137 handle_osd_map epochs [137,137], i have 137, src has [1,137]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 137 handle_osd_map epochs [137,137], i have 137, src has [1,137]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 137 handle_osd_map epochs [137,137], i have 137, src has [1,137]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 137 ms_handle_reset con 0x558439f7b800 session 0x55843fca5a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:17.686006+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 139247616 unmapped: 48832512 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 137 handle_osd_map epochs [138,138], i have 137, src has [1,138]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 138 handle_osd_map epochs [138,138], i have 138, src has [1,138]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843b5a4c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 138 ms_handle_reset con 0x55843b5a4c00 session 0x55843f14c000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:18.686190+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 139313152 unmapped: 48766976 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 138 heartbeat osd_stat(store_statfs(0x1b748e000/0x0/0x1bfc00000, data 0x41205d7/0x41fe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 138 ms_handle_reset con 0x55843e973000 session 0x55843f14c3c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:19.686374+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 138 heartbeat osd_stat(store_statfs(0x1b748f000/0x0/0x1bfc00000, data 0x41201b4/0x41fc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 138 handle_osd_map epochs [138,139], i have 138, src has [1,139]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 139329536 unmapped: 48750592 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:20.686575+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ec5ac00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 139 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 139 ms_handle_reset con 0x55843ec5ac00 session 0x55843f14c5a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 139337728 unmapped: 48742400 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1508214 data_alloc: 301989888 data_used: 25649152
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:21.686793+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 139337728 unmapped: 48742400 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:22.688299+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 139337728 unmapped: 48742400 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:23.689047+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 139337728 unmapped: 48742400 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:24.689230+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 139 heartbeat osd_stat(store_statfs(0x1b748b000/0x0/0x1bfc00000, data 0x41224e7/0x4202000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 139 handle_osd_map epochs [140,140], i have 139, src has [1,140]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140386304 unmapped: 47693824 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:25.689486+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140386304 unmapped: 47693824 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1509984 data_alloc: 301989888 data_used: 25649152
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843c62b400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.574018478s of 10.014025688s, submitted: 155
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:26.689711+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 140 handle_osd_map epochs [141,141], i have 140, src has [1,141]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140410880 unmapped: 47669248 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 141 ms_handle_reset con 0x55843c62b400 session 0x55843b5aaf00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 141 handle_osd_map epochs [140,141], i have 141, src has [1,141]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:27.689892+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140427264 unmapped: 47652864 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 141 handle_osd_map epochs [142,142], i have 141, src has [1,142]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 141 handle_osd_map epochs [142,142], i have 142, src has [1,142]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 142 handle_osd_map epochs [142,142], i have 142, src has [1,142]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 142 ms_handle_reset con 0x558439f7b800 session 0x55843f14d680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:28.690092+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140451840 unmapped: 47628288 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 142 ms_handle_reset con 0x55843e973000 session 0x55843f14da40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ec5ac00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:29.690240+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 142 handle_osd_map epochs [143,143], i have 142, src has [1,143]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140460032 unmapped: 47620096 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 143 ms_handle_reset con 0x55843ec5ac00 session 0x55843efef860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d010400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 143 heartbeat osd_stat(store_statfs(0x1b747e000/0x0/0x1bfc00000, data 0x4128f69/0x420f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [1])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 143 ms_handle_reset con 0x55843d010400 session 0x55843c993c20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:30.690434+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140484608 unmapped: 47595520 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 143 handle_osd_map epochs [144,144], i have 143, src has [1,144]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 144 ms_handle_reset con 0x55843ff12400 session 0x55843efee000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1523573 data_alloc: 301989888 data_used: 25669632
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:31.690634+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140468224 unmapped: 47611904 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 144 ms_handle_reset con 0x558439f7b800 session 0x55843ced94a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d010400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 144 ms_handle_reset con 0x55843d010400 session 0x55843cdbbc20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:32.690749+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 144 ms_handle_reset con 0x55843e973000 session 0x55843f374d20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ec5ac00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 144 ms_handle_reset con 0x55843ec5ac00 session 0x55843cdab860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140533760 unmapped: 47546368 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 144 heartbeat osd_stat(store_statfs(0x1b7477000/0x0/0x1bfc00000, data 0x412d72b/0x4216000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:33.690924+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140533760 unmapped: 47546368 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:34.691089+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140533760 unmapped: 47546368 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 144 handle_osd_map epochs [145,145], i have 144, src has [1,145]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:35.691525+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 145 ms_handle_reset con 0x55843ff12800 session 0x55843aa85860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140566528 unmapped: 47513600 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1526061 data_alloc: 301989888 data_used: 25677824
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:36.691916+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140574720 unmapped: 47505408 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.470746040s of 10.975565910s, submitted: 159
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 145 heartbeat osd_stat(store_statfs(0x1b7474000/0x0/0x1bfc00000, data 0x412f9be/0x4219000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 145 handle_osd_map epochs [146,146], i have 145, src has [1,146]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 146 ms_handle_reset con 0x558439f7b800 session 0x55843cdbad20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:37.692088+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140632064 unmapped: 47448064 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d010400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 146 ms_handle_reset con 0x55843d010400 session 0x55843f375e00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:38.692246+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140640256 unmapped: 47439872 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 147 ms_handle_reset con 0x55843e973000 session 0x55843f375860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 147 heartbeat osd_stat(store_statfs(0x1b746f000/0x0/0x1bfc00000, data 0x4131d7c/0x421d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:39.692490+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ec5ac00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 147 ms_handle_reset con 0x55843ec5ac00 session 0x55844102f680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140656640 unmapped: 47423488 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 147 ms_handle_reset con 0x55843ff12800 session 0x55844084a1e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:40.692669+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 147 heartbeat osd_stat(store_statfs(0x1b746e000/0x0/0x1bfc00000, data 0x413412c/0x4220000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140656640 unmapped: 47423488 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1530311 data_alloc: 301989888 data_used: 25677824
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:41.692823+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140656640 unmapped: 47423488 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:42.693043+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140656640 unmapped: 47423488 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:43.693215+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140656640 unmapped: 47423488 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:44.693516+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 147 heartbeat osd_stat(store_statfs(0x1b746e000/0x0/0x1bfc00000, data 0x413412c/0x4220000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140673024 unmapped: 47407104 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 147 handle_osd_map epochs [148,148], i have 147, src has [1,148]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:45.693703+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140697600 unmapped: 47382528 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1534513 data_alloc: 301989888 data_used: 25690112
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:46.694928+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140697600 unmapped: 47382528 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:47.695336+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140697600 unmapped: 47382528 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:48.695695+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 148 heartbeat osd_stat(store_statfs(0x1b7469000/0x0/0x1bfc00000, data 0x41363d0/0x4224000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140697600 unmapped: 47382528 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:49.696072+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140697600 unmapped: 47382528 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:50.696232+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140697600 unmapped: 47382528 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1534513 data_alloc: 301989888 data_used: 25690112
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:51.696404+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 148 heartbeat osd_stat(store_statfs(0x1b7469000/0x0/0x1bfc00000, data 0x41363d0/0x4224000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140697600 unmapped: 47382528 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:52.696569+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140705792 unmapped: 47374336 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:53.696734+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140705792 unmapped: 47374336 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:54.697109+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140705792 unmapped: 47374336 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:55.697300+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 148 heartbeat osd_stat(store_statfs(0x1b7469000/0x0/0x1bfc00000, data 0x41363d0/0x4224000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140705792 unmapped: 47374336 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1534513 data_alloc: 301989888 data_used: 25690112
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:56.697549+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140705792 unmapped: 47374336 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 20.235593796s of 20.376049042s, submitted: 64
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 148 ms_handle_reset con 0x558439f7b800 session 0x55843c652d20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:57.697711+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140722176 unmapped: 47357952 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:58.698627+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d010400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 148 ms_handle_reset con 0x55843d010400 session 0x558440867a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 148 heartbeat osd_stat(store_statfs(0x1b7468000/0x0/0x1bfc00000, data 0x4136432/0x4225000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 148 ms_handle_reset con 0x55843e973000 session 0x55843aa854a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140738560 unmapped: 47341568 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:59.698929+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140738560 unmapped: 47341568 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:00.699096+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140763136 unmapped: 47316992 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1534542 data_alloc: 301989888 data_used: 25690112
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:01.699330+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140763136 unmapped: 47316992 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 148 heartbeat osd_stat(store_statfs(0x1b746a000/0x0/0x1bfc00000, data 0x41363d0/0x4224000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:02.699511+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140763136 unmapped: 47316992 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:03.699697+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140763136 unmapped: 47316992 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 148 heartbeat osd_stat(store_statfs(0x1b746a000/0x0/0x1bfc00000, data 0x41363d0/0x4224000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:04.699874+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140763136 unmapped: 47316992 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:05.700105+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140763136 unmapped: 47316992 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1534542 data_alloc: 301989888 data_used: 25690112
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:06.700322+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140763136 unmapped: 47316992 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 148 heartbeat osd_stat(store_statfs(0x1b746a000/0x0/0x1bfc00000, data 0x41363d0/0x4224000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:07.700493+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140763136 unmapped: 47316992 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:08.700630+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140771328 unmapped: 47308800 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:09.700837+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 148 heartbeat osd_stat(store_statfs(0x1b746a000/0x0/0x1bfc00000, data 0x41363d0/0x4224000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140771328 unmapped: 47308800 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:10.701055+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140771328 unmapped: 47308800 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1534542 data_alloc: 301989888 data_used: 25690112
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:11.701182+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140771328 unmapped: 47308800 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:12.701327+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140771328 unmapped: 47308800 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 15.926622391s of 15.982488632s, submitted: 13
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:13.701467+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 148 handle_osd_map epochs [149,149], i have 148, src has [1,149]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140787712 unmapped: 47292416 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:14.701590+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140787712 unmapped: 47292416 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ec5ac00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 149 handle_osd_map epochs [150,150], i have 149, src has [1,150]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 149 handle_osd_map epochs [150,150], i have 150, src has [1,150]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 150 ms_handle_reset con 0x55843ec5ac00 session 0x55843d11e000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:15.701732+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 150 heartbeat osd_stat(store_statfs(0x1b745d000/0x0/0x1bfc00000, data 0x413b3a0/0x422d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140812288 unmapped: 47267840 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1545575 data_alloc: 301989888 data_used: 25702400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:16.701890+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 150 heartbeat osd_stat(store_statfs(0x1b745d000/0x0/0x1bfc00000, data 0x413aba0/0x422c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140820480 unmapped: 47259648 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:17.702097+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140820480 unmapped: 47259648 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:18.702279+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140820480 unmapped: 47259648 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:19.702453+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 150 heartbeat osd_stat(store_statfs(0x1b745d000/0x0/0x1bfc00000, data 0x413aba0/0x422c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140820480 unmapped: 47259648 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 150 handle_osd_map epochs [151,151], i have 150, src has [1,151]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:20.702599+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140828672 unmapped: 47251456 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1544748 data_alloc: 301989888 data_used: 25702400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:21.702855+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140828672 unmapped: 47251456 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:22.703058+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 heartbeat osd_stat(store_statfs(0x1b745d000/0x0/0x1bfc00000, data 0x413ce44/0x4230000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140828672 unmapped: 47251456 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:23.703234+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140828672 unmapped: 47251456 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:24.703391+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140828672 unmapped: 47251456 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 heartbeat osd_stat(store_statfs(0x1b745d000/0x0/0x1bfc00000, data 0x413ce44/0x4230000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:25.703555+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140828672 unmapped: 47251456 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1544748 data_alloc: 301989888 data_used: 25702400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:26.703723+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140828672 unmapped: 47251456 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:27.703911+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140828672 unmapped: 47251456 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:28.704082+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140828672 unmapped: 47251456 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:29.704259+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140828672 unmapped: 47251456 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:30.704472+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 heartbeat osd_stat(store_statfs(0x1b745d000/0x0/0x1bfc00000, data 0x413ce44/0x4230000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140828672 unmapped: 47251456 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1544748 data_alloc: 301989888 data_used: 25702400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:31.704645+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 heartbeat osd_stat(store_statfs(0x1b745d000/0x0/0x1bfc00000, data 0x413ce44/0x4230000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140828672 unmapped: 47251456 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:32.704765+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140836864 unmapped: 47243264 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:33.704926+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 heartbeat osd_stat(store_statfs(0x1b745d000/0x0/0x1bfc00000, data 0x413ce44/0x4230000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140836864 unmapped: 47243264 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:34.705112+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140836864 unmapped: 47243264 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:35.705275+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 heartbeat osd_stat(store_statfs(0x1b745d000/0x0/0x1bfc00000, data 0x413ce44/0x4230000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140845056 unmapped: 47235072 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1544748 data_alloc: 301989888 data_used: 25702400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:36.705460+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140845056 unmapped: 47235072 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:37.705602+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140845056 unmapped: 47235072 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:38.705816+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140845056 unmapped: 47235072 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:39.706059+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140845056 unmapped: 47235072 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 heartbeat osd_stat(store_statfs(0x1b745d000/0x0/0x1bfc00000, data 0x413ce44/0x4230000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:40.706199+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140845056 unmapped: 47235072 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1544748 data_alloc: 301989888 data_used: 25702400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:41.706341+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140845056 unmapped: 47235072 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:42.706531+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140845056 unmapped: 47235072 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:43.706691+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140845056 unmapped: 47235072 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:44.707414+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 heartbeat osd_stat(store_statfs(0x1b745d000/0x0/0x1bfc00000, data 0x413ce44/0x4230000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140845056 unmapped: 47235072 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:45.708229+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 heartbeat osd_stat(store_statfs(0x1b745d000/0x0/0x1bfc00000, data 0x413ce44/0x4230000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140845056 unmapped: 47235072 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1544748 data_alloc: 301989888 data_used: 25702400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:46.708409+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140845056 unmapped: 47235072 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:47.708583+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140845056 unmapped: 47235072 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:48.708729+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140853248 unmapped: 47226880 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:49.708959+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 36.106723785s of 36.216259003s, submitted: 48
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 ms_handle_reset con 0x55843ff12c00 session 0x55843fca5860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140853248 unmapped: 47226880 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:50.709131+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 heartbeat osd_stat(store_statfs(0x1b745c000/0x0/0x1bfc00000, data 0x413cea6/0x4231000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 ms_handle_reset con 0x558439f7b800 session 0x55843e7594a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d010400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140861440 unmapped: 47218688 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 ms_handle_reset con 0x55843d010400 session 0x55843eef3c20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1544905 data_alloc: 301989888 data_used: 25702400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:51.709345+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140861440 unmapped: 47218688 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 ms_handle_reset con 0x55843e973000 session 0x55843b5aa000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:52.709521+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140861440 unmapped: 47218688 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:53.709663+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ec5ac00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 ms_handle_reset con 0x55843ec5ac00 session 0x55843aa85e00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d3c5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 ms_handle_reset con 0x55843d3c5800 session 0x55843efef0e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140861440 unmapped: 47218688 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 heartbeat osd_stat(store_statfs(0x1b745d000/0x0/0x1bfc00000, data 0x413ce44/0x4230000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:54.709803+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 ms_handle_reset con 0x558439f7b800 session 0x55843ca85a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140877824 unmapped: 47202304 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:55.710000+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d010400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 ms_handle_reset con 0x55843d010400 session 0x55843d0cc3c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d3c5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140894208 unmapped: 47185920 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1547209 data_alloc: 301989888 data_used: 25702400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 ms_handle_reset con 0x55843d3c5800 session 0x55843d11fe00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:56.710274+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140926976 unmapped: 47153152 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:57.710431+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 ms_handle_reset con 0x55843e973000 session 0x55843cdbb860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140926976 unmapped: 47153152 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:58.710660+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 heartbeat osd_stat(store_statfs(0x1b745e000/0x0/0x1bfc00000, data 0x413ce44/0x4230000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ec5ac00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 ms_handle_reset con 0x55843ec5ac00 session 0x55843f14c960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 ms_handle_reset con 0x558439f7b800 session 0x55843f14d860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140967936 unmapped: 47112192 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:59.710894+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d010400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.083531380s of 10.386286736s, submitted: 78
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 ms_handle_reset con 0x55843d010400 session 0x55843e606d20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d3c5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 ms_handle_reset con 0x55843d3c5800 session 0x55843d0c5680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140984320 unmapped: 47095808 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:00.711062+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140984320 unmapped: 47095808 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1553456 data_alloc: 301989888 data_used: 25702400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:01.711222+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 ms_handle_reset con 0x55843e973000 session 0x55844084b680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d3c4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 ms_handle_reset con 0x55843d3c4800 session 0x55843d5f1860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 140992512 unmapped: 47087616 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 ms_handle_reset con 0x558439f7b800 session 0x55844084ba40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d010400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:02.711512+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 ms_handle_reset con 0x55843d010400 session 0x55843a7823c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 heartbeat osd_stat(store_statfs(0x1b745c000/0x0/0x1bfc00000, data 0x413ceb6/0x4232000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141000704 unmapped: 47079424 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d3c5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:03.711641+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 ms_handle_reset con 0x55843d3c5800 session 0x55843ca85680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141000704 unmapped: 47079424 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:04.711814+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141008896 unmapped: 47071232 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:05.711987+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 151 handle_osd_map epochs [152,152], i have 151, src has [1,152]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 152 handle_osd_map epochs [152,152], i have 152, src has [1,152]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 152 handle_osd_map epochs [152,152], i have 152, src has [1,152]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 152 ms_handle_reset con 0x55843e973000 session 0x55844102ef00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141017088 unmapped: 47063040 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1564300 data_alloc: 301989888 data_used: 25714688
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:06.712252+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 152 heartbeat osd_stat(store_statfs(0x1b7055000/0x0/0x1bfc00000, data 0x413f2e6/0x4238000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141017088 unmapped: 47063040 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:07.712421+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558440484800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 152 ms_handle_reset con 0x558440484800 session 0x55844102e000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 152 ms_handle_reset con 0x558439f7b800 session 0x55843cd3c3c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141058048 unmapped: 47022080 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:08.712591+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d010400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 152 ms_handle_reset con 0x55843d010400 session 0x55843a786b40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d3c5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 152 ms_handle_reset con 0x55843d3c5800 session 0x55843cf24960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141082624 unmapped: 46997504 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:09.712741+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141082624 unmapped: 46997504 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:10.712945+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141082624 unmapped: 46997504 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1562206 data_alloc: 301989888 data_used: 25714688
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:11.713110+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 152 heartbeat osd_stat(store_statfs(0x1b7057000/0x0/0x1bfc00000, data 0x413f274/0x4236000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141082624 unmapped: 46997504 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:12.713252+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141082624 unmapped: 46997504 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:13.713424+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 152 heartbeat osd_stat(store_statfs(0x1b7057000/0x0/0x1bfc00000, data 0x413f274/0x4236000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141082624 unmapped: 46997504 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:14.713591+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141082624 unmapped: 46997504 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 15.047349930s of 15.488705635s, submitted: 105
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:15.719248+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 152 ms_handle_reset con 0x55843e973000 session 0x55843f3752c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141082624 unmapped: 46997504 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558440484c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1563812 data_alloc: 301989888 data_used: 25714688
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:16.719416+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 152 handle_osd_map epochs [153,153], i have 152, src has [1,153]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 153 handle_osd_map epochs [153,153], i have 153, src has [1,153]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 153 ms_handle_reset con 0x558440484c00 session 0x55843e759e00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141099008 unmapped: 46981120 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:17.719573+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 153 ms_handle_reset con 0x558439f7b800 session 0x55843f374960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d010400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141099008 unmapped: 46981120 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:18.719743+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 153 heartbeat osd_stat(store_statfs(0x1b7052000/0x0/0x1bfc00000, data 0x4141632/0x423a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 153 handle_osd_map epochs [153,154], i have 153, src has [1,154]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 154 handle_osd_map epochs [154,154], i have 154, src has [1,154]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 154 ms_handle_reset con 0x55843d010400 session 0x55843cdd0960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141107200 unmapped: 46972928 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:19.719886+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d3c5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 154 ms_handle_reset con 0x55843d3c5800 session 0x55843c994d20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 141123584 unmapped: 46956544 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:20.720050+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558440485000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 154 handle_osd_map epochs [154,154], i have 154, src has [1,154]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 154 ms_handle_reset con 0x55843e973000 session 0x55843f375a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 154 ms_handle_reset con 0x558440485000 session 0x558440867860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 142417920 unmapped: 45662208 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1569259 data_alloc: 301989888 data_used: 28872704
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:21.720180+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 154 heartbeat osd_stat(store_statfs(0x1b7050000/0x0/0x1bfc00000, data 0x4143a44/0x423e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 142417920 unmapped: 45662208 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:22.720322+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 142417920 unmapped: 45662208 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:23.720527+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 142417920 unmapped: 45662208 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:24.720778+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 154 handle_osd_map epochs [155,155], i have 154, src has [1,155]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 155 handle_osd_map epochs [155,155], i have 155, src has [1,155]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 155 handle_osd_map epochs [155,155], i have 155, src has [1,155]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 155 ms_handle_reset con 0x558439f7b800 session 0x55843ca84f00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 142417920 unmapped: 45662208 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:25.720926+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 155 heartbeat osd_stat(store_statfs(0x1b7049000/0x0/0x1bfc00000, data 0x4145d5a/0x4244000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 142417920 unmapped: 45662208 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:26.721092+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d010400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1578545 data_alloc: 301989888 data_used: 28884992
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.717106819s of 11.062205315s, submitted: 91
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 155 ms_handle_reset con 0x55843d010400 session 0x55843c652780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d3c5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 155 ms_handle_reset con 0x55843d3c5800 session 0x55843efeeb40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 155 ms_handle_reset con 0x55843e973000 session 0x55843efef2c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558440485800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 155 ms_handle_reset con 0x558440485800 session 0x55843cdbcb40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 142508032 unmapped: 45572096 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:27.721254+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d010400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 155 ms_handle_reset con 0x55843d010400 session 0x55843cf24b40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d3c5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 155 heartbeat osd_stat(store_statfs(0x1b7047000/0x0/0x1bfc00000, data 0x4145dcd/0x4246000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [0,0,0,0,0,0,0,1])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 155 ms_handle_reset con 0x558439f7b800 session 0x55843cdaa5a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 155 ms_handle_reset con 0x55843e973000 session 0x55843cdbc3c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 155 ms_handle_reset con 0x55843d3c5800 session 0x55844102fe00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144359424 unmapped: 43720704 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:28.721406+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558440485800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 155 ms_handle_reset con 0x558440485800 session 0x55843cab8960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144392192 unmapped: 43687936 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:29.721553+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 155 ms_handle_reset con 0x558439f7b800 session 0x55843e7592c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d010400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d3c5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144392192 unmapped: 43687936 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:30.721706+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 155 heartbeat osd_stat(store_statfs(0x1b7049000/0x0/0x1bfc00000, data 0x4145d5b/0x4244000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 155 handle_osd_map epochs [156,156], i have 155, src has [1,156]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 155 handle_osd_map epochs [156,156], i have 156, src has [1,156]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 155 handle_osd_map epochs [156,156], i have 156, src has [1,156]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 155 handle_osd_map epochs [156,156], i have 156, src has [1,156]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 156 handle_osd_map epochs [157,157], i have 156, src has [1,157]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 157 handle_osd_map epochs [157,157], i have 157, src has [1,157]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 157 ms_handle_reset con 0x55843d3c5800 session 0x55843eef25a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 157 handle_osd_map epochs [157,157], i have 157, src has [1,157]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144441344 unmapped: 43638784 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:31.721866+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1602418 data_alloc: 301989888 data_used: 28897280
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 157 handle_osd_map epochs [157,158], i have 157, src has [1,158]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 158 ms_handle_reset con 0x55843e973000 session 0x55843eef2960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144449536 unmapped: 43630592 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:32.721975+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2ec00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 158 ms_handle_reset con 0x55843ca2ec00 session 0x55843eef3c20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 158 handle_osd_map epochs [159,159], i have 158, src has [1,159]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9f400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 159 handle_osd_map epochs [159,159], i have 159, src has [1,159]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 159 ms_handle_reset con 0x558441c9f400 session 0x55843c90e5a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144490496 unmapped: 43589632 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:33.722086+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 159 ms_handle_reset con 0x55843d010400 session 0x55843c995c20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 159 ms_handle_reset con 0x558439f7b800 session 0x55843ca845a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2ec00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 159 heartbeat osd_stat(store_statfs(0x1b703a000/0x0/0x1bfc00000, data 0x414c987/0x4253000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [0,0,1])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144490496 unmapped: 43589632 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:34.722259+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 159 ms_handle_reset con 0x55843ca2ec00 session 0x55843d0cdc20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 159 handle_osd_map epochs [160,160], i have 159, src has [1,160]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144515072 unmapped: 43565056 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:35.722408+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 160 heartbeat osd_stat(store_statfs(0x1b7032000/0x0/0x1bfc00000, data 0x4151057/0x425b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144531456 unmapped: 43548672 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:36.722583+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1613312 data_alloc: 301989888 data_used: 28921856
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d3c5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.413414001s of 10.288631439s, submitted: 247
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 160 ms_handle_reset con 0x55843d3c5800 session 0x55843cdbda40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144531456 unmapped: 43548672 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:37.722724+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:38.722868+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144556032 unmapped: 43524096 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9f400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ef6e400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 160 ms_handle_reset con 0x55843ef6e400 session 0x55843cddba40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ef6e400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 160 ms_handle_reset con 0x55843ef6e400 session 0x55843efee1e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 160 handle_osd_map epochs [161,161], i have 160, src has [1,161]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 161 ms_handle_reset con 0x558441c9f400 session 0x55843c994d20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 161 ms_handle_reset con 0x55843e973000 session 0x55843eef23c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:39.723076+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144580608 unmapped: 43499520 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 161 handle_osd_map epochs [161,161], i have 161, src has [1,161]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 161 heartbeat osd_stat(store_statfs(0x1b702a000/0x0/0x1bfc00000, data 0x41534fa/0x4263000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2ec00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 161 ms_handle_reset con 0x55843ca2ec00 session 0x55843d06fa40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d010400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 161 ms_handle_reset con 0x55843d010400 session 0x55843cf24960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:40.723217+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144556032 unmapped: 43524096 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 161 handle_osd_map epochs [162,162], i have 161, src has [1,162]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 162 ms_handle_reset con 0x558439f7b800 session 0x55843efee5a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 162 handle_osd_map epochs [162,162], i have 162, src has [1,162]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2ec00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 162 handle_osd_map epochs [162,162], i have 162, src has [1,162]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d010400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 162 ms_handle_reset con 0x55843ca2ec00 session 0x55843fca4d20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:41.723389+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144572416 unmapped: 43507712 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1620149 data_alloc: 301989888 data_used: 28934144
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 162 heartbeat osd_stat(store_statfs(0x1b702b000/0x0/0x1bfc00000, data 0x41557c5/0x4262000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 162 handle_osd_map epochs [163,163], i have 162, src has [1,163]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 163 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 163 ms_handle_reset con 0x55843d010400 session 0x55843d5f1860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:42.723557+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144596992 unmapped: 43483136 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 163 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:43.723745+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144596992 unmapped: 43483136 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 163 handle_osd_map epochs [164,164], i have 163, src has [1,164]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 164 handle_osd_map epochs [164,164], i have 164, src has [1,164]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 164 ms_handle_reset con 0x55843e973000 session 0x55843fca50e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ef6e400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:44.723899+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144613376 unmapped: 43466752 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 164 handle_osd_map epochs [164,164], i have 164, src has [1,164]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 164 ms_handle_reset con 0x55843ef6e400 session 0x55843cdd05a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 164 handle_osd_map epochs [165,165], i have 164, src has [1,165]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:45.724088+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144621568 unmapped: 43458560 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 165 handle_osd_map epochs [163,165], i have 165, src has [1,165]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 165 ms_handle_reset con 0x558439f7b800 session 0x55843d11fa40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 165 heartbeat osd_stat(store_statfs(0x1b7027000/0x0/0x1bfc00000, data 0x4159f3c/0x4266000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:46.724297+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144621568 unmapped: 43458560 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1629048 data_alloc: 301989888 data_used: 28934144
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 165 heartbeat osd_stat(store_statfs(0x1b7020000/0x0/0x1bfc00000, data 0x415c28a/0x426c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2ec00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:47.724492+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144621568 unmapped: 43458560 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.180541039s of 10.788432121s, submitted: 178
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 165 handle_osd_map epochs [166,166], i have 165, src has [1,166]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d010400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 166 ms_handle_reset con 0x55843d010400 session 0x55843d0aa5a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:48.724662+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 144637952 unmapped: 43442176 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 166 heartbeat osd_stat(store_statfs(0x1b701d000/0x0/0x1bfc00000, data 0x415e680/0x4270000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 166 handle_osd_map epochs [167,167], i have 166, src has [1,167]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 167 ms_handle_reset con 0x55843ca2ec00 session 0x55843cdd14a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:49.724794+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 145694720 unmapped: 42385408 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 167 handle_osd_map epochs [168,168], i have 167, src has [1,168]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 167 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:50.724941+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 145170432 unmapped: 42909696 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 168 handle_osd_map epochs [169,169], i have 168, src has [1,169]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:51.725125+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 145170432 unmapped: 42909696 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1641945 data_alloc: 301989888 data_used: 28946432
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 169 ms_handle_reset con 0x55843e973000 session 0x55843c65ed20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9f400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:52.725252+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 145178624 unmapped: 42901504 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d3c5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 169 handle_osd_map epochs [168,169], i have 169, src has [1,169]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 169 ms_handle_reset con 0x55843d3c5800 session 0x55843b959e00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 169 ms_handle_reset con 0x558439f7b800 session 0x55843fca5680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 169 handle_osd_map epochs [170,170], i have 169, src has [1,170]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 170 handle_osd_map epochs [170,170], i have 170, src has [1,170]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 170 ms_handle_reset con 0x558441c9f400 session 0x55843d5f1e00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2ec00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:53.725401+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 170 handle_osd_map epochs [170,170], i have 170, src has [1,170]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 145186816 unmapped: 42893312 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 170 handle_osd_map epochs [170,170], i have 170, src has [1,170]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 170 handle_osd_map epochs [170,170], i have 170, src has [1,170]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d010400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 170 ms_handle_reset con 0x55843ca2ec00 session 0x55843cf243c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 170 ms_handle_reset con 0x55843e973000 session 0x558440867680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 170 handle_osd_map epochs [171,171], i have 170, src has [1,171]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 170 handle_osd_map epochs [171,171], i have 171, src has [1,171]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 171 ms_handle_reset con 0x55843d010400 session 0x558440867c20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 171 heartbeat osd_stat(store_statfs(0x1b700e000/0x0/0x1bfc00000, data 0x4167530/0x427f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:54.725610+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 145195008 unmapped: 42885120 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 171 handle_osd_map epochs [172,172], i have 171, src has [1,172]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:55.725781+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2ec00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 172 ms_handle_reset con 0x55843ca2ec00 session 0x55843ca93a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 145195008 unmapped: 42885120 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 172 handle_osd_map epochs [172,172], i have 172, src has [1,172]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 172 handle_osd_map epochs [172,172], i have 172, src has [1,172]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 172 ms_handle_reset con 0x55843e973000 session 0x55843c65f0e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 172 handle_osd_map epochs [173,173], i have 172, src has [1,173]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9f400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 173 ms_handle_reset con 0x558439f7b800 session 0x5584408672c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 173 ms_handle_reset con 0x55843ff12c00 session 0x55843f14d4a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:56.725967+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 145653760 unmapped: 42426368 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ac000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1784255 data_alloc: 301989888 data_used: 28962816
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 173 ms_handle_reset con 0x5584407ac000 session 0x55844102f860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 173 ms_handle_reset con 0x558441c9f400 session 0x55843ced9860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2ec00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 173 ms_handle_reset con 0x55843ca2ec00 session 0x55843e6063c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:57.726108+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 145711104 unmapped: 42369024 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 173 ms_handle_reset con 0x55843e973000 session 0x55843d11f680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 173 handle_osd_map epochs [174,174], i have 173, src has [1,174]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.560883522s of 10.187833786s, submitted: 214
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 174 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 174 ms_handle_reset con 0x55843ff12c00 session 0x55843cd3d860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 174 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:58.726249+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 145776640 unmapped: 42303488 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 174 handle_osd_map epochs [175,175], i have 174, src has [1,175]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 175 ms_handle_reset con 0x55843ff12000 session 0x55843d5f0000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2ec00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 175 ms_handle_reset con 0x55843e973000 session 0x55843c653a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:59.726399+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 145842176 unmapped: 42237952 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 175 ms_handle_reset con 0x55843ff12c00 session 0x55843c9934a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 175 heartbeat osd_stat(store_statfs(0x1b5077000/0x0/0x1bfc00000, data 0x60f18ab/0x6216000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [0,0,0,0,0,0,0,1])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 175 handle_osd_map epochs [175,176], i have 175, src has [1,176]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 176 handle_osd_map epochs [176,176], i have 176, src has [1,176]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9f400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 176 handle_osd_map epochs [177,177], i have 176, src has [1,177]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 176 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 177 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 177 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ac400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 177 handle_osd_map epochs [176,177], i have 177, src has [1,177]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:00.726548+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 147341312 unmapped: 40738816 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 177 ms_handle_reset con 0x558441c9f400 session 0x55843b9592c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407acc00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 177 ms_handle_reset con 0x5584407acc00 session 0x55843d11fe00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:01.726731+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 147488768 unmapped: 40591360 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2310644 data_alloc: 301989888 data_used: 28962816
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 177 handle_osd_map epochs [178,178], i have 177, src has [1,178]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 178 handle_osd_map epochs [178,178], i have 178, src has [1,178]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 178 handle_osd_map epochs [178,178], i have 178, src has [1,178]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 178 ms_handle_reset con 0x5584407ac400 session 0x55843ce1da40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 178 ms_handle_reset con 0x55843e973000 session 0x55843aa841e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 178 ms_handle_reset con 0x55843ca2ec00 session 0x55843b9594a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 178 ms_handle_reset con 0x55843ff12c00 session 0x55843cdbc780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:02.726933+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 155205632 unmapped: 32874496 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 178 handle_osd_map epochs [179,179], i have 178, src has [1,179]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 179 handle_osd_map epochs [179,179], i have 179, src has [1,179]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 179 handle_osd_map epochs [179,179], i have 179, src has [1,179]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 179 ms_handle_reset con 0x5584407ad000 session 0x55843d0cc960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ac400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 179 ms_handle_reset con 0x5584407ac400 session 0x55843f374f00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:03.727124+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 147922944 unmapped: 40157184 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2ec00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 179 ms_handle_reset con 0x55843ca2ec00 session 0x55843e758f00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:04.727257+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 147980288 unmapped: 40099840 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 179 ms_handle_reset con 0x55843e973000 session 0x55843e606960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 179 ms_handle_reset con 0x5584407ad000 session 0x55843a782780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 179 ms_handle_reset con 0x55843ff12c00 session 0x55843d0cc3c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407acc00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 179 handle_osd_map epochs [180,180], i have 179, src has [1,180]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 180 handle_osd_map epochs [180,180], i have 180, src has [1,180]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:05.727400+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 180 heartbeat osd_stat(store_statfs(0x1ad7b7000/0x0/0x1bfc00000, data 0xc805ef8/0xc937000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 148144128 unmapped: 39936000 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9f400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 180 ms_handle_reset con 0x558441c9f400 session 0x55843ca85a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9f400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 180 ms_handle_reset con 0x5584407ad800 session 0x55843d0c5c20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 180 ms_handle_reset con 0x558441c9f400 session 0x55843eef3680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 180 ms_handle_reset con 0x5584407ad000 session 0x55843e759680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407adc00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:06.727619+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 180 ms_handle_reset con 0x5584407adc00 session 0x55843cdbbc20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 148553728 unmapped: 39526400 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2954900 data_alloc: 301989888 data_used: 28979200
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2ec00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 180 ms_handle_reset con 0x55843ca2ec00 session 0x558440866780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 180 handle_osd_map epochs [181,181], i have 180, src has [1,181]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 181 ms_handle_reset con 0x5584407ad400 session 0x55843ca84d20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:07.727771+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 181 handle_osd_map epochs [181,181], i have 181, src has [1,181]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 148701184 unmapped: 39378944 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 181 handle_osd_map epochs [181,181], i have 181, src has [1,181]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 181 ms_handle_reset con 0x5584407ad800 session 0x55843cdbde00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 7.736338139s of 10.004708290s, submitted: 321
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 181 ms_handle_reset con 0x5584407acc00 session 0x55843cdbc1e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 181 ms_handle_reset con 0x5584407ad000 session 0x5584408672c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:08.727941+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 158269440 unmapped: 29810688 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407adc00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 181 heartbeat osd_stat(store_statfs(0x1a7efa000/0x0/0x1bfc00000, data 0x120bf562/0x121f4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 181 handle_osd_map epochs [182,182], i have 181, src has [1,182]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 182 ms_handle_reset con 0x5584407adc00 session 0x55843c994b40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407acc00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:09.728087+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 150036480 unmapped: 38043648 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 182 ms_handle_reset con 0x5584407acc00 session 0x5584408665a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:10.728256+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 158613504 unmapped: 29466624 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 182 heartbeat osd_stat(store_statfs(0x1a67ab000/0x0/0x1bfc00000, data 0x1380c901/0x13941000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 182 handle_osd_map epochs [183,183], i have 182, src has [1,183]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 182 handle_osd_map epochs [183,183], i have 183, src has [1,183]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 183 handle_osd_map epochs [183,183], i have 183, src has [1,183]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:11.728757+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 183 handle_osd_map epochs [183,183], i have 183, src has [1,183]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 183 handle_osd_map epochs [183,183], i have 183, src has [1,183]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 158695424 unmapped: 29384704 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3496075 data_alloc: 301989888 data_used: 29003776
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 183 ms_handle_reset con 0x5584407ad000 session 0x55843a787860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 183 ms_handle_reset con 0x5584407ad800 session 0x558440867680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9f400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 183 handle_osd_map epochs [184,184], i have 183, src has [1,184]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 183 handle_osd_map epochs [184,184], i have 184, src has [1,184]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:12.728918+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 183 handle_osd_map epochs [184,184], i have 184, src has [1,184]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 184 ms_handle_reset con 0x558441c9f400 session 0x558440867c20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 184 ms_handle_reset con 0x5584407ad400 session 0x558440866b40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 184 handle_osd_map epochs [184,184], i have 184, src has [1,184]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 150519808 unmapped: 37560320 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407acc00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 184 ms_handle_reset con 0x5584407acc00 session 0x55843b958f00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:13.729093+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 184 ms_handle_reset con 0x5584407ad000 session 0x55843e606b40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 158998528 unmapped: 29081600 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9f400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 184 ms_handle_reset con 0x558441c9f400 session 0x55843ca85680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:14.729271+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 184 handle_osd_map epochs [185,185], i have 184, src has [1,185]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 185 handle_osd_map epochs [185,185], i have 185, src has [1,185]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166567936 unmapped: 21512192 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 185 ms_handle_reset con 0x5584407ad800 session 0x558440867a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 185 handle_osd_map epochs [185,185], i have 185, src has [1,185]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 185 handle_osd_map epochs [186,186], i have 185, src has [1,186]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 186 handle_osd_map epochs [186,186], i have 186, src has [1,186]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 186 ms_handle_reset con 0x55843e973000 session 0x55843cdb92c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:15.729396+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 186 handle_osd_map epochs [185,186], i have 186, src has [1,186]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 160481280 unmapped: 27598848 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 186 ms_handle_reset con 0x55843ff12c00 session 0x55843efeef00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:16.729589+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 152109056 unmapped: 35971072 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 186 heartbeat osd_stat(store_statfs(0x1a272a000/0x0/0x1bfc00000, data 0x1718b07f/0x172c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3826652 data_alloc: 301989888 data_used: 29011968
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 186 ms_handle_reset con 0x55843e973000 session 0x55843d5f0960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:17.729775+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 152190976 unmapped: 35889152 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 186 heartbeat osd_stat(store_statfs(0x1a272a000/0x0/0x1bfc00000, data 0x1718b07f/0x172c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,2,3,4] op hist [0,0,0,0,0,0,0,1])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 7.980698109s of 10.045507431s, submitted: 368
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:18.729972+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 153387008 unmapped: 34693120 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:19.730189+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 153559040 unmapped: 34521088 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 186 handle_osd_map epochs [187,187], i have 186, src has [1,187]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:20.730358+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 187 handle_osd_map epochs [187,187], i have 187, src has [1,187]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 187 handle_osd_map epochs [187,187], i have 187, src has [1,187]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 187 ms_handle_reset con 0x5584407ad000 session 0x55843b959680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 154583040 unmapped: 33497088 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Got map version 46
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:21.730505+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 154370048 unmapped: 33710080 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4274426 data_alloc: 301989888 data_used: 29024256
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:22.730669+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 187 ms_handle_reset con 0x5584407ad800 session 0x55843cd390e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 154411008 unmapped: 33669120 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:23.730843+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 187 heartbeat osd_stat(store_statfs(0x19e224000/0x0/0x1bfc00000, data 0x1b98d450/0x1bac9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 154419200 unmapped: 33660928 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9f400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 187 ms_handle_reset con 0x558441c9f400 session 0x55843eef3e00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:24.731003+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 162889728 unmapped: 25190400 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:25.731189+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 154624000 unmapped: 33456128 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:26.731415+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 154624000 unmapped: 33456128 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4500520 data_alloc: 301989888 data_used: 29024256
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:27.731576+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 154689536 unmapped: 33390592 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.285512924s of 10.146043777s, submitted: 88
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 187 ms_handle_reset con 0x55843e973000 session 0x55843d06f4a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:28.731704+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 154853376 unmapped: 33226752 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 187 heartbeat osd_stat(store_statfs(0x19ba23000/0x0/0x1bfc00000, data 0x1e18d54d/0x1e2cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,2,3,4] op hist [0,1])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 187 ms_handle_reset con 0x55843ff12c00 session 0x55843eef2d20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:29.731859+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 163405824 unmapped: 24674304 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 187 ms_handle_reset con 0x5584407ad000 session 0x55843e607680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 187 ms_handle_reset con 0x5584407ad800 session 0x558440866d20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 187 heartbeat osd_stat(store_statfs(0x198ba0000/0x0/0x1bfc00000, data 0x21010514/0x2114e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441bf8000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 187 ms_handle_reset con 0x558441bf8000 session 0x55843cdb83c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 187 ms_handle_reset con 0x55843e973000 session 0x55843cdaaf00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 187 ms_handle_reset con 0x55843ff12c00 session 0x55844084a000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:30.731998+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 155541504 unmapped: 32538624 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:31.732211+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 187 ms_handle_reset con 0x5584407ad000 session 0x55843b5aa000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 164085760 unmapped: 23994368 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 5156613 data_alloc: 301989888 data_used: 29028352
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Got map version 47
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 187 ms_handle_reset con 0x5584407ad800 session 0x55843cdaa5a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:32.732349+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 155811840 unmapped: 32268288 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:33.732513+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 187 heartbeat osd_stat(store_statfs(0x196b82000/0x0/0x1bfc00000, data 0x240104db/0x2414c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 155926528 unmapped: 32153600 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441bf8000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 187 ms_handle_reset con 0x558441bf8000 session 0x55843cab83c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:34.732649+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 187 ms_handle_reset con 0x55843e973000 session 0x55843d11f4a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 164454400 unmapped: 23625728 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:35.732820+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 156180480 unmapped: 31899648 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 187 heartbeat osd_stat(store_statfs(0x193b85000/0x0/0x1bfc00000, data 0x270103cd/0x27149000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:36.733063+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 156188672 unmapped: 31891456 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 5590023 data_alloc: 301989888 data_used: 29024256
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:37.733231+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 156262400 unmapped: 31817728 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 8.493808746s of 10.112232208s, submitted: 161
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:38.733383+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 187 heartbeat osd_stat(store_statfs(0x193385000/0x0/0x1bfc00000, data 0x278103cd/0x27949000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 157360128 unmapped: 30720000 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 187 handle_osd_map epochs [187,188], i have 187, src has [1,188]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 188 handle_osd_map epochs [188,188], i have 188, src has [1,188]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 188 ms_handle_reset con 0x55843ff12c00 session 0x55843ced94a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 188 ms_handle_reset con 0x558439f7b800 session 0x55843f14dc20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:39.733539+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 157368320 unmapped: 30711808 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 188 handle_osd_map epochs [188,188], i have 188, src has [1,188]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 188 ms_handle_reset con 0x5584407ad000 session 0x55843e7583c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:40.733691+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 157392896 unmapped: 30687232 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441bf8400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 188 ms_handle_reset con 0x558441bf8400 session 0x55843e606f00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 188 handle_osd_map epochs [189,189], i have 188, src has [1,189]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 189 handle_osd_map epochs [188,189], i have 189, src has [1,189]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:41.733826+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 189 ms_handle_reset con 0x558439f7b800 session 0x55843aa84780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 155107328 unmapped: 32972800 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3032560 data_alloc: 301989888 data_used: 29040640
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 189 ms_handle_reset con 0x5584407ad800 session 0x55843cd38960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:42.734003+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 155172864 unmapped: 32907264 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 189 ms_handle_reset con 0x5584407ad000 session 0x55843ca92f00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441bf8800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 189 ms_handle_reset con 0x558441bf8800 session 0x55844084a5a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:43.734200+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 155172864 unmapped: 32907264 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:44.734328+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 189 heartbeat osd_stat(store_statfs(0x1b337c000/0x0/0x1bfc00000, data 0x4814b6d/0x4950000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 155181056 unmapped: 32899072 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441bf8c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 189 ms_handle_reset con 0x558441bf8c00 session 0x55843d0aad20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:45.734495+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 155287552 unmapped: 32792576 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:46.734677+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 155287552 unmapped: 32792576 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1929833 data_alloc: 301989888 data_used: 32190464
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 189 handle_osd_map epochs [190,190], i have 189, src has [1,190]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 190 ms_handle_reset con 0x558439f7b800 session 0x55843d0cda40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 190 handle_osd_map epochs [190,190], i have 190, src has [1,190]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:47.734816+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 155295744 unmapped: 32784384 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 190 handle_osd_map epochs [191,191], i have 190, src has [1,191]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.631053925s of 10.217371941s, submitted: 167
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 191 handle_osd_map epochs [191,191], i have 191, src has [1,191]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 191 ms_handle_reset con 0x5584407ad000 session 0x55843ca84f00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:48.734952+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 155328512 unmapped: 32751616 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 191 handle_osd_map epochs [192,192], i have 191, src has [1,192]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 192 ms_handle_reset con 0x5584407ad800 session 0x55843f14d0e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:49.735095+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 155328512 unmapped: 32751616 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 192 heartbeat osd_stat(store_statfs(0x1b636b000/0x0/0x1bfc00000, data 0x481b8bc/0x4960000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 192 handle_osd_map epochs [192,193], i have 192, src has [1,193]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:50.735280+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441bf8800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 193 handle_osd_map epochs [193,193], i have 193, src has [1,193]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 193 ms_handle_reset con 0x558441bf8800 session 0x55843cf245a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 155369472 unmapped: 32710656 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441bf9000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 193 handle_osd_map epochs [194,194], i have 193, src has [1,194]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:51.735418+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 194 ms_handle_reset con 0x558441bf9000 session 0x55844102e3c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 194 heartbeat osd_stat(store_statfs(0x1b6369000/0x0/0x1bfc00000, data 0x481db60/0x4964000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 155418624 unmapped: 32661504 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1951449 data_alloc: 301989888 data_used: 32202752
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 194 ms_handle_reset con 0x558439f7b800 session 0x55843ab32780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 194 handle_osd_map epochs [194,195], i have 194, src has [1,195]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 195 handle_osd_map epochs [195,195], i have 195, src has [1,195]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 195 ms_handle_reset con 0x5584407ad000 session 0x55844102e960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:52.735557+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 155426816 unmapped: 32653312 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 195 handle_osd_map epochs [195,195], i have 195, src has [1,195]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:53.735756+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 195 ms_handle_reset con 0x5584407ad800 session 0x55843efee000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441bf8800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 155443200 unmapped: 32636928 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:54.735965+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 195 handle_osd_map epochs [196,196], i have 195, src has [1,196]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 196 ms_handle_reset con 0x558441bf8800 session 0x55843cd381e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 155492352 unmapped: 32587776 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 196 handle_osd_map epochs [196,196], i have 196, src has [1,196]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:55.736119+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 157646848 unmapped: 30433280 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441bf9400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 196 ms_handle_reset con 0x558441bf9400 session 0x55843a787680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 196 heartbeat osd_stat(store_statfs(0x1b5f87000/0x0/0x1bfc00000, data 0x4bfc6d2/0x4d46000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 196 ms_handle_reset con 0x558439f7b800 session 0x55843d0aba40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:56.736293+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 157671424 unmapped: 30408704 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1984749 data_alloc: 301989888 data_used: 32227328
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 196 ms_handle_reset con 0x5584407ad000 session 0x55843c993a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:57.736466+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 157679616 unmapped: 30400512 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:58.736615+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.547788620s of 10.326631546s, submitted: 215
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 196 ms_handle_reset con 0x5584407ad800 session 0x55843c65f680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441bf8800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 157720576 unmapped: 30359552 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 196 ms_handle_reset con 0x558441bf8800 session 0x55843c90e3c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:59.736787+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 157736960 unmapped: 30343168 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:00.736925+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 196 heartbeat osd_stat(store_statfs(0x1b5f84000/0x0/0x1bfc00000, data 0x4c025d5/0x4d4a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 196 handle_osd_map epochs [197,197], i have 196, src has [1,197]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 196 handle_osd_map epochs [197,197], i have 197, src has [1,197]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 196 handle_osd_map epochs [197,197], i have 197, src has [1,197]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 157753344 unmapped: 30326784 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:01.737110+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 157753344 unmapped: 30326784 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1992797 data_alloc: 301989888 data_used: 32243712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 197 heartbeat osd_stat(store_statfs(0x1b5f7f000/0x0/0x1bfc00000, data 0x4c04879/0x4d4e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:02.737274+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 157753344 unmapped: 30326784 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 197 heartbeat osd_stat(store_statfs(0x1b5f7f000/0x0/0x1bfc00000, data 0x4c04879/0x4d4e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 157753344 unmapped: 30326784 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 157761536 unmapped: 30318592 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 157761536 unmapped: 30318592 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:06.397128+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441bf9800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441bf9c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 197 ms_handle_reset con 0x558441bf9c00 session 0x55843cdbb680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 157777920 unmapped: 30302208 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1997295 data_alloc: 301989888 data_used: 32243712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:07.397295+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 157777920 unmapped: 30302208 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:08.397472+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 197 handle_osd_map epochs [198,198], i have 197, src has [1,198]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 198 ms_handle_reset con 0x5584407ad000 session 0x55843cd38780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x5584407ad800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 157786112 unmapped: 30294016 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.258571625s of 10.460664749s, submitted: 73
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441bf8800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:09.397617+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 198 handle_osd_map epochs [199,199], i have 198, src has [1,199]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain systemd-journald[47227]: Data hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 75.0 (53725 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation.
Nov 28 10:18:34 np0005538513.localdomain systemd-journald[47227]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 10:18:34 np0005538513.localdomain rsyslogd[759]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 199 ms_handle_reset con 0x5584407ad800 session 0x55843c90e780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 199 heartbeat osd_stat(store_statfs(0x1b5f75000/0x0/0x1bfc00000, data 0x4c07177/0x4d58000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d008800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 157827072 unmapped: 30253056 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:10.397756+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 199 handle_osd_map epochs [200,200], i have 199, src has [1,200]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 200 ms_handle_reset con 0x55843d008800 session 0x55843d0c4d20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441cafc00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 200 ms_handle_reset con 0x558441cafc00 session 0x55843ced8960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 157933568 unmapped: 30146560 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:11.397891+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 200 handle_osd_map epochs [201,201], i have 200, src has [1,201]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441caf400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 201 ms_handle_reset con 0x558441bf8800 session 0x55843f14d680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 157974528 unmapped: 30105600 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2020713 data_alloc: 301989888 data_used: 32256000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:12.398075+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 201 handle_osd_map epochs [202,202], i have 201, src has [1,202]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 202 ms_handle_reset con 0x558441caf400 session 0x55844102e5a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 158154752 unmapped: 29925376 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:13.398356+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d008800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 202 ms_handle_reset con 0x55843d008800 session 0x55843cdd1a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d008000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 158171136 unmapped: 29908992 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:14.398581+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 202 handle_osd_map epochs [203,203], i have 202, src has [1,203]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 203 heartbeat osd_stat(store_statfs(0x1b5f62000/0x0/0x1bfc00000, data 0x4c101fc/0x4d69000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d009400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 203 ms_handle_reset con 0x55843d009400 session 0x55843e6061e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 158195712 unmapped: 29884416 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:15.426445+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 203 handle_osd_map epochs [204,204], i have 203, src has [1,204]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441bf8800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441cafc00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 204 ms_handle_reset con 0x558441cafc00 session 0x55843b5ab2c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 204 ms_handle_reset con 0x55843d008000 session 0x55843a782000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 204 ms_handle_reset con 0x558441bf8800 session 0x55843eef23c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d008800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d009400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 158253056 unmapped: 29827072 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 204 ms_handle_reset con 0x55843d008800 session 0x55843c90eb40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:16.426604+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 204 handle_osd_map epochs [205,205], i have 204, src has [1,205]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 205 ms_handle_reset con 0x55843d009400 session 0x55843cddaf00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 205 heartbeat osd_stat(store_statfs(0x1b5f5a000/0x0/0x1bfc00000, data 0x4c14f58/0x4d72000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 158277632 unmapped: 29802496 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2039635 data_alloc: 301989888 data_used: 32256000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:17.426798+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 205 handle_osd_map epochs [205,206], i have 205, src has [1,206]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441caf400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 206 ms_handle_reset con 0x558441caf400 session 0x55843cdba5a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 158310400 unmapped: 29769728 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:18.426958+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441cafc00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 206 ms_handle_reset con 0x558441cafc00 session 0x55843cdb81e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d008800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 158343168 unmapped: 29736960 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:19.427100+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 206 handle_osd_map epochs [207,207], i have 206, src has [1,207]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.609278679s of 10.412595749s, submitted: 270
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 207 ms_handle_reset con 0x55843d008800 session 0x55843cdb8780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 207 handle_osd_map epochs [206,207], i have 207, src has [1,207]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 158400512 unmapped: 29679616 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:20.427270+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d009400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441bf8800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 207 ms_handle_reset con 0x558441bf8800 session 0x55843ced9860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441caf400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 158457856 unmapped: 29622272 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 207 handle_osd_map epochs [207,208], i have 207, src has [1,208]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:21.427408+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 208 ms_handle_reset con 0x558441caf400 session 0x55843ce1d0e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 208 handle_osd_map epochs [208,208], i have 208, src has [1,208]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441671c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 208 ms_handle_reset con 0x558441671c00 session 0x558440866b40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 158547968 unmapped: 29532160 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2044754 data_alloc: 301989888 data_used: 32268288
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 208 heartbeat osd_stat(store_statfs(0x1b5f4f000/0x0/0x1bfc00000, data 0x4c1dc5a/0x4d7e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:22.427550+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 208 handle_osd_map epochs [209,209], i have 208, src has [1,209]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 209 ms_handle_reset con 0x55843d009400 session 0x55843a7821e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 158556160 unmapped: 29523968 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:23.427727+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 209 heartbeat osd_stat(store_statfs(0x1b5f4a000/0x0/0x1bfc00000, data 0x4c2007c/0x4d82000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 209 handle_osd_map epochs [210,210], i have 209, src has [1,210]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 209 handle_osd_map epochs [210,210], i have 210, src has [1,210]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 209 handle_osd_map epochs [210,210], i have 210, src has [1,210]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 209 handle_osd_map epochs [210,210], i have 210, src has [1,210]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 209 handle_osd_map epochs [210,210], i have 210, src has [1,210]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 158564352 unmapped: 29515776 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:24.427893+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 210 handle_osd_map epochs [211,211], i have 210, src has [1,211]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 211 handle_osd_map epochs [211,211], i have 211, src has [1,211]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 211 handle_osd_map epochs [211,211], i have 211, src has [1,211]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 158605312 unmapped: 29474816 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:25.428072+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 158605312 unmapped: 29474816 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:26.428216+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 211 handle_osd_map epochs [212,212], i have 211, src has [1,212]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 211 ms_handle_reset con 0x558439f7b800 session 0x55843cd3c960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 159727616 unmapped: 28352512 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2055467 data_alloc: 301989888 data_used: 32268288
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:27.428425+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 212 heartbeat osd_stat(store_statfs(0x1b5f40000/0x0/0x1bfc00000, data 0x4c26ca0/0x4d8d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [0,0,2])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 212 ms_handle_reset con 0x55843e973000 session 0x55843ced90e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 212 ms_handle_reset con 0x55843ff12c00 session 0x55843eef34a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 159752192 unmapped: 28327936 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:28.428561+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d008800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 212 ms_handle_reset con 0x55843d008800 session 0x55843c994d20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 159162368 unmapped: 28917760 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:29.428707+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 159162368 unmapped: 28917760 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:30.428850+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.821386337s of 11.409869194s, submitted: 232
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 159162368 unmapped: 28917760 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:31.429052+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 212 handle_osd_map epochs [213,213], i have 212, src has [1,213]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441671c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 213 ms_handle_reset con 0x558441671c00 session 0x55843d0cd4a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 158859264 unmapped: 29220864 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1959174 data_alloc: 301989888 data_used: 29114368
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:32.429203+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 213 heartbeat osd_stat(store_statfs(0x1b6752000/0x0/0x1bfc00000, data 0x41c5ca6/0x432a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558439f7b800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 158859264 unmapped: 29220864 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d008800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:33.429340+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 213 ms_handle_reset con 0x55843d008800 session 0x55843fca50e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 213 ms_handle_reset con 0x55843e973000 session 0x55843ca93680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 158900224 unmapped: 29179904 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:34.429475+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 213 handle_osd_map epochs [213,214], i have 213, src has [1,214]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 213 handle_osd_map epochs [214,214], i have 214, src has [1,214]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441bf8800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 214 handle_osd_map epochs [214,214], i have 214, src has [1,214]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 214 handle_osd_map epochs [214,214], i have 214, src has [1,214]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 214 ms_handle_reset con 0x558441bf8800 session 0x55843ca84960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161021952 unmapped: 27058176 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:35.429687+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 214 handle_osd_map epochs [215,215], i have 214, src has [1,215]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 215 heartbeat osd_stat(store_statfs(0x1b53f4000/0x0/0x1bfc00000, data 0x41cca8f/0x4339000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161030144 unmapped: 27049984 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:36.429828+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441caf400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 215 ms_handle_reset con 0x558441caf400 session 0x55843c992f00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fb86000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 215 handle_osd_map epochs [215,216], i have 215, src has [1,216]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 216 ms_handle_reset con 0x55843ff12c00 session 0x55843cdbcf00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 216 ms_handle_reset con 0x55843fb86000 session 0x55844102fa40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161218560 unmapped: 26861568 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1973883 data_alloc: 301989888 data_used: 29126656
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 216 ms_handle_reset con 0x55843ff12c00 session 0x55843d5f0960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:37.430001+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161218560 unmapped: 26861568 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:38.430181+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161218560 unmapped: 26861568 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:39.430328+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161218560 unmapped: 26861568 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:40.430491+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161218560 unmapped: 26861568 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:41.430634+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 216 handle_osd_map epochs [217,218], i have 216, src has [1,218]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.949913979s of 10.501282692s, submitted: 225
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 218 heartbeat osd_stat(store_statfs(0x1b53f6000/0x0/0x1bfc00000, data 0x41cee82/0x4338000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161292288 unmapped: 26787840 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1978499 data_alloc: 301989888 data_used: 29138944
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:42.430838+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 218 heartbeat osd_stat(store_statfs(0x1b53ed000/0x0/0x1bfc00000, data 0x41d35e1/0x4340000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161292288 unmapped: 26787840 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:43.431094+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161292288 unmapped: 26787840 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:44.431291+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 218 heartbeat osd_stat(store_statfs(0x1b53ee000/0x0/0x1bfc00000, data 0x41d3546/0x433f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161292288 unmapped: 26787840 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:45.431465+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 218 handle_osd_map epochs [219,219], i have 218, src has [1,219]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161030144 unmapped: 27049984 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:46.431614+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 219 handle_osd_map epochs [220,220], i have 219, src has [1,220]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161038336 unmapped: 27041792 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1991387 data_alloc: 301989888 data_used: 29138944
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:47.431799+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161038336 unmapped: 27041792 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:48.431979+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d008800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161030144 unmapped: 27049984 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:49.432141+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 220 ms_handle_reset con 0x55843d008800 session 0x55843a786f00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 220 ms_handle_reset con 0x55843e973000 session 0x55843a786b40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161030144 unmapped: 27049984 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:50.432289+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441bf8800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 220 ms_handle_reset con 0x558441bf8800 session 0x55843a787680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 220 heartbeat osd_stat(store_statfs(0x1b53e6000/0x0/0x1bfc00000, data 0x41d7f12/0x4348000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161030144 unmapped: 27049984 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:51.432493+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 220 handle_osd_map epochs [220,221], i have 220, src has [1,221]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441bf8800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 221 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.923654556s of 10.138095856s, submitted: 174
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 221 ms_handle_reset con 0x558441bf8800 session 0x55843cdaa1e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d008800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 221 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 221 ms_handle_reset con 0x55843d008800 session 0x55843cdaaf00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161046528 unmapped: 27033600 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1996813 data_alloc: 301989888 data_used: 29151232
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:52.432666+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 221 ms_handle_reset con 0x55843e973000 session 0x55843cdab860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fb86000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 221 ms_handle_reset con 0x55843fb86000 session 0x55843cf241e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 221 ms_handle_reset con 0x55843ff12c00 session 0x55843ca85680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 162390016 unmapped: 25690112 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:53.432829+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 221 ms_handle_reset con 0x55843ff12c00 session 0x55843ca84d20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d008800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 221 ms_handle_reset con 0x55843d008800 session 0x55843ca85a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 221 ms_handle_reset con 0x55843e973000 session 0x55843ca84f00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161390592 unmapped: 26689536 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:54.432939+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fb86000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Got map version 48
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 221 ms_handle_reset con 0x55843fb86000 session 0x55843cdbda40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 221 heartbeat osd_stat(store_statfs(0x1b4dbf000/0x0/0x1bfc00000, data 0x47fb3bb/0x496f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441bf8800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 221 ms_handle_reset con 0x558441bf8800 session 0x55843d0c5680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d008800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161579008 unmapped: 26501120 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:55.433082+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 221 heartbeat osd_stat(store_statfs(0x1b4dbf000/0x0/0x1bfc00000, data 0x47fb3bb/0x496f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:56.433172+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161587200 unmapped: 26492928 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441caf400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 221 handle_osd_map epochs [222,222], i have 221, src has [1,222]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 222 ms_handle_reset con 0x55843ff12c00 session 0x55843a7861e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 222 ms_handle_reset con 0x558441caf400 session 0x55843cdbab40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 222 ms_handle_reset con 0x55843fef5800 session 0x55843ca854a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:57.433341+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161619968 unmapped: 26460160 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2012976 data_alloc: 301989888 data_used: 29163520
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef5000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 222 ms_handle_reset con 0x55843fef5000 session 0x55843cdb8b40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 222 ms_handle_reset con 0x55843fef4000 session 0x55843ca843c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef5000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 222 ms_handle_reset con 0x55843fef5000 session 0x55843e607a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:58.433511+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161636352 unmapped: 26443776 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 222 heartbeat osd_stat(store_statfs(0x1b53dd000/0x0/0x1bfc00000, data 0x41dc4c6/0x4350000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:59.433757+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161652736 unmapped: 26427392 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 222 handle_osd_map epochs [223,223], i have 222, src has [1,223]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 223 ms_handle_reset con 0x55843fef5800 session 0x55843e759c20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 223 ms_handle_reset con 0x55843ff12c00 session 0x55843ce1da40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 223 handle_osd_map epochs [223,223], i have 223, src has [1,223]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:00.433933+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161669120 unmapped: 26411008 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441caf400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 223 ms_handle_reset con 0x558441caf400 session 0x55843d11e960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 223 ms_handle_reset con 0x55843fef4400 session 0x55843d5f0000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef5000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 223 ms_handle_reset con 0x55843fef4800 session 0x55843cdbc000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:01.434090+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161685504 unmapped: 26394624 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.358844757s of 10.016320229s, submitted: 164
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 223 ms_handle_reset con 0x55843fef5800 session 0x55843cd3c780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 223 heartbeat osd_stat(store_statfs(0x1b53d9000/0x0/0x1bfc00000, data 0x41de832/0x4354000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:02.434294+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 223 handle_osd_map epochs [223,224], i have 223, src has [1,224]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161701888 unmapped: 26378240 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2031272 data_alloc: 301989888 data_used: 29163520
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 224 handle_osd_map epochs [224,224], i have 224, src has [1,224]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 224 handle_osd_map epochs [224,224], i have 224, src has [1,224]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441caf400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 224 ms_handle_reset con 0x55843fef5000 session 0x55843cdaa3c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 224 ms_handle_reset con 0x55843ff12c00 session 0x55843efef2c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 224 ms_handle_reset con 0x55843fef4c00 session 0x55843cd3cb40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 224 ms_handle_reset con 0x55843fef4800 session 0x55843d06fa40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:03.434463+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef5000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161775616 unmapped: 26304512 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 224 ms_handle_reset con 0x55843fef5800 session 0x55843d0abe00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 224 handle_osd_map epochs [225,225], i have 224, src has [1,225]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 225 handle_osd_map epochs [225,225], i have 225, src has [1,225]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 225 ms_handle_reset con 0x558441caf400 session 0x55843d06e000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 225 ms_handle_reset con 0x55843ff12c00 session 0x55843cd381e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 225 heartbeat osd_stat(store_statfs(0x1b53cb000/0x0/0x1bfc00000, data 0x41e31b6/0x4361000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:04.434638+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161808384 unmapped: 26271744 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 225 handle_osd_map epochs [225,226], i have 225, src has [1,226]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 226 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 226 ms_handle_reset con 0x55843fef5000 session 0x55843b5112c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 226 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:05.434799+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161865728 unmapped: 26214400 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 226 ms_handle_reset con 0x55843fef5800 session 0x55843ab332c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 226 handle_osd_map epochs [226,227], i have 226, src has [1,227]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 227 ms_handle_reset con 0x55843fef4800 session 0x55843c65e960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 227 ms_handle_reset con 0x55843ff12c00 session 0x55843eef3680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:06.434990+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161939456 unmapped: 26140672 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441caf400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 227 ms_handle_reset con 0x558441caf400 session 0x55843ca85680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843cf49c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fb13000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 227 ms_handle_reset con 0x55843fb13000 session 0x55843b5abc20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:07.435218+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 161964032 unmapped: 26116096 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2040395 data_alloc: 301989888 data_used: 29196288
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 227 ms_handle_reset con 0x55843fef4800 session 0x55843c90f860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 227 ms_handle_reset con 0x55843fef5800 session 0x55843cd392c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 227 handle_osd_map epochs [228,228], i have 227, src has [1,228]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 227 handle_osd_map epochs [228,228], i have 228, src has [1,228]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 228 ms_handle_reset con 0x55843cf49c00 session 0x55843ca845a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 228 ms_handle_reset con 0x55843ff12c00 session 0x55844084a780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:08.435383+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 162070528 unmapped: 26009600 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441caf400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:09.435588+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 162078720 unmapped: 26001408 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 228 heartbeat osd_stat(store_statfs(0x1b53c4000/0x0/0x1bfc00000, data 0x41e9d20/0x436a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.3 total, 600.0 interval
                                                          Cumulative writes: 16K writes, 61K keys, 16K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.01 MB/s
                                                          Cumulative WAL: 16K writes, 5217 syncs, 3.12 writes per sync, written: 0.04 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 10K writes, 36K keys, 10K commit groups, 1.0 writes per commit group, ingest: 23.73 MB, 0.04 MB/s
                                                          Interval WAL: 10K writes, 4356 syncs, 2.39 writes per sync, written: 0.02 GB, 0.04 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 228 handle_osd_map epochs [229,229], i have 228, src has [1,229]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 229 ms_handle_reset con 0x558441caf400 session 0x55843cdb8b40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843cf49c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 229 ms_handle_reset con 0x55843cf49c00 session 0x55843cdaa000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:10.435731+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 162160640 unmapped: 25919488 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 229 handle_osd_map epochs [230,230], i have 229, src has [1,230]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 230 ms_handle_reset con 0x55843fef4800 session 0x55843cdaaf00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:11.435876+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 162226176 unmapped: 25853952 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 230 handle_osd_map epochs [231,231], i have 230, src has [1,231]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.239541054s of 10.173316002s, submitted: 253
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:12.436044+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 162250752 unmapped: 25829376 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2054993 data_alloc: 301989888 data_used: 29208576
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 231 handle_osd_map epochs [231,232], i have 231, src has [1,232]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 232 ms_handle_reset con 0x55843fef5800 session 0x55843d5f0d20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:13.436184+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 162250752 unmapped: 25829376 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 232 ms_handle_reset con 0x55843ff12c00 session 0x55843b959680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fb13800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 232 heartbeat osd_stat(store_statfs(0x1b53af000/0x0/0x1bfc00000, data 0x41f2dce/0x437d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:14.436384+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 162308096 unmapped: 25772032 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 232 handle_osd_map epochs [233,233], i have 232, src has [1,233]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 233 ms_handle_reset con 0x55843fb13800 session 0x55843b959860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843cf49c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:15.436521+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 162324480 unmapped: 25755648 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 233 handle_osd_map epochs [233,234], i have 233, src has [1,234]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 234 ms_handle_reset con 0x55843cf49c00 session 0x55843c994f00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:16.436737+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 234 handle_osd_map epochs [235,235], i have 234, src has [1,235]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 162381824 unmapped: 25698304 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 235 ms_handle_reset con 0x55843fef4800 session 0x55843fca4f00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:17.436956+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 235 handle_osd_map epochs [236,236], i have 235, src has [1,236]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 162439168 unmapped: 25640960 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2072721 data_alloc: 301989888 data_used: 29208576
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 236 ms_handle_reset con 0x55843fef5800 session 0x55843fca54a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:18.437147+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 162422784 unmapped: 25657344 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 236 heartbeat osd_stat(store_statfs(0x1b53a2000/0x0/0x1bfc00000, data 0x41fbc9a/0x438c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 236 ms_handle_reset con 0x55843ff12c00 session 0x55843ce1da40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e732400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:19.437335+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 162471936 unmapped: 25608192 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 236 handle_osd_map epochs [237,237], i have 236, src has [1,237]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9f800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 237 ms_handle_reset con 0x55843e732400 session 0x55843d06f4a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 237 ms_handle_reset con 0x558441c9f800 session 0x55843aa84b40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843cf49c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 237 ms_handle_reset con 0x55843cf49c00 session 0x55843b9594a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:20.437466+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 163659776 unmapped: 24420352 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 237 ms_handle_reset con 0x55843fef4800 session 0x55843ce1c000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 237 ms_handle_reset con 0x55843fef5800 session 0x55844084b860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:21.437605+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 237 handle_osd_map epochs [237,238], i have 237, src has [1,238]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 163725312 unmapped: 24354816 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 238 handle_osd_map epochs [238,238], i have 238, src has [1,238]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 238 handle_osd_map epochs [238,238], i have 238, src has [1,238]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 238 ms_handle_reset con 0x55843ff12c00 session 0x55843a7823c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 238 heartbeat osd_stat(store_statfs(0x1b53a2000/0x0/0x1bfc00000, data 0x41fdf5d/0x438c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:22.437763+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 163733504 unmapped: 24346624 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2072412 data_alloc: 301989888 data_used: 29229056
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843cf49c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.115608215s of 10.785024643s, submitted: 230
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 238 ms_handle_reset con 0x55843cf49c00 session 0x55843b9fbc20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:23.437972+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 163733504 unmapped: 24346624 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 238 ms_handle_reset con 0x55843fef4800 session 0x55843d06ed20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 238 ms_handle_reset con 0x55843fef5800 session 0x55843c995e00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:24.438148+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 163741696 unmapped: 24338432 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:25.438349+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 163741696 unmapped: 24338432 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 238 ms_handle_reset con 0x55843ff12c00 session 0x55843ced9e00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:26.438524+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 238 handle_osd_map epochs [239,239], i have 238, src has [1,239]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 163749888 unmapped: 24330240 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:27.438686+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9f800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 239 handle_osd_map epochs [239,239], i have 239, src has [1,239]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 163774464 unmapped: 24305664 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2078144 data_alloc: 301989888 data_used: 29241344
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 239 handle_osd_map epochs [239,239], i have 239, src has [1,239]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 239 ms_handle_reset con 0x558441c9f800 session 0x55843d06f4a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 239 heartbeat osd_stat(store_statfs(0x1b5398000/0x0/0x1bfc00000, data 0x420258c/0x4395000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:28.438879+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 163774464 unmapped: 24305664 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843cf49c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 239 ms_handle_reset con 0x55843cf49c00 session 0x55844102ed20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:29.439142+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 163774464 unmapped: 24305664 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 239 ms_handle_reset con 0x55843fef4800 session 0x55843cdbc000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 239 ms_handle_reset con 0x55843fef5800 session 0x55843aa84b40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:30.439281+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 163774464 unmapped: 24305664 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 239 handle_osd_map epochs [240,240], i have 239, src has [1,240]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 240 heartbeat osd_stat(store_statfs(0x1b5396000/0x0/0x1bfc00000, data 0x4202627/0x4398000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 240 ms_handle_reset con 0x55843ff12c00 session 0x55843cdbcf00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9fc00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:31.439437+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9e000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 163815424 unmapped: 24264704 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 240 ms_handle_reset con 0x558441c9e000 session 0x55843eef21e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843cf49c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 240 ms_handle_reset con 0x558441c9fc00 session 0x55843cdb8b40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 240 ms_handle_reset con 0x55843cf49c00 session 0x55843eef3e00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:32.439648+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 240 ms_handle_reset con 0x55843fef5800 session 0x55843c65fc20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 240 handle_osd_map epochs [241,241], i have 240, src has [1,241]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 163905536 unmapped: 24174592 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2097085 data_alloc: 301989888 data_used: 29253632
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 241 handle_osd_map epochs [241,241], i have 241, src has [1,241]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 241 handle_osd_map epochs [241,241], i have 241, src has [1,241]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 241 ms_handle_reset con 0x55843ff12c00 session 0x55843c65f860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9e400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9e800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.533895493s of 10.007489204s, submitted: 154
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 241 ms_handle_reset con 0x558441c9e800 session 0x55843cdb85a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9e800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 241 ms_handle_reset con 0x558441c9e400 session 0x55843c65ef00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:33.439865+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 163954688 unmapped: 24125440 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 241 handle_osd_map epochs [241,242], i have 241, src has [1,242]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 242 ms_handle_reset con 0x558441c9e800 session 0x55843cdab680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 242 handle_osd_map epochs [242,242], i have 242, src has [1,242]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843cf49c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 242 ms_handle_reset con 0x55843cf49c00 session 0x55843ca92000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 242 ms_handle_reset con 0x55843fef4800 session 0x55843ca85a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef5800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 242 ms_handle_reset con 0x55843fef5800 session 0x55843d5f0d20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:34.440094+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 163971072 unmapped: 24109056 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843cf49c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:35.440276+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 163971072 unmapped: 24109056 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 242 handle_osd_map epochs [243,243], i have 242, src has [1,243]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b5388000/0x0/0x1bfc00000, data 0x420960b/0x43a3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 243 ms_handle_reset con 0x55843cf49c00 session 0x55843cdda1e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:36.440439+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 164028416 unmapped: 24051712 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 243 handle_osd_map epochs [244,244], i have 243, src has [1,244]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 244 handle_osd_map epochs [244,244], i have 244, src has [1,244]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:37.440632+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 164036608 unmapped: 24043520 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2102397 data_alloc: 301989888 data_used: 29261824
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 244 handle_osd_map epochs [245,245], i have 244, src has [1,245]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 245 ms_handle_reset con 0x55843fef4800 session 0x55843cd38d20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9e400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 245 ms_handle_reset con 0x558441c9e400 session 0x55843ce1da40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:38.440805+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 164134912 unmapped: 23945216 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9e800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:39.440978+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 164134912 unmapped: 23945216 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 245 ms_handle_reset con 0x558441c9e800 session 0x55843a783680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:40.441170+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ff12c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 245 ms_handle_reset con 0x55843ff12c00 session 0x55843cdbc780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 164134912 unmapped: 23945216 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 245 heartbeat osd_stat(store_statfs(0x1b5380000/0x0/0x1bfc00000, data 0x420fb4d/0x43ab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843cf49c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 245 ms_handle_reset con 0x55843cf49c00 session 0x55843b3b05a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:41.441438+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 164143104 unmapped: 23937024 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 245 heartbeat osd_stat(store_statfs(0x1b5382000/0x0/0x1bfc00000, data 0x420fbaf/0x43ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 245 handle_osd_map epochs [246,246], i have 245, src has [1,246]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 245 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 245 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:42.441643+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 164167680 unmapped: 23912448 heap: 188080128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2111921 data_alloc: 301989888 data_used: 29274112
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9e400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 246 ms_handle_reset con 0x558441c9e400 session 0x55843cddb860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 246 ms_handle_reset con 0x55843fef4800 session 0x55843cf25e00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9e800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9fc00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.531710625s of 10.001019478s, submitted: 155
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 246 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 246 ms_handle_reset con 0x558441c9fc00 session 0x55843ca92780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 246 ms_handle_reset con 0x558441c9e800 session 0x55843efee960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843cf49c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:43.441814+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 165781504 unmapped: 25976832 heap: 191758336 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 246 ms_handle_reset con 0x55843cf49c00 session 0x55843cdab4a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 246 ms_handle_reset con 0x55843fef4800 session 0x55843b958b40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9e400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:44.442089+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 165781504 unmapped: 25976832 heap: 191758336 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 246 handle_osd_map epochs [247,247], i have 246, src has [1,247]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 247 handle_osd_map epochs [247,247], i have 247, src has [1,247]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 247 handle_osd_map epochs [247,247], i have 247, src has [1,247]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 247 ms_handle_reset con 0x558441c9e400 session 0x55843cdbb2c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:45.442269+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 247 handle_osd_map epochs [247,248], i have 247, src has [1,248]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 165675008 unmapped: 26083328 heap: 191758336 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 248 handle_osd_map epochs [248,248], i have 248, src has [1,248]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9fc00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 248 ms_handle_reset con 0x558441c9fc00 session 0x55843b958f00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9ec00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b45c2000/0x0/0x1bfc00000, data 0x4fc9346/0x516b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 248 ms_handle_reset con 0x558441c9ec00 session 0x55843fca45a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:46.442427+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 248 handle_osd_map epochs [248,249], i have 248, src has [1,249]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 165683200 unmapped: 26075136 heap: 191758336 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843cf49c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 249 ms_handle_reset con 0x55843cf49c00 session 0x55843d11fc20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 249 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 249 ms_handle_reset con 0x55843fef4800 session 0x55843c995c20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:47.442666+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 165683200 unmapped: 26075136 heap: 191758336 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2238409 data_alloc: 301989888 data_used: 29274112
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9e400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 249 ms_handle_reset con 0x558441c9e400 session 0x55843d11eb40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:48.442864+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 165691392 unmapped: 26066944 heap: 191758336 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9fc00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 249 ms_handle_reset con 0x558441c9fc00 session 0x55843ca92780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:49.443062+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9f000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 165707776 unmapped: 26050560 heap: 191758336 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 249 handle_osd_map epochs [250,250], i have 249, src has [1,250]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 250 handle_osd_map epochs [250,250], i have 250, src has [1,250]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 250 handle_osd_map epochs [250,250], i have 250, src has [1,250]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441670800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 250 ms_handle_reset con 0x558441670800 session 0x55843ca934a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 250 ms_handle_reset con 0x558441c9f000 session 0x55843cf25e00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843cf49c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:50.443256+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441670000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 250 ms_handle_reset con 0x558441670000 session 0x55843ca92f00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441670400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 165715968 unmapped: 26042368 heap: 191758336 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 250 handle_osd_map epochs [251,251], i have 250, src has [1,251]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 250 handle_osd_map epochs [251,251], i have 251, src has [1,251]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 251 handle_osd_map epochs [251,251], i have 251, src has [1,251]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 251 ms_handle_reset con 0x55843cf49c00 session 0x55843cdbc780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 251 handle_osd_map epochs [251,251], i have 251, src has [1,251]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 251 ms_handle_reset con 0x558441670400 session 0x55843ca921e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:51.443480+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef5400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 251 handle_osd_map epochs [252,252], i have 251, src has [1,252]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 165789696 unmapped: 25968640 heap: 191758336 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 252 ms_handle_reset con 0x55843fef5400 session 0x55843cdab680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441670400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 252 heartbeat osd_stat(store_statfs(0x1b41ae000/0x0/0x1bfc00000, data 0x4fd243c/0x517f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441670000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 252 ms_handle_reset con 0x558441670000 session 0x55843ca93c20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441670c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 252 ms_handle_reset con 0x558441670c00 session 0x55843ca92b40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:52.443696+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 252 ms_handle_reset con 0x55843fef4800 session 0x55843b959e00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 165756928 unmapped: 26001408 heap: 191758336 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2262338 data_alloc: 301989888 data_used: 29298688
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 252 handle_osd_map epochs [253,253], i have 252, src has [1,253]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 8.905896187s of 10.033369064s, submitted: 311
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 253 handle_osd_map epochs [253,253], i have 253, src has [1,253]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 253 ms_handle_reset con 0x558441670400 session 0x55843c65f860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:53.443909+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef5400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 165781504 unmapped: 25976832 heap: 191758336 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9e400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 253 handle_osd_map epochs [254,254], i have 253, src has [1,254]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 253 handle_osd_map epochs [254,254], i have 254, src has [1,254]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 254 ms_handle_reset con 0x558441c9e400 session 0x55843fca54a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 254 ms_handle_reset con 0x55843fef5400 session 0x55843cdbcf00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 254 heartbeat osd_stat(store_statfs(0x1b41a2000/0x0/0x1bfc00000, data 0x4fd8f70/0x518b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [0,0,1])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 254 ms_handle_reset con 0x55843fef4800 session 0x55843cdaa960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:54.444102+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 165838848 unmapped: 25919488 heap: 191758336 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441670000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441670400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 254 ms_handle_reset con 0x558441670400 session 0x55843b9fbe00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 254 ms_handle_reset con 0x558441670000 session 0x55843f14c3c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:55.444291+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 165691392 unmapped: 26066944 heap: 191758336 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441670c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9fc00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:56.444457+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441c9f000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 254 handle_osd_map epochs [255,255], i have 254, src has [1,255]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 254 handle_osd_map epochs [255,255], i have 255, src has [1,255]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 169836544 unmapped: 34521088 heap: 204357632 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 255 ms_handle_reset con 0x558441c9f000 session 0x55843e606000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:57.444672+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174047232 unmapped: 30310400 heap: 204357632 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2718222 data_alloc: 301989888 data_used: 29298688
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 255 ms_handle_reset con 0x55843fef4800 session 0x55843aa85c20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:58.444801+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 169869312 unmapped: 34488320 heap: 204357632 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef5400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 255 ms_handle_reset con 0x55843fef5400 session 0x55843c65e5a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441670000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 255 heartbeat osd_stat(store_statfs(0x1ac19e000/0x0/0x1bfc00000, data 0xcfdb261/0xd190000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:59.444957+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 165715968 unmapped: 42844160 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:00.445101+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 255 ms_handle_reset con 0x558441670000 session 0x55843cab9a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 165814272 unmapped: 42745856 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441670400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:01.445250+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 255 ms_handle_reset con 0x558441670400 session 0x55843c9952c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2e400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 255 handle_osd_map epochs [256,256], i have 255, src has [1,256]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 256 handle_osd_map epochs [256,256], i have 256, src has [1,256]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174276608 unmapped: 34283520 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 256 handle_osd_map epochs [256,256], i have 256, src has [1,256]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 256 handle_osd_map epochs [256,256], i have 256, src has [1,256]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:02.445417+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 256 ms_handle_reset con 0x55843ca2e400 session 0x55843efee000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 165904384 unmapped: 42655744 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4240810 data_alloc: 301989888 data_used: 29310976
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 7.835624218s of 10.028038025s, submitted: 223
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 256 heartbeat osd_stat(store_statfs(0x1a459b000/0x0/0x1bfc00000, data 0x14bdd51d/0x14d92000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [0,0,0,0,0,0,0,0,1])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:03.445569+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2e400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 256 ms_handle_reset con 0x55843ca2e400 session 0x55843cd3d860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 170123264 unmapped: 38436864 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:04.445741+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 165961728 unmapped: 42598400 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:05.445909+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 165961728 unmapped: 42598400 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 256 ms_handle_reset con 0x55843d008800 session 0x55843f375c20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 256 handle_osd_map epochs [257,257], i have 256, src has [1,257]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 257 handle_osd_map epochs [257,257], i have 257, src has [1,257]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 257 handle_osd_map epochs [257,257], i have 257, src has [1,257]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:06.446080+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 257 ms_handle_reset con 0x55843fef4800 session 0x5584408663c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 165601280 unmapped: 42958848 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Got map version 49
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:07.446295+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 257 heartbeat osd_stat(store_statfs(0x199d93000/0x0/0x1bfc00000, data 0x1f3df9bf/0x1f59a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [0,0,0,0,0,0,0,0,1])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 5318427 data_alloc: 301989888 data_used: 29323264
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174178304 unmapped: 34381824 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:08.446460+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fb86000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 257 ms_handle_reset con 0x55843fb86000 session 0x55843cd38b40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 257 ms_handle_reset con 0x558441c9fc00 session 0x55843cf24b40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 257 ms_handle_reset con 0x558441670c00 session 0x55844102ed20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 165838848 unmapped: 42721280 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2e400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 257 ms_handle_reset con 0x55843ca2e400 session 0x55843ced9a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d008800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 257 ms_handle_reset con 0x55843d008800 session 0x55844084b2c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fb86000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 257 ms_handle_reset con 0x55843fb86000 session 0x55843e606f00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:09.446626+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166936576 unmapped: 41623552 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 257 ms_handle_reset con 0x55843fef4800 session 0x55843d06f0e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 257 ms_handle_reset con 0x55843e973000 session 0x55843c65f860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:10.446783+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2e400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 257 ms_handle_reset con 0x55843fef4800 session 0x55843e606960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d008800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 257 ms_handle_reset con 0x55843ca2e400 session 0x55843f14d680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fb86000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166871040 unmapped: 41689088 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 257 ms_handle_reset con 0x55843fb86000 session 0x55843ca92f00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 257 handle_osd_map epochs [258,258], i have 257, src has [1,258]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 258 ms_handle_reset con 0x55843d008800 session 0x55843a786d20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:11.446925+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d008800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2e400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 258 ms_handle_reset con 0x55843ca2e400 session 0x55843ca934a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166895616 unmapped: 41664512 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 258 ms_handle_reset con 0x55843e973000 session 0x55843ca92780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 258 heartbeat osd_stat(store_statfs(0x1b4192000/0x0/0x1bfc00000, data 0x4fe1ee1/0x519c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:12.447086+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 258 handle_osd_map epochs [259,259], i have 258, src has [1,259]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 259 ms_handle_reset con 0x55843d008800 session 0x55843c993680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2424784 data_alloc: 301989888 data_used: 29347840
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166912000 unmapped: 41648128 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:13.447255+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fb86000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166912000 unmapped: 41648128 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 259 ms_handle_reset con 0x55843fef4800 session 0x55843c995c20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441670c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 8.652847290s of 11.161242485s, submitted: 549
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 259 ms_handle_reset con 0x55843fb86000 session 0x55843cdd1860
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 259 handle_osd_map epochs [260,260], i have 259, src has [1,260]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 260 ms_handle_reset con 0x558441670c00 session 0x55843d11fc20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:14.447402+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166264832 unmapped: 42295296 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fb86000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 260 ms_handle_reset con 0x55843fb86000 session 0x55843e607e00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2e400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 260 ms_handle_reset con 0x55843ca2e400 session 0x55843cdda000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:15.447558+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 260 heartbeat osd_stat(store_statfs(0x1b4188000/0x0/0x1bfc00000, data 0x4fe6787/0x51a6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166264832 unmapped: 42295296 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 260 heartbeat osd_stat(store_statfs(0x1b4186000/0x0/0x1bfc00000, data 0x4fe67f9/0x51a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d008800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:16.447715+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 260 handle_osd_map epochs [260,261], i have 260, src has [1,261]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 261 handle_osd_map epochs [261,261], i have 261, src has [1,261]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 261 handle_osd_map epochs [261,261], i have 261, src has [1,261]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 261 ms_handle_reset con 0x55843d008800 session 0x55843ced8d20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166264832 unmapped: 42295296 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843e973000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 261 ms_handle_reset con 0x55843e973000 session 0x55843ce66f00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2e400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:17.447883+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d008800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 261 ms_handle_reset con 0x55843ca2e400 session 0x55843cf25680
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fb86000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2438151 data_alloc: 301989888 data_used: 29360128
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166289408 unmapped: 42270720 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441670c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 261 ms_handle_reset con 0x55843fb86000 session 0x55843d06e780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 261 ms_handle_reset con 0x55843d008800 session 0x55843b9fa5a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 261 ms_handle_reset con 0x55843fef4800 session 0x55843ced92c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 261 ms_handle_reset con 0x558441670c00 session 0x55843eef3a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2e400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:18.448064+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 261 ms_handle_reset con 0x55843ca2e400 session 0x55843c65e1e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 261 heartbeat osd_stat(store_statfs(0x1b3c48000/0x0/0x1bfc00000, data 0x5521ca5/0x56e6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d008800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 261 ms_handle_reset con 0x55843d008800 session 0x55843d0cc5a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166633472 unmapped: 41926656 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fb86000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 261 ms_handle_reset con 0x55843fb86000 session 0x55843eef2d20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:19.448257+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166666240 unmapped: 41893888 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 261 ms_handle_reset con 0x55843fef4800 session 0x55843fca54a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441670c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:20.448419+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef5400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 261 ms_handle_reset con 0x55843fef5400 session 0x55843b958f00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 261 handle_osd_map epochs [262,262], i have 261, src has [1,262]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 262 handle_osd_map epochs [262,262], i have 262, src has [1,262]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 262 ms_handle_reset con 0x558441670c00 session 0x5584408665a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166690816 unmapped: 41869312 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b3c49000/0x0/0x1bfc00000, data 0x5521be1/0x56e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2e400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 262 ms_handle_reset con 0x55843ca2e400 session 0x55843c995a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d008800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fb86000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 262 ms_handle_reset con 0x55843fb86000 session 0x55843f375e00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 262 ms_handle_reset con 0x55843d008800 session 0x55843ca843c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:21.448544+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 262 handle_osd_map epochs [263,263], i have 262, src has [1,263]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 263 handle_osd_map epochs [263,263], i have 263, src has [1,263]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 263 handle_osd_map epochs [263,263], i have 263, src has [1,263]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 263 ms_handle_reset con 0x55843fef4800 session 0x55843cdab4a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166821888 unmapped: 41738240 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 263 handle_osd_map epochs [263,263], i have 263, src has [1,263]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2e400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 263 ms_handle_reset con 0x55843ca2e400 session 0x55843c993a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:22.448703+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 263 heartbeat osd_stat(store_statfs(0x1b3c40000/0x0/0x1bfc00000, data 0x5526397/0x56ed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d008800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 263 ms_handle_reset con 0x55843d008800 session 0x55843d06e5a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fb86000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2499590 data_alloc: 301989888 data_used: 29384704
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166838272 unmapped: 41721856 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 263 heartbeat osd_stat(store_statfs(0x1b3c40000/0x0/0x1bfc00000, data 0x5526397/0x56ed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 263 ms_handle_reset con 0x55843fb86000 session 0x55843ca84000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:23.448901+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166846464 unmapped: 41713664 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:24.449078+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 263 handle_osd_map epochs [264,264], i have 263, src has [1,264]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.642466545s of 10.705989838s, submitted: 295
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 264 ms_handle_reset con 0x55843fef4800 session 0x55843c9954a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166846464 unmapped: 41713664 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:25.449241+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166846464 unmapped: 41713664 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 264 heartbeat osd_stat(store_statfs(0x1b4174000/0x0/0x1bfc00000, data 0x4fef85e/0x51b9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441670c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441670000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 264 ms_handle_reset con 0x558441670000 session 0x55843d0cc5a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 264 ms_handle_reset con 0x558441670c00 session 0x55843ab33a40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:26.449381+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 264 handle_osd_map epochs [264,265], i have 264, src has [1,265]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166846464 unmapped: 41713664 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2e400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 265 ms_handle_reset con 0x55843ca2e400 session 0x55843ca930e0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d008800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:27.449574+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 265 handle_osd_map epochs [265,265], i have 265, src has [1,265]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 265 ms_handle_reset con 0x55843d008800 session 0x55843d06e000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2470464 data_alloc: 301989888 data_used: 29388800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166936576 unmapped: 41623552 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 265 heartbeat osd_stat(store_statfs(0x1b416f000/0x0/0x1bfc00000, data 0x4ff1b74/0x51bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fb86000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 265 ms_handle_reset con 0x55843fb86000 session 0x55843fca4b40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:28.449725+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Got map version 50
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 265 ms_handle_reset con 0x55843fef4800 session 0x55843d06f4a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166928384 unmapped: 41631744 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 265 handle_osd_map epochs [266,266], i have 265, src has [1,266]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 266 handle_osd_map epochs [266,266], i have 266, src has [1,266]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:29.449886+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fef4800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 266 ms_handle_reset con 0x55843fef4800 session 0x55843eef32c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2e400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 266 handle_osd_map epochs [266,266], i have 266, src has [1,266]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 266 handle_osd_map epochs [266,266], i have 266, src has [1,266]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 266 ms_handle_reset con 0x55843ca2e400 session 0x55843cdaa3c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166928384 unmapped: 41631744 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 266 heartbeat osd_stat(store_statfs(0x1b416f000/0x0/0x1bfc00000, data 0x4ff1dbc/0x51bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d008800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:30.450053+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441670400
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 266 ms_handle_reset con 0x558441670400 session 0x55843a786d20
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2f000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166928384 unmapped: 41631744 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 266 ms_handle_reset con 0x55843ca2f000 session 0x55843ca92f00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:31.450210+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166928384 unmapped: 41631744 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:32.450370+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2476158 data_alloc: 301989888 data_used: 29401088
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 266 heartbeat osd_stat(store_statfs(0x1b4169000/0x0/0x1bfc00000, data 0x4ff42ee/0x51c4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166928384 unmapped: 41631744 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:33.450550+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166928384 unmapped: 41631744 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:34.450743+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166928384 unmapped: 41631744 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.196878433s of 10.512495995s, submitted: 100
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:35.450892+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166952960 unmapped: 41607168 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:36.451106+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166952960 unmapped: 41607168 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 266 handle_osd_map epochs [267,267], i have 266, src has [1,267]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 267 handle_osd_map epochs [267,267], i have 267, src has [1,267]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:37.451309+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2481304 data_alloc: 301989888 data_used: 29442048
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166961152 unmapped: 41598976 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 267 heartbeat osd_stat(store_statfs(0x1b4169000/0x0/0x1bfc00000, data 0x4ff620b/0x51c4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:38.451514+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166961152 unmapped: 41598976 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:39.451663+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166961152 unmapped: 41598976 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:40.451821+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166969344 unmapped: 41590784 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:41.452093+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166969344 unmapped: 41590784 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 267 heartbeat osd_stat(store_statfs(0x1b4166000/0x0/0x1bfc00000, data 0x4ff6373/0x51c6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:42.452263+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 267 heartbeat osd_stat(store_statfs(0x1b4166000/0x0/0x1bfc00000, data 0x4ff62d8/0x51c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2484300 data_alloc: 301989888 data_used: 29442048
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 166969344 unmapped: 41590784 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:43.452444+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168026112 unmapped: 40534016 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:44.452643+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168026112 unmapped: 40534016 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.830991745s of 10.000524521s, submitted: 52
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:45.452816+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 267 handle_osd_map epochs [268,268], i have 267, src has [1,268]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168042496 unmapped: 40517632 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:46.453009+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168042496 unmapped: 40517632 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:47.453303+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2485218 data_alloc: 301989888 data_used: 29454336
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168042496 unmapped: 40517632 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 268 heartbeat osd_stat(store_statfs(0x1b4168000/0x0/0x1bfc00000, data 0x4ff8434/0x51c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:48.453481+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168042496 unmapped: 40517632 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:49.453659+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168042496 unmapped: 40517632 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:50.453863+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168042496 unmapped: 40517632 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:51.453995+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168042496 unmapped: 40517632 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:52.454198+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 268 handle_osd_map epochs [269,269], i have 268, src has [1,269]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 269 handle_osd_map epochs [269,270], i have 269, src has [1,270]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 270 heartbeat osd_stat(store_statfs(0x1b4168000/0x0/0x1bfc00000, data 0x4ff850b/0x51c6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2495060 data_alloc: 301989888 data_used: 29466624
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168067072 unmapped: 40493056 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:53.454375+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168067072 unmapped: 40493056 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:54.454550+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168067072 unmapped: 40493056 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 270 heartbeat osd_stat(store_statfs(0x1b415e000/0x0/0x1bfc00000, data 0x4ffcaa0/0x51cc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.716396332s of 10.000543594s, submitted: 100
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:55.454751+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168067072 unmapped: 40493056 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 270 heartbeat osd_stat(store_statfs(0x1b415e000/0x0/0x1bfc00000, data 0x4ffcaa0/0x51cc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:56.454894+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168067072 unmapped: 40493056 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:57.455109+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2489282 data_alloc: 301989888 data_used: 29466624
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168067072 unmapped: 40493056 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:58.455284+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 270 heartbeat osd_stat(store_statfs(0x1b4163000/0x0/0x1bfc00000, data 0x4ffca05/0x51cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168067072 unmapped: 40493056 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:59.455525+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168067072 unmapped: 40493056 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:00.455714+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168075264 unmapped: 40484864 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:01.455887+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 270 handle_osd_map epochs [270,271], i have 270, src has [1,271]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 271 handle_osd_map epochs [271,271], i have 271, src has [1,271]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 271 heartbeat osd_stat(store_statfs(0x1b4162000/0x0/0x1bfc00000, data 0x4ffcaa0/0x51cc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168075264 unmapped: 40484864 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:02.456114+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2497148 data_alloc: 301989888 data_used: 29478912
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168075264 unmapped: 40484864 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:03.456329+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:04.456476+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168075264 unmapped: 40484864 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.917531967s of 10.001461983s, submitted: 28
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:05.456635+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168075264 unmapped: 40484864 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:06.456838+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168075264 unmapped: 40484864 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:07.457070+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168075264 unmapped: 40484864 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 271 heartbeat osd_stat(store_statfs(0x1b415d000/0x0/0x1bfc00000, data 0x4ffee0d/0x51d1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2495788 data_alloc: 301989888 data_used: 29478912
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:08.457273+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168075264 unmapped: 40484864 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:09.457476+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 271 heartbeat osd_stat(store_statfs(0x1b415d000/0x0/0x1bfc00000, data 0x4ffee0d/0x51d1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168075264 unmapped: 40484864 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:10.457676+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168083456 unmapped: 40476672 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:11.457851+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168091648 unmapped: 40468480 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:12.457997+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168091648 unmapped: 40468480 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2498810 data_alloc: 301989888 data_used: 29478912
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:13.458198+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168091648 unmapped: 40468480 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 271 heartbeat osd_stat(store_statfs(0x1b415d000/0x0/0x1bfc00000, data 0x4ffeddb/0x51d1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:14.458383+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168091648 unmapped: 40468480 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.924475670s of 10.004346848s, submitted: 16
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:15.458551+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168091648 unmapped: 40468480 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:16.458725+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168099840 unmapped: 40460288 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:17.458922+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168099840 unmapped: 40460288 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2501128 data_alloc: 301989888 data_used: 29478912
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:18.459092+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168132608 unmapped: 40427520 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 271 heartbeat osd_stat(store_statfs(0x1b4159000/0x0/0x1bfc00000, data 0x4ffefda/0x51d4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:19.459267+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168132608 unmapped: 40427520 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:20.459622+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168132608 unmapped: 40427520 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:21.459770+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168140800 unmapped: 40419328 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:22.459963+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 271 heartbeat osd_stat(store_statfs(0x1b415a000/0x0/0x1bfc00000, data 0x4ffefd8/0x51d4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168140800 unmapped: 40419328 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2503622 data_alloc: 301989888 data_used: 29478912
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:23.460131+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168140800 unmapped: 40419328 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 271 heartbeat osd_stat(store_statfs(0x1b415a000/0x0/0x1bfc00000, data 0x4ffef11/0x51d3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:24.460362+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 271 heartbeat osd_stat(store_statfs(0x1b415a000/0x0/0x1bfc00000, data 0x4ffefac/0x51d4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168140800 unmapped: 40419328 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.938243866s of 10.001822472s, submitted: 14
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:25.460554+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168140800 unmapped: 40419328 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:26.460790+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168140800 unmapped: 40419328 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:27.460984+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168140800 unmapped: 40419328 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2505764 data_alloc: 301989888 data_used: 29478912
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:28.461140+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168140800 unmapped: 40419328 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:29.461293+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168165376 unmapped: 40394752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 271 heartbeat osd_stat(store_statfs(0x1b4156000/0x0/0x1bfc00000, data 0x4fff218/0x51d8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 271 handle_osd_map epochs [272,272], i have 271, src has [1,272]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:30.461421+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168165376 unmapped: 40394752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:31.461586+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 272 handle_osd_map epochs [272,272], i have 272, src has [1,272]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168329216 unmapped: 40230912 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:32.461757+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168755200 unmapped: 39804928 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2530774 data_alloc: 301989888 data_used: 29491200
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:33.461925+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 168755200 unmapped: 39804928 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:34.462101+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 272 heartbeat osd_stat(store_statfs(0x1b40fa000/0x0/0x1bfc00000, data 0x505ab27/0x5233000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 169107456 unmapped: 39452672 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.760598183s of 10.000513077s, submitted: 72
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:35.462268+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 169533440 unmapped: 39026688 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:36.462384+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 272 handle_osd_map epochs [272,273], i have 272, src has [1,273]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 169598976 unmapped: 38961152 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:37.462565+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 169238528 unmapped: 39321600 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2537744 data_alloc: 301989888 data_used: 29515776
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:38.462716+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 169500672 unmapped: 39059456 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 273 heartbeat osd_stat(store_statfs(0x1b404c000/0x0/0x1bfc00000, data 0x51076f4/0x52e2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:39.462851+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 273 heartbeat osd_stat(store_statfs(0x1b3feb000/0x0/0x1bfc00000, data 0x5165507/0x5340000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 169648128 unmapped: 38912000 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:40.463009+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 170336256 unmapped: 38223872 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:41.463228+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 273 heartbeat osd_stat(store_statfs(0x1b3fd5000/0x0/0x1bfc00000, data 0x517dc3c/0x5359000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 170344448 unmapped: 38215680 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:42.463375+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 273 heartbeat osd_stat(store_statfs(0x1b3f65000/0x0/0x1bfc00000, data 0x51ecd97/0x53c9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 172531712 unmapped: 36028416 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2558766 data_alloc: 301989888 data_used: 29515776
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:43.463543+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 172924928 unmapped: 35635200 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:44.463816+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 171679744 unmapped: 36880384 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.478956223s of 10.000002861s, submitted: 118
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:45.463991+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 170967040 unmapped: 37593088 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:46.464175+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 172417024 unmapped: 36143104 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 273 heartbeat osd_stat(store_statfs(0x1b3ac9000/0x0/0x1bfc00000, data 0x528837f/0x5464000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:47.464400+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 172433408 unmapped: 36126720 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2563596 data_alloc: 301989888 data_used: 29515776
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:48.464584+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 172539904 unmapped: 36020224 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:49.464753+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 172859392 unmapped: 35700736 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:50.464902+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 172908544 unmapped: 35651584 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:51.465059+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 172965888 unmapped: 35594240 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:52.465197+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 273 heartbeat osd_stat(store_statfs(0x1b3a3b000/0x0/0x1bfc00000, data 0x5318ffb/0x54f3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 173776896 unmapped: 34783232 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2583772 data_alloc: 301989888 data_used: 29515776
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 273 heartbeat osd_stat(store_statfs(0x1b39e0000/0x0/0x1bfc00000, data 0x537093b/0x554d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:53.465343+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 173957120 unmapped: 34603008 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:54.465502+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174071808 unmapped: 34488320 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.521921158s of 10.001026154s, submitted: 112
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:55.465673+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174219264 unmapped: 34340864 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:56.465822+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174219264 unmapped: 34340864 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:57.466094+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174219264 unmapped: 34340864 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2585058 data_alloc: 301989888 data_used: 29515776
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:58.466315+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 273 heartbeat osd_stat(store_statfs(0x1b394f000/0x0/0x1bfc00000, data 0x540376d/0x55de000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174219264 unmapped: 34340864 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:59.466474+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174227456 unmapped: 34332672 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:00.466613+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174252032 unmapped: 34308096 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 273 handle_osd_map epochs [273,274], i have 273, src has [1,274]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:01.466792+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174260224 unmapped: 34299904 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:02.466955+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 274 handle_osd_map epochs [274,274], i have 274, src has [1,274]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 274 handle_osd_map epochs [274,274], i have 274, src has [1,274]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174268416 unmapped: 34291712 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2585892 data_alloc: 301989888 data_used: 29528064
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 274 heartbeat osd_stat(store_statfs(0x1b394a000/0x0/0x1bfc00000, data 0x5405bf1/0x55e2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:03.467113+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174268416 unmapped: 34291712 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:04.467250+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174276608 unmapped: 34283520 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:05.467408+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 274 heartbeat osd_stat(store_statfs(0x1b394a000/0x0/0x1bfc00000, data 0x5405b2a/0x55e1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.182142258s of 10.486183167s, submitted: 83
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174292992 unmapped: 34267136 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:06.467590+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174301184 unmapped: 34258944 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Got map version 51
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:07.467847+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174309376 unmapped: 34250752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2586382 data_alloc: 301989888 data_used: 29528064
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 274 heartbeat osd_stat(store_statfs(0x1b3949000/0x0/0x1bfc00000, data 0x5405b9d/0x55e1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:08.468045+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174309376 unmapped: 34250752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:09.468189+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174309376 unmapped: 34250752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:10.468368+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174309376 unmapped: 34250752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:11.468531+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 274 handle_osd_map epochs [275,275], i have 274, src has [1,275]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174309376 unmapped: 34250752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:12.468673+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174333952 unmapped: 34226176 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2591570 data_alloc: 301989888 data_used: 29540352
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:13.468800+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 275 heartbeat osd_stat(store_statfs(0x1b3947000/0x0/0x1bfc00000, data 0x5407d9d/0x55e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174333952 unmapped: 34226176 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:14.468983+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 174333952 unmapped: 34226176 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:15.469165+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 275 handle_osd_map epochs [276,276], i have 275, src has [1,276]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.882104874s of 10.118027687s, submitted: 52
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 175382528 unmapped: 33177600 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 276 handle_osd_map epochs [276,276], i have 276, src has [1,276]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:16.469330+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 176431104 unmapped: 32129024 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:17.469494+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 276 heartbeat osd_stat(store_statfs(0x1b3946000/0x0/0x1bfc00000, data 0x540a113/0x55e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 176431104 unmapped: 32129024 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2595864 data_alloc: 301989888 data_used: 29552640
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:18.469659+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 176431104 unmapped: 32129024 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:19.469859+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 276 heartbeat osd_stat(store_statfs(0x1b393c000/0x0/0x1bfc00000, data 0x54127c9/0x55f1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 176472064 unmapped: 32088064 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:20.470063+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 176791552 unmapped: 31768576 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:21.470206+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 176799744 unmapped: 31760384 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 276 handle_osd_map epochs [277,277], i have 276, src has [1,277]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 277 handle_osd_map epochs [277,277], i have 277, src has [1,277]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:22.470343+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 177758208 unmapped: 30801920 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2623084 data_alloc: 301989888 data_used: 29564928
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:23.470527+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 277 heartbeat osd_stat(store_statfs(0x1b384d000/0x0/0x1bfc00000, data 0x54fe73e/0x56df000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 177758208 unmapped: 30801920 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:24.470677+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 177782784 unmapped: 30777344 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:25.470860+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 176848896 unmapped: 31711232 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.005203247s of 10.441389084s, submitted: 137
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:26.471050+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 176799744 unmapped: 31760384 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:27.471216+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 277 heartbeat osd_stat(store_statfs(0x1b37cb000/0x0/0x1bfc00000, data 0x5583660/0x5763000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 176898048 unmapped: 31662080 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2635068 data_alloc: 301989888 data_used: 29564928
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:28.471459+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 277 heartbeat osd_stat(store_statfs(0x1b379e000/0x0/0x1bfc00000, data 0x55aec01/0x5790000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 177086464 unmapped: 31473664 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:29.471617+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 177160192 unmapped: 31399936 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:30.471787+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 277 heartbeat osd_stat(store_statfs(0x1b3765000/0x0/0x1bfc00000, data 0x55ebe31/0x57c9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 178397184 unmapped: 30162944 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:31.471937+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 277 handle_osd_map epochs [278,278], i have 277, src has [1,278]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 178552832 unmapped: 30007296 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:32.472101+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 278 handle_osd_map epochs [278,278], i have 278, src has [1,278]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 178561024 unmapped: 29999104 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2643898 data_alloc: 301989888 data_used: 29577216
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:33.472248+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 278 heartbeat osd_stat(store_statfs(0x1b36d4000/0x0/0x1bfc00000, data 0x5677e39/0x5858000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 178561024 unmapped: 29999104 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:34.472398+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 178241536 unmapped: 30318592 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:35.472590+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 278 handle_osd_map epochs [278,279], i have 278, src has [1,279]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843ca2f800
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 279 ms_handle_reset con 0x55843ca2f800 session 0x55843cf24b40
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 279 ms_handle_reset con 0x55843d008800 session 0x55843c994780
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 178798592 unmapped: 29761536 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x558441670c00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:36.472727+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 279 handle_osd_map epochs [280,280], i have 279, src has [1,280]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.883527756s of 10.519721985s, submitted: 415
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 280 handle_osd_map epochs [280,280], i have 280, src has [1,280]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 280 handle_osd_map epochs [280,280], i have 280, src has [1,280]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 179855360 unmapped: 28704768 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 280 ms_handle_reset con 0x558441670c00 session 0x55843e606f00
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Got map version 52
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:37.472915+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 180207616 unmapped: 28352512 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2664662 data_alloc: 301989888 data_used: 29597696
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:38.473111+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 180207616 unmapped: 28352512 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:39.473242+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 280 heartbeat osd_stat(store_statfs(0x1b3622000/0x0/0x1bfc00000, data 0x5727c21/0x590c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 180264960 unmapped: 28295168 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:40.473377+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 280 handle_osd_map epochs [281,281], i have 280, src has [1,281]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 180609024 unmapped: 27951104 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:41.473515+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 281 handle_osd_map epochs [281,281], i have 281, src has [1,281]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 281 handle_osd_map epochs [282,282], i have 281, src has [1,282]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 180641792 unmapped: 27918336 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:42.473654+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 282 heartbeat osd_stat(store_statfs(0x1b3571000/0x0/0x1bfc00000, data 0x57d4613/0x59bb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [0,0,0,3])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 181690368 unmapped: 26869760 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2689330 data_alloc: 301989888 data_used: 29609984
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:43.473888+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 181010432 unmapped: 27549696 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:44.474087+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 181100544 unmapped: 27459584 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:45.474300+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 282 handle_osd_map epochs [283,283], i have 282, src has [1,283]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 283 handle_osd_map epochs [283,283], i have 283, src has [1,283]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 181207040 unmapped: 27353088 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:46.474473+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 283 handle_osd_map epochs [284,284], i have 283, src has [1,284]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.664692879s of 10.302494049s, submitted: 216
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 181215232 unmapped: 27344896 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:47.474703+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 181215232 unmapped: 27344896 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2697602 data_alloc: 301989888 data_used: 29634560
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:48.474875+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b34c1000/0x0/0x1bfc00000, data 0x58837be/0x5a6c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 181215232 unmapped: 27344896 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:49.475112+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 182304768 unmapped: 26255360 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:50.475254+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 182386688 unmapped: 26173440 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:51.475419+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 284 handle_osd_map epochs [285,285], i have 284, src has [1,285]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 182386688 unmapped: 26173440 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:52.475586+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 285 handle_osd_map epochs [285,286], i have 285, src has [1,286]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 182804480 unmapped: 25755648 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2714154 data_alloc: 301989888 data_used: 29646848
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:53.475770+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 182804480 unmapped: 25755648 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:54.475915+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 286 heartbeat osd_stat(store_statfs(0x1b2236000/0x0/0x1bfc00000, data 0x596aa8c/0x5b57000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 182804480 unmapped: 25755648 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:55.476087+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 182149120 unmapped: 26411008 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:56.476262+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 182149120 unmapped: 26411008 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:57.476516+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.328048706s of 10.630928993s, submitted: 119
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 182272000 unmapped: 26288128 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2718842 data_alloc: 301989888 data_used: 29646848
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:58.476721+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 183697408 unmapped: 24862720 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 286 heartbeat osd_stat(store_statfs(0x1b21a3000/0x0/0x1bfc00000, data 0x59fd496/0x5beb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:59.476874+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 183795712 unmapped: 24764416 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:00.477106+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 184131584 unmapped: 24428544 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 286 handle_osd_map epochs [287,287], i have 286, src has [1,287]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:01.477286+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 183115776 unmapped: 25444352 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 287 handle_osd_map epochs [288,288], i have 287, src has [1,288]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:02.477541+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 288 handle_osd_map epochs [287,288], i have 288, src has [1,288]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 288 handle_osd_map epochs [287,288], i have 288, src has [1,288]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 183115776 unmapped: 25444352 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2739926 data_alloc: 301989888 data_used: 29659136
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:03.477694+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 184164352 unmapped: 24395776 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:04.477860+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 184532992 unmapped: 24027136 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 288 heartbeat osd_stat(store_statfs(0x1b20d1000/0x0/0x1bfc00000, data 0x5ac8889/0x5cbb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:05.478119+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 288 handle_osd_map epochs [289,289], i have 288, src has [1,289]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 288 handle_osd_map epochs [289,289], i have 289, src has [1,289]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 289 handle_osd_map epochs [289,289], i have 289, src has [1,289]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 185860096 unmapped: 22700032 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:06.478264+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 185958400 unmapped: 22601728 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:07.478493+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.673652649s of 10.199068069s, submitted: 171
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 185942016 unmapped: 22618112 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2749780 data_alloc: 301989888 data_used: 29671424
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:08.478664+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 187031552 unmapped: 21528576 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:09.478908+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 187146240 unmapped: 21413888 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:10.479136+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 289 heartbeat osd_stat(store_statfs(0x1b1f87000/0x0/0x1bfc00000, data 0x5c15832/0x5e07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,2,3,4] op hist [0,0,1])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 186556416 unmapped: 22003712 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:11.479317+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 186556416 unmapped: 22003712 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 289 handle_osd_map epochs [290,290], i have 289, src has [1,290]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 handle_osd_map epochs [290,290], i have 290, src has [1,290]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:12.479436+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 handle_osd_map epochs [290,290], i have 290, src has [1,290]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 186564608 unmapped: 21995520 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2771120 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:13.479630+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 187121664 unmapped: 21438464 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:14.479830+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 186916864 unmapped: 21643264 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:15.479997+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 189022208 unmapped: 19537920 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:16.480208+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b0d26000/0x0/0x1bfc00000, data 0x5cd537a/0x5ec8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 189333504 unmapped: 19226624 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:17.480393+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b0d26000/0x0/0x1bfc00000, data 0x5cd537a/0x5ec8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b0d26000/0x0/0x1bfc00000, data 0x5cd537a/0x5ec8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 189464576 unmapped: 19095552 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2769102 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:18.480525+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 189464576 unmapped: 19095552 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:19.480712+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.462128639s of 11.739068985s, submitted: 70
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 188506112 unmapped: 20054016 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:20.480824+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 188506112 unmapped: 20054016 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:21.480994+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b0c98000/0x0/0x1bfc00000, data 0x5d63ba6/0x5f56000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 188678144 unmapped: 19881984 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:22.481200+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 188899328 unmapped: 19660800 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2781174 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:23.481340+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 188899328 unmapped: 19660800 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:24.481507+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 190136320 unmapped: 18423808 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:25.481649+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 190373888 unmapped: 18186240 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:26.481784+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b0bcf000/0x0/0x1bfc00000, data 0x5e2a7fa/0x601e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 189579264 unmapped: 18980864 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:27.481954+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 189661184 unmapped: 18898944 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2792914 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:28.482134+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b0bc9000/0x0/0x1bfc00000, data 0x5e30a3a/0x6024000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 189849600 unmapped: 18710528 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:29.482305+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.066974640s of 10.319569588s, submitted: 54
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 190021632 unmapped: 18538496 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:30.482432+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 190021632 unmapped: 18538496 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:31.482599+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 190160896 unmapped: 18399232 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:32.482796+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 191225856 unmapped: 17334272 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2805368 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:33.482941+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 191225856 unmapped: 17334272 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b0ab7000/0x0/0x1bfc00000, data 0x5f42bbd/0x6136000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:34.483107+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 191242240 unmapped: 17317888 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:35.483255+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 191242240 unmapped: 17317888 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:36.483419+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b0ab2000/0x0/0x1bfc00000, data 0x5f48d91/0x613c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 191373312 unmapped: 17186816 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:37.483622+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 191627264 unmapped: 16932864 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2805932 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:38.483752+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 191627264 unmapped: 16932864 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:39.483916+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b0a75000/0x0/0x1bfc00000, data 0x5f85dad/0x6179000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.019208908s of 10.242109299s, submitted: 53
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 191815680 unmapped: 16744448 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:40.484090+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b0a45000/0x0/0x1bfc00000, data 0x5fb58cf/0x61a9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 191717376 unmapped: 16842752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:41.484256+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 191717376 unmapped: 16842752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:42.484423+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 191717376 unmapped: 16842752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2812330 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:43.484581+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 191971328 unmapped: 16588800 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:44.484738+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 191971328 unmapped: 16588800 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b09fc000/0x0/0x1bfc00000, data 0x5fff3cd/0x61f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:45.484911+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 191971328 unmapped: 16588800 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:46.485083+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 192094208 unmapped: 16465920 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:47.485256+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 192094208 unmapped: 16465920 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:48.485455+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2818450 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 192094208 unmapped: 16465920 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:49.485653+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b09a1000/0x0/0x1bfc00000, data 0x605b1a8/0x624d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.829882622s of 10.004420280s, submitted: 35
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 192315392 unmapped: 16244736 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:50.485812+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 192323584 unmapped: 16236544 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:51.485983+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 192323584 unmapped: 16236544 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:52.486172+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b093f000/0x0/0x1bfc00000, data 0x60bd015/0x62af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 192626688 unmapped: 15933440 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:53.486355+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b093f000/0x0/0x1bfc00000, data 0x60bd015/0x62af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2825402 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 192626688 unmapped: 15933440 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:54.486523+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 193372160 unmapped: 15187968 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:55.486712+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 193372160 unmapped: 15187968 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:56.487086+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b0917000/0x0/0x1bfc00000, data 0x60e495a/0x62d7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 193372160 unmapped: 15187968 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:57.487278+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b0917000/0x0/0x1bfc00000, data 0x60e495a/0x62d7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 193372160 unmapped: 15187968 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:58.487444+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2826446 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b0917000/0x0/0x1bfc00000, data 0x60e495a/0x62d7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 193380352 unmapped: 15179776 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:59.487620+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.830650330s of 10.005453110s, submitted: 33
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 193380352 unmapped: 15179776 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:00.487831+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 193380352 unmapped: 15179776 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:01.488052+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 193486848 unmapped: 15073280 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:02.488214+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 193593344 unmapped: 14966784 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:03.488373+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2834942 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 193593344 unmapped: 14966784 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:04.488533+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b085f000/0x0/0x1bfc00000, data 0x619a9a1/0x638e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 193863680 unmapped: 14696448 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:05.488696+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b085f000/0x0/0x1bfc00000, data 0x619a9a1/0x638e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 194658304 unmapped: 13901824 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:06.489152+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 194658304 unmapped: 13901824 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:07.489403+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 194928640 unmapped: 13631488 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:08.489571+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2837594 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 194928640 unmapped: 13631488 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:09.489748+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 194977792 unmapped: 13582336 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:10.489930+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 194977792 unmapped: 13582336 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:11.490069+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b07c6000/0x0/0x1bfc00000, data 0x6234b7a/0x6428000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 194977792 unmapped: 13582336 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:12.490249+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 12.272965431s of 12.466406822s, submitted: 32
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 194871296 unmapped: 13688832 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:13.490399+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2837760 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 194871296 unmapped: 13688832 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:14.490594+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 194961408 unmapped: 13598720 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:15.490763+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 194961408 unmapped: 13598720 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:16.490904+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b0769000/0x0/0x1bfc00000, data 0x62928a0/0x6485000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195108864 unmapped: 13451264 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:17.491110+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195108864 unmapped: 13451264 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:18.491253+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2841512 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195108864 unmapped: 13451264 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:19.491390+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195108864 unmapped: 13451264 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:20.491534+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195108864 unmapped: 13451264 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:21.491671+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b0769000/0x0/0x1bfc00000, data 0x6292d4e/0x6485000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195108864 unmapped: 13451264 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:22.491862+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195108864 unmapped: 13451264 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:23.492004+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2841528 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b0769000/0x0/0x1bfc00000, data 0x6292d4e/0x6485000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195108864 unmapped: 13451264 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:24.492206+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 12.475637436s of 12.526000977s, submitted: 13
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195108864 unmapped: 13451264 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:25.492344+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195035136 unmapped: 13524992 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:26.492501+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b0742000/0x0/0x1bfc00000, data 0x62b93e9/0x64ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195035136 unmapped: 13524992 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:27.492698+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195035136 unmapped: 13524992 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:28.492822+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2842204 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195035136 unmapped: 13524992 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:29.493008+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195035136 unmapped: 13524992 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:30.493207+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195035136 unmapped: 13524992 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:31.493385+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:32.493545+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195035136 unmapped: 13524992 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b071a000/0x0/0x1bfc00000, data 0x62e1a93/0x64d4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:33.493695+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195174400 unmapped: 13385728 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2849628 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b071a000/0x0/0x1bfc00000, data 0x62e1a93/0x64d4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:34.493897+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195174400 unmapped: 13385728 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.887657166s of 10.005189896s, submitted: 22
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:35.494124+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195330048 unmapped: 13230080 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:36.494348+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195330048 unmapped: 13230080 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b06a8000/0x0/0x1bfc00000, data 0x63539b5/0x6546000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:37.494680+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195338240 unmapped: 13221888 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:38.494865+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195338240 unmapped: 13221888 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2852476 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b068d000/0x0/0x1bfc00000, data 0x636e69b/0x6561000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:39.495011+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195338240 unmapped: 13221888 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b068d000/0x0/0x1bfc00000, data 0x636e69b/0x6561000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:40.495229+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195493888 unmapped: 13066240 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:41.495451+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195493888 unmapped: 13066240 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:42.495613+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195493888 unmapped: 13066240 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:43.495847+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 195493888 unmapped: 13066240 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2851236 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b062e000/0x0/0x1bfc00000, data 0x63ca6e5/0x65bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:44.496091+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 196542464 unmapped: 12017664 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.882203102s of 10.000601768s, submitted: 24
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:45.496278+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197599232 unmapped: 10960896 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:46.496475+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197599232 unmapped: 10960896 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:47.496729+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197894144 unmapped: 10665984 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b0605000/0x0/0x1bfc00000, data 0x63f3886/0x65e8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:48.496893+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197894144 unmapped: 10665984 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2864632 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:49.497079+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197894144 unmapped: 10665984 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:50.497243+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197607424 unmapped: 10952704 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d0000/0x0/0x1bfc00000, data 0x64293ae/0x661d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:51.497428+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197607424 unmapped: 10952704 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d0000/0x0/0x1bfc00000, data 0x64293ae/0x661d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:52.497600+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197607424 unmapped: 10952704 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:53.497779+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197738496 unmapped: 10821632 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2864400 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:54.497961+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197738496 unmapped: 10821632 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.919629097s of 10.000417709s, submitted: 19
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:55.498104+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197738496 unmapped: 10821632 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d0000/0x0/0x1bfc00000, data 0x64293ae/0x661d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:56.498285+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197804032 unmapped: 10756096 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:57.498496+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197804032 unmapped: 10756096 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:58.498674+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197804032 unmapped: 10756096 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2860558 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:59.498877+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197804032 unmapped: 10756096 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d2000/0x0/0x1bfc00000, data 0x64292e7/0x661c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:00.499097+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197812224 unmapped: 10747904 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:01.499305+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197812224 unmapped: 10747904 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:02.499464+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197812224 unmapped: 10747904 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d0000/0x0/0x1bfc00000, data 0x64293b0/0x661d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:03.499663+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197844992 unmapped: 10715136 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2862454 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:04.499802+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198893568 unmapped: 9666560 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.956855774s of 10.001013756s, submitted: 10
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:05.499980+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197844992 unmapped: 10715136 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:06.500170+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197844992 unmapped: 10715136 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d0000/0x0/0x1bfc00000, data 0x642944b/0x661e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:07.500410+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197844992 unmapped: 10715136 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:08.500640+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197844992 unmapped: 10715136 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2863918 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:09.500823+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197844992 unmapped: 10715136 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d0000/0x0/0x1bfc00000, data 0x642944b/0x661e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:10.501064+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197844992 unmapped: 10715136 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:11.501258+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197844992 unmapped: 10715136 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:12.501440+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197844992 unmapped: 10715136 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:13.501583+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197844992 unmapped: 10715136 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2863918 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d0000/0x0/0x1bfc00000, data 0x6429449/0x661e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:14.501769+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197844992 unmapped: 10715136 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:15.501971+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197853184 unmapped: 10706944 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d0000/0x0/0x1bfc00000, data 0x6429449/0x661e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:16.502146+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197853184 unmapped: 10706944 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.996944427s of 12.021103859s, submitted: 4
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:17.502353+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197853184 unmapped: 10706944 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:18.502522+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197853184 unmapped: 10706944 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2863052 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:19.502687+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197853184 unmapped: 10706944 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:20.502833+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197853184 unmapped: 10706944 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d2000/0x0/0x1bfc00000, data 0x64292e7/0x661c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:21.503078+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197853184 unmapped: 10706944 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d2000/0x0/0x1bfc00000, data 0x64292e7/0x661c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:22.503273+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197853184 unmapped: 10706944 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:23.503427+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197853184 unmapped: 10706944 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2861672 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:24.503577+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197853184 unmapped: 10706944 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d3000/0x0/0x1bfc00000, data 0x64292b7/0x661b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:25.503764+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:26.503948+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:27.504192+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:28.504380+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2861672 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d3000/0x0/0x1bfc00000, data 0x64292b7/0x661b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:29.504583+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 12.940068245s of 12.971845627s, submitted: 7
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:30.504717+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d3000/0x0/0x1bfc00000, data 0x64292b7/0x661b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:31.504918+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:32.505106+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:33.505280+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2861672 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:34.505441+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:35.505604+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d3000/0x0/0x1bfc00000, data 0x64292b7/0x661b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:36.505796+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:37.505979+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d3000/0x0/0x1bfc00000, data 0x64292b7/0x661b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:38.506163+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2861672 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:39.506335+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:40.506523+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:41.506721+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d3000/0x0/0x1bfc00000, data 0x64292b7/0x661b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:42.506855+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:43.506996+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2861672 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d3000/0x0/0x1bfc00000, data 0x64292b7/0x661b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:44.507152+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:45.507332+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d3000/0x0/0x1bfc00000, data 0x64292b7/0x661b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:46.507475+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:47.507639+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d3000/0x0/0x1bfc00000, data 0x64292b7/0x661b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:48.507800+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2861672 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:49.508003+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 19.994880676s of 20.000654221s, submitted: 1
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:50.508216+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:51.508386+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:52.508530+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d3000/0x0/0x1bfc00000, data 0x64292b7/0x661b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:53.508681+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2861672 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:54.508857+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:55.509082+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d3000/0x0/0x1bfc00000, data 0x64292b7/0x661b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:56.509262+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:57.509437+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d3000/0x0/0x1bfc00000, data 0x64292b7/0x661b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:58.509620+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2861672 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:59.509814+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.981195450s of 10.005669594s, submitted: 4
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:00.509989+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:01.510176+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:02.510347+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01d2000/0x0/0x1bfc00000, data 0x6429352/0x661c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:03.510538+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2863440 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:04.510740+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:05.510918+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:06.511118+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b01ba000/0x0/0x1bfc00000, data 0x644143c/0x6634000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:07.511344+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:08.511534+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2866386 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:09.511658+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.949810982s of 10.003954887s, submitted: 10
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:10.511813+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:11.511981+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:12.512165+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b014a000/0x0/0x1bfc00000, data 0x64b1a82/0x66a4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:13.512352+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2872634 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:14.512525+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:15.512696+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b012c000/0x0/0x1bfc00000, data 0x64cfbcd/0x66c2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:16.512827+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:17.513047+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:18.513206+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2871980 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:19.513374+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197861376 unmapped: 10698752 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.943881035s of 10.001965523s, submitted: 14
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:20.513588+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197869568 unmapped: 10690560 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b012c000/0x0/0x1bfc00000, data 0x64cfbcd/0x66c2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:21.513741+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197869568 unmapped: 10690560 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:22.513936+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 197869568 unmapped: 10690560 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b0117000/0x0/0x1bfc00000, data 0x64e4581/0x66d7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:23.514097+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198926336 unmapped: 9633792 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2874460 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:24.514243+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198926336 unmapped: 9633792 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:25.514428+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198926336 unmapped: 9633792 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b00fd000/0x0/0x1bfc00000, data 0x64ff267/0x66f1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:26.514550+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198926336 unmapped: 9633792 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:27.514746+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198926336 unmapped: 9633792 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:28.514896+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198926336 unmapped: 9633792 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2874500 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:29.515042+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198926336 unmapped: 9633792 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.916554451s of 10.002207756s, submitted: 19
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:30.515206+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198942720 unmapped: 9617408 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b0095000/0x0/0x1bfc00000, data 0x6565877/0x6759000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:31.515377+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198942720 unmapped: 9617408 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 heartbeat osd_stat(store_statfs(0x1b0095000/0x0/0x1bfc00000, data 0x6565877/0x6759000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:32.515521+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198942720 unmapped: 9617408 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:33.515704+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199180288 unmapped: 9379840 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2882008 data_alloc: 301989888 data_used: 29683712
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:34.515857+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199180288 unmapped: 9379840 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:35.515994+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198311936 unmapped: 10248192 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:36.516146+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198311936 unmapped: 10248192 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 290 handle_osd_map epochs [291,291], i have 290, src has [1,291]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 291 heartbeat osd_stat(store_statfs(0x1b004a000/0x0/0x1bfc00000, data 0x65b1c48/0x67a4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:37.516372+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 291 heartbeat osd_stat(store_statfs(0x1b004a000/0x0/0x1bfc00000, data 0x65b1c48/0x67a4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198311936 unmapped: 10248192 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:38.516562+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198311936 unmapped: 10248192 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2885344 data_alloc: 301989888 data_used: 29696000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:39.516768+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198311936 unmapped: 10248192 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.903023720s of 10.003490448s, submitted: 47
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:40.516923+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 291 heartbeat osd_stat(store_statfs(0x1b0045000/0x0/0x1bfc00000, data 0x65b406f/0x67a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198311936 unmapped: 10248192 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:41.517118+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198139904 unmapped: 10420224 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:42.517317+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198139904 unmapped: 10420224 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:43.517521+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198148096 unmapped: 10412032 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2889100 data_alloc: 301989888 data_used: 29696000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:44.517718+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198148096 unmapped: 10412032 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:45.517902+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 291 heartbeat osd_stat(store_statfs(0x1afffe000/0x0/0x1bfc00000, data 0x65fbc02/0x67f0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198148096 unmapped: 10412032 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:46.518090+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198156288 unmapped: 10403840 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:47.518342+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198156288 unmapped: 10403840 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _renew_subs
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 291 handle_osd_map epochs [292,292], i have 291, src has [1,292]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:48.518513+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198156288 unmapped: 10403840 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2892826 data_alloc: 301989888 data_used: 29708288
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:49.518716+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198156288 unmapped: 10403840 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 292 handle_osd_map epochs [292,292], i have 292, src has [1,292]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.911927223s of 10.000244141s, submitted: 32
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:50.518884+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199385088 unmapped: 9175040 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 292 heartbeat osd_stat(store_statfs(0x1aff96000/0x0/0x1bfc00000, data 0x6661478/0x6858000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:51.519101+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199385088 unmapped: 9175040 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:52.519293+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199385088 unmapped: 9175040 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:53.519497+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199802880 unmapped: 8757248 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2900306 data_alloc: 301989888 data_used: 29712384
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:54.519727+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199802880 unmapped: 8757248 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 292 heartbeat osd_stat(store_statfs(0x1aff4d000/0x0/0x1bfc00000, data 0x66aad96/0x68a1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:55.519935+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199802880 unmapped: 8757248 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 292 handle_osd_map epochs [292,293], i have 292, src has [1,293]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:56.520096+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199802880 unmapped: 8757248 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:57.520279+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199802880 unmapped: 8757248 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:58.520450+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 293 heartbeat osd_stat(store_statfs(0x1aff48000/0x0/0x1bfc00000, data 0x66ad1bd/0x68a5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199802880 unmapped: 8757248 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2903076 data_alloc: 301989888 data_used: 29724672
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 293 heartbeat osd_stat(store_statfs(0x1aff48000/0x0/0x1bfc00000, data 0x66ad1bd/0x68a5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:59.520644+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199802880 unmapped: 8757248 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.838012695s of 10.007084846s, submitted: 48
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:00.520860+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199942144 unmapped: 8617984 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:01.521011+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199950336 unmapped: 8609792 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:02.521225+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 200089600 unmapped: 8470528 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 293 heartbeat osd_stat(store_statfs(0x1aff09000/0x0/0x1bfc00000, data 0x66ed992/0x68e5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:03.521397+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 200089600 unmapped: 8470528 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2907864 data_alloc: 301989888 data_used: 29724672
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 293 heartbeat osd_stat(store_statfs(0x1aff09000/0x0/0x1bfc00000, data 0x66ed992/0x68e5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:04.521599+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 200097792 unmapped: 8462336 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:05.521770+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 200097792 unmapped: 8462336 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:06.521970+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 200097792 unmapped: 8462336 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 293 heartbeat osd_stat(store_statfs(0x1aff09000/0x0/0x1bfc00000, data 0x66ed992/0x68e5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 293 handle_osd_map epochs [294,294], i have 293, src has [1,294]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 293 handle_osd_map epochs [294,294], i have 294, src has [1,294]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 293 handle_osd_map epochs [294,294], i have 294, src has [1,294]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:07.522150+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198516736 unmapped: 10043392 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff04000/0x0/0x1bfc00000, data 0x66efc36/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:08.522331+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198565888 unmapped: 9994240 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2909010 data_alloc: 301989888 data_used: 29736960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:09.522506+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff04000/0x0/0x1bfc00000, data 0x66efc36/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198565888 unmapped: 9994240 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:10.522698+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198565888 unmapped: 9994240 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff04000/0x0/0x1bfc00000, data 0x66efc36/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:11.522850+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198565888 unmapped: 9994240 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:12.522983+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198565888 unmapped: 9994240 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:13.523126+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 ms_handle_reset con 0x55843d3de000 session 0x55843e6072c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843fb86000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198565888 unmapped: 9994240 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2909010 data_alloc: 301989888 data_used: 29736960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:14.523323+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 ms_handle_reset con 0x55843d3df800 session 0x55843f3743c0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: handle_auth_request added challenge on 0x55843d3de000
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198565888 unmapped: 9994240 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:15.523498+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198565888 unmapped: 9994240 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:16.523656+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff04000/0x0/0x1bfc00000, data 0x66efc36/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198574080 unmapped: 9986048 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:17.523893+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198574080 unmapped: 9986048 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:18.524118+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198574080 unmapped: 9986048 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2909010 data_alloc: 301989888 data_used: 29736960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:19.524308+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198574080 unmapped: 9986048 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:20.524499+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198574080 unmapped: 9986048 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:21.524687+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff04000/0x0/0x1bfc00000, data 0x66efc36/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198574080 unmapped: 9986048 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:22.524864+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198574080 unmapped: 9986048 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:23.525096+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198574080 unmapped: 9986048 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2909010 data_alloc: 301989888 data_used: 29736960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:24.525302+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198582272 unmapped: 9977856 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:25.525457+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198582272 unmapped: 9977856 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:26.525624+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff04000/0x0/0x1bfc00000, data 0x66efc36/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198582272 unmapped: 9977856 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:27.525870+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198582272 unmapped: 9977856 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:28.526010+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198582272 unmapped: 9977856 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2909010 data_alloc: 301989888 data_used: 29736960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:29.526176+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198582272 unmapped: 9977856 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:30.526382+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198582272 unmapped: 9977856 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff04000/0x0/0x1bfc00000, data 0x66efc36/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:31.526588+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198582272 unmapped: 9977856 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:32.526736+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198590464 unmapped: 9969664 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:33.526901+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff04000/0x0/0x1bfc00000, data 0x66efc36/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198590464 unmapped: 9969664 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2909010 data_alloc: 301989888 data_used: 29736960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:34.527088+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff04000/0x0/0x1bfc00000, data 0x66efc36/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198590464 unmapped: 9969664 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:35.527302+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff04000/0x0/0x1bfc00000, data 0x66efc36/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198590464 unmapped: 9969664 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:36.527475+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff04000/0x0/0x1bfc00000, data 0x66efc36/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198598656 unmapped: 9961472 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:37.527682+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198598656 unmapped: 9961472 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:38.527892+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198598656 unmapped: 9961472 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2909010 data_alloc: 301989888 data_used: 29736960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:39.528102+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198598656 unmapped: 9961472 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:40.528272+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198606848 unmapped: 9953280 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:41.528440+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198606848 unmapped: 9953280 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff04000/0x0/0x1bfc00000, data 0x66efc36/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:42.528605+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff04000/0x0/0x1bfc00000, data 0x66efc36/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198606848 unmapped: 9953280 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:43.528806+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff04000/0x0/0x1bfc00000, data 0x66efc36/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198606848 unmapped: 9953280 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2909010 data_alloc: 301989888 data_used: 29736960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:44.528979+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198606848 unmapped: 9953280 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:45.529175+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198606848 unmapped: 9953280 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:46.529367+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198606848 unmapped: 9953280 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:47.529568+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198606848 unmapped: 9953280 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff04000/0x0/0x1bfc00000, data 0x66efc36/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:48.529725+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198615040 unmapped: 9945088 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2909010 data_alloc: 301989888 data_used: 29736960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:49.529905+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff04000/0x0/0x1bfc00000, data 0x66efc36/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198615040 unmapped: 9945088 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:50.530127+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198615040 unmapped: 9945088 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:51.530297+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198615040 unmapped: 9945088 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:52.530422+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff04000/0x0/0x1bfc00000, data 0x66efc36/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198615040 unmapped: 9945088 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:53.530760+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198615040 unmapped: 9945088 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2909010 data_alloc: 301989888 data_used: 29736960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:54.530948+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198615040 unmapped: 9945088 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:55.531153+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198615040 unmapped: 9945088 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:56.531346+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198623232 unmapped: 9936896 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:57.531590+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff04000/0x0/0x1bfc00000, data 0x66efc36/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198623232 unmapped: 9936896 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:58.531773+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198623232 unmapped: 9936896 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2909010 data_alloc: 301989888 data_used: 29736960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:59.532006+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198623232 unmapped: 9936896 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:00.532264+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198623232 unmapped: 9936896 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:01.532463+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198623232 unmapped: 9936896 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:02.532597+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198623232 unmapped: 9936896 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:03.532750+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff04000/0x0/0x1bfc00000, data 0x66efc36/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198631424 unmapped: 9928704 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2909010 data_alloc: 301989888 data_used: 29736960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:04.532933+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198639616 unmapped: 9920512 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:05.533144+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 65.387779236s of 65.445526123s, submitted: 35
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 ms_handle_reset con 0x558441bf9800 session 0x55843ab334a0
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199016448 unmapped: 9543680 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:06.533315+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff05000/0x0/0x1bfc00000, data 0x66efe49/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199016448 unmapped: 9543680 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Got map version 53
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:07.533493+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199016448 unmapped: 9543680 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:08.533678+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff05000/0x0/0x1bfc00000, data 0x66efe49/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199016448 unmapped: 9543680 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2908354 data_alloc: 301989888 data_used: 29736960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:09.534148+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199016448 unmapped: 9543680 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:10.534284+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff05000/0x0/0x1bfc00000, data 0x66efe49/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199016448 unmapped: 9543680 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:11.534487+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199016448 unmapped: 9543680 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:12.534654+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199016448 unmapped: 9543680 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:13.534871+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199016448 unmapped: 9543680 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2908354 data_alloc: 301989888 data_used: 29736960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:14.535106+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199016448 unmapped: 9543680 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:15.535254+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff05000/0x0/0x1bfc00000, data 0x66efe49/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199016448 unmapped: 9543680 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff05000/0x0/0x1bfc00000, data 0x66efe49/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:16.535431+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff05000/0x0/0x1bfc00000, data 0x66efe49/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199024640 unmapped: 9535488 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:17.535662+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199024640 unmapped: 9535488 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:18.535864+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199024640 unmapped: 9535488 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2908354 data_alloc: 301989888 data_used: 29736960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:19.536049+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199024640 unmapped: 9535488 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:20.536227+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199032832 unmapped: 9527296 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:21.536385+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199032832 unmapped: 9527296 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:22.536611+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff05000/0x0/0x1bfc00000, data 0x66efe49/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199032832 unmapped: 9527296 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:23.536889+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199032832 unmapped: 9527296 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2908354 data_alloc: 301989888 data_used: 29736960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:24.537091+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199032832 unmapped: 9527296 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:25.537313+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199032832 unmapped: 9527296 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff05000/0x0/0x1bfc00000, data 0x66efe49/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:26.537450+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199032832 unmapped: 9527296 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:27.537650+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199032832 unmapped: 9527296 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:28.537865+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199041024 unmapped: 9519104 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2908354 data_alloc: 301989888 data_used: 29736960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:29.538046+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199041024 unmapped: 9519104 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff05000/0x0/0x1bfc00000, data 0x66efe49/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:30.538273+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199041024 unmapped: 9519104 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:31.538462+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199041024 unmapped: 9519104 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:32.538679+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199041024 unmapped: 9519104 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:33.538850+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff05000/0x0/0x1bfc00000, data 0x66efe49/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199041024 unmapped: 9519104 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2908354 data_alloc: 301989888 data_used: 29736960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:34.539046+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199041024 unmapped: 9519104 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:35.539221+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199041024 unmapped: 9519104 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:36.539371+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199049216 unmapped: 9510912 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:37.539562+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff05000/0x0/0x1bfc00000, data 0x66efe49/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199049216 unmapped: 9510912 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:38.539752+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199049216 unmapped: 9510912 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2908354 data_alloc: 301989888 data_used: 29736960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:39.539964+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199049216 unmapped: 9510912 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:40.540128+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199049216 unmapped: 9510912 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff05000/0x0/0x1bfc00000, data 0x66efe49/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:41.540291+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199049216 unmapped: 9510912 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:42.540447+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199049216 unmapped: 9510912 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:43.540695+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199049216 unmapped: 9510912 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2908354 data_alloc: 301989888 data_used: 29736960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:44.540937+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199057408 unmapped: 9502720 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:45.541176+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff05000/0x0/0x1bfc00000, data 0x66efe49/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199057408 unmapped: 9502720 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:46.541298+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199057408 unmapped: 9502720 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:47.541528+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199057408 unmapped: 9502720 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:48.541695+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199057408 unmapped: 9502720 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2908354 data_alloc: 301989888 data_used: 29736960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:49.541892+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff05000/0x0/0x1bfc00000, data 0x66efe49/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199065600 unmapped: 9494528 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:50.542109+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199065600 unmapped: 9494528 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:51.542274+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff05000/0x0/0x1bfc00000, data 0x66efe49/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199065600 unmapped: 9494528 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:52.542450+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199073792 unmapped: 9486336 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:53.542569+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199073792 unmapped: 9486336 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2908354 data_alloc: 301989888 data_used: 29736960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:54.542737+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199073792 unmapped: 9486336 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:55.542887+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199073792 unmapped: 9486336 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:56.543057+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199073792 unmapped: 9486336 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:57.543240+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff05000/0x0/0x1bfc00000, data 0x66efe49/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199073792 unmapped: 9486336 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:58.543371+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199073792 unmapped: 9486336 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: bluestore.MempoolThread(0x558439219b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2908354 data_alloc: 301989888 data_used: 29736960
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:59.543501+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199073792 unmapped: 9486336 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:00.543637+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 199155712 unmapped: 9404416 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: osd.5 294 heartbeat osd_stat(store_statfs(0x1aff05000/0x0/0x1bfc00000, data 0x66efe49/0x68e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,4] op hist [])
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:01.543806+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: do_command 'config diff' '{prefix=config diff}'
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: do_command 'config show' '{prefix=config show}'
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: do_command 'counter dump' '{prefix=counter dump}'
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: do_command 'counter schema' '{prefix=counter schema}'
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198787072 unmapped: 9773056 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:02.543956+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: prioritycache tune_memory target: 5709084876 mapped: 198844416 unmapped: 9715712 heap: 208560128 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: tick
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:03.544076+0000)
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[32506]: do_command 'log dump' '{prefix=log dump}'
Nov 28 10:18:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Nov 28 10:18:34 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3138369626' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 28 10:18:34 np0005538513.localdomain ceph-mon[292954]: from='client.59545 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:34 np0005538513.localdomain ceph-mon[292954]: from='client.69557 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:34 np0005538513.localdomain ceph-mon[292954]: from='client.49446 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:34 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1513419320' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Nov 28 10:18:34 np0005538513.localdomain ceph-mon[292954]: from='client.69575 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:34 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1517482184' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Nov 28 10:18:34 np0005538513.localdomain ceph-mon[292954]: from='client.59575 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:34 np0005538513.localdomain ceph-mon[292954]: from='client.49458 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:34 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1322949749' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 28 10:18:34 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/258418876' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Nov 28 10:18:34 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2618867657' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 28 10:18:34 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1946602031' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 10:18:34 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.1 total, 600.0 interval
                                                          Cumulative writes: 23K writes, 88K keys, 23K commit groups, 1.0 writes per commit group, ingest: 0.08 GB, 0.01 MB/s
                                                          Cumulative WAL: 23K writes, 7977 syncs, 2.88 writes per sync, written: 0.08 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 8547 writes, 31K keys, 8547 commit groups, 1.0 writes per commit group, ingest: 31.78 MB, 0.05 MB/s
                                                          Interval WAL: 8547 writes, 3411 syncs, 2.51 writes per sync, written: 0.03 GB, 0.05 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 10:18:34 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Nov 28 10:18:34 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2135208413' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 28 10:18:35 np0005538513.localdomain crontab[335087]: (root) LIST (root)
Nov 28 10:18:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 28 10:18:35 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2560522477' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 28 10:18:35 np0005538513.localdomain ceph-mon[292954]: from='client.69587 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538513.localdomain ceph-mon[292954]: from='client.59590 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538513.localdomain ceph-mon[292954]: from='client.59602 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538513.localdomain ceph-mon[292954]: pgmap v819: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:35 np0005538513.localdomain ceph-mon[292954]: from='client.59620 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538513.localdomain ceph-mon[292954]: from='client.69617 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538513.localdomain ceph-mon[292954]: from='client.49464 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2092821165' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 28 10:18:35 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3138369626' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 28 10:18:35 np0005538513.localdomain ceph-mon[292954]: from='client.69629 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538513.localdomain ceph-mon[292954]: from='client.59641 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/3274056886' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 28 10:18:35 np0005538513.localdomain ceph-mon[292954]: from='client.49476 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/97215739' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 28 10:18:35 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2135208413' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 28 10:18:35 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/265827640' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 28 10:18:35 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/3675476587' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 28 10:18:35 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2560522477' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 28 10:18:35 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mgr versions"} v 0)
Nov 28 10:18:35 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1773482231' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 28 10:18:36 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon stat"} v 0)
Nov 28 10:18:36 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3561111865' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Nov 28 10:18:36 np0005538513.localdomain ceph-mon[292954]: from='client.59659 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:36 np0005538513.localdomain ceph-mon[292954]: from='client.69647 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:36 np0005538513.localdomain ceph-mon[292954]: from='client.49488 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:36 np0005538513.localdomain ceph-mon[292954]: from='client.59671 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:36 np0005538513.localdomain ceph-mon[292954]: from='client.69668 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:36 np0005538513.localdomain ceph-mon[292954]: from='client.49500 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:36 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1719564881' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 28 10:18:36 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/3738043283' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 28 10:18:36 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1773482231' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 28 10:18:36 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2168774809' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Nov 28 10:18:36 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1437964439' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 28 10:18:36 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3561111865' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/842310788' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "node ls"} v 0)
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3259287151' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3115545767' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3687197011' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: from='client.49512 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: from='client.59683 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: from='client.59692 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: pgmap v820: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: from='client.69704 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: from='client.59707 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/87497652' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: from='client.49533 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: from='client.59725 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2392063071' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/842310788' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3259287151' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/97330575' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3115545767' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3687197011' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2236414565' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3437061184' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Nov 28 10:18:37 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/367041026' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Nov 28 10:18:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:38.038 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:38 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:38.041 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/641110482' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2259440631' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc reconnect Starting new session with [v2:172.18.0.107:6810/2760684413,v1:172.18.0.107:6811/2760684413]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: get_auth_request con 0x55ff55c4e000 auth_method 0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_configure stats_period=5
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:12.912592+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83705856 unmapped: 442368 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Got map version 37
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2760684413,v1:172.18.0.107:6811/2760684413]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:13.912750+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83714048 unmapped: 434176 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:14.912934+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83714048 unmapped: 434176 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 771807 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:15.913219+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Got map version 38
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2760684413,v1:172.18.0.107:6811/2760684413]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83714048 unmapped: 434176 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:16.913386+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Got map version 39
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2760684413,v1:172.18.0.107:6811/2760684413]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83722240 unmapped: 425984 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:17.913567+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83722240 unmapped: 425984 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:18.913705+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83722240 unmapped: 425984 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:19.913862+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83722240 unmapped: 425984 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 771807 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:20.914009+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83722240 unmapped: 425984 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:21.914242+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83722240 unmapped: 425984 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:22.914367+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83722240 unmapped: 425984 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:23.914521+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83722240 unmapped: 425984 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:24.914671+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83722240 unmapped: 425984 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 771807 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:25.914817+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83722240 unmapped: 425984 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:26.915011+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83722240 unmapped: 425984 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:27.915167+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83722240 unmapped: 425984 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:28.915292+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83722240 unmapped: 425984 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:29.915507+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83722240 unmapped: 425984 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 771807 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:30.915741+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83722240 unmapped: 425984 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:31.915986+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83722240 unmapped: 425984 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:32.916106+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83722240 unmapped: 425984 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:33.916251+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83722240 unmapped: 425984 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:34.916424+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Got map version 40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2760684413,v1:172.18.0.107:6811/2760684413]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 771807 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:35.916609+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:36.916753+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:37.916880+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:38.917071+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:39.917233+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 771807 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:40.917399+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:41.917581+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:42.917743+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:43.917887+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:44.918119+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 771807 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:45.918255+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:46.918411+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:47.918586+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:48.918763+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:49.918894+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 771807 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:50.919092+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:51.919260+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:52.919411+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:53.919598+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:54.919758+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 771807 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:55.919910+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:56.920090+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:57.920271+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:58.920424+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:59.920562+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 771807 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:00.920722+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:01.920899+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:02.921082+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:03.921226+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:04.921384+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 771807 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:05.936583+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:06.936757+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:07.939102+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:08.939235+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:09.939389+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 771807 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:10.940256+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:11.940493+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83869696 unmapped: 278528 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:12.940653+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83599360 unmapped: 548864 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:13.940860+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83599360 unmapped: 548864 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:14.941065+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83599360 unmapped: 548864 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 771807 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:15.941253+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83599360 unmapped: 548864 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:16.941450+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83599360 unmapped: 548864 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:17.941601+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83599360 unmapped: 548864 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:18.941786+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83599360 unmapped: 548864 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:19.942143+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83599360 unmapped: 548864 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 771807 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:20.942512+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83599360 unmapped: 548864 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:21.942760+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83599360 unmapped: 548864 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:22.943135+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83599360 unmapped: 548864 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:23.943464+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83599360 unmapped: 548864 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:24.943675+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83599360 unmapped: 548864 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 771807 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:25.943913+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83599360 unmapped: 548864 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:26.944158+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83599360 unmapped: 548864 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:27.944422+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83599360 unmapped: 548864 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:28.944689+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83599360 unmapped: 548864 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:29.944885+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83607552 unmapped: 540672 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 771807 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:30.945142+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83607552 unmapped: 540672 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:31.945314+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 heartbeat osd_stat(store_statfs(0x1bba85000/0x0/0x1bfc00000, data 0x1524960/0x15a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83607552 unmapped: 540672 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:32.945637+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83607552 unmapped: 540672 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:33.945903+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83607552 unmapped: 540672 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:34.946068+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 91 handle_osd_map epochs [91,92], i have 91, src has [1,92]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 83.986007690s of 84.007759094s, submitted: 6
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Got map version 41
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Active mgr is now 
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc reconnect Terminating session with v2:172.18.0.107:6810/2760684413
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc reconnect No active mgr available yet
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 handle_osd_map epochs [92,92], i have 92, src has [1,92]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 ms_handle_reset con 0x55ff554c6000 session 0x55ff555cf680
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83714048 unmapped: 434176 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554c6400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 773719 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:35.946235+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Got map version 42
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc reconnect Starting new session with [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: get_auth_request con 0x55ff554cac00 auth_method 0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_configure stats_period=5
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83468288 unmapped: 679936 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:36.946381+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83468288 unmapped: 679936 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:37.947114+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83468288 unmapped: 679936 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Got map version 43
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:38.947275+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83468288 unmapped: 679936 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:39.947454+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83468288 unmapped: 679936 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 773719 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:40.947676+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Got map version 44
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83484672 unmapped: 663552 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:41.947906+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83484672 unmapped: 663552 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:42.948080+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83484672 unmapped: 663552 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:43.948306+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83484672 unmapped: 663552 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:44.948470+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83484672 unmapped: 663552 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 773719 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:45.948652+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83484672 unmapped: 663552 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:46.948907+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83484672 unmapped: 663552 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:47.949128+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83484672 unmapped: 663552 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:48.949349+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83484672 unmapped: 663552 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:49.949501+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83484672 unmapped: 663552 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 773719 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:50.949693+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83484672 unmapped: 663552 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:51.949930+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83484672 unmapped: 663552 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:52.950131+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83484672 unmapped: 663552 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:53.950360+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83484672 unmapped: 663552 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:54.950514+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83484672 unmapped: 663552 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:55.950708+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 773719 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83484672 unmapped: 663552 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:56.950900+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83484672 unmapped: 663552 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:57.951099+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83492864 unmapped: 655360 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:58.951250+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83492864 unmapped: 655360 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:59.951484+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83492864 unmapped: 655360 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:00.951669+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 773719 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83492864 unmapped: 655360 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:01.951896+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83492864 unmapped: 655360 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:02.952120+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83492864 unmapped: 655360 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:03.952304+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.1 total, 600.0 interval
                                                          Cumulative writes: 5057 writes, 22K keys, 5057 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5057 writes, 683 syncs, 7.40 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 86 writes, 299 keys, 86 commit groups, 1.0 writes per commit group, ingest: 0.37 MB, 0.00 MB/s
                                                          Interval WAL: 86 writes, 39 syncs, 2.21 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83492864 unmapped: 655360 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:04.952467+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83492864 unmapped: 655360 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:05.952657+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 773719 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83492864 unmapped: 655360 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:06.952827+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83501056 unmapped: 647168 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:07.952987+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83501056 unmapped: 647168 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:08.953179+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83501056 unmapped: 647168 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:09.953328+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83509248 unmapped: 638976 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:10.953490+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 773719 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83509248 unmapped: 638976 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:11.953731+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83509248 unmapped: 638976 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:12.953970+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83509248 unmapped: 638976 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:13.954173+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83509248 unmapped: 638976 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:14.954338+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83509248 unmapped: 638976 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:15.954593+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 773719 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83517440 unmapped: 630784 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:16.954797+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83517440 unmapped: 630784 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:17.955125+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83517440 unmapped: 630784 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:18.955365+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83517440 unmapped: 630784 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:19.955585+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83517440 unmapped: 630784 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:20.955802+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 773719 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83517440 unmapped: 630784 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:21.956100+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83517440 unmapped: 630784 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:22.956345+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83517440 unmapped: 630784 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:23.956558+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83517440 unmapped: 630784 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:24.956814+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83525632 unmapped: 622592 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:25.956967+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 773719 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83525632 unmapped: 622592 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:26.957227+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83525632 unmapped: 622592 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:27.957382+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83525632 unmapped: 622592 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:28.957534+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83525632 unmapped: 622592 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:29.958186+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83525632 unmapped: 622592 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:30.958408+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 773719 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83525632 unmapped: 622592 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:31.958650+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83525632 unmapped: 622592 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:32.958873+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83525632 unmapped: 622592 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:33.959100+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83525632 unmapped: 622592 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:34.959365+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83525632 unmapped: 622592 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:35.959572+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 773719 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83525632 unmapped: 622592 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:36.959792+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83525632 unmapped: 622592 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:37.959971+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83525632 unmapped: 622592 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:38.960139+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83525632 unmapped: 622592 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:39.960310+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83533824 unmapped: 614400 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:40.960527+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 773719 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83533824 unmapped: 614400 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:41.960773+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83533824 unmapped: 614400 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:42.960978+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83533824 unmapped: 614400 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:43.961212+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83533824 unmapped: 614400 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:44.961450+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83533824 unmapped: 614400 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:45.961770+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 773719 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83533824 unmapped: 614400 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:46.961958+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83533824 unmapped: 614400 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:47.962112+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83542016 unmapped: 606208 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:48.962266+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83542016 unmapped: 606208 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:49.962476+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83542016 unmapped: 606208 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:50.962737+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 773719 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83542016 unmapped: 606208 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:51.962984+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83542016 unmapped: 606208 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:52.963187+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83542016 unmapped: 606208 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:53.963418+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:54.963620+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83550208 unmapped: 598016 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:55.963831+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83550208 unmapped: 598016 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 773719 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:56.964078+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83550208 unmapped: 598016 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:57.964332+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83550208 unmapped: 598016 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:58.964551+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83550208 unmapped: 598016 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:59.964782+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83550208 unmapped: 598016 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:00.964979+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83550208 unmapped: 598016 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 773719 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bba82000/0x0/0x1bfc00000, data 0x1527188/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:01.965290+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83550208 unmapped: 598016 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:02.965516+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83550208 unmapped: 598016 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:03.965752+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83550208 unmapped: 598016 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:04.965936+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83558400 unmapped: 589824 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 90.139678955s of 90.163536072s, submitted: 6
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:05.966155+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83599360 unmapped: 548864 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 773895 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bb682000/0x0/0x1bfc00000, data 0x15272a2/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:06.966337+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Got map version 45
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83615744 unmapped: 532480 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bb682000/0x0/0x1bfc00000, data 0x15272a2/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:07.966576+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83615744 unmapped: 532480 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:08.966912+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83615744 unmapped: 532480 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:09.967170+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83615744 unmapped: 532480 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:10.967406+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83615744 unmapped: 532480 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 773895 data_alloc: 285212672 data_used: 8060928
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bb682000/0x0/0x1bfc00000, data 0x15272a2/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:11.967709+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83615744 unmapped: 532480 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:12.967948+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83533824 unmapped: 614400 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:13.968153+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83533824 unmapped: 614400 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55c4e000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:14.968329+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 83558400 unmapped: 589824 heap: 84148224 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.100503922s of 10.169430733s, submitted: 10
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 heartbeat osd_stat(store_statfs(0x1bb682000/0x0/0x1bfc00000, data 0x15272a2/0x15ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [0,0,0,0,0,0,0,1])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:15.968496+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85540864 unmapped: 15392768 heap: 100933632 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 92 handle_osd_map epochs [93,93], i have 92, src has [1,93]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 867088 data_alloc: 285212672 data_used: 8073216
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 93 ms_handle_reset con 0x55ff55c4e000 session 0x55ff555cfc20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55c4ec00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:16.968762+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85565440 unmapped: 23764992 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:17.968919+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85491712 unmapped: 23838720 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 93 handle_osd_map epochs [93,94], i have 93, src has [1,94]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 ms_handle_reset con 0x55ff55c4ec00 session 0x55ff542b5c20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:18.969142+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b9d97000/0x0/0x1bfc00000, data 0x2e0ba51/0x2e96000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:19.969425+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:20.969631+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 957150 data_alloc: 285212672 data_used: 8073216
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:21.969897+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:22.970107+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:23.970298+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:24.970518+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b9d97000/0x0/0x1bfc00000, data 0x2e0ba51/0x2e96000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:25.970687+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 957150 data_alloc: 285212672 data_used: 8073216
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:26.970901+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:27.971575+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:28.972715+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:29.973773+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:30.974122+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b9d97000/0x0/0x1bfc00000, data 0x2e0ba51/0x2e96000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 957150 data_alloc: 285212672 data_used: 8073216
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:31.974412+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:32.975157+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:33.975794+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b9d97000/0x0/0x1bfc00000, data 0x2e0ba51/0x2e96000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:34.976102+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b9d97000/0x0/0x1bfc00000, data 0x2e0ba51/0x2e96000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:35.976569+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 957150 data_alloc: 285212672 data_used: 8073216
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:36.977055+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:37.977487+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:38.977721+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b9d97000/0x0/0x1bfc00000, data 0x2e0ba51/0x2e96000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:39.978076+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:40.978379+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 957150 data_alloc: 285212672 data_used: 8073216
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:41.978648+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:42.978946+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b9d97000/0x0/0x1bfc00000, data 0x2e0ba51/0x2e96000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:43.979122+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:44.983764+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b9d97000/0x0/0x1bfc00000, data 0x2e0ba51/0x2e96000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:45.984100+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 957150 data_alloc: 285212672 data_used: 8073216
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:46.984384+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:47.984624+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:48.984865+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:49.985089+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b9d97000/0x0/0x1bfc00000, data 0x2e0ba51/0x2e96000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:50.985334+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 957150 data_alloc: 285212672 data_used: 8073216
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:51.985707+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b9d97000/0x0/0x1bfc00000, data 0x2e0ba51/0x2e96000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:52.986042+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:53.986192+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:54.986395+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:55.986574+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 957150 data_alloc: 285212672 data_used: 8073216
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b9d97000/0x0/0x1bfc00000, data 0x2e0ba51/0x2e96000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:56.986745+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b9d97000/0x0/0x1bfc00000, data 0x2e0ba51/0x2e96000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:57.987128+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:58.987323+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:59.987475+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:00.987647+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b9d97000/0x0/0x1bfc00000, data 0x2e0ba51/0x2e96000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 957150 data_alloc: 285212672 data_used: 8073216
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:01.987915+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:02.988103+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b9d97000/0x0/0x1bfc00000, data 0x2e0ba51/0x2e96000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:03.988279+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:04.988483+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:05.988691+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 957150 data_alloc: 285212672 data_used: 8073216
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b9d97000/0x0/0x1bfc00000, data 0x2e0ba51/0x2e96000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:06.988899+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:07.989142+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:08.989354+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:09.989633+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b9d97000/0x0/0x1bfc00000, data 0x2e0ba51/0x2e96000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:10.989794+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55a07c00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 ms_handle_reset con 0x55ff55a07c00 session 0x55ff55a3ef00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55a07800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 ms_handle_reset con 0x55ff55a07800 session 0x55ff55a3e5a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554c6000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 ms_handle_reset con 0x55ff554c6000 session 0x55ff554d8960
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 957150 data_alloc: 285212672 data_used: 8073216
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:11.989984+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 85532672 unmapped: 23797760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55a07800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 56.441593170s of 56.586063385s, submitted: 20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 ms_handle_reset con 0x55ff55a07800 session 0x55ff54861a40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55a07c00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:12.990198+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 ms_handle_reset con 0x55ff55a07c00 session 0x55ff54861c20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 89161728 unmapped: 20168704 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55c4e000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 ms_handle_reset con 0x55ff55c4e000 session 0x55ff555cf680
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55c4ec00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:13.990383+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 ms_handle_reset con 0x55ff55c4ec00 session 0x55ff555cf4a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554c6000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 ms_handle_reset con 0x55ff554c6000 session 0x55ff555ce5a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55a07800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 ms_handle_reset con 0x55ff55a07800 session 0x55ff555ce1e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55a07c00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 92471296 unmapped: 16859136 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 ms_handle_reset con 0x55ff55a07c00 session 0x55ff554d8780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55c4e000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 ms_handle_reset con 0x55ff55c4e000 session 0x55ff5471f2c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b8d46000/0x0/0x1bfc00000, data 0x3e5bac3/0x3ee8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:14.990591+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 92536832 unmapped: 16793600 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:15.990751+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 92553216 unmapped: 16777216 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1102236 data_alloc: 301989888 data_used: 12730368
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:16.990864+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 92553216 unmapped: 16777216 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b8d46000/0x0/0x1bfc00000, data 0x3e5bac3/0x3ee8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:17.991065+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 92553216 unmapped: 16777216 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeb000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b8d46000/0x0/0x1bfc00000, data 0x3e5bac3/0x3ee8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 ms_handle_reset con 0x55ff55aeb000 session 0x55ff531c0960
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:18.991218+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554c6000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 92700672 unmapped: 16629760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:19.991362+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 93396992 unmapped: 15933440 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:20.991524+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 95789056 unmapped: 13541376 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1134230 data_alloc: 301989888 data_used: 16506880
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b8d21000/0x0/0x1bfc00000, data 0x3e7fae6/0x3f0d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:21.991714+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 97648640 unmapped: 11681792 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b8d21000/0x0/0x1bfc00000, data 0x3e7fae6/0x3f0d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:22.991844+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 99893248 unmapped: 9437184 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:23.992073+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 99893248 unmapped: 9437184 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55a07800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.104839325s of 12.453378677s, submitted: 72
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:24.992188+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 99926016 unmapped: 9404416 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 94 handle_osd_map epochs [95,95], i have 94, src has [1,95]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:25.992309+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 95 ms_handle_reset con 0x55ff55a07800 session 0x55ff542b54a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b8d20000/0x0/0x1bfc00000, data 0x3e80526/0x3f0e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 99991552 unmapped: 9338880 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1171443 data_alloc: 301989888 data_used: 20721664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:26.992470+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 99991552 unmapped: 9338880 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55a07c00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 95 handle_osd_map epochs [96,96], i have 95, src has [1,96]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 96 heartbeat osd_stat(store_statfs(0x1b8d1b000/0x0/0x1bfc00000, data 0x3e828e4/0x3f12000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:27.992623+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55c4e000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 100081664 unmapped: 9248768 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 96 ms_handle_reset con 0x55ff55a07c00 session 0x55ff542b50e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:28.992771+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 96 ms_handle_reset con 0x55ff554c6000 session 0x55ff54322f00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 100130816 unmapped: 9199616 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:29.992892+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 100130816 unmapped: 9199616 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:30.993057+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 100139008 unmapped: 9191424 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1175213 data_alloc: 301989888 data_used: 20746240
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:31.993422+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 96 heartbeat osd_stat(store_statfs(0x1b8d18000/0x0/0x1bfc00000, data 0x3e842b6/0x3f15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,3,4,5] op hist [0,0,0,1,1])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 107036672 unmapped: 2293760 heap: 109330432 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 96 heartbeat osd_stat(store_statfs(0x1b6ff9000/0x0/0x1bfc00000, data 0x4a042b6/0x4a95000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [0,0,0,0,2,0,1])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:32.993570+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeac00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 96 ms_handle_reset con 0x55ff55aeac00 session 0x55ff56c7cd20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aea800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 96 ms_handle_reset con 0x55ff55aea800 session 0x55ff56c7cf00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554c6000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 96 ms_handle_reset con 0x55ff554c6000 session 0x55ff56c7d0e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 111665152 unmapped: 1867776 heap: 113532928 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:33.993700+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 110288896 unmapped: 3244032 heap: 113532928 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55a07800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 96 ms_handle_reset con 0x55ff55a07800 session 0x55ff56c7d2c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55a07c00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 96 handle_osd_map epochs [97,97], i have 96, src has [1,97]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.291957855s of 10.034324646s, submitted: 195
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 96 handle_osd_map epochs [97,97], i have 97, src has [1,97]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff55c4e000 session 0x55ff55a3f2c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:35.003151+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff55a07c00 session 0x55ff56c7d680
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aebc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 109895680 unmapped: 3637248 heap: 113532928 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:36.003265+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeb800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff55aeb800 session 0x55ff542b4960
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554c6000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 104620032 unmapped: 8912896 heap: 113532928 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1029314 data_alloc: 301989888 data_used: 12767232
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff55aebc00 session 0x55ff56c7da40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 heartbeat osd_stat(store_statfs(0x1b7cdb000/0x0/0x1bfc00000, data 0x2e364e8/0x2ec7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [0,0,0,0,0,0,1,1])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff554c6000 session 0x55ff5455e960
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55a07800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff55a07800 session 0x55ff554ce780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55a07c00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff55a07c00 session 0x55ff52adf680
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55c4e000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff55c4e000 session 0x55ff52adf2c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554c6000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff554c6000 session 0x55ff52ade000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:37.003434+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 103923712 unmapped: 19628032 heap: 123551744 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:38.003605+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 103923712 unmapped: 19628032 heap: 123551744 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 heartbeat osd_stat(store_statfs(0x1b83d0000/0x0/0x1bfc00000, data 0x362c4d5/0x36bd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:39.003730+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 103956480 unmapped: 19595264 heap: 123551744 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:40.003984+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55a07800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff55a07800 session 0x55ff52adf0e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55a07c00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 103088128 unmapped: 20463616 heap: 123551744 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:41.004212+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 103145472 unmapped: 20406272 heap: 123551744 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1086977 data_alloc: 301989888 data_used: 13389824
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:42.004817+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 103145472 unmapped: 20406272 heap: 123551744 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:43.004981+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 103817216 unmapped: 19734528 heap: 123551744 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:44.005138+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 103817216 unmapped: 19734528 heap: 123551744 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 heartbeat osd_stat(store_statfs(0x1b83d0000/0x0/0x1bfc00000, data 0x362c4e5/0x36be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:45.005310+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 103800832 unmapped: 19750912 heap: 123551744 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:46.005451+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 103800832 unmapped: 19750912 heap: 123551744 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1115457 data_alloc: 301989888 data_used: 17371136
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:47.005649+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 heartbeat osd_stat(store_statfs(0x1b83d0000/0x0/0x1bfc00000, data 0x362c4e5/0x36be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 103800832 unmapped: 19750912 heap: 123551744 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:48.005885+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 103800832 unmapped: 19750912 heap: 123551744 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:49.006068+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 103800832 unmapped: 19750912 heap: 123551744 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:50.006273+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 heartbeat osd_stat(store_statfs(0x1b83d0000/0x0/0x1bfc00000, data 0x362c4e5/0x36be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 103800832 unmapped: 19750912 heap: 123551744 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:51.006411+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aebc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff55aebc00 session 0x55ff522d3680
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54497000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff54497000 session 0x55ff522d3a40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff5354a000 session 0x55ff536863c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 heartbeat osd_stat(store_statfs(0x1b83d0000/0x0/0x1bfc00000, data 0x362c4e5/0x36be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.901083946s of 16.385768890s, submitted: 133
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff5354a000 session 0x55ff53687e00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54497000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 103800832 unmapped: 19750912 heap: 123551744 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1115413 data_alloc: 301989888 data_used: 17371136
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff54497000 session 0x55ff53687c20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:52.006633+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 112361472 unmapped: 11190272 heap: 123551744 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554c6000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff554c6000 session 0x55ff52adb860
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55a07800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 heartbeat osd_stat(store_statfs(0x1b7767000/0x0/0x1bfc00000, data 0x428d4e5/0x431f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [0,0,0,0,0,0,0,1])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff55a07800 session 0x55ff52ada960
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:53.006767+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aebc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff55aebc00 session 0x55ff53663c20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aebc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff55aebc00 session 0x55ff536625a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff5354a000 session 0x55ff53662960
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54497000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff54497000 session 0x55ff53662000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 heartbeat osd_stat(store_statfs(0x1b6b53000/0x0/0x1bfc00000, data 0x4e9a4f4/0x4f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 114671616 unmapped: 12558336 heap: 127229952 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:54.006906+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554c6000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff554c6000 session 0x55ff545203c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 111419392 unmapped: 15810560 heap: 127229952 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55a07800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff55a07800 session 0x55ff557e10e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:55.007093+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55a07800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff55a07800 session 0x55ff557e0d20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 111452160 unmapped: 15777792 heap: 127229952 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff5354a000 session 0x55ff557e0f00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54497000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554c6000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:56.007238+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 111460352 unmapped: 15769600 heap: 127229952 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1328479 data_alloc: 301989888 data_used: 17604608
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:57.007395+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 111468544 unmapped: 15761408 heap: 127229952 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:58.007557+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 heartbeat osd_stat(store_statfs(0x1b6acb000/0x0/0x1bfc00000, data 0x4f29504/0x4fbd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 111534080 unmapped: 15695872 heap: 127229952 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:59.007791+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 111542272 unmapped: 15687680 heap: 127229952 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:00.007992+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 112836608 unmapped: 14393344 heap: 127229952 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:01.008180+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 113295360 unmapped: 13934592 heap: 127229952 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1346839 data_alloc: 301989888 data_used: 20951040
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:02.008370+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 116137984 unmapped: 11091968 heap: 127229952 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 heartbeat osd_stat(store_statfs(0x1b6ab2000/0x0/0x1bfc00000, data 0x4f48504/0x4fdc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:03.008503+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.824758530s of 11.739228249s, submitted: 188
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 116432896 unmapped: 10797056 heap: 127229952 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:04.008648+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 116498432 unmapped: 10731520 heap: 127229952 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:05.008781+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff55a07c00 session 0x55ff522d30e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 116506624 unmapped: 10723328 heap: 127229952 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:06.008890+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54748400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 ms_handle_reset con 0x55ff54748400 session 0x55ff545c2780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54748800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 116514816 unmapped: 10715136 heap: 127229952 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1363499 data_alloc: 301989888 data_used: 23285760
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:07.009000+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 97 handle_osd_map epochs [97,98], i have 97, src has [1,98]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 98 handle_osd_map epochs [98,98], i have 98, src has [1,98]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 98 handle_osd_map epochs [98,98], i have 98, src has [1,98]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 98 ms_handle_reset con 0x55ff54748800 session 0x55ff545c2960
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54748400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 98 ms_handle_reset con 0x55ff54748400 session 0x55ff542405a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 98 ms_handle_reset con 0x55ff5354a000 session 0x55ff545c30e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55a07800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122634240 unmapped: 4595712 heap: 127229952 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:08.009121+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 98 ms_handle_reset con 0x55ff55a07800 session 0x55ff54241a40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55a07c00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 126255104 unmapped: 16670720 heap: 142925824 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 98 heartbeat osd_stat(store_statfs(0x1b5662000/0x0/0x1bfc00000, data 0x6395924/0x642c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:09.009225+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 98 handle_osd_map epochs [98,99], i have 98, src has [1,99]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 99 handle_osd_map epochs [99,99], i have 99, src has [1,99]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 99 ms_handle_reset con 0x55ff55a07c00 session 0x55ff557e01e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 99 handle_osd_map epochs [99,99], i have 99, src has [1,99]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 126197760 unmapped: 16728064 heap: 142925824 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 99 handle_osd_map epochs [99,99], i have 99, src has [1,99]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 99 handle_osd_map epochs [99,99], i have 99, src has [1,99]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:10.009366+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 99 handle_osd_map epochs [99,100], i have 99, src has [1,100]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 100 handle_osd_map epochs [100,100], i have 100, src has [1,100]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 100 handle_osd_map epochs [100,100], i have 100, src has [1,100]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 100 ms_handle_reset con 0x55ff54497000 session 0x55ff5455f0e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 100 handle_osd_map epochs [100,100], i have 100, src has [1,100]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 100 ms_handle_reset con 0x55ff554c6000 session 0x55ff554cef00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 126230528 unmapped: 16695296 heap: 142925824 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:11.009630+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 126238720 unmapped: 16687104 heap: 142925824 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 100 heartbeat osd_stat(store_statfs(0x1b5659000/0x0/0x1bfc00000, data 0x639a0ae/0x6433000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1566770 data_alloc: 301989888 data_used: 26906624
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:12.009880+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 120053760 unmapped: 22872064 heap: 142925824 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:13.010066+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.169121742s of 10.003950119s, submitted: 210
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122699776 unmapped: 20226048 heap: 142925824 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 100 heartbeat osd_stat(store_statfs(0x1b48d9000/0x0/0x1bfc00000, data 0x710e0ae/0x71a7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:14.010174+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54497000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 119488512 unmapped: 23437312 heap: 142925824 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 100 handle_osd_map epochs [100,101], i have 100, src has [1,101]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 101 ms_handle_reset con 0x55ff54497000 session 0x55ff54570000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:15.010312+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 116662272 unmapped: 26263552 heap: 142925824 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:16.010470+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 116662272 unmapped: 26263552 heap: 142925824 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1448617 data_alloc: 301989888 data_used: 18837504
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:17.010669+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 116662272 unmapped: 26263552 heap: 142925824 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:18.010858+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 116662272 unmapped: 26263552 heap: 142925824 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:19.010996+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 101 ms_handle_reset con 0x55ff5354a000 session 0x55ff52725a40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 101 heartbeat osd_stat(store_statfs(0x1b5eb3000/0x0/0x1bfc00000, data 0x5b3b332/0x5bd4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54748400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 101 ms_handle_reset con 0x55ff54748400 session 0x55ff52adb0e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55a07800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 101 ms_handle_reset con 0x55ff55a07800 session 0x55ff545c32c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55a07800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 101 ms_handle_reset con 0x55ff5354a000 session 0x55ff536863c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 101 ms_handle_reset con 0x55ff55a07800 session 0x55ff545741e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54497000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 116629504 unmapped: 26296320 heap: 142925824 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 101 ms_handle_reset con 0x55ff54497000 session 0x55ff557e1680
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54748400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 101 ms_handle_reset con 0x55ff54748400 session 0x55ff53663a40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554c6000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 101 ms_handle_reset con 0x55ff554c6000 session 0x55ff56c7d860
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554c6000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 101 ms_handle_reset con 0x55ff554c6000 session 0x55ff54322d20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:20.011177+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 143540224 unmapped: 18595840 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 101 ms_handle_reset con 0x55ff5354a000 session 0x55ff543505a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:21.011316+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 134160384 unmapped: 27975680 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.038835
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1728053248 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1729589 data_alloc: 318767104 data_used: 35397632
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:22.011529+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 101 ms_handle_reset con 0x55ff54894800 session 0x55ff5471dc20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 101 ms_handle_reset con 0x55ff54895400 session 0x55ff56c7d0e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54497000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54748400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 101 ms_handle_reset con 0x55ff54497000 session 0x55ff557e1a40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 130744320 unmapped: 31391744 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 101 heartbeat osd_stat(store_statfs(0x1b4119000/0x0/0x1bfc00000, data 0x78d9375/0x7975000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [0,1,1])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 101 ms_handle_reset con 0x55ff54748400 session 0x55ff542401e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:23.011671+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 101 handle_osd_map epochs [101,102], i have 101, src has [1,102]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.528486252s of 10.387819290s, submitted: 206
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 102 handle_osd_map epochs [102,102], i have 102, src has [1,102]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 130867200 unmapped: 31268864 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 102 ms_handle_reset con 0x55ff54894800 session 0x55ff545701e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:24.011802+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 115490816 unmapped: 46645248 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:25.011935+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 117989376 unmapped: 44146688 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:26.012130+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 117989376 unmapped: 44146688 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1356047 data_alloc: 301989888 data_used: 20090880
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:27.012318+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 117989376 unmapped: 44146688 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 102 heartbeat osd_stat(store_statfs(0x1b6e5b000/0x0/0x1bfc00000, data 0x4b97768/0x4c33000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:28.012488+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 117989376 unmapped: 44146688 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:29.012688+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 102 heartbeat osd_stat(store_statfs(0x1b6e5b000/0x0/0x1bfc00000, data 0x4b97768/0x4c33000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 117989376 unmapped: 44146688 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:30.012832+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 102 handle_osd_map epochs [103,103], i have 102, src has [1,103]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 119054336 unmapped: 43081728 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:31.012956+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 119054336 unmapped: 43081728 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1360249 data_alloc: 301989888 data_used: 20103168
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:32.013095+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 119054336 unmapped: 43081728 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:33.013300+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 119054336 unmapped: 43081728 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:34.013442+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.589120865s of 10.782855034s, submitted: 66
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 103 heartbeat osd_stat(store_statfs(0x1b6e56000/0x0/0x1bfc00000, data 0x4b99a0c/0x4c37000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 117735424 unmapped: 44400640 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:35.014142+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 117932032 unmapped: 44204032 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:36.014324+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 119390208 unmapped: 42745856 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 103 heartbeat osd_stat(store_statfs(0x1b6e4c000/0x0/0x1bfc00000, data 0x4ba1a0c/0x4c3f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1370985 data_alloc: 301989888 data_used: 20623360
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:37.014581+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121061376 unmapped: 41074688 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 103 ms_handle_reset con 0x55ff54895400 session 0x55ff5471c780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554c6000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 103 ms_handle_reset con 0x55ff554c6000 session 0x55ff554d8000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55a07800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 103 ms_handle_reset con 0x55ff55a07800 session 0x55ff574330e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:38.014771+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54748400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 103 ms_handle_reset con 0x55ff54748400 session 0x55ff536674a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 103 ms_handle_reset con 0x55ff54894800 session 0x55ff5455fa40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 118980608 unmapped: 43155456 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 103 ms_handle_reset con 0x55ff5354a000 session 0x55ff52adf680
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:39.014912+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 103 ms_handle_reset con 0x55ff54895400 session 0x55ff53683680
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 119078912 unmapped: 43057152 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554c6000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 103 handle_osd_map epochs [104,104], i have 103, src has [1,104]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 104 ms_handle_reset con 0x55ff554c6000 session 0x55ff54574b40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54748400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:40.015089+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 104 ms_handle_reset con 0x55ff54748400 session 0x55ff52adf860
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 104 ms_handle_reset con 0x55ff5354a000 session 0x55ff54241860
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 126386176 unmapped: 35749888 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 104 ms_handle_reset con 0x55ff54894800 session 0x55ff54574780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 104 ms_handle_reset con 0x55ff54895400 session 0x55ff57433680
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55a07c00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 104 handle_osd_map epochs [105,105], i have 104, src has [1,105]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:41.015224+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 126410752 unmapped: 35725312 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1633302 data_alloc: 301989888 data_used: 25989120
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5489b000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 105 handle_osd_map epochs [106,106], i have 105, src has [1,106]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 106 heartbeat osd_stat(store_statfs(0x1b5354000/0x0/0x1bfc00000, data 0x66971ec/0x6739000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 106 ms_handle_reset con 0x55ff55a07c00 session 0x55ff545741e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:42.015489+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 106 ms_handle_reset con 0x55ff5489b000 session 0x55ff52adeb40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 106 ms_handle_reset con 0x55ff5354a000 session 0x55ff545754a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54748400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 106 ms_handle_reset con 0x55ff54895400 session 0x55ff53687c20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55519c00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 106 ms_handle_reset con 0x55ff55519c00 session 0x55ff53686f00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 126713856 unmapped: 35422208 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:43.015615+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5551b000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 106 ms_handle_reset con 0x55ff5551b000 session 0x55ff53663c20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 119250944 unmapped: 42885120 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:44.015760+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121724928 unmapped: 40411136 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:45.015925+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 106 handle_osd_map epochs [107,107], i have 106, src has [1,107]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.124994278s of 10.832698822s, submitted: 176
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 107 heartbeat osd_stat(store_statfs(0x1b6ca8000/0x0/0x1bfc00000, data 0x49415a6/0x49e3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 118898688 unmapped: 43237376 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:46.016107+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 107 ms_handle_reset con 0x55ff5354a000 session 0x55ff536621e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 107 ms_handle_reset con 0x55ff54895400 session 0x55ff545205a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5489b000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 107 ms_handle_reset con 0x55ff5489b000 session 0x55ff545212c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55519c00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 107 ms_handle_reset con 0x55ff55519c00 session 0x55ff54520d20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5551b000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 118898688 unmapped: 43237376 heap: 162136064 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54c95400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1395325 data_alloc: 301989888 data_used: 20275200
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 107 ms_handle_reset con 0x55ff54c95400 session 0x55ff53666f00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 107 ms_handle_reset con 0x55ff5551b000 session 0x55ff522d3c20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 107 ms_handle_reset con 0x55ff5354a000 session 0x55ff522d2d20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 107 ms_handle_reset con 0x55ff54895400 session 0x55ff522d23c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:47.016272+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5489b000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 107 ms_handle_reset con 0x55ff5489b000 session 0x55ff5368f4a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55519c00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 107 ms_handle_reset con 0x55ff55519c00 session 0x55ff5368e3c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 134307840 unmapped: 31604736 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 107 ms_handle_reset con 0x55ff5354a000 session 0x55ff52ad9e00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:48.016452+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 107 heartbeat osd_stat(store_statfs(0x1b6003000/0x0/0x1bfc00000, data 0x59e58bc/0x5a8b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 126992384 unmapped: 38920192 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:49.016593+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 107 handle_osd_map epochs [108,108], i have 107, src has [1,108]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 108 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 108 ms_handle_reset con 0x55ff54895400 session 0x55ff52724b40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5489b000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5551b000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 127041536 unmapped: 38871040 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 108 ms_handle_reset con 0x55ff5551b000 session 0x55ff5471da40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff57812000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:50.016766+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff57812400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 108 handle_osd_map epochs [109,109], i have 108, src has [1,109]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 109 ms_handle_reset con 0x55ff5489b000 session 0x55ff52ade3c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff57812800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff57812c00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 127393792 unmapped: 38518784 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:51.016915+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 109 handle_osd_map epochs [109,110], i have 109, src has [1,110]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 110 handle_osd_map epochs [110,110], i have 110, src has [1,110]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 110 ms_handle_reset con 0x55ff57812c00 session 0x55ff56c7d4a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 110 ms_handle_reset con 0x55ff57812800 session 0x55ff56c7de00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125550592 unmapped: 40361984 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1480629 data_alloc: 301989888 data_used: 20971520
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 110 heartbeat osd_stat(store_statfs(0x1b5d52000/0x0/0x1bfc00000, data 0x4bb1bdb/0x4c64000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:52.017125+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 110 handle_osd_map epochs [111,111], i have 110, src has [1,111]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 111 ms_handle_reset con 0x55ff5354a000 session 0x55ff522d2d20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 123133952 unmapped: 42778624 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:53.017309+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 111 handle_osd_map epochs [112,112], i have 111, src has [1,112]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 112 handle_osd_map epochs [112,112], i have 112, src has [1,112]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 112 ms_handle_reset con 0x55ff54895400 session 0x55ff53666f00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 112 ms_handle_reset con 0x55ff57812000 session 0x55ff53662960
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 112 handle_osd_map epochs [111,111], i have 112, src has [1,111]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 112 ms_handle_reset con 0x55ff57812400 session 0x55ff52adf0e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 123232256 unmapped: 42680320 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff57812400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:54.017476+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 112 handle_osd_map epochs [112,113], i have 112, src has [1,113]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 113 handle_osd_map epochs [113,113], i have 113, src has [1,113]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 113 handle_osd_map epochs [113,113], i have 113, src has [1,113]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 113 ms_handle_reset con 0x55ff57812400 session 0x55ff545205a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125173760 unmapped: 40738816 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 113 ms_handle_reset con 0x55ff5354a000 session 0x55ff54240000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 113 handle_osd_map epochs [113,114], i have 113, src has [1,114]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:55.017642+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.221845627s of 10.494216919s, submitted: 357
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124215296 unmapped: 41697280 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 114 handle_osd_map epochs [114,114], i have 114, src has [1,114]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 114 handle_osd_map epochs [114,114], i have 114, src has [1,114]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:56.017786+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 114 handle_osd_map epochs [114,115], i have 114, src has [1,115]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b6fdb000/0x0/0x1bfc00000, data 0x45f633d/0x46af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 115 ms_handle_reset con 0x55ff54895400 session 0x55ff545212c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124723200 unmapped: 41189376 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1441773 data_alloc: 301989888 data_used: 20439040
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff57812000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:57.017983+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 115 handle_osd_map epochs [115,116], i have 115, src has [1,116]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 116 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 116 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 116 ms_handle_reset con 0x55ff57812000 session 0x55ff53686f00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124846080 unmapped: 41066496 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff57812800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:58.018082+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 116 handle_osd_map epochs [116,117], i have 116, src has [1,117]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 117 handle_osd_map epochs [117,117], i have 117, src has [1,117]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 117 ms_handle_reset con 0x55ff57812800 session 0x55ff52adf860
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124674048 unmapped: 41238528 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:59.018228+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 117 handle_osd_map epochs [117,118], i have 117, src has [1,118]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 118 ms_handle_reset con 0x55ff5354a000 session 0x55ff57433e00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124682240 unmapped: 41230336 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 118 handle_osd_map epochs [119,119], i have 118, src has [1,119]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 118 handle_osd_map epochs [119,119], i have 119, src has [1,119]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:00.018370+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 119 heartbeat osd_stat(store_statfs(0x1b6fd2000/0x0/0x1bfc00000, data 0x45fd9d6/0x46b9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125730816 unmapped: 40181760 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 119 handle_osd_map epochs [120,120], i have 119, src has [1,120]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:01.018474+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 120 ms_handle_reset con 0x55ff54895400 session 0x55ff53683c20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff57812000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 120 handle_osd_map epochs [120,120], i have 120, src has [1,120]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125968384 unmapped: 39944192 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1445029 data_alloc: 301989888 data_used: 20459520
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:02.018680+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 120 handle_osd_map epochs [121,121], i have 120, src has [1,121]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125943808 unmapped: 39968768 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 121 ms_handle_reset con 0x55ff57812000 session 0x55ff536870e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:03.018869+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125927424 unmapped: 39985152 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:04.019198+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125927424 unmapped: 39985152 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:05.019341+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 121 handle_osd_map epochs [122,122], i have 121, src has [1,122]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 122 heartbeat osd_stat(store_statfs(0x1b6fae000/0x0/0x1bfc00000, data 0x4621d46/0x46dd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125976576 unmapped: 39936000 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:06.019511+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125976576 unmapped: 39936000 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1447324 data_alloc: 301989888 data_used: 20455424
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:07.019708+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.820149422s of 11.799381256s, submitted: 302
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125861888 unmapped: 40050688 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:08.019873+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 122 heartbeat osd_stat(store_statfs(0x1b6fa3000/0x0/0x1bfc00000, data 0x462e026/0x46eb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125911040 unmapped: 40001536 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:09.020043+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125911040 unmapped: 40001536 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:10.020272+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 122 handle_osd_map epochs [123,123], i have 122, src has [1,123]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125976576 unmapped: 39936000 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:11.020448+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125976576 unmapped: 39936000 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1450758 data_alloc: 301989888 data_used: 20467712
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff57812400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:12.020823+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125976576 unmapped: 39936000 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 123 handle_osd_map epochs [124,124], i have 123, src has [1,124]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:13.021192+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 124 ms_handle_reset con 0x55ff57812400 session 0x55ff52ad8780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 124 heartbeat osd_stat(store_statfs(0x1b6f98000/0x0/0x1bfc00000, data 0x46326ce/0x46f5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125198336 unmapped: 40714240 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:14.021494+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125198336 unmapped: 40714240 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:15.021659+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 124 heartbeat osd_stat(store_statfs(0x1b6f95000/0x0/0x1bfc00000, data 0x46356ce/0x46f8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125214720 unmapped: 40697856 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:16.021810+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 124 heartbeat osd_stat(store_statfs(0x1b6f95000/0x0/0x1bfc00000, data 0x46356ce/0x46f8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125239296 unmapped: 40673280 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1458093 data_alloc: 301989888 data_used: 20467712
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:17.021965+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5489b000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.224763870s of 10.313405037s, submitted: 31
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125239296 unmapped: 40673280 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:18.022122+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 124 handle_osd_map epochs [124,125], i have 124, src has [1,125]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 125 handle_osd_map epochs [125,125], i have 125, src has [1,125]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 125 ms_handle_reset con 0x55ff5489b000 session 0x55ff536605a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 125 heartbeat osd_stat(store_statfs(0x1b6f91000/0x0/0x1bfc00000, data 0x4637abd/0x46fb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 125 ms_handle_reset con 0x55ff54748400 session 0x55ff54574960
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 125 ms_handle_reset con 0x55ff54894800 session 0x55ff536672c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125353984 unmapped: 40558592 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:19.022328+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 125 ms_handle_reset con 0x55ff5354a000 session 0x55ff522d3c20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122093568 unmapped: 43819008 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:20.023083+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122093568 unmapped: 43819008 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 125 heartbeat osd_stat(store_statfs(0x1b877b000/0x0/0x1bfc00000, data 0x2e50a77/0x2f12000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:21.023252+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 125 heartbeat osd_stat(store_statfs(0x1b877b000/0x0/0x1bfc00000, data 0x2e50a77/0x2f12000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122093568 unmapped: 43819008 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1222533 data_alloc: 301989888 data_used: 12926976
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:22.023483+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122093568 unmapped: 43819008 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:23.023646+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 125 heartbeat osd_stat(store_statfs(0x1b877b000/0x0/0x1bfc00000, data 0x2e50a77/0x2f12000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122093568 unmapped: 43819008 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:24.023882+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122093568 unmapped: 43819008 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:25.024230+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 125 handle_osd_map epochs [126,126], i have 125, src has [1,126]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122134528 unmapped: 43778048 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:26.024382+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 126 heartbeat osd_stat(store_statfs(0x1b877b000/0x0/0x1bfc00000, data 0x2e50a77/0x2f12000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122134528 unmapped: 43778048 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1226559 data_alloc: 301989888 data_used: 12939264
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:27.024555+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 126 heartbeat osd_stat(store_statfs(0x1b8777000/0x0/0x1bfc00000, data 0x2e52d1b/0x2f16000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122134528 unmapped: 43778048 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:28.024710+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122134528 unmapped: 43778048 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:29.024887+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122134528 unmapped: 43778048 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:30.025068+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 126 heartbeat osd_stat(store_statfs(0x1b8777000/0x0/0x1bfc00000, data 0x2e52d1b/0x2f16000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122134528 unmapped: 43778048 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:31.025222+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122134528 unmapped: 43778048 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1226559 data_alloc: 301989888 data_used: 12939264
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:32.025418+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121896960 unmapped: 44015616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:33.025622+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121896960 unmapped: 44015616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:34.025787+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121896960 unmapped: 44015616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:35.025956+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121896960 unmapped: 44015616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 126 heartbeat osd_stat(store_statfs(0x1b8777000/0x0/0x1bfc00000, data 0x2e52d1b/0x2f16000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:36.026116+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121896960 unmapped: 44015616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1226559 data_alloc: 301989888 data_used: 12939264
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:37.026278+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 126 heartbeat osd_stat(store_statfs(0x1b8777000/0x0/0x1bfc00000, data 0x2e52d1b/0x2f16000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121896960 unmapped: 44015616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:38.026491+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121896960 unmapped: 44015616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:39.026671+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121896960 unmapped: 44015616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:40.026931+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121896960 unmapped: 44015616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:41.027132+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121896960 unmapped: 44015616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1226559 data_alloc: 301989888 data_used: 12939264
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:42.027389+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 126 heartbeat osd_stat(store_statfs(0x1b8777000/0x0/0x1bfc00000, data 0x2e52d1b/0x2f16000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121896960 unmapped: 44015616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:43.027543+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121896960 unmapped: 44015616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:44.027736+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121896960 unmapped: 44015616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:45.027930+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121896960 unmapped: 44015616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:46.028112+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 126 heartbeat osd_stat(store_statfs(0x1b8777000/0x0/0x1bfc00000, data 0x2e52d1b/0x2f16000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121896960 unmapped: 44015616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1226559 data_alloc: 301989888 data_used: 12939264
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:47.028265+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121896960 unmapped: 44015616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:48.029158+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121896960 unmapped: 44015616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:49.029316+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 126 heartbeat osd_stat(store_statfs(0x1b8777000/0x0/0x1bfc00000, data 0x2e52d1b/0x2f16000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121896960 unmapped: 44015616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:50.029473+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121896960 unmapped: 44015616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:51.029658+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121896960 unmapped: 44015616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:52.029870+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1226559 data_alloc: 301989888 data_used: 12939264
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121896960 unmapped: 44015616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:53.030056+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 126 heartbeat osd_stat(store_statfs(0x1b8777000/0x0/0x1bfc00000, data 0x2e52d1b/0x2f16000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121896960 unmapped: 44015616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:54.030237+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 126 heartbeat osd_stat(store_statfs(0x1b8777000/0x0/0x1bfc00000, data 0x2e52d1b/0x2f16000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 36.924610138s of 37.276969910s, submitted: 109
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121913344 unmapped: 43999232 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:55.030411+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 126 handle_osd_map epochs [126,127], i have 126, src has [1,127]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 121946112 unmapped: 43966464 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:56.030584+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 127 ms_handle_reset con 0x55ff54895400 session 0x55ff57432780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff57812000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 127 ms_handle_reset con 0x55ff57812000 session 0x55ff52adb0e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 127 ms_handle_reset con 0x55ff5354a000 session 0x55ff554d90e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122036224 unmapped: 43876352 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:57.030770+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1233182 data_alloc: 301989888 data_used: 12951552
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54748400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:58.030990+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122036224 unmapped: 43876352 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 127 handle_osd_map epochs [128,128], i have 127, src has [1,128]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 128 ms_handle_reset con 0x55ff54748400 session 0x55ff554d81e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 128 ms_handle_reset con 0x55ff54894800 session 0x55ff555ceb40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 128 ms_handle_reset con 0x55ff54895400 session 0x55ff555cef00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:59.031112+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122044416 unmapped: 43868160 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff57812400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 128 heartbeat osd_stat(store_statfs(0x1b876b000/0x0/0x1bfc00000, data 0x2e578ca/0x2f21000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:00.031312+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122052608 unmapped: 43859968 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 128 handle_osd_map epochs [129,129], i have 128, src has [1,129]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 129 ms_handle_reset con 0x55ff57812400 session 0x55ff555ce5a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 129 heartbeat osd_stat(store_statfs(0x1b8767000/0x0/0x1bfc00000, data 0x2e59ccc/0x2f24000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:01.139995+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122150912 unmapped: 43761664 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1241093 data_alloc: 301989888 data_used: 12951552
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:02.140279+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122175488 unmapped: 43737088 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 129 handle_osd_map epochs [130,130], i have 129, src has [1,130]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 130 ms_handle_reset con 0x55ff5354a000 session 0x55ff522d3680
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:03.140435+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122241024 unmapped: 43671552 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 130 heartbeat osd_stat(store_statfs(0x1b8767000/0x0/0x1bfc00000, data 0x2e5bcbb/0x2f26000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:04.140630+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122241024 unmapped: 43671552 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:05.140805+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122249216 unmapped: 43663360 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 130 handle_osd_map epochs [131,131], i have 130, src has [1,131]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.951623917s of 10.519275665s, submitted: 143
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 131 heartbeat osd_stat(store_statfs(0x1b8767000/0x0/0x1bfc00000, data 0x2e5bcbb/0x2f26000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: get_auth_request con 0x55ff530c6800 auth_method 0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets getting new tickets!
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:06.141038+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _finish_auth 0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:06.141993+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122257408 unmapped: 43655168 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 131 handle_osd_map epochs [132,132], i have 131, src has [1,132]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:07.141197+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1250447 data_alloc: 301989888 data_used: 12963840
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122265600 unmapped: 43646976 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 132 handle_osd_map epochs [132,133], i have 132, src has [1,133]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 133 handle_osd_map epochs [132,133], i have 133, src has [1,133]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54748400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 133 ms_handle_reset con 0x55ff54748400 session 0x55ff52adba40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:08.141365+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122298368 unmapped: 43614208 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 133 handle_osd_map epochs [133,134], i have 133, src has [1,134]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 134 handle_osd_map epochs [134,134], i have 134, src has [1,134]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 134 ms_handle_reset con 0x55ff54895400 session 0x55ff52ada960
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 134 ms_handle_reset con 0x55ff54894800 session 0x55ff52adbc20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:09.141545+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122331136 unmapped: 43581440 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 134 handle_osd_map epochs [134,135], i have 134, src has [1,135]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:10.141632+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122388480 unmapped: 43524096 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5551b000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 135 heartbeat osd_stat(store_statfs(0x1b874c000/0x0/0x1bfc00000, data 0x2e6703f/0x2f3f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:11.141839+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122413056 unmapped: 43499520 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff57812c00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 135 ms_handle_reset con 0x55ff57812c00 session 0x55ff54241860
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:12.142043+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1325612 data_alloc: 301989888 data_used: 12967936
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122470400 unmapped: 43442176 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:13.142215+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 135 heartbeat osd_stat(store_statfs(0x1b774f000/0x0/0x1bfc00000, data 0x3e67054/0x3f3f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [0,0,0,0,0,0,0,0,1])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122503168 unmapped: 43409408 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 135 ms_handle_reset con 0x55ff5354a000 session 0x55ff544c72c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54748400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:14.142391+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 122150912 unmapped: 43761664 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 135 heartbeat osd_stat(store_statfs(0x1b674f000/0x0/0x1bfc00000, data 0x4e67056/0x4f3f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 135 handle_osd_map epochs [135,136], i have 135, src has [1,136]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 136 ms_handle_reset con 0x55ff54748400 session 0x55ff5484d860
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 136 ms_handle_reset con 0x55ff5551b000 session 0x55ff555cf2c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:15.142523+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 123232256 unmapped: 42680320 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.933844566s of 10.517407417s, submitted: 134
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:16.142680+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 123240448 unmapped: 42672128 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 136 ms_handle_reset con 0x55ff54895400 session 0x55ff555ce780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 136 handle_osd_map epochs [137,137], i have 136, src has [1,137]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:17.142831+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1605592 data_alloc: 301989888 data_used: 12992512
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 137 heartbeat osd_stat(store_statfs(0x1b574a000/0x0/0x1bfc00000, data 0x5e695f2/0x5f43000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [1])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 123273216 unmapped: 42639360 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 137 ms_handle_reset con 0x55ff54894800 session 0x55ff53667e00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:18.142998+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 137 handle_osd_map epochs [138,138], i have 137, src has [1,138]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 123297792 unmapped: 42614784 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 138 ms_handle_reset con 0x55ff5354a000 session 0x55ff5368f4a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:19.143698+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 123297792 unmapped: 42614784 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:20.143882+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 123314176 unmapped: 42598400 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 138 handle_osd_map epochs [139,139], i have 138, src has [1,139]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 139 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:21.144085+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 123355136 unmapped: 42557440 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:22.144379+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1291043 data_alloc: 301989888 data_used: 13004800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 123355136 unmapped: 42557440 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 139 heartbeat osd_stat(store_statfs(0x1b8741000/0x0/0x1bfc00000, data 0x2e700b5/0x2f4c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:23.144574+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 123355136 unmapped: 42557440 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:24.144730+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 123355136 unmapped: 42557440 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:25.144830+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 139 handle_osd_map epochs [140,140], i have 139, src has [1,140]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 123355136 unmapped: 42557440 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:26.145000+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.912479401s of 10.317738533s, submitted: 128
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 123355136 unmapped: 42557440 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 141 handle_osd_map epochs [140,141], i have 141, src has [1,141]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 141 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:27.145194+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1298526 data_alloc: 301989888 data_used: 13004800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 123387904 unmapped: 42524672 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 141 heartbeat osd_stat(store_statfs(0x1b8738000/0x0/0x1bfc00000, data 0x2e74779/0x2f55000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 141 handle_osd_map epochs [141,142], i have 141, src has [1,142]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:28.145353+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 123404288 unmapped: 42508288 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:29.146355+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 123412480 unmapped: 42500096 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 142 ms_handle_reset con 0x55ff54894800 session 0x55ff56c7c960
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 142 handle_osd_map epochs [142,143], i have 142, src has [1,143]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:30.146501+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 143 handle_osd_map epochs [143,143], i have 143, src has [1,143]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 143 ms_handle_reset con 0x55ff54895400 session 0x55ff536605a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 123412480 unmapped: 42500096 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:31.146657+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 143 handle_osd_map epochs [144,144], i have 143, src has [1,144]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124461056 unmapped: 41451520 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:32.146855+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1307794 data_alloc: 301989888 data_used: 13021184
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124461056 unmapped: 41451520 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 144 handle_osd_map epochs [143,144], i have 144, src has [1,144]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 144 handle_osd_map epochs [143,144], i have 144, src has [1,144]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:33.147244+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124461056 unmapped: 41451520 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 144 heartbeat osd_stat(store_statfs(0x1b8730000/0x0/0x1bfc00000, data 0x2e7b287/0x2f5e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:34.147424+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124461056 unmapped: 41451520 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:35.147556+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 144 handle_osd_map epochs [144,145], i have 144, src has [1,145]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124477440 unmapped: 41435136 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 145 handle_osd_map epochs [145,145], i have 145, src has [1,145]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 145 handle_osd_map epochs [145,145], i have 145, src has [1,145]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:36.147720+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124510208 unmapped: 41402368 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:37.147885+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.712349892s of 10.926544189s, submitted: 68
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1312882 data_alloc: 301989888 data_used: 13041664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 146 handle_osd_map epochs [146,146], i have 146, src has [1,146]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124518400 unmapped: 41394176 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:38.148081+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124518400 unmapped: 41394176 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5551b000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 146 ms_handle_reset con 0x55ff5551b000 session 0x55ff53661e00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 146 heartbeat osd_stat(store_statfs(0x1b8725000/0x0/0x1bfc00000, data 0x2e7f8fa/0x2f67000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:39.148277+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124551168 unmapped: 41361408 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 147 handle_osd_map epochs [147,147], i have 147, src has [1,147]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:40.148445+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124567552 unmapped: 41345024 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:41.148617+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124567552 unmapped: 41345024 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:42.148847+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1317654 data_alloc: 301989888 data_used: 13058048
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124567552 unmapped: 41345024 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 147 heartbeat osd_stat(store_statfs(0x1b8723000/0x0/0x1bfc00000, data 0x2e81cfb/0x2f6a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:43.148983+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124567552 unmapped: 41345024 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:44.149131+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124567552 unmapped: 41345024 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 147 handle_osd_map epochs [147,148], i have 147, src has [1,148]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:45.149275+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124592128 unmapped: 41320448 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:46.150256+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124592128 unmapped: 41320448 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:47.150486+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1320656 data_alloc: 301989888 data_used: 13058048
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124592128 unmapped: 41320448 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 148 heartbeat osd_stat(store_statfs(0x1b871f000/0x0/0x1bfc00000, data 0x2e83f9f/0x2f6e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:48.150682+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124592128 unmapped: 41320448 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 148 heartbeat osd_stat(store_statfs(0x1b871f000/0x0/0x1bfc00000, data 0x2e83f9f/0x2f6e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:49.151927+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124592128 unmapped: 41320448 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:50.152122+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124592128 unmapped: 41320448 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:51.152285+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124592128 unmapped: 41320448 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:52.152524+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1320656 data_alloc: 301989888 data_used: 13058048
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 148 heartbeat osd_stat(store_statfs(0x1b871f000/0x0/0x1bfc00000, data 0x2e83f9f/0x2f6e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124592128 unmapped: 41320448 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:53.152795+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124600320 unmapped: 41312256 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:54.153109+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124600320 unmapped: 41312256 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:55.157595+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 148 heartbeat osd_stat(store_statfs(0x1b871f000/0x0/0x1bfc00000, data 0x2e83f9f/0x2f6e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124600320 unmapped: 41312256 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:56.157783+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124600320 unmapped: 41312256 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:57.157958+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1320656 data_alloc: 301989888 data_used: 13058048
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff57813000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.150384903s of 20.357698441s, submitted: 62
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 148 ms_handle_reset con 0x55ff57813000 session 0x55ff53660960
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124600320 unmapped: 41312256 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:58.158129+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124600320 unmapped: 41312256 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 148 ms_handle_reset con 0x55ff5354a000 session 0x55ff55a3e5a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 148 ms_handle_reset con 0x55ff54894800 session 0x55ff52ad9e00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:59.158354+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124633088 unmapped: 41279488 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:00.158524+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 148 heartbeat osd_stat(store_statfs(0x1b8720000/0x0/0x1bfc00000, data 0x2e83f9f/0x2f6e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124633088 unmapped: 41279488 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:01.158798+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124641280 unmapped: 41271296 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:02.159042+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1321613 data_alloc: 301989888 data_used: 13058048
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124641280 unmapped: 41271296 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:03.159218+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124641280 unmapped: 41271296 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:04.159432+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124641280 unmapped: 41271296 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 148 heartbeat osd_stat(store_statfs(0x1b8720000/0x0/0x1bfc00000, data 0x2e83f9f/0x2f6e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:05.159584+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124641280 unmapped: 41271296 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:06.159746+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124641280 unmapped: 41271296 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 148 heartbeat osd_stat(store_statfs(0x1b8720000/0x0/0x1bfc00000, data 0x2e83f9f/0x2f6e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:07.159901+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1321613 data_alloc: 301989888 data_used: 13058048
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124641280 unmapped: 41271296 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:08.160108+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124641280 unmapped: 41271296 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 148 heartbeat osd_stat(store_statfs(0x1b8720000/0x0/0x1bfc00000, data 0x2e83f9f/0x2f6e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:09.160316+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124641280 unmapped: 41271296 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:10.160468+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124641280 unmapped: 41271296 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:11.160667+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 148 heartbeat osd_stat(store_statfs(0x1b8720000/0x0/0x1bfc00000, data 0x2e83f9f/0x2f6e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124641280 unmapped: 41271296 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:12.161084+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1321613 data_alloc: 301989888 data_used: 13058048
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124641280 unmapped: 41271296 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:13.161388+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.916543961s of 16.002923965s, submitted: 21
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124657664 unmapped: 41254912 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:14.161566+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 148 handle_osd_map epochs [149,149], i have 148, src has [1,149]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 148 handle_osd_map epochs [149,149], i have 149, src has [1,149]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 149 ms_handle_reset con 0x55ff54895400 session 0x55ff52ad8f00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124665856 unmapped: 41246720 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5551b000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:15.161737+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 149 handle_osd_map epochs [150,150], i have 149, src has [1,150]
Nov 28 10:18:38 np0005538513.localdomain rsyslogd[759]: imjournal from <localhost:ceph-osd>: begin to drop messages due to rate-limiting
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 150 ms_handle_reset con 0x55ff5551b000 session 0x55ff53663860
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124755968 unmapped: 41156608 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:16.161950+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124755968 unmapped: 41156608 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:17.162104+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1331577 data_alloc: 301989888 data_used: 13070336
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 150 heartbeat osd_stat(store_statfs(0x1b8715000/0x0/0x1bfc00000, data 0x2e8876f/0x2f76000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124764160 unmapped: 41148416 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:18.162347+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124764160 unmapped: 41148416 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:19.162560+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124764160 unmapped: 41148416 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 150 handle_osd_map epochs [150,151], i have 150, src has [1,151]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:20.162765+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124764160 unmapped: 41148416 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:21.162921+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124764160 unmapped: 41148416 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:22.163116+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 heartbeat osd_stat(store_statfs(0x1b8713000/0x0/0x1bfc00000, data 0x2e8aa13/0x2f7a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1332725 data_alloc: 301989888 data_used: 13070336
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124649472 unmapped: 41263104 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:23.163303+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124649472 unmapped: 41263104 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:24.163485+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124649472 unmapped: 41263104 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:25.163651+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124649472 unmapped: 41263104 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:26.163840+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124649472 unmapped: 41263104 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 heartbeat osd_stat(store_statfs(0x1b8713000/0x0/0x1bfc00000, data 0x2e8aa13/0x2f7a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:27.164042+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 heartbeat osd_stat(store_statfs(0x1b8713000/0x0/0x1bfc00000, data 0x2e8aa13/0x2f7a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1332725 data_alloc: 301989888 data_used: 13070336
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124649472 unmapped: 41263104 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:28.164252+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124649472 unmapped: 41263104 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 heartbeat osd_stat(store_statfs(0x1b8713000/0x0/0x1bfc00000, data 0x2e8aa13/0x2f7a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:29.164440+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124649472 unmapped: 41263104 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:30.165122+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124649472 unmapped: 41263104 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:31.165314+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124649472 unmapped: 41263104 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:32.165567+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1332725 data_alloc: 301989888 data_used: 13070336
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 heartbeat osd_stat(store_statfs(0x1b8713000/0x0/0x1bfc00000, data 0x2e8aa13/0x2f7a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124649472 unmapped: 41263104 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:33.165742+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124649472 unmapped: 41263104 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:34.165957+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124649472 unmapped: 41263104 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:35.166138+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124649472 unmapped: 41263104 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:36.166305+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124649472 unmapped: 41263104 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:37.166462+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1332725 data_alloc: 301989888 data_used: 13070336
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124649472 unmapped: 41263104 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:38.166642+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 heartbeat osd_stat(store_statfs(0x1b8713000/0x0/0x1bfc00000, data 0x2e8aa13/0x2f7a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124649472 unmapped: 41263104 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:39.166890+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124649472 unmapped: 41263104 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:40.167105+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124657664 unmapped: 41254912 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:41.167272+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124657664 unmapped: 41254912 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:42.167506+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1332725 data_alloc: 301989888 data_used: 13070336
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124657664 unmapped: 41254912 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:43.167689+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 heartbeat osd_stat(store_statfs(0x1b8713000/0x0/0x1bfc00000, data 0x2e8aa13/0x2f7a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124657664 unmapped: 41254912 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:44.167868+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124657664 unmapped: 41254912 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:45.168095+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 heartbeat osd_stat(store_statfs(0x1b8713000/0x0/0x1bfc00000, data 0x2e8aa13/0x2f7a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124657664 unmapped: 41254912 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:46.168267+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124657664 unmapped: 41254912 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:47.168441+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1332725 data_alloc: 301989888 data_used: 13070336
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124657664 unmapped: 41254912 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:48.168593+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124657664 unmapped: 41254912 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:49.168792+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff57813400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 35.905910492s of 36.205245972s, submitted: 88
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 ms_handle_reset con 0x55ff57813400 session 0x55ff555cf2c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124674048 unmapped: 41238528 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:50.168972+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 heartbeat osd_stat(store_statfs(0x1b8713000/0x0/0x1bfc00000, data 0x2e8aa23/0x2f7b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124674048 unmapped: 41238528 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:51.169115+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 ms_handle_reset con 0x55ff5354a000 session 0x55ff5471cb40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 ms_handle_reset con 0x55ff54894800 session 0x55ff54241860
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124674048 unmapped: 41238528 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:52.169349+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1333682 data_alloc: 301989888 data_used: 13070336
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 ms_handle_reset con 0x55ff54895400 session 0x55ff5471f0e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124682240 unmapped: 41230336 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:53.169517+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 heartbeat osd_stat(store_statfs(0x1b8313000/0x0/0x1bfc00000, data 0x2e8aa23/0x2f7b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5551b000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 ms_handle_reset con 0x55ff5551b000 session 0x55ff5368e3c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff57813800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124715008 unmapped: 41197568 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:54.169665+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 ms_handle_reset con 0x55ff57813800 session 0x55ff56c7d4a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124747776 unmapped: 41164800 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:55.169848+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124747776 unmapped: 41164800 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:56.170012+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124755968 unmapped: 41156608 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:57.170195+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1335519 data_alloc: 301989888 data_used: 13070336
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 ms_handle_reset con 0x55ff5354a000 session 0x55ff57432960
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124772352 unmapped: 41140224 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:58.170364+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124772352 unmapped: 41140224 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:59.170519+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 ms_handle_reset con 0x55ff54894800 session 0x55ff5471c1e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 heartbeat osd_stat(store_statfs(0x1b8313000/0x0/0x1bfc00000, data 0x2e8aa75/0x2f7b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124755968 unmapped: 41156608 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:00.170893+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.163437843s of 10.401829720s, submitted: 58
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 ms_handle_reset con 0x55ff54895400 session 0x55ff5471fe00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5551b000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 ms_handle_reset con 0x55ff5551b000 session 0x55ff557e1e00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124755968 unmapped: 41156608 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:01.171080+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff57813c00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 ms_handle_reset con 0x55ff57813c00 session 0x55ff5471c960
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124755968 unmapped: 41156608 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:02.171258+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 ms_handle_reset con 0x55ff5354a000 session 0x55ff522d2780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 heartbeat osd_stat(store_statfs(0x1b8314000/0x0/0x1bfc00000, data 0x2e8aa13/0x2f7a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1339199 data_alloc: 301989888 data_used: 13070336
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124755968 unmapped: 41156608 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 heartbeat osd_stat(store_statfs(0x1b8314000/0x0/0x1bfc00000, data 0x2e8aa13/0x2f7a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:03.171433+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124755968 unmapped: 41156608 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:04.216092+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124755968 unmapped: 41156608 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:05.216298+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 heartbeat osd_stat(store_statfs(0x1b8314000/0x0/0x1bfc00000, data 0x2e8aa13/0x2f7a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124755968 unmapped: 41156608 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 151 handle_osd_map epochs [152,152], i have 151, src has [1,152]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 152 handle_osd_map epochs [152,152], i have 152, src has [1,152]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:06.216482+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 152 ms_handle_reset con 0x55ff54894800 session 0x55ff5471e780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124764160 unmapped: 41148416 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:07.216688+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1345217 data_alloc: 301989888 data_used: 13082624
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124764160 unmapped: 41148416 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 152 ms_handle_reset con 0x55ff54895400 session 0x55ff5471f2c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:08.216870+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124764160 unmapped: 41148416 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:09.217124+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5551b000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 152 ms_handle_reset con 0x55ff5551b000 session 0x55ff554d94a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 152 heartbeat osd_stat(store_statfs(0x1b830e000/0x0/0x1bfc00000, data 0x2e8ce33/0x2f7f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124764160 unmapped: 41148416 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:10.217362+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124764160 unmapped: 41148416 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:11.217553+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124764160 unmapped: 41148416 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:12.234153+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1344161 data_alloc: 301989888 data_used: 13082624
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 152 heartbeat osd_stat(store_statfs(0x1b830f000/0x0/0x1bfc00000, data 0x2e8ce33/0x2f7f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124764160 unmapped: 41148416 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:13.234340+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124772352 unmapped: 41140224 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:14.234544+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124772352 unmapped: 41140224 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 152 heartbeat osd_stat(store_statfs(0x1b830f000/0x0/0x1bfc00000, data 0x2e8ce33/0x2f7f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:15.234732+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.355293274s of 15.478096962s, submitted: 29
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124772352 unmapped: 41140224 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:16.234891+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124780544 unmapped: 41132032 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:17.235057+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 152 handle_osd_map epochs [152,153], i have 152, src has [1,153]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 153 handle_osd_map epochs [153,153], i have 153, src has [1,153]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1351636 data_alloc: 301989888 data_used: 13094912
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124788736 unmapped: 41123840 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:18.235213+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aea800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 153 ms_handle_reset con 0x55ff55aea800 session 0x55ff554d8000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124788736 unmapped: 41123840 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:19.235382+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 153 handle_osd_map epochs [154,154], i have 153, src has [1,154]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 154 heartbeat osd_stat(store_statfs(0x1b8308000/0x0/0x1bfc00000, data 0x2e8f263/0x2f85000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124813312 unmapped: 41099264 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:20.235527+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 154 heartbeat osd_stat(store_statfs(0x1b8303000/0x0/0x1bfc00000, data 0x2e91675/0x2f89000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 154 ms_handle_reset con 0x55ff5354a000 session 0x55ff56c7cd20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 154 ms_handle_reset con 0x55ff54894800 session 0x55ff56c7cb40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124829696 unmapped: 41082880 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:21.235760+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124829696 unmapped: 41082880 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:22.235983+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1352326 data_alloc: 301989888 data_used: 13094912
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124829696 unmapped: 41082880 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:23.236141+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124829696 unmapped: 41082880 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:24.236284+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124846080 unmapped: 41066496 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 154 handle_osd_map epochs [155,155], i have 154, src has [1,155]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:25.236413+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 155 heartbeat osd_stat(store_statfs(0x1b8307000/0x0/0x1bfc00000, data 0x2e91603/0x2f87000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124846080 unmapped: 41066496 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:26.236602+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.906539917s of 11.069557190s, submitted: 58
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 124854272 unmapped: 41058304 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:27.236733+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 155 ms_handle_reset con 0x55ff54895400 session 0x55ff557e1e00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1355648 data_alloc: 301989888 data_used: 13107200
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5551b000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 155 ms_handle_reset con 0x55ff5551b000 session 0x55ff557e03c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125272064 unmapped: 40640512 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:28.236998+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125272064 unmapped: 40640512 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:29.237214+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aea000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 155 ms_handle_reset con 0x55ff55aea000 session 0x55ff522d23c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 155 heartbeat osd_stat(store_statfs(0x1b7a18000/0x0/0x1bfc00000, data 0x377e8a7/0x3876000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125304832 unmapped: 40607744 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:30.237396+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 155 handle_osd_map epochs [156,156], i have 155, src has [1,156]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125304832 unmapped: 40607744 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:31.237611+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 156 heartbeat osd_stat(store_statfs(0x1b82fd000/0x0/0x1bfc00000, data 0x2e95c6d/0x2f90000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 156 handle_osd_map epochs [156,157], i have 156, src has [1,157]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 157 ms_handle_reset con 0x55ff5354a000 session 0x55ff54574000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125313024 unmapped: 40599552 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:32.237876+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 157 handle_osd_map epochs [158,158], i have 157, src has [1,158]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 158 handle_osd_map epochs [157,158], i have 158, src has [1,158]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 158 ms_handle_reset con 0x55ff54894800 session 0x55ff5471cb40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5551b000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1376252 data_alloc: 301989888 data_used: 13119488
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 158 ms_handle_reset con 0x55ff54895400 session 0x55ff5471c1e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125362176 unmapped: 40550400 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:33.238078+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 158 ms_handle_reset con 0x55ff5551b000 session 0x55ff554cfa40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 158 handle_osd_map epochs [159,159], i have 158, src has [1,159]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125370368 unmapped: 40542208 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:34.238253+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54735400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 159 handle_osd_map epochs [159,159], i have 159, src has [1,159]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 159 ms_handle_reset con 0x55ff54735400 session 0x55ff54241860
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125378560 unmapped: 40534016 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:35.238446+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 159 handle_osd_map epochs [160,160], i have 159, src has [1,160]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125337600 unmapped: 40574976 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 160 heartbeat osd_stat(store_statfs(0x1b82eb000/0x0/0x1bfc00000, data 0x2e9f04a/0x2fa2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:36.238602+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 160 heartbeat osd_stat(store_statfs(0x1b82eb000/0x0/0x1bfc00000, data 0x2e9f04a/0x2fa2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.785191536s of 10.088568687s, submitted: 99
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 160 ms_handle_reset con 0x55ff5354a000 session 0x55ff52ad9e00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125337600 unmapped: 40574976 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:37.238785+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1385206 data_alloc: 301989888 data_used: 13119488
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125337600 unmapped: 40574976 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:38.238869+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54735400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 160 ms_handle_reset con 0x55ff54735400 session 0x55ff52ad8f00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 160 ms_handle_reset con 0x55ff54895400 session 0x55ff5368e3c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 160 handle_osd_map epochs [161,161], i have 160, src has [1,161]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125337600 unmapped: 40574976 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:39.239156+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5551b000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 161 ms_handle_reset con 0x55ff5551b000 session 0x55ff543514a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 161 ms_handle_reset con 0x55ff54894800 session 0x55ff55a3e3c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 161 ms_handle_reset con 0x55ff5354a000 session 0x55ff53661a40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125345792 unmapped: 40566784 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:40.239295+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 161 handle_osd_map epochs [162,162], i have 161, src has [1,162]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54735400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125353984 unmapped: 40558592 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:41.239457+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 162 handle_osd_map epochs [162,162], i have 162, src has [1,162]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 162 heartbeat osd_stat(store_statfs(0x1b82e4000/0x0/0x1bfc00000, data 0x2ea381a/0x2faa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:42.240118+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125370368 unmapped: 40542208 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 162 handle_osd_map epochs [162,163], i have 162, src has [1,163]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 163 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 163 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 163 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 163 ms_handle_reset con 0x55ff54735400 session 0x55ff545752c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1395201 data_alloc: 301989888 data_used: 13148160
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:43.240270+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125370368 unmapped: 40542208 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:44.241213+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125370368 unmapped: 40542208 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 163 handle_osd_map epochs [163,164], i have 163, src has [1,164]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 164 handle_osd_map epochs [164,165], i have 164, src has [1,165]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:45.241418+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125419520 unmapped: 40493056 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 165 handle_osd_map epochs [165,165], i have 165, src has [1,165]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 165 ms_handle_reset con 0x55ff54894800 session 0x55ff522d2f00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:46.241599+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125435904 unmapped: 40476672 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:47.241764+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125435904 unmapped: 40476672 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 165 heartbeat osd_stat(store_statfs(0x1b82d8000/0x0/0x1bfc00000, data 0x2ea9dea/0x2fb4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.325178146s of 10.849090576s, submitted: 140
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5551b000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 165 ms_handle_reset con 0x55ff5551b000 session 0x55ff54351680
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1405510 data_alloc: 301989888 data_used: 13160448
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:48.241923+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125468672 unmapped: 40443904 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 165 handle_osd_map epochs [165,166], i have 165, src has [1,166]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff548fb000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff548fa400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 166 ms_handle_reset con 0x55ff548fa400 session 0x55ff5430b4a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 166 heartbeat osd_stat(store_statfs(0x1b82d2000/0x0/0x1bfc00000, data 0x2eac2b5/0x2fbb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:49.242128+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125493248 unmapped: 40419328 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 166 handle_osd_map epochs [167,167], i have 166, src has [1,167]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 167 handle_osd_map epochs [167,167], i have 167, src has [1,167]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 167 ms_handle_reset con 0x55ff548fb000 session 0x55ff55a3f4a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 167 handle_osd_map epochs [167,167], i have 167, src has [1,167]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 167 ms_handle_reset con 0x55ff54895400 session 0x55ff5471e5a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 167 handle_osd_map epochs [167,168], i have 167, src has [1,168]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:50.242287+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125542400 unmapped: 40370176 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 168 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 168 handle_osd_map epochs [168,169], i have 168, src has [1,169]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 169 handle_osd_map epochs [169,169], i have 169, src has [1,169]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:51.242420+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125542400 unmapped: 40370176 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 169 handle_osd_map epochs [168,169], i have 169, src has [1,169]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 169 ms_handle_reset con 0x55ff5354a000 session 0x55ff57432780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54735400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 169 ms_handle_reset con 0x55ff54735400 session 0x55ff544c7680
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:52.242592+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125542400 unmapped: 40370176 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 169 ms_handle_reset con 0x55ff54894800 session 0x55ff557e1c20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1420167 data_alloc: 301989888 data_used: 13189120
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:53.242763+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125542400 unmapped: 40370176 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54735400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 169 ms_handle_reset con 0x55ff54735400 session 0x55ff557e1680
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 169 handle_osd_map epochs [170,170], i have 169, src has [1,170]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 170 ms_handle_reset con 0x55ff5354a000 session 0x55ff52adb0e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 170 ms_handle_reset con 0x55ff54894800 session 0x55ff5455f680
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:54.242965+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125607936 unmapped: 40304640 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 170 heartbeat osd_stat(store_statfs(0x1b82c2000/0x0/0x1bfc00000, data 0x2eb51c6/0x2fcb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 170 handle_osd_map epochs [171,171], i have 170, src has [1,171]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 171 handle_osd_map epochs [171,171], i have 171, src has [1,171]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 171 handle_osd_map epochs [171,172], i have 171, src has [1,172]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:55.243089+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 172 handle_osd_map epochs [172,172], i have 172, src has [1,172]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125632512 unmapped: 40280064 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 172 handle_osd_map epochs [172,172], i have 172, src has [1,172]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 172 ms_handle_reset con 0x55ff54895400 session 0x55ff54322d20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff548fb000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 172 handle_osd_map epochs [172,172], i have 172, src has [1,172]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 172 heartbeat osd_stat(store_statfs(0x1b82b7000/0x0/0x1bfc00000, data 0x2eb98ce/0x2fd3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 172 handle_osd_map epochs [172,173], i have 172, src has [1,173]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 172 ms_handle_reset con 0x55ff548fb000 session 0x55ff54322000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 173 handle_osd_map epochs [173,173], i have 173, src has [1,173]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 173 handle_osd_map epochs [173,173], i have 173, src has [1,173]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:56.243206+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54735400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 173 handle_osd_map epochs [173,173], i have 173, src has [1,173]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 125689856 unmapped: 40222720 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 173 ms_handle_reset con 0x55ff54735400 session 0x55ff52724780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 173 ms_handle_reset con 0x55ff54894800 session 0x55ff545c2780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 173 ms_handle_reset con 0x55ff5354a000 session 0x55ff542b52c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54895400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:57.243389+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 126648320 unmapped: 39264256 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5551b000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 173 ms_handle_reset con 0x55ff5551b000 session 0x55ff527254a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.194399834s of 10.010549545s, submitted: 238
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1603667 data_alloc: 301989888 data_used: 13201408
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 173 handle_osd_map epochs [174,174], i have 173, src has [1,174]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 174 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff548fa800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:58.243577+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 174 ms_handle_reset con 0x55ff548fa800 session 0x55ff52725a40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 126705664 unmapped: 39206912 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 174 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 174 handle_osd_map epochs [175,175], i have 174, src has [1,175]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 175 handle_osd_map epochs [175,175], i have 175, src has [1,175]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff548fa800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:59.243765+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 135127040 unmapped: 30785536 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 175 heartbeat osd_stat(store_statfs(0x1b64ee000/0x0/0x1bfc00000, data 0x4c7f49a/0x4d9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 175 ms_handle_reset con 0x55ff5354a000 session 0x55ff542401e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 175 handle_osd_map epochs [176,176], i have 175, src has [1,176]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 176 handle_osd_map epochs [176,177], i have 176, src has [1,177]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54735400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:00.243886+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 135307264 unmapped: 30605312 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 177 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 177 ms_handle_reset con 0x55ff54735400 session 0x55ff5471c1e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 177 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5551b000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 177 ms_handle_reset con 0x55ff5551b000 session 0x55ff545701e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:01.244043+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 128131072 unmapped: 37781504 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 177 handle_osd_map epochs [177,178], i have 177, src has [1,178]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 178 handle_osd_map epochs [178,178], i have 178, src has [1,178]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 178 ms_handle_reset con 0x55ff54894800 session 0x55ff5471f4a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:02.244221+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 136577024 unmapped: 29335552 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 178 ms_handle_reset con 0x55ff548fa800 session 0x55ff545c32c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 178 handle_osd_map epochs [178,178], i have 178, src has [1,178]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 178 handle_osd_map epochs [178,178], i have 178, src has [1,178]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 178 heartbeat osd_stat(store_statfs(0x1b3eac000/0x0/0x1bfc00000, data 0x72b8f40/0x73df000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [0,0,1])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2004216 data_alloc: 301989888 data_used: 13230080
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 178 heartbeat osd_stat(store_statfs(0x1b3eac000/0x0/0x1bfc00000, data 0x72b8f40/0x73df000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 178 handle_osd_map epochs [179,179], i have 178, src has [1,179]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:03.244405+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 179 ms_handle_reset con 0x55ff5354a400 session 0x55ff5471e780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 136577024 unmapped: 29335552 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:04.244559+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 128229376 unmapped: 37683200 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 179 ms_handle_reset con 0x55ff5354a000 session 0x55ff545c30e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:05.244816+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 128335872 unmapped: 37576704 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54735400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 179 handle_osd_map epochs [180,180], i have 179, src has [1,180]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 180 ms_handle_reset con 0x55ff54735400 session 0x55ff545c2960
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5551b000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 180 handle_osd_map epochs [180,180], i have 180, src has [1,180]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 180 ms_handle_reset con 0x55ff5551b000 session 0x55ff54323680
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554c9000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 180 handle_osd_map epochs [180,180], i have 180, src has [1,180]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54935400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55688000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:06.245040+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 135856128 unmapped: 30056448 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 180 ms_handle_reset con 0x55ff554c9000 session 0x55ff54323e00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 180 ms_handle_reset con 0x55ff54935400 session 0x55ff542b50e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 180 heartbeat osd_stat(store_statfs(0x1b16a4000/0x0/0x1bfc00000, data 0x9abd682/0x9be9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [0,0,0,0,0,0,0,0,1])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:07.245222+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 129630208 unmapped: 36282368 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 180 handle_osd_map epochs [180,181], i have 180, src has [1,181]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 181 handle_osd_map epochs [181,181], i have 181, src has [1,181]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 181 handle_osd_map epochs [181,181], i have 181, src has [1,181]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 181 ms_handle_reset con 0x55ff55688000 session 0x55ff554d90e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.978422165s of 10.070631027s, submitted: 174
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 181 ms_handle_reset con 0x55ff5354a000 session 0x55ff543225a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2394556 data_alloc: 301989888 data_used: 13246464
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 181 ms_handle_reset con 0x55ff54894800 session 0x55ff545c32c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:08.245379+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 138092544 unmapped: 27820032 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 181 heartbeat osd_stat(store_statfs(0x1b0214000/0x0/0x1bfc00000, data 0xaf48b04/0xb078000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:09.245541+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 129728512 unmapped: 36184064 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 181 handle_osd_map epochs [182,182], i have 181, src has [1,182]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 182 handle_osd_map epochs [182,182], i have 182, src has [1,182]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55688c00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:10.245697+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 182 ms_handle_reset con 0x55ff55688c00 session 0x55ff5471c1e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 129843200 unmapped: 36069376 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:11.245873+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 138305536 unmapped: 27607040 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 182 handle_osd_map epochs [182,183], i have 182, src has [1,183]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:12.246104+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 183 handle_osd_map epochs [183,183], i have 183, src has [1,183]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 183 handle_osd_map epochs [183,183], i have 183, src has [1,183]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 183 ms_handle_reset con 0x55ff54894800 session 0x55ff574325a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54935400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 129925120 unmapped: 35987456 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 183 handle_osd_map epochs [183,183], i have 183, src has [1,183]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 183 ms_handle_reset con 0x55ff5354a000 session 0x55ff52724780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 183 ms_handle_reset con 0x55ff54935400 session 0x55ff56da81e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 183 handle_osd_map epochs [184,184], i have 183, src has [1,184]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2532555 data_alloc: 301989888 data_used: 13271040
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55689000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 184 ms_handle_reset con 0x55ff55689000 session 0x55ff527254a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 184 handle_osd_map epochs [184,184], i have 184, src has [1,184]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:13.246263+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55689400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 184 handle_osd_map epochs [184,184], i have 184, src has [1,184]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 130007040 unmapped: 35905536 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 184 heartbeat osd_stat(store_statfs(0x1aeccd000/0x0/0x1bfc00000, data 0xc49120d/0xc5c0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55688000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 184 ms_handle_reset con 0x55ff55689400 session 0x55ff52725a40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:14.246415+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 129720320 unmapped: 36192256 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 184 handle_osd_map epochs [185,185], i have 184, src has [1,185]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 185 ms_handle_reset con 0x55ff55688000 session 0x55ff56da85a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 185 handle_osd_map epochs [185,185], i have 185, src has [1,185]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:15.246572+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 129753088 unmapped: 36159488 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55689400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 185 handle_osd_map epochs [186,186], i have 185, src has [1,186]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 186 handle_osd_map epochs [186,186], i have 186, src has [1,186]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 186 handle_osd_map epochs [186,186], i have 186, src has [1,186]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 186 ms_handle_reset con 0x55ff55689400 session 0x55ff542b52c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:16.246739+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 138231808 unmapped: 27680768 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:17.246870+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 129966080 unmapped: 35946496 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.199987411s of 10.006847382s, submitted: 134
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2874107 data_alloc: 301989888 data_used: 13262848
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:18.247054+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 129916928 unmapped: 35995648 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 186 heartbeat osd_stat(store_statfs(0x1aba83000/0x0/0x1bfc00000, data 0xf6d8cd3/0xf80b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:19.247196+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 129957888 unmapped: 35954688 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:20.247322+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 186 handle_osd_map epochs [187,187], i have 186, src has [1,187]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 138436608 unmapped: 27475968 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 187 handle_osd_map epochs [187,187], i have 187, src has [1,187]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 187 handle_osd_map epochs [187,187], i have 187, src has [1,187]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:21.247501+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Got map version 46
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 130416640 unmapped: 35495936 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:22.247708+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 130523136 unmapped: 35389440 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 187 ms_handle_reset con 0x55ff5354a000 session 0x55ff53686f00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3264893 data_alloc: 301989888 data_used: 13279232
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:23.247897+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 139083776 unmapped: 26828800 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:24.248052+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 139214848 unmapped: 26697728 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 187 heartbeat osd_stat(store_statfs(0x1a6a78000/0x0/0x1bfc00000, data 0x146e104a/0x14816000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [0,0,0,0,0,0,1])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:25.248171+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 139313152 unmapped: 26599424 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:26.248690+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 131104768 unmapped: 34807808 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:27.248889+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 139771904 unmapped: 26140672 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3925257 data_alloc: 301989888 data_used: 13279232
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:28.249097+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.299388885s of 10.604496956s, submitted: 77
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 131383296 unmapped: 34529280 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 187 heartbeat osd_stat(store_statfs(0x1a226f000/0x0/0x1bfc00000, data 0x18ee88bb/0x1901f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:29.249258+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 187 ms_handle_reset con 0x55ff54894800 session 0x55ff557e1c20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54935400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 187 ms_handle_reset con 0x55ff54935400 session 0x55ff557e05a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 139837440 unmapped: 26075136 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54935400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 187 ms_handle_reset con 0x55ff54935400 session 0x55ff52adb0e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 187 ms_handle_reset con 0x55ff5354a000 session 0x55ff57433e00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:30.249392+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 187 ms_handle_reset con 0x55ff54894800 session 0x55ff544c7680
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55688000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 187 ms_handle_reset con 0x55ff55688000 session 0x55ff53683c20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55689400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 187 ms_handle_reset con 0x55ff55689400 session 0x55ff54575a40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 134094848 unmapped: 31817728 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55689400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 187 ms_handle_reset con 0x55ff55689400 session 0x55ff554d9860
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5354a000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 187 ms_handle_reset con 0x55ff5354a000 session 0x55ff522d3a40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:31.249546+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 133922816 unmapped: 31989760 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 187 heartbeat osd_stat(store_statfs(0x1a0bb3000/0x0/0x1bfc00000, data 0x1a5a4a43/0x1a6db000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:32.249732+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: from='client.69740 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: from='client.59755 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1340007337' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/998719553' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3437061184' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2354334774' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/367041026' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2033329145' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2676165965' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/3830129038' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/641110482' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2982477570' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2259440631' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/4237691921' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/3050153649' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/3319289045' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 133988352 unmapped: 31924224 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Got map version 47
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4268082 data_alloc: 301989888 data_used: 13279232
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:33.249877+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 134283264 unmapped: 31629312 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 187 heartbeat osd_stat(store_statfs(0x19ebae000/0x0/0x1bfc00000, data 0x1c5aa08c/0x1c6e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:34.250055+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 134414336 unmapped: 31498240 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:35.250250+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 134438912 unmapped: 31473664 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:36.250448+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 134512640 unmapped: 31399936 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:37.250644+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 135667712 unmapped: 30244864 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4601726 data_alloc: 301989888 data_used: 13275136
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:38.250802+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 135798784 unmapped: 30113792 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 187 heartbeat osd_stat(store_statfs(0x19b207000/0x0/0x1bfc00000, data 0x1edaebea/0x1eee7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.339087486s of 10.388599396s, submitted: 80
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:39.250971+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 187 handle_osd_map epochs [188,188], i have 187, src has [1,188]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 187 handle_osd_map epochs [188,188], i have 188, src has [1,188]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 135913472 unmapped: 29999104 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 188 ms_handle_reset con 0x55ff54895400 session 0x55ff542b5c20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 188 handle_osd_map epochs [188,188], i have 188, src has [1,188]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:40.251119+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 135962624 unmapped: 29949952 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54894800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 188 heartbeat osd_stat(store_statfs(0x1999fd000/0x0/0x1bfc00000, data 0x205b5100/0x206ef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:41.251309+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 188 handle_osd_map epochs [189,189], i have 188, src has [1,189]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 136011776 unmapped: 29900800 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54935400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 189 ms_handle_reset con 0x55ff54935400 session 0x55ff52725860
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55688000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55689000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:42.251490+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 189 ms_handle_reset con 0x55ff54894800 session 0x55ff543232c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 135921664 unmapped: 29990912 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1720576 data_alloc: 301989888 data_used: 13295616
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:43.251654+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 135921664 unmapped: 29990912 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:44.251827+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 136609792 unmapped: 29302784 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:45.251969+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 136626176 unmapped: 29286400 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 189 heartbeat osd_stat(store_statfs(0x1b61cb000/0x0/0x1bfc00000, data 0x3de7409/0x3f23000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:46.252139+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 137101312 unmapped: 28811264 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:47.252322+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 137101312 unmapped: 28811264 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 189 handle_osd_map epochs [190,190], i have 189, src has [1,190]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1766573 data_alloc: 301989888 data_used: 18698240
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:48.252485+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 190 handle_osd_map epochs [190,191], i have 190, src has [1,191]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 137150464 unmapped: 28762112 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:49.252641+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 191 handle_osd_map epochs [190,192], i have 191, src has [1,192]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.160500526s of 10.798882484s, submitted: 145
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 137166848 unmapped: 28745728 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 192 heartbeat osd_stat(store_statfs(0x1b61be000/0x0/0x1bfc00000, data 0x3df0173/0x3f2e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:50.252796+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 192 handle_osd_map epochs [193,193], i have 192, src has [1,193]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 137289728 unmapped: 28622848 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 193 handle_osd_map epochs [193,193], i have 193, src has [1,193]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54526400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 193 ms_handle_reset con 0x55ff54526400 session 0x55ff574332c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:51.252949+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 193 heartbeat osd_stat(store_statfs(0x1b61b0000/0x0/0x1bfc00000, data 0x3dfb698/0x3f3e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 137338880 unmapped: 28573696 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 193 handle_osd_map epochs [194,194], i have 193, src has [1,194]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554ca400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 194 ms_handle_reset con 0x55ff554ca400 session 0x55ff56c7d2c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:52.253131+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 137338880 unmapped: 28573696 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 194 handle_osd_map epochs [195,195], i have 194, src has [1,195]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1784243 data_alloc: 301989888 data_used: 18710528
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 195 handle_osd_map epochs [195,195], i have 195, src has [1,195]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:53.253249+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 137355264 unmapped: 28557312 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554cbc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 195 ms_handle_reset con 0x55ff554cbc00 session 0x55ff555ce5a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:54.253377+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 195 handle_osd_map epochs [195,196], i have 195, src has [1,196]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 137388032 unmapped: 28524544 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 196 handle_osd_map epochs [194,196], i have 196, src has [1,196]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 196 handle_osd_map epochs [196,196], i have 196, src has [1,196]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:55.254200+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146046976 unmapped: 19865600 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:56.254394+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 143712256 unmapped: 22200320 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:57.254561+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 196 heartbeat osd_stat(store_statfs(0x1b4d20000/0x0/0x1bfc00000, data 0x5286bb8/0x53cd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145793024 unmapped: 20119552 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1972653 data_alloc: 301989888 data_used: 20262912
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:58.254702+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145833984 unmapped: 20078592 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:59.254824+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145334272 unmapped: 20578304 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.738337517s of 10.652186394s, submitted: 289
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:00.254966+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145342464 unmapped: 20570112 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 196 handle_osd_map epochs [197,197], i have 196, src has [1,197]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:01.255128+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145358848 unmapped: 20553728 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:02.255349+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145358848 unmapped: 20553728 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1967691 data_alloc: 301989888 data_used: 20275200
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 197 heartbeat osd_stat(store_statfs(0x1b4838000/0x0/0x1bfc00000, data 0x536db41/0x54b6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:03.255493+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145506304 unmapped: 20406272 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:04.255620+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145506304 unmapped: 20406272 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:05.255759+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145506304 unmapped: 20406272 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:06.255860+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145522688 unmapped: 20389888 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:07.256086+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554cdc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145522688 unmapped: 20389888 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1979819 data_alloc: 301989888 data_used: 20279296
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:08.256253+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 197 handle_osd_map epochs [198,198], i have 197, src has [1,198]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 198 ms_handle_reset con 0x55ff554cdc00 session 0x55ff555ceb40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145588224 unmapped: 20324352 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 198 heartbeat osd_stat(store_statfs(0x1b4817000/0x0/0x1bfc00000, data 0x538b1bd/0x54d6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5559ac00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:09.256392+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 198 handle_osd_map epochs [199,199], i have 198, src has [1,199]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5489bc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 199 ms_handle_reset con 0x55ff5489bc00 session 0x55ff56f94b40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145686528 unmapped: 20226048 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 199 ms_handle_reset con 0x55ff5559ac00 session 0x55ff542b4b40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54526400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 7.946992874s of 10.000387192s, submitted: 169
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:10.256537+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 199 handle_osd_map epochs [199,200], i have 199, src has [1,200]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 200 ms_handle_reset con 0x55ff54526400 session 0x55ff554cfa40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5489bc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 200 handle_osd_map epochs [200,200], i have 200, src has [1,200]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554ca400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 200 ms_handle_reset con 0x55ff554ca400 session 0x55ff54241860
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145604608 unmapped: 20307968 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 200 handle_osd_map epochs [200,200], i have 200, src has [1,200]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:11.256730+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 200 heartbeat osd_stat(store_statfs(0x1b4808000/0x0/0x1bfc00000, data 0x5395ee4/0x54e6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 200 handle_osd_map epochs [200,201], i have 200, src has [1,201]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 201 handle_osd_map epochs [201,201], i have 201, src has [1,201]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 201 handle_osd_map epochs [201,201], i have 201, src has [1,201]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146685952 unmapped: 19226624 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 201 ms_handle_reset con 0x55ff5489bc00 session 0x55ff557e0b40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554cbc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554cdc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 201 ms_handle_reset con 0x55ff554cdc00 session 0x55ff542b4f00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 201 heartbeat osd_stat(store_statfs(0x1b4808000/0x0/0x1bfc00000, data 0x5395ee4/0x54e6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:12.256893+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 201 handle_osd_map epochs [201,202], i have 201, src has [1,202]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145924096 unmapped: 19988480 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 202 ms_handle_reset con 0x55ff554cbc00 session 0x55ff54571e00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2000538 data_alloc: 301989888 data_used: 20303872
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:13.257083+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54526400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145924096 unmapped: 19988480 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 202 ms_handle_reset con 0x55ff54526400 session 0x55ff53661860
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5489bc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 202 ms_handle_reset con 0x55ff5489bc00 session 0x55ff53661680
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 202 heartbeat osd_stat(store_statfs(0x1b47f5000/0x0/0x1bfc00000, data 0x53a46e5/0x54f9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:14.257220+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 202 handle_osd_map epochs [202,203], i have 202, src has [1,203]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 203 handle_osd_map epochs [203,203], i have 203, src has [1,203]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 203 handle_osd_map epochs [203,203], i have 203, src has [1,203]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145989632 unmapped: 19922944 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554ca400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5559ac00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 203 handle_osd_map epochs [203,203], i have 203, src has [1,203]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 203 ms_handle_reset con 0x55ff554ca400 session 0x55ff542405a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 203 handle_osd_map epochs [203,203], i have 203, src has [1,203]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:15.257361+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 203 handle_osd_map epochs [203,204], i have 203, src has [1,204]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 204 handle_osd_map epochs [204,204], i have 204, src has [1,204]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 204 handle_osd_map epochs [204,204], i have 204, src has [1,204]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 204 ms_handle_reset con 0x55ff5559ac00 session 0x55ff5471f2c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146022400 unmapped: 19890176 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 204 handle_osd_map epochs [204,204], i have 204, src has [1,204]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54526400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:16.257535+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 204 ms_handle_reset con 0x55ff54526400 session 0x55ff55a3e780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 204 handle_osd_map epochs [205,205], i have 204, src has [1,205]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146038784 unmapped: 19873792 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 205 heartbeat osd_stat(store_statfs(0x1b47df000/0x0/0x1bfc00000, data 0x53b4008/0x550e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:17.257678+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5489bc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 205 heartbeat osd_stat(store_statfs(0x1b47df000/0x0/0x1bfc00000, data 0x53b4008/0x550e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 205 handle_osd_map epochs [206,206], i have 205, src has [1,206]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 206 handle_osd_map epochs [206,206], i have 206, src has [1,206]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 206 ms_handle_reset con 0x55ff5489bc00 session 0x55ff544c6000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 147120128 unmapped: 18792448 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2018669 data_alloc: 301989888 data_used: 20316160
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:18.257852+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554ca400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 206 ms_handle_reset con 0x55ff554ca400 session 0x55ff52ad9c20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 206 handle_osd_map epochs [206,206], i have 206, src has [1,206]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554cbc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 147120128 unmapped: 18792448 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:19.258045+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 147136512 unmapped: 18776064 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 206 ms_handle_reset con 0x55ff554cbc00 session 0x55ff55a3f2c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 206 handle_osd_map epochs [207,207], i have 206, src has [1,207]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.400495529s of 10.005897522s, submitted: 211
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:20.258180+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 147144704 unmapped: 18767872 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff56ecd400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 207 ms_handle_reset con 0x55ff56ecd400 session 0x55ff53682000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 207 handle_osd_map epochs [206,207], i have 207, src has [1,207]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54526400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 207 ms_handle_reset con 0x55ff54526400 session 0x55ff531c1860
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:21.258313+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5489bc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 207 ms_handle_reset con 0x55ff5489bc00 session 0x55ff57432d20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 207 handle_osd_map epochs [208,208], i have 207, src has [1,208]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554cb800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554cbc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 208 ms_handle_reset con 0x55ff554cb800 session 0x55ff5455e960
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 208 handle_osd_map epochs [208,208], i have 208, src has [1,208]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 147161088 unmapped: 18751488 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:22.258490+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 208 handle_osd_map epochs [209,209], i have 208, src has [1,209]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 209 handle_osd_map epochs [209,209], i have 209, src has [1,209]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 209 handle_osd_map epochs [209,209], i have 209, src has [1,209]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 209 ms_handle_reset con 0x55ff554cbc00 session 0x55ff5471f0e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 209 handle_osd_map epochs [209,209], i have 209, src has [1,209]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 147185664 unmapped: 18726912 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:23.258693+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2032055 data_alloc: 301989888 data_used: 20316160
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554ca400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 209 heartbeat osd_stat(store_statfs(0x1b579a000/0x0/0x1bfc00000, data 0x53d09da/0x5531000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 148217856 unmapped: 17694720 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 209 handle_osd_map epochs [210,210], i have 209, src has [1,210]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 210 handle_osd_map epochs [210,210], i have 210, src has [1,210]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 210 ms_handle_reset con 0x55ff554ca400 session 0x55ff5484de00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 210 handle_osd_map epochs [210,210], i have 210, src has [1,210]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 210 handle_osd_map epochs [210,210], i have 210, src has [1,210]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:24.258858+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54526400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 148291584 unmapped: 17620992 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 210 handle_osd_map epochs [211,211], i have 210, src has [1,211]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 211 handle_osd_map epochs [211,211], i have 211, src has [1,211]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 211 handle_osd_map epochs [211,211], i have 211, src has [1,211]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:25.259004+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 211 ms_handle_reset con 0x55ff54526400 session 0x55ff56f95a40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5489bc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 211 ms_handle_reset con 0x55ff5489bc00 session 0x55ff56da9e00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 148340736 unmapped: 17571840 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:26.259174+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 211 handle_osd_map epochs [211,212], i have 211, src has [1,212]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 148357120 unmapped: 17555456 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:27.259325+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 212 handle_osd_map epochs [212,212], i have 212, src has [1,212]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 212 handle_osd_map epochs [212,212], i have 212, src has [1,212]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 212 handle_osd_map epochs [212,212], i have 212, src has [1,212]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 148447232 unmapped: 17465344 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 212 ms_handle_reset con 0x55ff55688000 session 0x55ff56da83c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 212 ms_handle_reset con 0x55ff55689000 session 0x55ff56da8960
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:28.259480+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2033543 data_alloc: 301989888 data_used: 20332544
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554ca400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 142311424 unmapped: 23601152 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 212 ms_handle_reset con 0x55ff554ca400 session 0x55ff54323c20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:29.259637+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 212 heartbeat osd_stat(store_statfs(0x1b71f9000/0x0/0x1bfc00000, data 0x2fbfb04/0x3120000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 142311424 unmapped: 23601152 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:30.259815+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 142311424 unmapped: 23601152 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.371277809s of 10.878194809s, submitted: 212
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:31.260206+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 144424960 unmapped: 21487616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 212 handle_osd_map epochs [213,213], i have 212, src has [1,213]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 213 handle_osd_map epochs [213,213], i have 213, src has [1,213]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:32.260402+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aea800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 213 ms_handle_reset con 0x55ff55aea800 session 0x55ff55a3fa40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 144392192 unmapped: 21520384 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:33.260602+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 213 handle_osd_map epochs [213,213], i have 213, src has [1,213]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1717267 data_alloc: 301989888 data_used: 13426688
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aebc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 213 ms_handle_reset con 0x55ff55aebc00 session 0x55ff55a3fc20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 144424960 unmapped: 21487616 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 213 heartbeat osd_stat(store_statfs(0x1b6a1f000/0x0/0x1bfc00000, data 0x2faafd7/0x310d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:34.260740+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 144367616 unmapped: 21544960 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 213 handle_osd_map epochs [214,214], i have 213, src has [1,214]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 214 handle_osd_map epochs [214,214], i have 214, src has [1,214]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:35.260910+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 144392192 unmapped: 21520384 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:36.261095+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 214 handle_osd_map epochs [214,215], i have 214, src has [1,215]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 215 heartbeat osd_stat(store_statfs(0x1b69ff000/0x0/0x1bfc00000, data 0x2fcabde/0x312e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 144400384 unmapped: 21512192 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 215 handle_osd_map epochs [216,216], i have 215, src has [1,216]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 215 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:37.261247+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 216 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 216 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 216 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 144506880 unmapped: 21405696 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:38.261451+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1732854 data_alloc: 301989888 data_used: 13426688
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 144506880 unmapped: 21405696 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:39.261620+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b69db000/0x0/0x1bfc00000, data 0x2fe9e57/0x3150000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [0,0,0,0,0,1])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 144531456 unmapped: 21381120 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:40.261767+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 144580608 unmapped: 21331968 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:41.261921+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.597072601s of 10.189755440s, submitted: 217
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 216 handle_osd_map epochs [217,218], i have 216, src has [1,218]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 218 heartbeat osd_stat(store_statfs(0x1b69b9000/0x0/0x1bfc00000, data 0x300e507/0x3175000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 143589376 unmapped: 22323200 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:42.262156+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 143605760 unmapped: 22306816 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:43.262293+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1740448 data_alloc: 301989888 data_used: 13438976
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 143646720 unmapped: 22265856 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:44.262509+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 143671296 unmapped: 22241280 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 218 heartbeat osd_stat(store_statfs(0x1b699d000/0x0/0x1bfc00000, data 0x3027398/0x3191000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:45.262708+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 143671296 unmapped: 22241280 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 218 handle_osd_map epochs [219,219], i have 218, src has [1,219]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b6995000/0x0/0x1bfc00000, data 0x302f2df/0x3199000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:46.262846+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 142999552 unmapped: 22913024 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 219 handle_osd_map epochs [220,220], i have 219, src has [1,220]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:47.263057+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 220 handle_osd_map epochs [220,220], i have 220, src has [1,220]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 144179200 unmapped: 21733376 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:48.263242+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1751130 data_alloc: 301989888 data_used: 13451264
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 144179200 unmapped: 21733376 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeb400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 220 ms_handle_reset con 0x55ff55aeb400 session 0x55ff52ad8000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 220 heartbeat osd_stat(store_statfs(0x1b696f000/0x0/0x1bfc00000, data 0x3052f7e/0x31be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeac00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:49.263427+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 220 ms_handle_reset con 0x55ff55aeac00 session 0x55ff52ad9860
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54c96400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 220 ms_handle_reset con 0x55ff54c96400 session 0x55ff53686b40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 144187392 unmapped: 21725184 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 220 heartbeat osd_stat(store_statfs(0x1b6968000/0x0/0x1bfc00000, data 0x3058f4e/0x31c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:50.263592+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54c96400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 220 ms_handle_reset con 0x55ff54c96400 session 0x55ff53660780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 144187392 unmapped: 21725184 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:51.263745+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aea800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 220 handle_osd_map epochs [221,221], i have 220, src has [1,221]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.103576660s of 10.469866753s, submitted: 193
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 221 ms_handle_reset con 0x55ff55aea800 session 0x55ff52adfe00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 144203776 unmapped: 21708800 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeac00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 221 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 221 ms_handle_reset con 0x55ff55aeac00 session 0x55ff54350f00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:52.263930+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeb400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 144211968 unmapped: 21700608 heap: 165912576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 221 ms_handle_reset con 0x55ff55aeb400 session 0x55ff5471dc20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:53.264087+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 221 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1768353 data_alloc: 301989888 data_used: 13467648
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aebc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 221 ms_handle_reset con 0x55ff55aebc00 session 0x55ff52adfc20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145285120 unmapped: 24829952 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:54.264270+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 221 heartbeat osd_stat(store_statfs(0x1b5a1f000/0x0/0x1bfc00000, data 0x3f9fc49/0x410f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Got map version 48
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145432576 unmapped: 24682496 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:55.264423+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145530880 unmapped: 24584192 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:56.264798+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 221 handle_osd_map epochs [222,222], i have 221, src has [1,222]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54c96400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 222 ms_handle_reset con 0x55ff54c96400 session 0x55ff52ada3c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145416192 unmapped: 24698880 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:57.264966+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145416192 unmapped: 24698880 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:58.265153+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1785783 data_alloc: 301989888 data_used: 13479936
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145416192 unmapped: 24698880 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:59.265348+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 222 handle_osd_map epochs [223,223], i have 222, src has [1,223]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 222 handle_osd_map epochs [223,223], i have 223, src has [1,223]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 145424384 unmapped: 24690688 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 223 handle_osd_map epochs [223,223], i have 223, src has [1,223]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 223 heartbeat osd_stat(store_statfs(0x1b6911000/0x0/0x1bfc00000, data 0x30a57f0/0x321c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:00.265505+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 223 heartbeat osd_stat(store_statfs(0x1b6911000/0x0/0x1bfc00000, data 0x30a57f0/0x321c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146472960 unmapped: 23642112 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:01.265700+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aea800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 223 ms_handle_reset con 0x55ff55aea800 session 0x55ff557e0b40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.081674576s of 10.037590981s, submitted: 238
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146472960 unmapped: 23642112 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:02.265962+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 223 handle_osd_map epochs [224,224], i have 223, src has [1,224]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146505728 unmapped: 23609344 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeac00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 224 handle_osd_map epochs [224,224], i have 224, src has [1,224]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 224 ms_handle_reset con 0x55ff55aeac00 session 0x55ff531c1a40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:03.266123+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeb400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 224 ms_handle_reset con 0x55ff55aeb400 session 0x55ff554cef00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1797447 data_alloc: 301989888 data_used: 13479936
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 224 handle_osd_map epochs [225,225], i have 224, src has [1,225]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 225 handle_osd_map epochs [225,225], i have 225, src has [1,225]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146530304 unmapped: 23584768 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 225 heartbeat osd_stat(store_statfs(0x1b68f9000/0x0/0x1bfc00000, data 0x30ba894/0x3233000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:04.266253+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 14K writes, 57K keys, 14K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.01 MB/s
                                                          Cumulative WAL: 14K writes, 4566 syncs, 3.17 writes per sync, written: 0.04 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 9397 writes, 34K keys, 9397 commit groups, 1.0 writes per commit group, ingest: 26.84 MB, 0.04 MB/s
                                                          Interval WAL: 9397 writes, 3883 syncs, 2.42 writes per sync, written: 0.03 GB, 0.04 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 225 handle_osd_map epochs [226,226], i have 225, src has [1,226]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146546688 unmapped: 23568384 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 226 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:05.266435+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 226 heartbeat osd_stat(store_statfs(0x1b68f4000/0x0/0x1bfc00000, data 0x30bccb6/0x3237000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55c5dc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 226 handle_osd_map epochs [227,227], i have 226, src has [1,227]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 227 handle_osd_map epochs [227,227], i have 227, src has [1,227]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 227 ms_handle_reset con 0x55ff55c5dc00 session 0x55ff54574960
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 227 handle_osd_map epochs [227,227], i have 227, src has [1,227]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146546688 unmapped: 23568384 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:06.266600+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54c96400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 227 ms_handle_reset con 0x55ff54c96400 session 0x55ff557e0f00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aea800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146546688 unmapped: 23568384 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeac00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 227 ms_handle_reset con 0x55ff55aeac00 session 0x55ff54323860
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:07.266774+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeb400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 227 ms_handle_reset con 0x55ff55aeb400 session 0x55ff5471c960
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 227 handle_osd_map epochs [227,228], i have 227, src has [1,228]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 228 handle_osd_map epochs [228,228], i have 228, src has [1,228]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 228 ms_handle_reset con 0x55ff55aea800 session 0x55ff557e05a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 228 handle_osd_map epochs [228,228], i have 228, src has [1,228]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146587648 unmapped: 23527424 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 228 heartbeat osd_stat(store_statfs(0x1b68ef000/0x0/0x1bfc00000, data 0x30c1667/0x323c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:08.266916+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1807509 data_alloc: 301989888 data_used: 13496320
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146587648 unmapped: 23527424 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:09.267094+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55c5cc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 228 handle_osd_map epochs [229,229], i have 228, src has [1,229]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 229 ms_handle_reset con 0x55ff55c5cc00 session 0x55ff545743c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 229 heartbeat osd_stat(store_statfs(0x1b68ec000/0x0/0x1bfc00000, data 0x30c3ac3/0x3241000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146595840 unmapped: 23519232 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54c96400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 229 ms_handle_reset con 0x55ff54c96400 session 0x55ff545745a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:10.267225+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aea800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 229 handle_osd_map epochs [230,230], i have 229, src has [1,230]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 230 ms_handle_reset con 0x55ff55aea800 session 0x55ff522d2780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146620416 unmapped: 23494656 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:11.267430+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 230 handle_osd_map epochs [230,231], i have 230, src has [1,231]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146620416 unmapped: 23494656 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.730031967s of 10.223917007s, submitted: 121
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 231 handle_osd_map epochs [230,231], i have 231, src has [1,231]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:12.267618+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146694144 unmapped: 23420928 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 231 handle_osd_map epochs [232,232], i have 231, src has [1,232]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:13.267771+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1822059 data_alloc: 301989888 data_used: 13496320
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146694144 unmapped: 23420928 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 232 heartbeat osd_stat(store_statfs(0x1b68cb000/0x0/0x1bfc00000, data 0x30df42d/0x3261000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeac00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 232 ms_handle_reset con 0x55ff55aeac00 session 0x55ff531c14a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:14.267906+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146694144 unmapped: 23420928 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 232 handle_osd_map epochs [233,233], i have 232, src has [1,233]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:15.268059+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 233 heartbeat osd_stat(store_statfs(0x1b68cb000/0x0/0x1bfc00000, data 0x30df42d/0x3261000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [0,0,1])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146718720 unmapped: 23396352 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 233 heartbeat osd_stat(store_statfs(0x1b68b9000/0x0/0x1bfc00000, data 0x30ef4f6/0x3274000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 233 handle_osd_map epochs [234,234], i have 233, src has [1,234]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 233 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 233 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 233 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 233 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:16.268221+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeb400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 234 handle_osd_map epochs [235,235], i have 234, src has [1,235]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 235 ms_handle_reset con 0x55ff55aeb400 session 0x55ff57432b40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146808832 unmapped: 23306240 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:17.268377+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 235 handle_osd_map epochs [236,236], i have 235, src has [1,236]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146841600 unmapped: 23273472 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:18.268552+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1834587 data_alloc: 301989888 data_used: 13500416
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146849792 unmapped: 23265280 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55c5c000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 236 ms_handle_reset con 0x55ff55c5c000 session 0x55ff544c72c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:19.268719+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 236 handle_osd_map epochs [237,237], i have 236, src has [1,237]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 236 handle_osd_map epochs [237,237], i have 237, src has [1,237]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 237 handle_osd_map epochs [237,237], i have 237, src has [1,237]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54c96400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 237 ms_handle_reset con 0x55ff54c96400 session 0x55ff5455fc20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146857984 unmapped: 23257088 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:20.268867+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146874368 unmapped: 23240704 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:21.269049+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aea800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 237 heartbeat osd_stat(store_statfs(0x1b687b000/0x0/0x1bfc00000, data 0x31253df/0x32b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 237 handle_osd_map epochs [238,238], i have 237, src has [1,238]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 237 handle_osd_map epochs [238,238], i have 238, src has [1,238]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 237 handle_osd_map epochs [238,238], i have 238, src has [1,238]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 238 ms_handle_reset con 0x55ff55aea800 session 0x55ff557e1a40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146907136 unmapped: 23207936 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 238 heartbeat osd_stat(store_statfs(0x1b6878000/0x0/0x1bfc00000, data 0x312763c/0x32b5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:22.269280+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.305340767s of 10.727466583s, submitted: 134
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146907136 unmapped: 23207936 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:23.269464+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1849095 data_alloc: 301989888 data_used: 13524992
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146907136 unmapped: 23207936 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:24.269604+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 146907136 unmapped: 23207936 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:25.269793+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 147963904 unmapped: 22151168 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:26.269993+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeac00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 238 ms_handle_reset con 0x55ff55aeac00 session 0x55ff53662000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 238 handle_osd_map epochs [239,239], i have 238, src has [1,239]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 149028864 unmapped: 21086208 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:27.270247+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 239 handle_osd_map epochs [239,239], i have 239, src has [1,239]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 239 handle_osd_map epochs [239,239], i have 239, src has [1,239]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 239 heartbeat osd_stat(store_statfs(0x1b6859000/0x0/0x1bfc00000, data 0x3141d5b/0x32d4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [0,0,1])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 149037056 unmapped: 21078016 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:28.270456+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1858551 data_alloc: 301989888 data_used: 13537280
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 149159936 unmapped: 20955136 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:29.270660+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 149159936 unmapped: 20955136 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 239 heartbeat osd_stat(store_statfs(0x1b6842000/0x0/0x1bfc00000, data 0x3157b5a/0x32eb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:30.270800+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeb400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 239 ms_handle_reset con 0x55ff55aeb400 session 0x55ff545754a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 239 handle_osd_map epochs [240,240], i have 239, src has [1,240]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3667633455' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 149184512 unmapped: 20930560 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:31.271053+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 240 handle_osd_map epochs [240,240], i have 240, src has [1,240]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 240 handle_osd_map epochs [240,240], i have 240, src has [1,240]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b6841000/0x0/0x1bfc00000, data 0x3157bcb/0x32ed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [0,0,0,1])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff5489bc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 240 ms_handle_reset con 0x55ff5489bc00 session 0x55ff557e1c20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 149323776 unmapped: 20791296 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff54c96400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:32.271219+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aea800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 240 ms_handle_reset con 0x55ff55aea800 session 0x55ff536632c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 240 handle_osd_map epochs [241,241], i have 240, src has [1,241]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeac00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeb400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 241 handle_osd_map epochs [241,241], i have 241, src has [1,241]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.629171371s of 10.007884979s, submitted: 121
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 241 ms_handle_reset con 0x55ff55aeac00 session 0x55ff53663e00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 241 handle_osd_map epochs [241,241], i have 241, src has [1,241]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 241 heartbeat osd_stat(store_statfs(0x1b641d000/0x0/0x1bfc00000, data 0x31793f5/0x3310000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 149364736 unmapped: 20750336 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:33.271429+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1874959 data_alloc: 301989888 data_used: 13565952
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 241 heartbeat osd_stat(store_statfs(0x1b641d000/0x0/0x1bfc00000, data 0x31793f5/0x3310000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 241 handle_osd_map epochs [242,242], i have 241, src has [1,242]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 242 ms_handle_reset con 0x55ff55aeb400 session 0x55ff5484da40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554cbc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 242 handle_osd_map epochs [242,242], i have 242, src has [1,242]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 242 ms_handle_reset con 0x55ff554cbc00 session 0x55ff53666000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 242 ms_handle_reset con 0x55ff54c96400 session 0x55ff52ad92c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554cbc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 242 ms_handle_reset con 0x55ff554cbc00 session 0x55ff557e12c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 149463040 unmapped: 20652032 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:34.271600+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aea800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 149463040 unmapped: 20652032 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:35.271813+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 242 handle_osd_map epochs [242,243], i have 242, src has [1,243]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 243 handle_osd_map epochs [243,243], i have 243, src has [1,243]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 243 handle_osd_map epochs [243,243], i have 243, src has [1,243]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 149479424 unmapped: 20635648 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 243 handle_osd_map epochs [242,243], i have 243, src has [1,243]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:36.271968+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 243 ms_handle_reset con 0x55ff55aea800 session 0x55ff56c7d680
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b63f8000/0x0/0x1bfc00000, data 0x319bcc3/0x3335000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 243 handle_osd_map epochs [243,244], i have 243, src has [1,244]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeac00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 244 handle_osd_map epochs [244,244], i have 244, src has [1,244]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 149479424 unmapped: 20635648 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:37.272187+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 244 handle_osd_map epochs [244,245], i have 244, src has [1,245]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 245 handle_osd_map epochs [245,245], i have 245, src has [1,245]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 245 ms_handle_reset con 0x55ff55aeac00 session 0x55ff531c0b40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeb400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 149569536 unmapped: 20545536 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 245 ms_handle_reset con 0x55ff55aeb400 session 0x55ff5430a5a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:38.272341+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1883910 data_alloc: 301989888 data_used: 13578240
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 149569536 unmapped: 20545536 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554cb800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 245 ms_handle_reset con 0x55ff554cb800 session 0x55ff55a3f4a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554cbc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:39.272537+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 245 handle_osd_map epochs [245,245], i have 245, src has [1,245]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 245 ms_handle_reset con 0x55ff554cbc00 session 0x55ff544c63c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 245 heartbeat osd_stat(store_statfs(0x1b63da000/0x0/0x1bfc00000, data 0x31b9ea8/0x3354000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 149594112 unmapped: 20520960 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:40.272762+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aea800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 245 ms_handle_reset con 0x55ff55aea800 session 0x55ff54323e00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 149594112 unmapped: 20520960 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:41.273129+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 245 handle_osd_map epochs [246,246], i have 245, src has [1,246]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 149651456 unmapped: 20463616 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:42.273414+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 246 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 246 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 246 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeac00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 246 ms_handle_reset con 0x55ff55aeac00 session 0x55ff522d2d20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.327192307s of 10.033041000s, submitted: 186
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeb400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 246 ms_handle_reset con 0x55ff55aeb400 session 0x55ff5455f4a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 150798336 unmapped: 19316736 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff56ecc400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:43.273612+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1965502 data_alloc: 301989888 data_used: 13594624
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 246 ms_handle_reset con 0x55ff56ecc400 session 0x55ff5484c960
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554cbc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 246 ms_handle_reset con 0x55ff554cbc00 session 0x55ff54575860
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 151199744 unmapped: 18915328 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:44.273756+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aea800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 246 handle_osd_map epochs [246,247], i have 246, src has [1,247]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 247 handle_osd_map epochs [247,247], i have 247, src has [1,247]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 247 handle_osd_map epochs [247,247], i have 247, src has [1,247]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 247 handle_osd_map epochs [247,247], i have 247, src has [1,247]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 247 ms_handle_reset con 0x55ff55aea800 session 0x55ff55a3f860
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 151273472 unmapped: 18841600 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 247 heartbeat osd_stat(store_statfs(0x1b5bfe000/0x0/0x1bfc00000, data 0x398e379/0x3b2f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:45.273953+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 247 heartbeat osd_stat(store_statfs(0x1b5bfe000/0x0/0x1bfc00000, data 0x398e379/0x3b2f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 247 handle_osd_map epochs [248,248], i have 247, src has [1,248]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 247 handle_osd_map epochs [248,248], i have 248, src has [1,248]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 247 handle_osd_map epochs [248,248], i have 248, src has [1,248]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 247 handle_osd_map epochs [248,248], i have 248, src has [1,248]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 151298048 unmapped: 18817024 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 248 handle_osd_map epochs [248,248], i have 248, src has [1,248]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:46.274088+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeac00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 248 ms_handle_reset con 0x55ff55aeac00 session 0x55ff5430b2c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 248 handle_osd_map epochs [249,249], i have 248, src has [1,249]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeb400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 248 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 249 ms_handle_reset con 0x55ff55aeb400 session 0x55ff536870e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff56ecc000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 249 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 249 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 249 ms_handle_reset con 0x55ff56ecc000 session 0x55ff56da92c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 151494656 unmapped: 18620416 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:47.274248+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 249 heartbeat osd_stat(store_statfs(0x1b5bc8000/0x0/0x1bfc00000, data 0x39c0868/0x3b63000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 249 heartbeat osd_stat(store_statfs(0x1b5bc4000/0x0/0x1bfc00000, data 0x39c2b28/0x3b66000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 151592960 unmapped: 18522112 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:48.274427+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1973386 data_alloc: 301989888 data_used: 13606912
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 151412736 unmapped: 18702336 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:49.274632+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554cbc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 249 handle_osd_map epochs [250,250], i have 249, src has [1,250]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 250 handle_osd_map epochs [250,250], i have 250, src has [1,250]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aea800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 250 ms_handle_reset con 0x55ff55aea800 session 0x55ff54520d20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 250 ms_handle_reset con 0x55ff554cbc00 session 0x55ff554d8960
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 151437312 unmapped: 18677760 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 250 heartbeat osd_stat(store_statfs(0x1b5bac000/0x0/0x1bfc00000, data 0x39dcbb3/0x3b81000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:50.274791+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeac00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 250 ms_handle_reset con 0x55ff55aeac00 session 0x55ff554cef00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 250 handle_osd_map epochs [250,251], i have 250, src has [1,251]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 251 handle_osd_map epochs [251,251], i have 251, src has [1,251]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeb400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 251 handle_osd_map epochs [251,251], i have 251, src has [1,251]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 251 ms_handle_reset con 0x55ff55aeb400 session 0x55ff554d8d20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 151470080 unmapped: 18644992 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:51.274987+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 251 handle_osd_map epochs [251,252], i have 251, src has [1,252]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff56ecc800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 252 ms_handle_reset con 0x55ff56ecc800 session 0x55ff554ce000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 252 handle_osd_map epochs [252,252], i have 252, src has [1,252]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 151502848 unmapped: 18612224 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:52.275269+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 252 handle_osd_map epochs [252,253], i have 252, src has [1,253]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.911566734s of 10.014744759s, submitted: 321
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 151519232 unmapped: 18595840 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:53.275454+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 253 handle_osd_map epochs [253,253], i have 253, src has [1,253]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1994531 data_alloc: 301989888 data_used: 13639680
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 253 heartbeat osd_stat(store_statfs(0x1b5b96000/0x0/0x1bfc00000, data 0x39ed12d/0x3b95000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 253 handle_osd_map epochs [253,254], i have 253, src has [1,254]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 254 handle_osd_map epochs [254,254], i have 254, src has [1,254]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554cbc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 254 ms_handle_reset con 0x55ff554cbc00 session 0x55ff52adba40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 151527424 unmapped: 18587648 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:54.275599+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aea800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeac00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 254 ms_handle_reset con 0x55ff55aeac00 session 0x55ff557e1860
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 254 ms_handle_reset con 0x55ff55aea800 session 0x55ff53660960
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 151560192 unmapped: 18554880 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:55.275756+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeb400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff56ecc800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 152608768 unmapped: 17506304 heap: 170115072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:56.275908+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 254 handle_osd_map epochs [254,254], i have 254, src has [1,254]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff56eccc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 254 handle_osd_map epochs [254,255], i have 254, src has [1,255]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 255 ms_handle_reset con 0x55ff56eccc00 session 0x55ff5484d4a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 156622848 unmapped: 26099712 heap: 182722560 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:57.276093+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 152682496 unmapped: 30040064 heap: 182722560 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:58.276286+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2800447 data_alloc: 301989888 data_used: 13672448
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 152862720 unmapped: 29859840 heap: 182722560 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:59.276520+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 255 heartbeat osd_stat(store_statfs(0x1adb78000/0x0/0x1bfc00000, data 0xba051b3/0xbbb6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [0,0,0,0,1,0,3,0,1])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 157114368 unmapped: 25608192 heap: 182722560 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:00.276681+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 166625280 unmapped: 16097280 heap: 182722560 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:01.276857+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 255 heartbeat osd_stat(store_statfs(0x1a8766000/0x0/0x1bfc00000, data 0x10e16e5a/0x10fc8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [0,0,0,1,0,0,2])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 255 handle_osd_map epochs [256,256], i have 255, src has [1,256]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 255 handle_osd_map epochs [256,256], i have 256, src has [1,256]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 256 handle_osd_map epochs [256,256], i have 256, src has [1,256]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 256 handle_osd_map epochs [256,256], i have 256, src has [1,256]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 154157056 unmapped: 28565504 heap: 182722560 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:02.277071+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 7.065317154s of 10.057737350s, submitted: 357
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 154222592 unmapped: 28499968 heap: 182722560 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:03.277219+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff56ecd000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4403353 data_alloc: 301989888 data_used: 13684736
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 256 ms_handle_reset con 0x55ff56ecd000 session 0x55ff52ade000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 162832384 unmapped: 19890176 heap: 182722560 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:04.277348+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 256 heartbeat osd_stat(store_statfs(0x19d35b000/0x0/0x1bfc00000, data 0x1c22173c/0x1c3d3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [0,0,0,0,0,0,1])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554cbc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 167526400 unmapped: 15196160 heap: 182722560 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:05.277528+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 256 handle_osd_map epochs [256,257], i have 256, src has [1,257]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 165240832 unmapped: 21684224 heap: 186925056 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:06.277667+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 257 ms_handle_reset con 0x55ff554cbc00 session 0x55ff54570780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Got map version 49
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 257 heartbeat osd_stat(store_statfs(0x197721000/0x0/0x1bfc00000, data 0x21e54b66/0x2200c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [0,0,0,0,3,0,1])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 162725888 unmapped: 24199168 heap: 186925056 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:07.277829+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 166887424 unmapped: 28442624 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:08.277992+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 257 ms_handle_reset con 0x55ff56ecc800 session 0x55ff554ce780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 5964970 data_alloc: 301989888 data_used: 13697024
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 257 ms_handle_reset con 0x55ff55aeb400 session 0x55ff554cf0e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff53042c00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 257 ms_handle_reset con 0x55ff53042c00 session 0x55ff5471e780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff53042800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 257 ms_handle_reset con 0x55ff53042800 session 0x55ff56da8780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 158720000 unmapped: 36610048 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:09.278142+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff53042c00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 257 ms_handle_reset con 0x55ff53042c00 session 0x55ff536821e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 158031872 unmapped: 37298176 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:10.278283+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554cbc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeb400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 257 ms_handle_reset con 0x55ff554cbc00 session 0x55ff5368f680
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 257 heartbeat osd_stat(store_statfs(0x1b42f4000/0x0/0x1bfc00000, data 0x3a8580e/0x3c3a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 257 ms_handle_reset con 0x55ff55aeb400 session 0x55ff54570b40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff56ecc800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554c6000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 257 ms_handle_reset con 0x55ff554c6000 session 0x55ff5484d680
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 257 handle_osd_map epochs [258,258], i have 257, src has [1,258]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 258 ms_handle_reset con 0x55ff56ecc800 session 0x55ff544c6b40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 158097408 unmapped: 37232640 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:11.278444+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 258 handle_osd_map epochs [258,258], i have 258, src has [1,258]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 258 handle_osd_map epochs [258,258], i have 258, src has [1,258]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff53042c00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 258 ms_handle_reset con 0x55ff53042c00 session 0x55ff548614a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554c6000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 258 ms_handle_reset con 0x55ff554c6000 session 0x55ff52725a40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 158130176 unmapped: 37199872 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:12.278617+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 258 handle_osd_map epochs [259,259], i have 258, src has [1,259]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 6.707076550s of 10.138974190s, submitted: 759
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 259 handle_osd_map epochs [259,259], i have 259, src has [1,259]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 158146560 unmapped: 37183488 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:13.278770+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554cbc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 259 ms_handle_reset con 0x55ff554cbc00 session 0x55ff52ad8f00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff55aeb400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2198016 data_alloc: 301989888 data_used: 13721600
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff51ec8400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 259 ms_handle_reset con 0x55ff51ec8400 session 0x55ff54520f00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 259 ms_handle_reset con 0x55ff55aeb400 session 0x55ff55a3f4a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 259 handle_osd_map epochs [260,260], i have 259, src has [1,260]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 158187520 unmapped: 37142528 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:14.278917+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 260 handle_osd_map epochs [260,260], i have 260, src has [1,260]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 260 handle_osd_map epochs [260,260], i have 260, src has [1,260]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff51ec8400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 260 ms_handle_reset con 0x55ff51ec8400 session 0x55ff55f70000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 159244288 unmapped: 36085760 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:15.279065+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 260 heartbeat osd_stat(store_statfs(0x1b5684000/0x0/0x1bfc00000, data 0x3af546e/0x3caa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 159244288 unmapped: 36085760 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:16.279224+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 260 handle_osd_map epochs [261,261], i have 260, src has [1,261]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 159358976 unmapped: 35971072 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:17.279403+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff53042c00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 261 ms_handle_reset con 0x55ff53042c00 session 0x55ff55f703c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554c6000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 261 handle_osd_map epochs [261,261], i have 261, src has [1,261]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 261 handle_osd_map epochs [261,261], i have 261, src has [1,261]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 261 ms_handle_reset con 0x55ff554c6000 session 0x55ff55f70780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554cbc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:18.279569+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 159842304 unmapped: 35487744 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 261 ms_handle_reset con 0x55ff554cbc00 session 0x55ff55f71c20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2329654 data_alloc: 301989888 data_used: 13737984
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:19.279757+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 159817728 unmapped: 35512320 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff51ec9400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 261 ms_handle_reset con 0x55ff51ec9400 session 0x55ff55f70780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:20.279935+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 159817728 unmapped: 35512320 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff51ec8400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 261 ms_handle_reset con 0x55ff51ec8400 session 0x55ff55f703c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 261 handle_osd_map epochs [261,262], i have 261, src has [1,262]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 262 handle_osd_map epochs [262,262], i have 262, src has [1,262]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:21.280073+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 159907840 unmapped: 35422208 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff53042c00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 262 ms_handle_reset con 0x55ff53042c00 session 0x55ff5471da40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554c6000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 262 handle_osd_map epochs [263,263], i have 262, src has [1,263]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 263 ms_handle_reset con 0x55ff554c6000 session 0x55ff52ad8f00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 263 handle_osd_map epochs [262,263], i have 263, src has [1,263]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 263 heartbeat osd_stat(store_statfs(0x1b4631000/0x0/0x1bfc00000, data 0x4b421f5/0x4cfc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:22.280247+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 159956992 unmapped: 35373056 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554cbc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 263 heartbeat osd_stat(store_statfs(0x1b4631000/0x0/0x1bfc00000, data 0x4b421f5/0x4cfc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 263 ms_handle_reset con 0x55ff554cbc00 session 0x55ff554ce780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.137942314s of 10.003144264s, submitted: 219
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff52aa2400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 263 ms_handle_reset con 0x55ff52aa2400 session 0x55ff54570780
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:23.280415+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 159973376 unmapped: 35356672 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff51ec8400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2229604 data_alloc: 301989888 data_used: 13766656
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:24.280549+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 159989760 unmapped: 35340288 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 263 handle_osd_map epochs [264,264], i have 263, src has [1,264]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 263 handle_osd_map epochs [264,264], i have 264, src has [1,264]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 264 ms_handle_reset con 0x55ff51ec8400 session 0x55ff522d23c0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:25.280674+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 159563776 unmapped: 35766272 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff52aa2400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 264 ms_handle_reset con 0x55ff52aa2400 session 0x55ff554ced20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:26.280946+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 159571968 unmapped: 35758080 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 264 heartbeat osd_stat(store_statfs(0x1b55fe000/0x0/0x1bfc00000, data 0x3b73a00/0x3d2f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 264 handle_osd_map epochs [265,265], i have 264, src has [1,265]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff53042c00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 265 ms_handle_reset con 0x55ff53042c00 session 0x55ff542405a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 265 handle_osd_map epochs [265,265], i have 265, src has [1,265]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554c6000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:27.281088+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 160776192 unmapped: 34553856 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 265 ms_handle_reset con 0x55ff554c6000 session 0x55ff554d8960
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:28.281236+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554cbc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 265 ms_handle_reset con 0x55ff554cbc00 session 0x55ff531c0b40
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 161013760 unmapped: 34316288 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff51ec8400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2242315 data_alloc: 301989888 data_used: 13774848
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 265 handle_osd_map epochs [265,266], i have 265, src has [1,266]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Got map version 50
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 266 handle_osd_map epochs [266,266], i have 266, src has [1,266]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 266 ms_handle_reset con 0x55ff51ec8400 session 0x55ff54575e00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:29.281431+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 161054720 unmapped: 34275328 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 266 handle_osd_map epochs [266,266], i have 266, src has [1,266]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff52aa2400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 266 ms_handle_reset con 0x55ff52aa2400 session 0x55ff52adb0e0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:30.281600+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 162332672 unmapped: 32997376 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff53042c00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 266 ms_handle_reset con 0x55ff53042c00 session 0x55ff5471d4a0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 266 heartbeat osd_stat(store_statfs(0x1b557c000/0x0/0x1bfc00000, data 0x3bf1a52/0x3db2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:31.281758+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 162504704 unmapped: 32825344 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:32.282007+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 162504704 unmapped: 32825344 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:33.282226+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 162521088 unmapped: 32808960 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2256214 data_alloc: 301989888 data_used: 13783040
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:34.282418+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 162562048 unmapped: 32768000 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.641526222s of 11.394283295s, submitted: 193
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 266 heartbeat osd_stat(store_statfs(0x1b5557000/0x0/0x1bfc00000, data 0x3c16a95/0x3dd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:35.282573+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 162496512 unmapped: 32833536 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 266 heartbeat osd_stat(store_statfs(0x1b5557000/0x0/0x1bfc00000, data 0x3c16a95/0x3dd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:36.282784+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 163577856 unmapped: 31752192 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 266 handle_osd_map epochs [267,267], i have 266, src has [1,267]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 267 handle_osd_map epochs [267,267], i have 267, src has [1,267]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:37.282916+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 163782656 unmapped: 31547392 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 267 handle_osd_map epochs [267,267], i have 267, src has [1,267]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 267 handle_osd_map epochs [267,267], i have 267, src has [1,267]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 267 heartbeat osd_stat(store_statfs(0x1b54ef000/0x0/0x1bfc00000, data 0x3c7c72f/0x3e3e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:38.283120+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 163823616 unmapped: 31506432 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2270270 data_alloc: 301989888 data_used: 13795328
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:39.283324+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 163823616 unmapped: 31506432 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:40.283490+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 164134912 unmapped: 31195136 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:41.283660+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 165281792 unmapped: 30048256 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 267 heartbeat osd_stat(store_statfs(0x1b5445000/0x0/0x1bfc00000, data 0x3d25bfa/0x3ee9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:42.283857+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 165298176 unmapped: 30031872 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:43.284050+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 165257216 unmapped: 30072832 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2271442 data_alloc: 301989888 data_used: 13795328
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:44.284191+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 165363712 unmapped: 29966336 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.585178375s of 10.127472878s, submitted: 125
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:45.284366+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 165421056 unmapped: 29908992 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 267 handle_osd_map epochs [268,268], i have 267, src has [1,268]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 268 heartbeat osd_stat(store_statfs(0x1b53c5000/0x0/0x1bfc00000, data 0x3da4f3a/0x3f69000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:46.284522+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 165150720 unmapped: 30179328 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:47.284718+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 165150720 unmapped: 30179328 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 268 heartbeat osd_stat(store_statfs(0x1b53b6000/0x0/0x1bfc00000, data 0x3db1b28/0x3f77000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:48.284859+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 166215680 unmapped: 29114368 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2285040 data_alloc: 301989888 data_used: 13807616
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 268 heartbeat osd_stat(store_statfs(0x1b536b000/0x0/0x1bfc00000, data 0x3dfe2c1/0x3fc3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:49.285069+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 166371328 unmapped: 28958720 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:50.285257+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 166371328 unmapped: 28958720 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:51.285459+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 166469632 unmapped: 28860416 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 268 handle_osd_map epochs [269,269], i have 268, src has [1,269]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:52.285763+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 167501824 unmapped: 27828224 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 269 handle_osd_map epochs [270,270], i have 269, src has [1,270]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:53.285942+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 166461440 unmapped: 28868608 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2293902 data_alloc: 301989888 data_used: 13819904
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:54.286136+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 166608896 unmapped: 28721152 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.570891380s of 10.024077415s, submitted: 156
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 270 heartbeat osd_stat(store_statfs(0x1b52f8000/0x0/0x1bfc00000, data 0x3e6f56b/0x4036000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:55.286328+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 166625280 unmapped: 28704768 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:56.286490+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 166789120 unmapped: 28540928 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:57.286644+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 166789120 unmapped: 28540928 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:58.286851+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 167968768 unmapped: 27361280 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2311652 data_alloc: 301989888 data_used: 13819904
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:59.287092+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 167968768 unmapped: 27361280 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:00.287220+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 168075264 unmapped: 27254784 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 270 heartbeat osd_stat(store_statfs(0x1b40ae000/0x0/0x1bfc00000, data 0x3f13fee/0x40df000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:01.287405+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 168189952 unmapped: 27140096 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 270 handle_osd_map epochs [271,271], i have 270, src has [1,271]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 271 handle_osd_map epochs [271,271], i have 271, src has [1,271]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 271 heartbeat osd_stat(store_statfs(0x1b40ae000/0x0/0x1bfc00000, data 0x3f13fee/0x40df000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:02.287642+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 168361984 unmapped: 26968064 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:03.287811+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 168361984 unmapped: 26968064 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:04.287973+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 168607744 unmapped: 26722304 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2332628 data_alloc: 301989888 data_used: 13832192
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.458841324s of 10.280069351s, submitted: 98
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:05.288123+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 169656320 unmapped: 25673728 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:06.288290+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 271 heartbeat osd_stat(store_statfs(0x1b4036000/0x0/0x1bfc00000, data 0x3f8850a/0x4158000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 169656320 unmapped: 25673728 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:07.288457+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 169664512 unmapped: 25665536 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:08.288658+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 169664512 unmapped: 25665536 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:09.288836+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 169664512 unmapped: 25665536 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2329580 data_alloc: 301989888 data_used: 13832192
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:10.288976+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 170082304 unmapped: 25247744 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:11.289170+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 271 heartbeat osd_stat(store_statfs(0x1b3fa7000/0x0/0x1bfc00000, data 0x4015168/0x41e6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 172081152 unmapped: 23248896 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:12.289372+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 171991040 unmapped: 23339008 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:13.289523+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 271 heartbeat osd_stat(store_statfs(0x1b3f26000/0x0/0x1bfc00000, data 0x4098215/0x4268000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 171999232 unmapped: 23330816 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:14.289718+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 172015616 unmapped: 23314432 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2348278 data_alloc: 301989888 data_used: 13832192
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.651040077s of 10.120527267s, submitted: 116
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:15.289873+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 271 heartbeat osd_stat(store_statfs(0x1b3f0f000/0x0/0x1bfc00000, data 0x40b1e83/0x427f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 172023808 unmapped: 23306240 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 271 heartbeat osd_stat(store_statfs(0x1b3ecd000/0x0/0x1bfc00000, data 0x40f2bb7/0x42c0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:16.290082+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 172285952 unmapped: 23044096 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 271 heartbeat osd_stat(store_statfs(0x1b3ecd000/0x0/0x1bfc00000, data 0x40f2bb7/0x42c0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:17.290233+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 271 heartbeat osd_stat(store_statfs(0x1b3ecd000/0x0/0x1bfc00000, data 0x40f2bb7/0x42c0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,3,4,5] op hist [0,0,1])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 174383104 unmapped: 20946944 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 271 heartbeat osd_stat(store_statfs(0x1b2d04000/0x0/0x1bfc00000, data 0x411de55/0x42ea000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:18.290407+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 174399488 unmapped: 20930560 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:19.290532+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 174579712 unmapped: 20750336 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2352182 data_alloc: 301989888 data_used: 13832192
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:20.290685+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 174702592 unmapped: 20627456 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:21.290826+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 174718976 unmapped: 20611072 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:22.291056+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 174284800 unmapped: 21045248 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:23.291248+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 174284800 unmapped: 21045248 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 271 heartbeat osd_stat(store_statfs(0x1b2c44000/0x0/0x1bfc00000, data 0x41e08f7/0x43aa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:24.291458+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 175333376 unmapped: 19996672 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2366950 data_alloc: 301989888 data_used: 13832192
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.478492737s of 10.008082390s, submitted: 111
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:25.291603+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 175448064 unmapped: 19881984 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:26.291803+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 175448064 unmapped: 19881984 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:27.291985+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 175456256 unmapped: 19873792 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:28.292143+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 271 heartbeat osd_stat(store_statfs(0x1b2ba4000/0x0/0x1bfc00000, data 0x427daaf/0x4448000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 175702016 unmapped: 19628032 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:29.292302+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 175702016 unmapped: 19628032 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2381336 data_alloc: 301989888 data_used: 13832192
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:30.292458+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 271 handle_osd_map epochs [272,272], i have 271, src has [1,272]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 272 handle_osd_map epochs [272,272], i have 272, src has [1,272]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 272 heartbeat osd_stat(store_statfs(0x1b2b25000/0x0/0x1bfc00000, data 0x42fa942/0x44c6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 177938432 unmapped: 17391616 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:31.292622+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 178200576 unmapped: 17129472 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:32.292822+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 178200576 unmapped: 17129472 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:33.292982+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 178200576 unmapped: 17129472 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:34.293132+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 178208768 unmapped: 17121280 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2382488 data_alloc: 301989888 data_used: 13844480
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:35.293289+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 272 heartbeat osd_stat(store_statfs(0x1b2afd000/0x0/0x1bfc00000, data 0x4324440/0x44f1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.234015465s of 10.705220222s, submitted: 129
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 178208768 unmapped: 17121280 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:36.293471+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 272 handle_osd_map epochs [273,273], i have 272, src has [1,273]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 178208768 unmapped: 17121280 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:37.293641+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 273 handle_osd_map epochs [273,273], i have 273, src has [1,273]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 177471488 unmapped: 17858560 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:38.293811+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 273 handle_osd_map epochs [273,273], i have 273, src has [1,273]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 273 handle_osd_map epochs [273,273], i have 273, src has [1,273]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 177471488 unmapped: 17858560 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:39.294004+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 177471488 unmapped: 17858560 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2386614 data_alloc: 301989888 data_used: 13856768
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:40.294217+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 273 heartbeat osd_stat(store_statfs(0x1b2af7000/0x0/0x1bfc00000, data 0x43267d5/0x44f4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 177471488 unmapped: 17858560 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:41.294354+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 177487872 unmapped: 17842176 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:42.294565+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 177487872 unmapped: 17842176 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:43.294729+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 273 heartbeat osd_stat(store_statfs(0x1b2af7000/0x0/0x1bfc00000, data 0x4326936/0x44f5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 177487872 unmapped: 17842176 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 273 heartbeat osd_stat(store_statfs(0x1b2af7000/0x0/0x1bfc00000, data 0x4326936/0x44f5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:44.294914+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 177496064 unmapped: 17833984 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2386924 data_alloc: 301989888 data_used: 13856768
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:45.295112+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.771763802s of 10.012133598s, submitted: 57
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 177496064 unmapped: 17833984 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:46.295267+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 177496064 unmapped: 17833984 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 273 heartbeat osd_stat(store_statfs(0x1b2afa000/0x0/0x1bfc00000, data 0x43269f7/0x44f4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:47.295532+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 177496064 unmapped: 17833984 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:48.295763+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 177504256 unmapped: 17825792 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:49.296009+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 273 heartbeat osd_stat(store_statfs(0x1b26f9000/0x0/0x1bfc00000, data 0x4326afa/0x44f3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 177504256 unmapped: 17825792 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2388434 data_alloc: 301989888 data_used: 13856768
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:50.296215+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 273 heartbeat osd_stat(store_statfs(0x1b26f8000/0x0/0x1bfc00000, data 0x4326b95/0x44f4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 177504256 unmapped: 17825792 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:51.296394+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 177504256 unmapped: 17825792 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:52.296639+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 177504256 unmapped: 17825792 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:53.296816+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 273 heartbeat osd_stat(store_statfs(0x1b26f7000/0x0/0x1bfc00000, data 0x4326cf1/0x44f5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 177512448 unmapped: 17817600 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:54.297069+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 177512448 unmapped: 17817600 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2391426 data_alloc: 301989888 data_used: 13856768
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:55.297233+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.128740311s of 10.274168015s, submitted: 29
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 177528832 unmapped: 17801216 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:56.297392+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554c6000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 177553408 unmapped: 17776640 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:57.297540+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 177561600 unmapped: 17768448 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 273 heartbeat osd_stat(store_statfs(0x1b2698000/0x0/0x1bfc00000, data 0x43843f1/0x4555000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:58.297710+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 178733056 unmapped: 16596992 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:59.297854+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 178839552 unmapped: 16490496 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2415710 data_alloc: 301989888 data_used: 13856768
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:00.297997+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 273 heartbeat osd_stat(store_statfs(0x1b2614000/0x0/0x1bfc00000, data 0x440a431/0x45d9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 178937856 unmapped: 16392192 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:01.298086+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 273 heartbeat osd_stat(store_statfs(0x1b25e9000/0x0/0x1bfc00000, data 0x4435c87/0x4604000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 273 handle_osd_map epochs [274,274], i have 273, src has [1,274]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 273 handle_osd_map epochs [274,274], i have 274, src has [1,274]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 178315264 unmapped: 17014784 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:02.298276+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 274 handle_osd_map epochs [274,274], i have 274, src has [1,274]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 274 handle_osd_map epochs [274,274], i have 274, src has [1,274]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 179380224 unmapped: 15949824 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:03.298470+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 179380224 unmapped: 15949824 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:04.298604+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 179937280 unmapped: 15392768 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2417786 data_alloc: 301989888 data_used: 13869056
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:05.298750+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 274 heartbeat osd_stat(store_statfs(0x1b2542000/0x0/0x1bfc00000, data 0x44db946/0x46ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.243766785s of 10.016719818s, submitted: 178
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 180068352 unmapped: 15261696 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:06.298944+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 180068352 unmapped: 15261696 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Got map version 51
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:07.299105+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 181256192 unmapped: 14073856 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:08.299274+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 274 heartbeat osd_stat(store_statfs(0x1b24bb000/0x0/0x1bfc00000, data 0x45646e6/0x4733000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 181256192 unmapped: 14073856 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 274 heartbeat osd_stat(store_statfs(0x1b24bb000/0x0/0x1bfc00000, data 0x45646e6/0x4733000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:09.299407+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 274 heartbeat osd_stat(store_statfs(0x1b24af000/0x0/0x1bfc00000, data 0x45700b0/0x473f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2437336 data_alloc: 301989888 data_used: 13869056
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 181280768 unmapped: 14049280 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:10.299561+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 180453376 unmapped: 14876672 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:11.299707+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 180453376 unmapped: 14876672 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 274 handle_osd_map epochs [275,275], i have 274, src has [1,275]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:12.299909+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 181567488 unmapped: 13762560 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:13.300082+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 275 handle_osd_map epochs [275,275], i have 275, src has [1,275]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 181084160 unmapped: 14245888 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:14.300188+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2444540 data_alloc: 301989888 data_used: 13881344
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 181190656 unmapped: 14139392 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:15.300349+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 275 heartbeat osd_stat(store_statfs(0x1b2392000/0x0/0x1bfc00000, data 0x468a781/0x485c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 275 handle_osd_map epochs [275,276], i have 275, src has [1,276]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.402547836s of 10.062718391s, submitted: 168
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 181313536 unmapped: 14016512 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:16.300528+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 182525952 unmapped: 12804096 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:17.300676+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 182525952 unmapped: 12804096 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:18.300848+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 182542336 unmapped: 12787712 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:19.300989+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2455304 data_alloc: 301989888 data_used: 13893632
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 182657024 unmapped: 12673024 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:20.301124+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 182657024 unmapped: 12673024 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:21.301288+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 276 heartbeat osd_stat(store_statfs(0x1b22ee000/0x0/0x1bfc00000, data 0x472e2a1/0x4900000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 276 handle_osd_map epochs [276,277], i have 276, src has [1,277]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 182665216 unmapped: 12664832 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:22.301486+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 277 heartbeat osd_stat(store_statfs(0x1b22e9000/0x0/0x1bfc00000, data 0x4730565/0x4904000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 182665216 unmapped: 12664832 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:23.301650+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 182665216 unmapped: 12664832 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:24.301824+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 277 heartbeat osd_stat(store_statfs(0x1b22e8000/0x0/0x1bfc00000, data 0x47305cc/0x4905000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2460136 data_alloc: 301989888 data_used: 13905920
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 182673408 unmapped: 12656640 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:25.301996+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 277 heartbeat osd_stat(store_statfs(0x1b22e8000/0x0/0x1bfc00000, data 0x473056b/0x4904000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 182673408 unmapped: 12656640 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.967475891s of 10.258387566s, submitted: 70
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:26.302257+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 182673408 unmapped: 12656640 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:27.302396+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 182673408 unmapped: 12656640 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:28.302561+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 277 heartbeat osd_stat(store_statfs(0x1b22e8000/0x0/0x1bfc00000, data 0x47306f7/0x4904000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 182697984 unmapped: 12632064 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:29.302718+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2460974 data_alloc: 301989888 data_used: 13905920
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 182706176 unmapped: 12623872 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:30.302891+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 277 handle_osd_map epochs [277,278], i have 277, src has [1,278]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 182722560 unmapped: 12607488 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:31.303092+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 278 heartbeat osd_stat(store_statfs(0x1b22e5000/0x0/0x1bfc00000, data 0x4732dee/0x4909000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 182738944 unmapped: 12591104 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:32.303264+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 278 heartbeat osd_stat(store_statfs(0x1b22e5000/0x0/0x1bfc00000, data 0x4732dee/0x4909000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 182583296 unmapped: 12746752 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:33.303415+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff554cbc00
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 182591488 unmapped: 12738560 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:34.303557+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2469952 data_alloc: 301989888 data_used: 13922304
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 182607872 unmapped: 12722176 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:35.303710+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 278 heartbeat osd_stat(store_statfs(0x1b22e4000/0x0/0x1bfc00000, data 0x4732e78/0x490a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 278 handle_osd_map epochs [279,279], i have 278, src has [1,279]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 279 ms_handle_reset con 0x55ff554cbc00 session 0x55ff522d2d20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 279 handle_osd_map epochs [279,279], i have 279, src has [1,279]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff52aa2000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183050240 unmapped: 12279808 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:36.303844+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 279 handle_osd_map epochs [279,280], i have 279, src has [1,280]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.870776176s of 10.462518692s, submitted: 387
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 280 heartbeat osd_stat(store_statfs(0x1b22de000/0x0/0x1bfc00000, data 0x4735224/0x490e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183066624 unmapped: 12263424 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 280 ms_handle_reset con 0x55ff52aa2000 session 0x55ff56546000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Got map version 52
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:37.303977+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 184115200 unmapped: 11214848 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:38.304118+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 184115200 unmapped: 11214848 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:39.304263+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2474866 data_alloc: 301989888 data_used: 13934592
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183066624 unmapped: 12263424 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:40.304388+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183074816 unmapped: 12255232 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 280 handle_osd_map epochs [281,281], i have 280, src has [1,281]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:41.304516+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 281 heartbeat osd_stat(store_statfs(0x1b22de000/0x0/0x1bfc00000, data 0x473798f/0x4910000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 281 handle_osd_map epochs [281,282], i have 281, src has [1,282]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183107584 unmapped: 12222464 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:42.304794+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183107584 unmapped: 12222464 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:43.304944+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 282 handle_osd_map epochs [282,282], i have 282, src has [1,282]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183107584 unmapped: 12222464 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:44.305262+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2482208 data_alloc: 301989888 data_used: 13950976
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183107584 unmapped: 12222464 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:45.305398+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 282 handle_osd_map epochs [282,283], i have 282, src has [1,283]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 283 heartbeat osd_stat(store_statfs(0x1b22d6000/0x0/0x1bfc00000, data 0x473c22f/0x4918000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183132160 unmapped: 12197888 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:46.305543+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 283 handle_osd_map epochs [283,284], i have 283, src has [1,284]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.410529137s of 10.057308197s, submitted: 200
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183140352 unmapped: 12189696 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:47.305713+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183140352 unmapped: 12189696 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:48.305891+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183140352 unmapped: 12189696 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:49.306118+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2494044 data_alloc: 301989888 data_used: 13975552
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183140352 unmapped: 12189696 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 284 heartbeat osd_stat(store_statfs(0x1b22cc000/0x0/0x1bfc00000, data 0x4740a91/0x4921000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:50.306263+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183140352 unmapped: 12189696 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:51.306439+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183156736 unmapped: 12173312 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 284 handle_osd_map epochs [285,285], i have 284, src has [1,285]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:52.306622+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 285 handle_osd_map epochs [286,286], i have 285, src has [1,286]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 286 heartbeat osd_stat(store_statfs(0x1b22ca000/0x0/0x1bfc00000, data 0x4740ac6/0x4920000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183197696 unmapped: 12132352 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:53.306773+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183197696 unmapped: 12132352 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 286 heartbeat osd_stat(store_statfs(0x1b22c0000/0x0/0x1bfc00000, data 0x474528d/0x4929000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:54.306921+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2499414 data_alloc: 301989888 data_used: 13987840
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183197696 unmapped: 12132352 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:55.307144+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183197696 unmapped: 12132352 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:56.307309+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 286 heartbeat osd_stat(store_statfs(0x1b22c3000/0x0/0x1bfc00000, data 0x4745258/0x4929000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183197696 unmapped: 12132352 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:57.307454+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.749951363s of 10.961165428s, submitted: 92
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183197696 unmapped: 12132352 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:58.307599+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 286 heartbeat osd_stat(store_statfs(0x1b22c2000/0x0/0x1bfc00000, data 0x47452f3/0x492a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183197696 unmapped: 12132352 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:59.307768+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2498988 data_alloc: 301989888 data_used: 13987840
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183197696 unmapped: 12132352 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:00.308000+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 286 heartbeat osd_stat(store_statfs(0x1b22c3000/0x0/0x1bfc00000, data 0x47452c3/0x4929000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 286 handle_osd_map epochs [287,287], i have 286, src has [1,287]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 286 handle_osd_map epochs [287,287], i have 287, src has [1,287]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183214080 unmapped: 12115968 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:01.308240+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 287 handle_osd_map epochs [288,288], i have 287, src has [1,288]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183230464 unmapped: 12099584 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:02.308455+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 288 handle_osd_map epochs [287,288], i have 288, src has [1,288]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183230464 unmapped: 12099584 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:03.308630+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183230464 unmapped: 12099584 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:04.308771+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 288 heartbeat osd_stat(store_statfs(0x1b22b9000/0x0/0x1bfc00000, data 0x4749add/0x4933000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 288 heartbeat osd_stat(store_statfs(0x1b22b9000/0x0/0x1bfc00000, data 0x4749add/0x4933000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2508686 data_alloc: 301989888 data_used: 14012416
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 183230464 unmapped: 12099584 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:05.308960+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 288 handle_osd_map epochs [288,289], i have 288, src has [1,289]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 289 handle_osd_map epochs [289,289], i have 289, src has [1,289]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 289 handle_osd_map epochs [289,289], i have 289, src has [1,289]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185344000 unmapped: 9986048 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:06.309155+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185344000 unmapped: 9986048 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:07.309339+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185344000 unmapped: 9986048 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:08.309544+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.524138451s of 10.851389885s, submitted: 115
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185352192 unmapped: 9977856 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:09.309665+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2509780 data_alloc: 301989888 data_used: 14016512
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185352192 unmapped: 9977856 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:10.309801+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 289 heartbeat osd_stat(store_statfs(0x1b22b8000/0x0/0x1bfc00000, data 0x474c0c1/0x4935000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:11.309959+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185360384 unmapped: 9969664 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 289 handle_osd_map epochs [290,290], i have 289, src has [1,290]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:12.310192+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185368576 unmapped: 9961472 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 handle_osd_map epochs [290,290], i have 290, src has [1,290]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 handle_osd_map epochs [290,290], i have 290, src has [1,290]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:13.310380+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185376768 unmapped: 9953280 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:14.310567+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185376768 unmapped: 9953280 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2515396 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:15.310713+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185376768 unmapped: 9953280 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b22b2000/0x0/0x1bfc00000, data 0x474e50f/0x4939000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [0,0,0,1])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:16.310871+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185376768 unmapped: 9953280 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:17.311057+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185384960 unmapped: 9945088 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:18.311225+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185384960 unmapped: 9945088 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:19.311382+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185384960 unmapped: 9945088 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.846098900s of 11.009695053s, submitted: 48
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2515194 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:20.311546+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185393152 unmapped: 9936896 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:21.311716+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185393152 unmapped: 9936896 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b22b3000/0x0/0x1bfc00000, data 0x474e5df/0x4939000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:22.311920+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185393152 unmapped: 9936896 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:23.312078+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185401344 unmapped: 9928704 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b22b3000/0x0/0x1bfc00000, data 0x474e645/0x493a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:24.312264+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185409536 unmapped: 9920512 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2516658 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:25.312462+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185417728 unmapped: 9912320 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:26.312674+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185417728 unmapped: 9912320 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b22b3000/0x0/0x1bfc00000, data 0x474e6aa/0x493a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:27.312863+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185417728 unmapped: 9912320 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:28.313046+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185417728 unmapped: 9912320 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:29.313195+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185417728 unmapped: 9912320 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.272034645s of 10.403295517s, submitted: 30
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:30.313344+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2516518 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185417728 unmapped: 9912320 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:31.313508+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185417728 unmapped: 9912320 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b22b4000/0x0/0x1bfc00000, data 0x474e712/0x4939000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:32.313754+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185417728 unmapped: 9912320 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:33.313933+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185425920 unmapped: 9904128 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:34.314108+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185425920 unmapped: 9904128 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:35.314357+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2516052 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185425920 unmapped: 9904128 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 ms_handle_reset con 0x55ff554c6400 session 0x55ff555cf860
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff52aa2000
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:36.314535+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185425920 unmapped: 9904128 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b22b5000/0x0/0x1bfc00000, data 0x474e7d9/0x4938000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:37.314714+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185425920 unmapped: 9904128 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:38.314863+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185425920 unmapped: 9904128 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b22b5000/0x0/0x1bfc00000, data 0x474e7a6/0x4938000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:39.315004+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185425920 unmapped: 9904128 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:40.315300+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2515652 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185425920 unmapped: 9904128 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.917982101s of 11.008728981s, submitted: 19
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:41.315445+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185458688 unmapped: 9871360 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:42.315658+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185458688 unmapped: 9871360 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b22b3000/0x0/0x1bfc00000, data 0x474e905/0x4939000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:43.315872+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185458688 unmapped: 9871360 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:44.316114+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185458688 unmapped: 9871360 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:45.316328+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2518076 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185458688 unmapped: 9871360 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:46.316510+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185458688 unmapped: 9871360 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:47.316663+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b22b5000/0x0/0x1bfc00000, data 0x474e902/0x4939000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185458688 unmapped: 9871360 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:48.316825+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185458688 unmapped: 9871360 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:49.316985+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185237504 unmapped: 10092544 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:50.317109+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2517420 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185237504 unmapped: 10092544 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:51.317302+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185237504 unmapped: 10092544 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:52.317504+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185237504 unmapped: 10092544 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.497300148s of 11.614668846s, submitted: 23
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b22b2000/0x0/0x1bfc00000, data 0x474e9c7/0x493a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:53.317665+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185245696 unmapped: 10084352 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:54.317840+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185253888 unmapped: 10076160 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:55.318068+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2519962 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185253888 unmapped: 10076160 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:56.318244+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185253888 unmapped: 10076160 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:57.318396+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185262080 unmapped: 10067968 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:58.318607+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185262080 unmapped: 10067968 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b22b4000/0x0/0x1bfc00000, data 0x474ea01/0x493a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:59.318839+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185262080 unmapped: 10067968 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:00.318992+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2519914 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185262080 unmapped: 10067968 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:01.319199+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185262080 unmapped: 10067968 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:02.319461+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185262080 unmapped: 10067968 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b22b4000/0x0/0x1bfc00000, data 0x474ea54/0x493a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.027684212s of 10.087369919s, submitted: 11
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:03.319660+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185262080 unmapped: 10067968 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:04.319833+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185262080 unmapped: 10067968 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:05.319999+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2519738 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185270272 unmapped: 10059776 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:06.320182+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185270272 unmapped: 10059776 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:07.320363+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185270272 unmapped: 10059776 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b22b3000/0x0/0x1bfc00000, data 0x474e9d2/0x493a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:08.320564+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185270272 unmapped: 10059776 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:09.320720+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185270272 unmapped: 10059776 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:10.320892+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2519914 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185270272 unmapped: 10059776 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:11.321121+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185270272 unmapped: 10059776 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:12.321314+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185270272 unmapped: 10059776 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b22b3000/0x0/0x1bfc00000, data 0x474e9d2/0x493a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [1,1])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:13.321461+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185286656 unmapped: 10043392 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:14.321654+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185286656 unmapped: 10043392 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.415963173s of 12.484015465s, submitted: 14
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:15.321837+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2518614 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185286656 unmapped: 10043392 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:16.321984+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185286656 unmapped: 10043392 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:17.322149+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185286656 unmapped: 10043392 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b22b8000/0x0/0x1bfc00000, data 0x474e99e/0x4936000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:18.322315+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185286656 unmapped: 10043392 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:19.322738+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185286656 unmapped: 10043392 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:20.322927+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2517218 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185286656 unmapped: 10043392 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:21.323076+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185294848 unmapped: 10035200 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:22.323308+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185294848 unmapped: 10035200 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b22b8000/0x0/0x1bfc00000, data 0x474e99e/0x4936000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:23.323497+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185294848 unmapped: 10035200 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b22b8000/0x0/0x1bfc00000, data 0x474e99e/0x4936000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b22b8000/0x0/0x1bfc00000, data 0x474e99e/0x4936000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:24.323673+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185294848 unmapped: 10035200 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:25.323873+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2517218 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185294848 unmapped: 10035200 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.423652649s of 10.456267357s, submitted: 6
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:26.324075+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185294848 unmapped: 10035200 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:27.324340+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185294848 unmapped: 10035200 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:28.324528+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185294848 unmapped: 10035200 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:29.324713+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185262080 unmapped: 10067968 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b22b6000/0x0/0x1bfc00000, data 0x474ead4/0x4938000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:30.324865+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2520210 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185270272 unmapped: 10059776 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:31.325007+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185270272 unmapped: 10059776 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b22b6000/0x0/0x1bfc00000, data 0x474ead4/0x4938000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:32.325265+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185270272 unmapped: 10059776 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:33.325398+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185286656 unmapped: 10043392 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:34.325571+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185286656 unmapped: 10043392 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:35.325723+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2523522 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185286656 unmapped: 10043392 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:36.325941+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185286656 unmapped: 10043392 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.474199295s of 11.511046410s, submitted: 7
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:37.326109+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185311232 unmapped: 10018816 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b22b2000/0x0/0x1bfc00000, data 0x474ec95/0x493b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:38.326283+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185311232 unmapped: 10018816 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:39.326477+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185311232 unmapped: 10018816 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:40.326716+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2525290 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185311232 unmapped: 10018816 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:41.326902+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185311232 unmapped: 10018816 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b22b2000/0x0/0x1bfc00000, data 0x474ec95/0x493b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:42.327176+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185311232 unmapped: 10018816 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:43.327377+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185311232 unmapped: 10018816 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:44.327600+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185311232 unmapped: 10018816 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:45.327828+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2524986 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185319424 unmapped: 10010624 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:46.327994+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185319424 unmapped: 10010624 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b22b3000/0x0/0x1bfc00000, data 0x474ec95/0x493b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:47.328191+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185319424 unmapped: 10010624 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:48.328364+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185319424 unmapped: 10010624 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:49.328556+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185319424 unmapped: 10010624 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:50.328729+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.189265251s of 13.201974869s, submitted: 2
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2531262 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185327616 unmapped: 10002432 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:51.328910+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185327616 unmapped: 10002432 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b228c000/0x0/0x1bfc00000, data 0x4774cf8/0x4962000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:52.329118+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185327616 unmapped: 10002432 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:53.329322+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185556992 unmapped: 9773056 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:54.329512+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185556992 unmapped: 9773056 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:55.329678+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2542196 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 186785792 unmapped: 8544256 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:56.329818+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187006976 unmapped: 8323072 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b2224000/0x0/0x1bfc00000, data 0x47dc010/0x49c9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:57.329981+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185556992 unmapped: 9773056 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:58.330175+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185556992 unmapped: 9773056 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:59.330378+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185556992 unmapped: 9773056 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:00.330513+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2541776 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185769984 unmapped: 9560064 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b21c9000/0x0/0x1bfc00000, data 0x48381a0/0x4a25000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:01.330674+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185769984 unmapped: 9560064 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:02.330873+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b21c9000/0x0/0x1bfc00000, data 0x48381a0/0x4a25000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185925632 unmapped: 9404416 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b21c9000/0x0/0x1bfc00000, data 0x48381a0/0x4a25000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:03.331076+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.834505081s of 13.023785591s, submitted: 44
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185933824 unmapped: 9396224 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b21b7000/0x0/0x1bfc00000, data 0x484a8ea/0x4a37000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [0,0,0,0,0,0,0,0,0,0,1])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:04.331273+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185942016 unmapped: 9388032 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:05.331452+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2542224 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185942016 unmapped: 9388032 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:06.331627+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185942016 unmapped: 9388032 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:07.331834+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 185942016 unmapped: 9388032 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:08.332065+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 186023936 unmapped: 9306112 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:09.332221+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 186023936 unmapped: 9306112 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b2178000/0x0/0x1bfc00000, data 0x4889391/0x4a76000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:10.332392+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2543472 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 186097664 unmapped: 9232384 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:11.332530+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 186097664 unmapped: 9232384 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:12.332712+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 186097664 unmapped: 9232384 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b2160000/0x0/0x1bfc00000, data 0x48a1050/0x4a8e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:13.332849+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.298421860s of 10.042017937s, submitted: 12
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 186277888 unmapped: 9052160 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:14.333070+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 186507264 unmapped: 8822784 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:15.333204+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2547760 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 186662912 unmapped: 8667136 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b2105000/0x0/0x1bfc00000, data 0x48fc473/0x4ae9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:16.333373+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 186662912 unmapped: 8667136 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:17.333537+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 186712064 unmapped: 8617984 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:18.333661+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187768832 unmapped: 7561216 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:19.333841+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187768832 unmapped: 7561216 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b20d7000/0x0/0x1bfc00000, data 0x492a3c3/0x4b17000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:20.334063+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2551728 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187957248 unmapped: 7372800 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:21.334221+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187957248 unmapped: 7372800 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:22.334397+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187957248 unmapped: 7372800 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:23.334566+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 186712064 unmapped: 8617984 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:24.334790+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.703664780s of 10.883479118s, submitted: 37
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 186712064 unmapped: 8617984 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x4985a99/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:25.334939+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2557024 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 186900480 unmapped: 8429568 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:26.335107+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187187200 unmapped: 8142848 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:27.335262+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187195392 unmapped: 8134656 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:28.335412+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b2028000/0x0/0x1bfc00000, data 0x49dcc02/0x4bc6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187195392 unmapped: 8134656 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:29.335552+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187195392 unmapped: 8134656 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:30.335682+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2552402 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187195392 unmapped: 8134656 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:31.335879+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187195392 unmapped: 8134656 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:32.336088+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187203584 unmapped: 8126464 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:33.336212+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187211776 unmapped: 8118272 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:34.336378+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b2029000/0x0/0x1bfc00000, data 0x49dd0af/0x4bc5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187211776 unmapped: 8118272 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.753453255s of 10.886549950s, submitted: 23
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:35.336571+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2557250 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187318272 unmapped: 8011776 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:36.336747+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187318272 unmapped: 8011776 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:37.336921+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b1ffa000/0x0/0x1bfc00000, data 0x4a0c933/0x4bf4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187318272 unmapped: 8011776 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:38.337134+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187432960 unmapped: 7897088 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:39.337311+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187432960 unmapped: 7897088 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:40.337493+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2557250 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187441152 unmapped: 7888896 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:41.337665+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187441152 unmapped: 7888896 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:42.337908+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187441152 unmapped: 7888896 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:43.338124+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b1ffa000/0x0/0x1bfc00000, data 0x4a0c933/0x4bf4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187441152 unmapped: 7888896 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:44.338305+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187441152 unmapped: 7888896 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b1ffa000/0x0/0x1bfc00000, data 0x4a0c933/0x4bf4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:45.338506+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b1ffa000/0x0/0x1bfc00000, data 0x4a0c933/0x4bf4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2557250 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187441152 unmapped: 7888896 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:46.338715+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187441152 unmapped: 7888896 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:47.338873+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187441152 unmapped: 7888896 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.868812561s of 12.892441750s, submitted: 7
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:48.339077+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187449344 unmapped: 7880704 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b1fde000/0x0/0x1bfc00000, data 0x4a276b4/0x4c10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:49.339217+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187449344 unmapped: 7880704 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b1fde000/0x0/0x1bfc00000, data 0x4a276b4/0x4c10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:50.339374+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b1fd7000/0x0/0x1bfc00000, data 0x4a2e396/0x4c17000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2558762 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187449344 unmapped: 7880704 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:51.339514+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187457536 unmapped: 7872512 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:52.339780+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187457536 unmapped: 7872512 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:53.339975+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187457536 unmapped: 7872512 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:54.340168+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187457536 unmapped: 7872512 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:55.340300+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2569782 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187777024 unmapped: 7553024 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b1b5d000/0x0/0x1bfc00000, data 0x4aa631e/0x4c91000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:56.340500+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187850752 unmapped: 7479296 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:57.340696+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187850752 unmapped: 7479296 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:58.340895+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187850752 unmapped: 7479296 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.830674171s of 10.960289955s, submitted: 26
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:59.341059+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187768832 unmapped: 7561216 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:00.341202+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2570058 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187768832 unmapped: 7561216 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:01.341374+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b1b23000/0x0/0x1bfc00000, data 0x4ae0a36/0x4ccb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187768832 unmapped: 7561216 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:02.341588+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187768832 unmapped: 7561216 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:03.341760+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b1b23000/0x0/0x1bfc00000, data 0x4ae0a36/0x4ccb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187981824 unmapped: 7348224 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:04.341952+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 187998208 unmapped: 7331840 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:05.342131+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2573238 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188006400 unmapped: 7323648 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:06.342350+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188014592 unmapped: 7315456 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:07.342518+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188014592 unmapped: 7315456 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:08.342706+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b1ab5000/0x0/0x1bfc00000, data 0x4b4f0bf/0x4d39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188121088 unmapped: 7208960 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:09.342851+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188121088 unmapped: 7208960 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:10.342962+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2571540 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188121088 unmapped: 7208960 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.968422890s of 12.208720207s, submitted: 26
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:11.343146+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188121088 unmapped: 7208960 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:12.343328+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188121088 unmapped: 7208960 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:13.343489+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188121088 unmapped: 7208960 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b1ab6000/0x0/0x1bfc00000, data 0x4b4f0ee/0x4d38000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:14.343620+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188121088 unmapped: 7208960 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:15.343781+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2569984 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188121088 unmapped: 7208960 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:16.343974+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188121088 unmapped: 7208960 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:17.344156+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188121088 unmapped: 7208960 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:18.344358+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b1ab7000/0x0/0x1bfc00000, data 0x4b4f11d/0x4d37000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188121088 unmapped: 7208960 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:19.344538+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188121088 unmapped: 7208960 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:20.344675+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2569984 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188129280 unmapped: 7200768 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b1ab7000/0x0/0x1bfc00000, data 0x4b4f11d/0x4d37000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:21.344835+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188129280 unmapped: 7200768 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:22.345089+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.688426971s of 11.736072540s, submitted: 9
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188129280 unmapped: 7200768 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:23.345285+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188129280 unmapped: 7200768 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:24.345428+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188129280 unmapped: 7200768 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:25.345638+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2571560 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188129280 unmapped: 7200768 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:26.345901+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b1ab6000/0x0/0x1bfc00000, data 0x4b4f1b8/0x4d38000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188129280 unmapped: 7200768 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:27.346123+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b1ab6000/0x0/0x1bfc00000, data 0x4b4f1b8/0x4d38000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188129280 unmapped: 7200768 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:28.346256+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188137472 unmapped: 7192576 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:29.346431+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188137472 unmapped: 7192576 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:30.346558+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b1ab6000/0x0/0x1bfc00000, data 0x4b4f1b8/0x4d38000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2573152 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b1ab5000/0x0/0x1bfc00000, data 0x4b4f253/0x4d39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188137472 unmapped: 7192576 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:31.346715+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b1ab5000/0x0/0x1bfc00000, data 0x4b4f253/0x4d39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188137472 unmapped: 7192576 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:32.346978+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188145664 unmapped: 7184384 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:33.347158+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b1ab7000/0x0/0x1bfc00000, data 0x4b4f1e7/0x4d37000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188145664 unmapped: 7184384 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:34.347308+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188145664 unmapped: 7184384 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.249052048s of 12.292942047s, submitted: 9
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:35.351102+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b1ab7000/0x0/0x1bfc00000, data 0x4b4f1e7/0x4d37000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2571788 data_alloc: 301989888 data_used: 14028800
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188145664 unmapped: 7184384 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 290 handle_osd_map epochs [290,291], i have 290, src has [1,291]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:36.351298+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188162048 unmapped: 7168000 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:37.351506+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188162048 unmapped: 7168000 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:38.351762+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 291 heartbeat osd_stat(store_statfs(0x1b1ab2000/0x0/0x1bfc00000, data 0x4b516d8/0x4d3b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188162048 unmapped: 7168000 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:39.352090+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188162048 unmapped: 7168000 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:40.352305+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2576166 data_alloc: 301989888 data_used: 14041088
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188162048 unmapped: 7168000 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:41.352519+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188162048 unmapped: 7168000 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:42.352782+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188162048 unmapped: 7168000 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:43.352971+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 291 heartbeat osd_stat(store_statfs(0x1b1ab2000/0x0/0x1bfc00000, data 0x4b516d8/0x4d3b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188162048 unmapped: 7168000 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:44.353189+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188170240 unmapped: 7159808 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:45.353390+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2576686 data_alloc: 301989888 data_used: 14041088
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188170240 unmapped: 7159808 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:46.353540+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 291 handle_osd_map epochs [291,292], i have 291, src has [1,292]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.550593376s of 11.688674927s, submitted: 38
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188178432 unmapped: 7151616 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:47.353683+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188178432 unmapped: 7151616 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:48.353842+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 292 heartbeat osd_stat(store_statfs(0x1b1aae000/0x0/0x1bfc00000, data 0x4b53a17/0x4d40000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 292 heartbeat osd_stat(store_statfs(0x1b1aae000/0x0/0x1bfc00000, data 0x4b53a17/0x4d40000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188178432 unmapped: 7151616 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:49.353993+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188178432 unmapped: 7151616 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:50.354157+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2581424 data_alloc: 301989888 data_used: 14053376
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188178432 unmapped: 7151616 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:51.354317+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188178432 unmapped: 7151616 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:52.354495+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 292 heartbeat osd_stat(store_statfs(0x1b1aad000/0x0/0x1bfc00000, data 0x4b53ab2/0x4d41000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188186624 unmapped: 7143424 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:53.354679+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 292 heartbeat osd_stat(store_statfs(0x1b1aae000/0x0/0x1bfc00000, data 0x4b53ae1/0x4d40000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188186624 unmapped: 7143424 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:54.354890+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188186624 unmapped: 7143424 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:55.355065+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2580236 data_alloc: 301989888 data_used: 14053376
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188194816 unmapped: 7135232 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:56.355228+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _renew_subs
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 292 handle_osd_map epochs [293,293], i have 292, src has [1,293]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188203008 unmapped: 7127040 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:57.355426+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 293 heartbeat osd_stat(store_statfs(0x1b1aaf000/0x0/0x1bfc00000, data 0x4b53b10/0x4d3f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188203008 unmapped: 7127040 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:58.355627+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188203008 unmapped: 7127040 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:59.355823+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188203008 unmapped: 7127040 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.136222839s of 13.244468689s, submitted: 61
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:00.356070+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2584246 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 293 heartbeat osd_stat(store_statfs(0x1b1aaa000/0x0/0x1bfc00000, data 0x4b55f37/0x4d43000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188203008 unmapped: 7127040 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:01.356213+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188203008 unmapped: 7127040 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:02.356456+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188203008 unmapped: 7127040 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:03.356634+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 293 heartbeat osd_stat(store_statfs(0x1b1aaa000/0x0/0x1bfc00000, data 0x4b55f37/0x4d43000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188203008 unmapped: 7127040 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:04.356831+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188203008 unmapped: 7127040 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:05.357101+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2584246 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188203008 unmapped: 7127040 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:06.357252+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 293 handle_osd_map epochs [293,294], i have 293, src has [1,294]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188211200 unmapped: 7118848 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:07.357526+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa6000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188211200 unmapped: 7118848 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:08.357689+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188211200 unmapped: 7118848 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:09.357856+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188211200 unmapped: 7118848 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:10.357977+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa6000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2587072 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188211200 unmapped: 7118848 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:11.358083+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa6000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188211200 unmapped: 7118848 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:12.358274+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188211200 unmapped: 7118848 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:13.358405+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 ms_handle_reset con 0x55ff554c7000 session 0x55ff54861680
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: handle_auth_request added challenge on 0x55ff51ec8400
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188211200 unmapped: 7118848 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:14.358581+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188211200 unmapped: 7118848 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:15.358753+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2587072 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188211200 unmapped: 7118848 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:16.358895+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 7110656 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:17.359081+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa6000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 7110656 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:18.359249+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 7110656 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:19.359433+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 7110656 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:20.359610+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2587072 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 7110656 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:21.359807+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa6000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa6000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 7110656 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:22.360100+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 7110656 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:23.360281+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 7110656 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:24.360477+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 7110656 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:25.360624+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2587072 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 7110656 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:26.360850+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 7110656 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:27.361078+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa6000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 7110656 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:28.361253+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 7110656 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:29.361423+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 7110656 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:30.361553+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2587072 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 7110656 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:31.361720+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa6000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 7110656 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:32.361927+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188227584 unmapped: 7102464 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:33.362185+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188227584 unmapped: 7102464 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa6000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:34.362380+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa6000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188227584 unmapped: 7102464 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:35.362541+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2587072 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188227584 unmapped: 7102464 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:36.362755+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa6000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188227584 unmapped: 7102464 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:37.362906+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa6000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188227584 unmapped: 7102464 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:38.363086+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa6000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188227584 unmapped: 7102464 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:39.363314+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa6000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188227584 unmapped: 7102464 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:40.363482+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2587072 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188235776 unmapped: 7094272 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:41.363641+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188235776 unmapped: 7094272 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:42.363921+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188235776 unmapped: 7094272 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:43.364119+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa6000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188235776 unmapped: 7094272 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:44.364286+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188235776 unmapped: 7094272 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:45.364438+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2587072 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188235776 unmapped: 7094272 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:46.364634+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188235776 unmapped: 7094272 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:47.364801+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa6000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188235776 unmapped: 7094272 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:48.364949+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa6000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188235776 unmapped: 7094272 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:49.365123+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188235776 unmapped: 7094272 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:50.365307+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2587072 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188235776 unmapped: 7094272 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:51.368510+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188235776 unmapped: 7094272 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:52.369338+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188235776 unmapped: 7094272 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:53.369513+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa6000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188235776 unmapped: 7094272 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:54.369680+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa6000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188235776 unmapped: 7094272 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:55.369873+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa6000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2587072 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188235776 unmapped: 7094272 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:56.370124+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa6000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188243968 unmapped: 7086080 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:57.370319+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188243968 unmapped: 7086080 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:58.370493+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188243968 unmapped: 7086080 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:59.370657+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188243968 unmapped: 7086080 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:00.370847+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2587072 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188243968 unmapped: 7086080 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:01.371056+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa6000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188243968 unmapped: 7086080 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:02.371260+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188243968 unmapped: 7086080 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:03.371463+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188243968 unmapped: 7086080 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:04.371570+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188252160 unmapped: 7077888 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:05.371711+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 65.370056152s of 65.444564819s, submitted: 34
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 ms_handle_reset con 0x55ff554c6000 session 0x55ff56c7cd20
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2586368 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:06.371901+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188530688 unmapped: 6799360 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Got map version 53
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:07.372113+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa7000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:08.372349+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:09.376236+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:10.376409+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2586368 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:11.376607+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa7000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:12.376844+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa7000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:13.377089+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:14.377287+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa7000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:15.377500+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2586368 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:16.377772+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:17.377971+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa7000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:18.378152+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:19.378321+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:20.378514+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa7000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2586368 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:21.378744+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:22.378987+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:23.379197+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:24.379438+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa7000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:25.379615+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2586368 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:26.379789+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:27.379949+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:28.380140+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:29.380274+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa7000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:30.380434+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2586368 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:31.380568+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa7000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:32.380742+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:33.380882+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa7000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:34.381168+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188538880 unmapped: 6791168 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa7000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:35.381300+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188547072 unmapped: 6782976 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2586368 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:36.381440+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188547072 unmapped: 6782976 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:37.381617+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188563456 unmapped: 6766592 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:38.381803+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188563456 unmapped: 6766592 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:39.381975+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188563456 unmapped: 6766592 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa7000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa7000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:40.382125+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188563456 unmapped: 6766592 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:41.382250+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2586368 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188563456 unmapped: 6766592 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:42.382459+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188563456 unmapped: 6766592 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:43.382610+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188563456 unmapped: 6766592 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa7000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:44.382750+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188563456 unmapped: 6766592 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:45.382931+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188563456 unmapped: 6766592 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:46.383111+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2586368 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188563456 unmapped: 6766592 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:47.383288+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188571648 unmapped: 6758400 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:48.383507+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188571648 unmapped: 6758400 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:49.383684+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188571648 unmapped: 6758400 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa7000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:50.383853+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188571648 unmapped: 6758400 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:51.383999+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2586368 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188571648 unmapped: 6758400 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:52.384210+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188571648 unmapped: 6758400 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:53.384391+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188579840 unmapped: 6750208 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa7000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:54.384560+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188579840 unmapped: 6750208 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:55.384742+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188579840 unmapped: 6750208 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:56.384868+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2586368 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa7000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188579840 unmapped: 6750208 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa7000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:57.385002+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188579840 unmapped: 6750208 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:58.385143+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188579840 unmapped: 6750208 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:59.385274+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188579840 unmapped: 6750208 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:00.385406+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188579840 unmapped: 6750208 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:01.385554+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2586368 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188579840 unmapped: 6750208 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa7000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:02.385716+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188579840 unmapped: 6750208 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:03.385859+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188579840 unmapped: 6750208 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:04.385995+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188579840 unmapped: 6750208 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.1 total, 600.0 interval
                                                          Cumulative writes: 23K writes, 88K keys, 23K commit groups, 1.0 writes per commit group, ingest: 0.08 GB, 0.01 MB/s
                                                          Cumulative WAL: 23K writes, 7977 syncs, 2.88 writes per sync, written: 0.08 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 8547 writes, 31K keys, 8547 commit groups, 1.0 writes per commit group, ingest: 31.78 MB, 0.05 MB/s
                                                          Interval WAL: 8547 writes, 3411 syncs, 2.51 writes per sync, written: 0.03 GB, 0.05 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:05.386084+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188571648 unmapped: 6758400 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: do_command 'config diff' '{prefix=config diff}'
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: do_command 'config show' '{prefix=config show}'
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: do_command 'counter dump' '{prefix=counter dump}'
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:06.386244+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1aa7000/0x0/0x1bfc00000, data 0x4b581db/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: bluestore.MempoolThread(0x55ff50edbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2586368 data_alloc: 301989888 data_used: 14065664
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: do_command 'counter schema' '{prefix=counter schema}'
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188628992 unmapped: 6701056 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:07.386405+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: prioritycache tune_memory target: 5709084876 mapped: 188211200 unmapped: 7118848 heap: 195330048 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: tick
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_tickets
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:08.386539+0000)
Nov 28 10:18:38 np0005538513.localdomain ceph-osd[31557]: do_command 'log dump' '{prefix=log dump}'
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Nov 28 10:18:38 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1087429981' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3300551025' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1467414868' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3956784888' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Nov 28 10:18:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 10:18:39 np0005538513.localdomain ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.3 total, 600.0 interval
                                                          Cumulative writes: 23K writes, 89K keys, 23K commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.01 MB/s
                                                          Cumulative WAL: 23K writes, 8020 syncs, 2.95 writes per sync, written: 0.07 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 7418 writes, 27K keys, 7418 commit groups, 1.0 writes per commit group, ingest: 29.62 MB, 0.05 MB/s
                                                          Interval WAL: 7418 writes, 2803 syncs, 2.65 writes per sync, written: 0.03 GB, 0.05 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: pgmap v821: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3667633455' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/4176004123' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1087429981' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1301512143' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1212911426' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1939566436' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3300551025' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2340776439' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1467414868' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/3245672563' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1500572328' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1135811599' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3956784888' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2059175748' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1176939805' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "osd metadata"} v 0)
Nov 28 10:18:39 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3622627622' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Nov 28 10:18:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.
Nov 28 10:18:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.
Nov 28 10:18:40 np0005538513.localdomain podman[238687]: time="2025-11-28T10:18:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:18:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:18:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1"
Nov 28 10:18:40 np0005538513.localdomain podman[238687]: @ - - [28/Nov/2025:10:18:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19281 "" "Go-http-client/1.1"
Nov 28 10:18:40 np0005538513.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.
Nov 28 10:18:40 np0005538513.localdomain podman[335756]: 2025-11-28 10:18:40.194582364 +0000 UTC m=+0.168353562 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 10:18:40 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "osd utilization"} v 0)
Nov 28 10:18:40 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/12388310' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Nov 28 10:18:40 np0005538513.localdomain podman[335756]: 2025-11-28 10:18:40.235323535 +0000 UTC m=+0.209094703 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:18:40 np0005538513.localdomain systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully.
Nov 28 10:18:40 np0005538513.localdomain podman[335757]: 2025-11-28 10:18:40.275126236 +0000 UTC m=+0.246193460 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7, distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6)
Nov 28 10:18:40 np0005538513.localdomain podman[335757]: 2025-11-28 10:18:40.311244145 +0000 UTC m=+0.282311359 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, architecture=x86_64, managed_by=edpm_ansible, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Nov 28 10:18:40 np0005538513.localdomain systemd[1]: tmp-crun.twuXw8.mount: Deactivated successfully.
Nov 28 10:18:40 np0005538513.localdomain systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully.
Nov 28 10:18:40 np0005538513.localdomain podman[335803]: 2025-11-28 10:18:40.333458142 +0000 UTC m=+0.132656547 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:18:40 np0005538513.localdomain podman[335803]: 2025-11-28 10:18:40.371357546 +0000 UTC m=+0.170555961 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 28 10:18:40 np0005538513.localdomain systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully.
Nov 28 10:18:40 np0005538513.localdomain sudo[335867]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:18:40 np0005538513.localdomain sudo[335867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:18:40 np0005538513.localdomain sudo[335867]: pam_unix(sudo:session): session closed for user root
Nov 28 10:18:40 np0005538513.localdomain sudo[335889]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 10:18:40 np0005538513.localdomain sudo[335889]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:18:40 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1176939805' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Nov 28 10:18:40 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1832128950' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Nov 28 10:18:40 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/4055277002' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Nov 28 10:18:40 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3622627622' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Nov 28 10:18:40 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2432312567' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Nov 28 10:18:40 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2924469151' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Nov 28 10:18:40 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1527008646' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Nov 28 10:18:40 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/3302747548' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Nov 28 10:18:40 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/12388310' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Nov 28 10:18:40 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1580882676' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Nov 28 10:18:40 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1816962051' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Nov 28 10:18:40 np0005538513.localdomain systemd[1]: Starting Hostname Service...
Nov 28 10:18:41 np0005538513.localdomain systemd[1]: Started Hostname Service.
Nov 28 10:18:41 np0005538513.localdomain podman[336083]: 2025-11-28 10:18:41.335156619 +0000 UTC m=+0.073397983 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, name=rhceph, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, release=553, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.buildah.version=1.33.12)
Nov 28 10:18:41 np0005538513.localdomain podman[336083]: 2025-11-28 10:18:41.450177789 +0000 UTC m=+0.188419213 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, name=rhceph, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 10:18:41 np0005538513.localdomain ceph-mon[292954]: from='client.49641 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:41 np0005538513.localdomain ceph-mon[292954]: pgmap v822: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:41 np0005538513.localdomain ceph-mon[292954]: from='client.49653 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:41 np0005538513.localdomain ceph-mon[292954]: from='client.69884 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:41 np0005538513.localdomain ceph-mon[292954]: from='client.49659 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:41 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1502712426' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Nov 28 10:18:41 np0005538513.localdomain ceph-mon[292954]: from='client.69890 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:41 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1699203364' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Nov 28 10:18:41 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "quorum_status"} v 0)
Nov 28 10:18:41 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2583937855' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Nov 28 10:18:42 np0005538513.localdomain sudo[335889]: pam_unix(sudo:session): session closed for user root
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:42 np0005538513.localdomain sudo[336296]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:18:42 np0005538513.localdomain sudo[336296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:18:42 np0005538513.localdomain sudo[336296]: pam_unix(sudo:session): session closed for user root
Nov 28 10:18:42 np0005538513.localdomain sudo[336314]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:18:42 np0005538513.localdomain sudo[336314]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "versions"} v 0)
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2050263147' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: from='client.49665 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: from='client.69902 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: from='client.49671 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: from='client.59887 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: from='client.69914 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: from='client.59896 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: from='client.49680 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: from='client.59908 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: from='client.69926 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: from='client.59914 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: from='client.49692 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: from='client.59920 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2583937855' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/4004329074' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2050263147' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/2192705607' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/3570554908' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Nov 28 10:18:42 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2928277428' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Nov 28 10:18:42 np0005538513.localdomain sudo[336314]: pam_unix(sudo:session): session closed for user root
Nov 28 10:18:43 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:43.041 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:43 np0005538513.localdomain sudo[336469]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:18:43 np0005538513.localdomain sudo[336469]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:18:43 np0005538513.localdomain sudo[336469]: pam_unix(sudo:session): session closed for user root
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/233060911' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='client.69944 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='client.49704 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: pgmap v823: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='client.59938 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='client.69959 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='client.49719 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='client.59959 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='client.69977 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2928277428' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/4269415353' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2018896911' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/233060911' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/327747030' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/663276669' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config dump"} v 0)
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1533518817' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:43 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2196629395' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: from='client.59974 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: from='client.69998 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: from='client.59992 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/3969960704' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1533518817' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/313248130' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/3322486298' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Nov 28 10:18:44 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2196629395' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Nov 28 10:18:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df"} v 0)
Nov 28 10:18:45 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3575363919' entity='client.admin' cmd={"prefix": "df"} : dispatch
Nov 28 10:18:45 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "fs dump"} v 0)
Nov 28 10:18:45 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2831850694' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Nov 28 10:18:45 np0005538513.localdomain ceph-mon[292954]: pgmap v824: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:45 np0005538513.localdomain ceph-mon[292954]: from='client.49779 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:45 np0005538513.localdomain ceph-mon[292954]: from='client.70061 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:45 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/4007421546' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Nov 28 10:18:45 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3575363919' entity='client.admin' cmd={"prefix": "df"} : dispatch
Nov 28 10:18:45 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1868358059' entity='client.admin' cmd={"prefix": "df"} : dispatch
Nov 28 10:18:45 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/3429851138' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Nov 28 10:18:45 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/2831850694' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Nov 28 10:18:45 np0005538513.localdomain kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 28 10:18:45 np0005538513.localdomain kernel: cfg80211: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 28 10:18:45 np0005538513.localdomain kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 28 10:18:45 np0005538513.localdomain kernel: cfg80211: failed to load regulatory.db
Nov 28 10:18:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "fs ls"} v 0)
Nov 28 10:18:46 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/754854177' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Nov 28 10:18:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:18:46 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:46 np0005538513.localdomain ceph-mon[292954]: from='client.60064 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:46 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1835945986' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Nov 28 10:18:46 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2490676851' entity='client.admin' cmd={"prefix": "df"} : dispatch
Nov 28 10:18:46 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/754854177' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Nov 28 10:18:46 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/84128671' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Nov 28 10:18:46 np0005538513.localdomain ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:46 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/1333046251' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Nov 28 10:18:46 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mds stat"} v 0)
Nov 28 10:18:46 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4095040716' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #79. Immutable memtables: 0.
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:47.144174) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 47] Flushing memtable with next log file: 79
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325127144273, "job": 47, "event": "flush_started", "num_memtables": 1, "num_entries": 981, "num_deletes": 253, "total_data_size": 792297, "memory_usage": 811096, "flush_reason": "Manual Compaction"}
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 47] Level-0 flush table #80: started
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325127151493, "cf_name": "default", "job": 47, "event": "table_file_creation", "file_number": 80, "file_size": 781055, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 42976, "largest_seqno": 43956, "table_properties": {"data_size": 775772, "index_size": 2498, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 14794, "raw_average_key_size": 21, "raw_value_size": 763937, "raw_average_value_size": 1113, "num_data_blocks": 102, "num_entries": 686, "num_filter_entries": 686, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764325092, "oldest_key_time": 1764325092, "file_creation_time": 1764325127, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 80, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 47] Flush lasted 7359 microseconds, and 3397 cpu microseconds.
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:47.151541) [db/flush_job.cc:967] [default] [JOB 47] Level-0 flush table #80: 781055 bytes OK
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:47.151563) [db/memtable_list.cc:519] [default] Level-0 commit table #80 started
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:47.153599) [db/memtable_list.cc:722] [default] Level-0 commit table #80: memtable #1 done
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:47.153621) EVENT_LOG_v1 {"time_micros": 1764325127153614, "job": 47, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:47.153642) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 47] Try to delete WAL files size 786773, prev total WAL file size 786773, number of live WAL files 2.
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000076.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:47.154106) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031353438' seq:72057594037927935, type:22 .. '6B760031383032' seq:0, type:0; will stop at (end)
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 48] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 47 Base level 0, inputs: [80(762KB)], [78(18MB)]
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325127154142, "job": 48, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [80], "files_L6": [78], "score": -1, "input_data_size": 20115741, "oldest_snapshot_seqno": -1}
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 48] Generated table #81: 14866 keys, 19020550 bytes, temperature: kUnknown
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325127240175, "cf_name": "default", "job": 48, "event": "table_file_creation", "file_number": 81, "file_size": 19020550, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18935986, "index_size": 46319, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 37189, "raw_key_size": 399681, "raw_average_key_size": 26, "raw_value_size": 18684236, "raw_average_value_size": 1256, "num_data_blocks": 1708, "num_entries": 14866, "num_filter_entries": 14866, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764325127, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 81, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:47.240468) [db/compaction/compaction_job.cc:1663] [default] [JOB 48] Compacted 1@0 + 1@6 files to L6 => 19020550 bytes
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:47.242322) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 233.6 rd, 220.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 18.4 +0.0 blob) out(18.1 +0.0 blob), read-write-amplify(50.1) write-amplify(24.4) OK, records in: 15394, records dropped: 528 output_compression: NoCompression
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:47.242349) EVENT_LOG_v1 {"time_micros": 1764325127242337, "job": 48, "event": "compaction_finished", "compaction_time_micros": 86124, "compaction_time_cpu_micros": 35795, "output_level": 6, "num_output_files": 1, "total_output_size": 19020550, "num_input_records": 15394, "num_output_records": 14866, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000080.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325127242578, "job": 48, "event": "table_file_deletion", "file_number": 80}
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000078.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325127245253, "job": 48, "event": "table_file_deletion", "file_number": 78}
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:47.154056) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:47.245373) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:47.245379) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:47.245380) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:47.245382) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:18:47.245383) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon dump"} v 0)
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1031978832' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: pgmap v825: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: from='client.49812 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: from='client.70112 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2998493452' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/4095040716' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1639533449' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: from='client.60094 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1031978832' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/1419418075' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: from='client.49830 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:47 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2593816835' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Nov 28 10:18:48 np0005538513.localdomain nova_compute[279673]: 2025-11-28 10:18:48.044 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:18:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:18:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:18:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:18:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:18:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:18:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:18:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:18:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:18:48 np0005538513.localdomain openstack_network_exporter[240658]: ERROR   10:18:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:18:48 np0005538513.localdomain openstack_network_exporter[240658]: 
Nov 28 10:18:48 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Nov 28 10:18:48 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3549249674' entity='client.admin' cmd={"prefix": "osd blocklist ls"} : dispatch
Nov 28 10:18:48 np0005538513.localdomain ceph-mon[292954]: from='client.70136 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:48 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3549249674' entity='client.admin' cmd={"prefix": "osd blocklist ls"} : dispatch
Nov 28 10:18:48 np0005538513.localdomain ceph-mon[292954]: pgmap v826: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:48 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/352639843' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Nov 28 10:18:48 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/603246220' entity='client.admin' cmd={"prefix": "osd blocklist ls"} : dispatch
Nov 28 10:18:48 np0005538513.localdomain ceph-mon[292954]: from='client.49842 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:48 np0005538513.localdomain ceph-mon[292954]: from='client.60118 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "osd dump"} v 0)
Nov 28 10:18:49 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1496163689' entity='client.admin' cmd={"prefix": "osd dump"} : dispatch
Nov 28 10:18:49 np0005538513.localdomain ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "osd numa-status"} v 0)
Nov 28 10:18:49 np0005538513.localdomain ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3399191420' entity='client.admin' cmd={"prefix": "osd numa-status"} : dispatch
Nov 28 10:18:49 np0005538513.localdomain ceph-mon[292954]: from='client.70151 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:49 np0005538513.localdomain ceph-mon[292954]: from='client.49848 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:49 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.107:0/2136366162' entity='client.admin' cmd={"prefix": "osd blocklist ls"} : dispatch
Nov 28 10:18:49 np0005538513.localdomain ceph-mon[292954]: from='client.70163 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:49 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/1496163689' entity='client.admin' cmd={"prefix": "osd dump"} : dispatch
Nov 28 10:18:49 np0005538513.localdomain ceph-mon[292954]: from='client.60136 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:49 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.108:0/319471767' entity='client.admin' cmd={"prefix": "osd dump"} : dispatch
Nov 28 10:18:49 np0005538513.localdomain ceph-mon[292954]: from='client.? 172.18.0.106:0/3399191420' entity='client.admin' cmd={"prefix": "osd numa-status"} : dispatch
